-
Paper 142 - Session title: Toolboxes 2
11:10 Felyx : open source solution for satellite cal/val and large EO dataset analysis
Piollé, Jean-François (1); Shutler, Jamie (2); Poulter, Dave (3); Herlédan, Sylvain (4); Goryl, Philippe (5); Donlon, Craig (5); Guidetti, Veronica (5) 1: Ifremer, France; 2: University of Exeter, UK; 3: Pelamis Software, UK; 4: OceanDataLab, France; 5: European Space Agency
Show abstract
Felyx is a web tool to facilitate EO data analytics: it is developed by IFREMER, PML and Pelamis, under ESA funding. It consists in a free open software solution, written in python and javascript, whose aim is to provide Earth Observation data producers and users with an open-source, flexible and reusable tool to allow the quality and performance of data streams (satellite, in situ and model) to be easily monitored and studied. It builds on the concept of the former HR-DDS system implemented for various projects (GHRSST, Medspiration, GlobColour and GlobWave) but extends the principle further to also incorporate multi-sensor match-up database capabilities. It is deployable anywhere and even includes interaction mechanisms between the deployed instances.
The primary concept of Felyx is to work as an extraction tool, subsetting source data over predefined target areas (which can be static or moving) : these data subsets, and associated metrics, can then be accessed by users or client applications either as raw files, automatic alerts and reports generated periodically, or through a flexible web interface enabling statistical analysis and visualization.Felyx presents itself as an open-source suite of tools, written in python and javascript, enabling :
subsetting large local or remote collections of Earth Observation data over predefined sites (geographical boxes) or moving targets (ship, buoy, hurricane), storing locally the extracted data (refered as miniprods). These miniprods constitute a much smaller representative subset of the original collection on which one can perform any kind of processing or assessment without having to cope with heavy volumes of data.
computing statistical metrics over these miniprods using for instance a set of usual statistical operators (mean, median, rms, ?), fully extensible and applicable to any variable of a dataset. These metrics are stored in a fast search engine, queryable by humans and automated applications.
reporting or alerting, based on user-defined inference rules, through various media (emails, twitter feeds,..) and devices (phones, tablets).
analysing miniprods and metrics through a web interface allowing to dig into this base of information and extracting useful knowledge through multidimensional interactive display functions (time series, scatterplots, histograms, maps).
Among many other applications, users may want to use felyx for:
monitoring and assessing the quality of Earth observations (e.g. satellite products and time series) through statistical analysis and/or comparison with other data sources
assessing and inter-comparing geophysical inversion algorithms
observing a given phenomenon, collecting and cumulating various parameters over a defined area
crossing different sources of data for synergy applications
The services provided by felyx are generic, deployable at users own premises and adaptable enough to integrate any kind of parameters. Users can operate their own felyx instance at any location, on datasets and parameters of their own interest, and the various instances will be able to interact with each other, creating a web of felyx systems enabling aggregation and cross comparison of miniProds and metrics from multiple sources.
Initially two instances are operated simultaneously during a 6 months demonstration phase, at IFREMER - on sea surface temperature and ocean waves datasets - and PML - on ocean colour. A first release is planned to be tested and assesed in the context of Sentinel-3 validation. This presentation will focus on examples of usage for SST or ocean colour, making full use of all felyx features (matchups over buoys, reports, ….).
[Authors] [ Overview programme] [ Keywords]
-
Paper 1652 - Session title: Toolboxes 2
10:30 QUANTOOLS: the spectral tools for mineral mapping
Kopackova, Veronika (1); Koucka, Lucie (1); Rogass, Christian (2); Mielke, Christian (2); Boesche, Nina (2) 1: Czech Geological Survey, Czech Republic; 2: Helmholtz Centre Potsdam, GFZ German Research Centre For Geosciences
Show abstract
The latest version of the QUANTools will be presented at the conference. These tools allow automatic detection of multiple absorption feature parameters to classify high spectral resolution data. The key issue lies on detection of bad bands (e.g., noise, effect of the atmosphere), as these can cause false absorption maxima detections. This unfavourable parameter differs for different HS datasets; as sensors differ from a level of noise as well as the atmospheric conditions differ during data acquisitions. The QUANTools are designed in such a way that users can use a graphical interface first to detect bad bands and exclude them from the further analysis. Therefore, they can define an optimal spectral window size as well as tune other parameters (number of neighbouring bands/wavelengths that are being statistically assessed to detect deviating bands/wavelengths). As multiple absorption features are detected and their wavelength positions, respectively, the newly suggested method has a potential to become a new mapping technique suitable for environments with high heterogeneity and dynamics. Furthermore, the derived depths of detected absorption features then can be correlated with abundances of corresponding minerals/materials.
One of the main advantages on using these new tools is that prior definition of endmembers is not requested to classify hyperspectral data. As inputs diverse reflectance as well as emissivity data either in a form of spectral libraries or image data can be used to model desire parameters. In that way QUANTOOLS also allow diverse sensor information fusion and integration.
The mineral mapping examples utilizing the VIS/NIR/SWIR/TIR regions will be demonstrated using the multi-temporal high spectral resolution image data (e.g., HyMap (2009 and 2010) and AHS (2011) data). Furthermore, an example how these tools can be utilized to detect Rare Earth Elements (REE) will be demonstrated as well as a potential for mineral mapping when employing simulated EnMap data. In addition it will be demonstrated which minerals can be differentiated using simulated Sentinel-2 and Landsat-8 data.
The QUANTools have been created using IDL programing language and can be used under ENVI/IDL (version 5.0 and higher).
AKNOWLEDGEMENTS
The present research is being undertaken within the framework of the grant n° LH13266 (Hyper Algo) funded by the Ministry of education youth and sports, the Czech Republic. The new technique was tuned and tested using the hyperspectral data sets acquired under following grants: grant n° 205/09/1989 (HypSo) funded by the Czech Science Foundation, EO-MINERS, Grant n° 244242 funded by the European Commission (FP7) and the DeMinTIR project funded by EUFAR.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1807 - Session title: Toolboxes 2
10:50 Nansat - a scientist friendly Python toolbox for processing 2D satellite EO raster data
Vines, Aleksander; Korosov, Anton; Hansen, Morten Wergeland; Yamakawa, Asuka; Olaussen, Tor Nansen Center, Norway
Show abstract
Nansat is an open source, scientist friendly Python toolbox for processing 2D satellite EO raster data (https://github.com/nansencenter/nansat). Nansat provides high flexibility in a command-line interface to facilitate easy development and testing of scientific algorithms, easy analysis of geospatial data, and efficient operational processing. To enable the use of several data sources for co-location, comparison and validation, Nansat takes benefit of the already existing open source GDAL C++ library. This is used to read geospatial data from a number of different sources and to perform basic operations on the data. A core feature of the system is that data is read from file only when needed, and it is possible to extract subsets of the full dataset. This is more efficient than reading the full dataset into memory as is necessary with, e.g., Matlab and other similar software, in particular if it is stored on a remote server and accessed via, e.g., OPeNDAP.
Nansat contains a package called mappers, which adds metadata to complement the retrieved raster data. Presently, 40 datasets from different satellite instruments and numerical models can be opened with the mappers package, in addition to any gridded datasets following the NetCDF-CF standard.
A set of functions to perform common operations on the data, like georeferencing and re-projection, averaging, transect extraction and analysis, as well as visualization and generation of high quality maps, is available. In addition to this, Nansat contains a set of common export functions.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2015 - Session title: Toolboxes 2
11:30 Enabling Community Collaboration, Information sharing, and Best Practices Software Development with NASA’s Earthdata Suite
Mitchell, Andrew Edward (1); Lowe, Dawn (1); Pilone, Daniel (2) 1: NASA, United States of America; 2: Raytheon/Element 84, United States of America
Show abstract
NASA’s Earth Observation System (EOS) Data and Information System (EOSDIS) is a system of systems enabling the coordination and sharing of scientific research and data across the Earth Science community. NASA’s EOSDIS enables research, algorithm development and testing, and data discovery and access. The EOSDIS is a multi-petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from NASA’s EOS instrument data collection to science data processing to full access to EOS and other earth science data. EOSDIS has continually evolved to improve the discoverability, accessibility, and usability of high-impact NASA data spanning the multi-petabyte-scale archive of Earth science data products.
NASA has recently deployed several major services under the EOSDIS umbrella and by taking advantage of these operational and highly scalable NASA services, lowered the barrier of entry to produce high value offerings to end-users. To expand on this model, NASA is looking at how to design, build, and deploy core services that encourage and accelerate affordable discipline specific capabilities and collaboration.
The NASA Earthdata website is a collaborative environment integrating information from across EOSDIS. Earthdata is the entry point for EOSDIS data, articles, documentation, and collaboration and leverages NASA’s Common Metadata Repository (CMR) to provide comprehensive search capabilities. Earthdata offers new and experienced users an organized view of EOSDIS resources and latest events. EOSDIS DAACs are key contributors to Earthdata, providing the latest information on Atmosphere, Solar Radiance, Cyrosphere, Human Dimensions, Land, and Ocean Science.
In addition to supporting information sharing, NASA has deployed a comprehensive suite of tools to make software development best practices available to the wide variety of EOSDIS development efforts. NASA’s Earthdata Code Collaborative (ECC) offers tools and support for several key aspects of EOSDIS projects: requirements gathering, review, and tracking; user story and work tracking; source code control; automated builds and continuous integration; automated deployment; and team and user collaboration.
The ECC offers EOSDIS affiliated projects immediate access to proven development workflows and, equally important, it provides a central repository for sharing expertise and reusing existing EOSDIS developed assets. In addition to standard software development support, the ECC has been used as a project’s community support tool, a scalable change management and distribution mechanism for standards and XML schemas, and full service user support capability for operational systems.
This presentation uses NASA’s experience developing the Sea Level Change Portal with the ECC to discuss lessons learned designing and developing shared, scalable information and tool development resources, identifying, capturing, and codifying community best practices, and how NASA balances drive for cost effective collaboration and development with diverse community needs and expectations.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2564 - Session title: Toolboxes 2
10:10 Sentinel-1 TOPSAR Interferometry with the DIAPASON InSAR software
Ordoqui, Patrick (1); Mora, Oscar (1); Koudogbo, Fifamè Nadège (1); Ganas, Athanassios (2) 1: Altamira-Information, Spain; 2: National Observatory of Athens (NOA), Greece
Show abstract
This paper presents Radar interferometry results with Sentinel-1 data in TOPSAR (IW) mode, using the DIAPASON Interferometric processing software. The processing steps required to produce interferograms with data in TOPSAR mode will be presented. The TOPSAR technique (Terrain Observation with Progressive Scans SAR) is a beam steering technique which allows the combination of a large swath width (250 km) with a moderate geometric resolution (5 m by 20 m). The backwards to forward antenna steering in the azimuth direction avoids scalloping and results in higher quality image compared to ScanSAR mode.
Due to the specifics of the TOPSAR mode, (data made of a series of bursts with a high doppler centroid variation within bursts), a highly accurate azimuth coregistration is required (up to a 1/1000 of a pixel) in order to ensure the phase continuity in the interferometric product. Failure to meet this accuracy during the coregistration process will lead to artifacts and phase jumps in the interferometric phase.
Therefore we present a coregistration scheme of the DIAPASON software, which is suitable for TOPSAR interferometry. The coregistration flow-chart includes a deramping step to put the slave image spectrum to zero, a burst-level geometric coregistration , later corrected by a beam-level incoherent cross-correlation (ICC), and finally an iterative coregistration grid refinement based on the Enhanced spectral diversity method (ESD). Once the SAR images are coregistered, the image spectrum is restored by a re-ramping operation, then each subswath is debursted and mosaicked, before the Interferogram generation is performed.
The method has been used to perform co-seismic interferograms of the Nepal Earthquake (which occurred on 25th April 2015) and the Chile Illapel earthquake (which took place on September 16th 2015). For each pair, the residual azimuth coregistration error is estimated to a few 1/1000 ths of pixels and the lack of phase artifacts is verified in order to validate the method.
In the framework of the RASOR (Rapid Analysis and Spatialisation of Risk) Project, 19 Sentinel-1 images acquired over Santorini (Greece) were processed to generate ground motion maps of the caldera. The Santorini volcano’s last major explosive eruption was about 3600 years ago. This event formed a large crater, or caldera, which is now flooded by the sea. The PSI results show a subsidence signal on several areas near the caldera, as well as on the Kameni islands, which lie in the middle of Santorini’s large flooded crater. More data will be processed to establish a pattern, modelling will be moreover applied in order to explain if the detected subsidence is due to volcano dynamics. The results of this analysis will be further presented.
[Authors] [ Overview programme] [ Keywords]