-
Paper 106 - Session title: EO Open Science Posters
OPEN-12 - Access and processing facilities for satellite data, tailored for Black Sea basin
Constantin, Sorin; Crăciunescu, Vasile; Constantinescu, Ștefan; Șerban, Florin Terrasigna, Romania
Show abstract
Black Sea is a unique marine environment, with specific environmental, economic and social problems to be tackled by the scientific and administrative communities. Regional studies empowering remote sensing data are considerably fewer than for other seas surrounding Europe (e.g. Mediterranean, Baltic or North Sea). Efficient analysis and interpretation of EO data can contribute to a better understanding of the environmental challenges that Black Sea is facing. It is therefore obvious that special attention needs to be given to actions that foster larger scale usage of satellite information by the stakeholders. Terrasigna dedicates significant efforts to this goal and its actions are visible through the developed projects. One of its innovative initiatives is Earth Science Data Access and Processing Service for Black Sea (ESPOSS) project. Another main initiative is the ESA's Thematic Exploitation Platform for Coastal Theme (C-TEP), with Terrasigna as part of the consortium, that kicked-off in 2015.
In order to meet the users’ demands for simple interfaces to inspect, download, process and analyze the data, Terrasigna developed ESPOSS - a web based EO data access and processing application for the Black Sea. The application incorporates geospatial data from various sources (satellite images, in-situ data, model outputs) and serves as a working tool for the scientific community and the relevant decision makers. It offers services based on Open Geospatial Consortium (OGC) standards for data retrieval (WMS, WCS, WFS) and server-side processing (WPS). The services were built upon open source solutions.
The users have access, among other geospatial data, to a wealth of satellite data coming from various sources and sensors, all related to Black Sea environmental parameters. Up to this moment, ESPOSS facilitates access to products such as MERIS Chlorophyll Level 3 data (monthly averaged products), MERIS Total Suspended Matter (TSM), MODIS derived Level 3 products based on 8 day composite, available from NASA's Ocean Color Data Distribution Center (Sea Surface Temperature, Chlorophyll Concentration and Diffuse Attenuation Coefficient at 490 nm). ESPOSS has integrated also daily forecast data regarding significant wave height and total water level for Black Sea (kindly provided by Kassandra Storm Surge Modelling System consortium). In-situ data from national and international institutions (e.g. EUXINUS Buoys Network)shall also be available.
While ESPOSS addressed the needs of less experienced EO users, the C-TEP aims to provide to scientific community a complete work environment, where users can have access to large volume of data, computing resources and processing software. C-TEP is designed as a collaborative platform where access shall be granted to usersthrough different types of portals (web, using http protocol, or by Remote Desktop Protocol), depending on the nature of the connection. C-TEP should anticipate the user's needs that might arise from the unprecedented amount of EO data that will be provided by current and future missions. Within the large platform, several smaller test-applications, called Child TEPs are going to be developed. Terrasigna will be responsible for designing the Child TEP tailored for the Black Sea Basin.
ESPOSS is mainly developed to disseminate high processing levels of data (Level 3 and Level 4), while C-TEP is designed to handle large amount of Level 1 and Level 2 information. The development of the two platforms implies the use of similar technologies for specific modules, such as the OGC's standard web services. Client interface also might share some common features.
ESA activities for increasing remote sensing data usage will have a major impact on the activities carried out by the stakeholders interested in the Black Sea basin. The ESPOSS and C-TEP platforms will have a significant contribution.
[Authors] [ Overview programme] [ Keywords]
-
Paper 113 - Session title: EO Open Science Posters
OPEN-45 - IPSentinel: The Collaborative Infrastructure for Sentinel Data in Portugal
Santos, José (1); Crisógono, Paulo (1); Caetano, Mário (1); Barbeiro, Andreia (1); Silva, Marisa (1); Patrício, Paulo (1); Anjos, Bruno (2) 1: Directorate-General for the Territory Development (DGT); 2: Portuguese Sea and Atmosphere Institute, I. P. (IPMA, IP)
Show abstract
The Directorate-General for the Territory Development (DGT) and the Portuguese Institute for the Sea and the Atmosphere (IPMA), as Portuguese National Institutions with responsibilities in the management and satellite image processing, started a new project, under financial support of EEA Grants and manage by Directorate General for Sea Policy (DGPM), for the implementation and coordination of a new technological infrastructure for storage and dissemination of images of Sentinel satellites acquired under the Copernicus program, creating a national platform that will act as "National Mirror" for Portugal and designated as Portuguese Infrastructure for Sentinel data (IPSentinel).
The IPSentinel will be a part of the global infrastructure platform from the responsibility of the European Space Agency (ESA) which will store all the images obtained by the different Sentinel satellites to a geographical area of interest for Portugal creating a privileged access to a wide range of Earth Observation (EO) data for the national users and interested entities.
The implementation of this technical infrastructure followed the development of a new architecture that fulfill the National interest and consist in the daily ingestion of data provided by Sentinel 1A/B Sentinel 2A/B and Sentinel 3A/B/C. Furthermore, it will be also developed a WebService to provide derivative products, compliant with the Open GeoSpatial Consortium (OGC), from the Sentinel missions. These products will have as end users public institutions that could profit and define acurated policies using these derivative satellite products. The entry point for these services will be the Portuguese website Copernicus.PT. Here it will be displayed all scientific information for the program Copernicus in the framework of this national project.
The technical development of this infrastructure will be based in ESA software, Data Hub Service (Dhus), which provides a complete system for data visualization and data providing. The routines for daily data ingestion, into DGT/IPMA servers for a certain types of data level provided by ESA regarding Sentinel, will be developed as external scripts under the GNU/GPL V3 license.
[Authors] [ Overview programme] [ Keywords]
-
Paper 152 - Session title: EO Open Science Posters
OPEN-47 - Next generation Sentinel data hub
Milčinski, Grega (1); Batič, Matej (1); Kadunc, Miha (1); Oštir, Krištof (2) 1: Sinergise, Slovenia; 2: ZRC-SAZU, Slovenia
Show abstract
Copernicus data provided by European Commission from a series of Sentinel satellites built and operated by European Space Agency are revolutionizing Earth observation. Free, full and open access to data with very short revisit times, high spatial resolution, and good spectral resolution are crucial for many applications. The portfolio of possible products is limitless. However, current gap between the data available through Copernicus and final users is simply too large to bridge. For example, Sentinel-2 level 1C products, that can be downloaded via not very straightforward interface, are huge, delivered in a complex data format and require additional processing - issues, which eliminate vast majority of possible users. Adding a temporal component - a new image of an area several times a week, otherwise a most welcome fact - reduces the number of organizations, capable of managing the data, to just a few. Sentinels Scientific Data Hub has 7.500 registers users, a number not as large as one would have expected.
Some of EO data providers are trying to tackle the issue in an old-fashioned way - offering derivative products, done in a semi-automatic or even manual way. But due to labor required to process the data, the costs of using such products are still too high. We tried to approach it from a different angle - thinking about which products/services we can offer without any manual effort involved from our side and easily accessible by end users. At first glance it seems there are quite many. We will demonstrate a working prototype of a fully automatic archiving, processing and distribution rolling archive infrastructure, which downloads the data from the Hub, performs basic processing and makes it available in full resolution over the web-based GIS client, WMS service or specialized APIs, capable of streaming raw data, statistics, time-lapse and similar services. On-the-fly processing and visualization of satellite data makes it possible to not only show true and false color images but also vegetation indices and similar analyses based on different sensor bands. We are adding automatic land use classification, crop mask and crop state, drought and flood identification and similar products as well. All of these services can either be used in 3rd party tools or within our cloud GIS, Geopedia, which makes it possible to analyze the data in combination with other alpha-numeric or spatial datasets.
We will also share the experience on how we built this service, challenges we had to overcome and limitations that remain in the product.
[Authors] [ Overview programme] [ Keywords]
-
Paper 205 - Session title: EO Open Science Posters
OPEN-26 - Novel 3D representation for cloud
Lo Tauro, Agata MIUR, Italy
Show abstract
This research is an opportunity to discuss how well investments in the development of surveying and novel "open 3D mapping", through International programmes may translate into operational, useful and relevant tools for professionals in the territory. Environmental analysis will be given the floor to provide the feedback on the experience with geomatic applications, and their needs under a pluri-disciplinary approach. The suitability of existing applications will be analyzed, as well as the efficiency of current support mechanisms for regions wishing to take up innovative applications and open data for "land conservation". The interdisciplinary research will analyze the following topics: strategic planning, environmental impact assessment, water conservation, etc. from regional, national, European and International Strategies. This is a work in progress.
[Authors] [ Overview programme] [ Keywords]
-
Paper 310 - Session title: EO Open Science Posters
OPEN-41 - Aviso+: altimetry products & services in 2016
Rosmorduc, Vinca (1); Bronner, Emilie (2); Maheu, Caroline (3); Mertz, Francoise (1) 1: CLS, France; 2: Cnes, France; 3: Akka, France
Show abstract
Since the launch of Topex/Poseidon, more than 23 years ago, satellite altimetry has evolved in parallel with the user community and oceanography. As a result of this evolution, we now have:
- A bigger choice of products, more and more easy-to-use, spanning complete GDRs to pre-computed sea level anomalies and gridded datasets and indicators such as MSL index or ENSO index.
- a mature approach, combining altimetric data from various satellites and merging data acquired using different observation techniques, including altimetry, to give us a global view of the ocean;
- data available in real or near-real time for operational use.
Different services are available either to choose between the various datasets, or to download, extract or even visualize the data. A smartphone application, AvisOcean and an online extraction tool (Online Data Extraction Service) have also been opened in October 2014, for information about the data and their updates.An overview of available products & services, how to access them today, will be presented.
[Authors] [ Overview programme] [ Keywords]
-
Paper 311 - Session title: EO Open Science Posters
OPEN-32 - Extracting Classes of Pixels with Similar Evolution in Satellite Image Time Series – Fundamentals and Main Algorithms
Cucu-Dumitrescu, Catalin (1); Serban, Florin (1); Stoica, Adrian (1); Vaduva, Corina (2); Pačes, Martin (3); Iapaolo, Michele (4) 1: TERRASIGNA Ltd, Romania; 2: University Politehnica of Bucharest - CeoSpaceTech, Romania; 3: EOX GmBH, Austria; 4: ESA ESRIN
Show abstract
The principles and methods for a future prototype Satellite Image Time Series (SITS) classifier are presented. The prototype can extract classes of pixels which experience similar evolution in time, offering also the possibility to assess the development of one scene based on alternative approaches. The solutions are provided by different integrated algorithms, which can be selected according to the needs of the application scenario. Visualization modes for the classification results are suggested.
The results presented in this poster derive from the ongoing ESA GSTP project: Data Mining for Analysis and exploitation of next generation of Time Series (DAMATS).
[Authors] [ Overview programme] [ Keywords]
-
Paper 312 - Session title: EO Open Science Posters
OPEN-33 - Semantic Search in Satellite Imagery – Methods and Algorithms for a Future CBIR Prototype
Cucu-Dumitrescu, Catalin (1); Serban, Florin (1); Stoica, Adrian (1); Vaduva, Corina (2); Iapaolo, Michele (3) 1: TERRASIGNA Ltd, Romania; 2: University Politehnica of Bucharest - CeoSpaceTech; 3: ESA ESRIN
Show abstract
A tile-based search engine for satellite imagery is presented. This engine shall be the core segment of a future Content Based Image Retrieval (CBIR) prototype service, for finding similar patches over large amounts of data. Both optic and radar scenes can be used. Each tile will be characterized based on specific descriptors adapted to EO image particularities. Positive and negative examples will point a specific class and then semantic annotation will be performed by means of machine learning algorithms. Several visualization approaches of the search results will be considered, aiming to highlight the patches or the scenes most similar to a specific query area.
The results presented in this poster derive from the ongoing ESA GSTP project: Open Source Image Retrieval - Integration of Developed tools (OSIRIDE).
[Authors] [ Overview programme] [ Keywords]
-
Paper 347 - Session title: EO Open Science Posters
OPEN-4 - Interactive Downloader for Radar Altimetry Data
Ghosh, Surajit (1); Choudhury, Sunil (1); Thakur, Praveen (1); Gupta, Prasun Kumar (1); Garg, Vaibhav (1); Nandy, Subrata (1); Aggarwal, Shivprasad (1); Sharma, Rashmi (2); Bhattacharyya, Soumya (3) 1: Indian Institute of Remote Sensing, Dehradun,India; 2: Space Applications Centre, Ahmedabad, India; 3: National Institute of Technology, Durgapur, India
Show abstract
Satellite radar altimetry, primarily, was designed for studying large-scale oceanic phenomena. Nowadays satellite altimetry is also used for monitoring and assessment of continental hydrologic components. The radar altimetry data sets are freely available for research purpose. The datasets, both raw and/or processed, are provided by National Aeronautics and Space Administration (NASA), European Space Agency (ESA), Archiving, Validation and Interpretation of Satellite Oceanographic Data (AVISO), Indian Space Research Organisation (ISRO) and several other institute or agency. In general the data sets are available in Network Common Data Format (NetCDF). AVISO and ISRO provide sensor interim data records of different radar altimetry. Few online services are also available for altimetry data download and extraction for particular geographic location. Although there is no standalone tool for downloading and managing the radar altimetry data. The main purpose of the present study was to develop a standalone tool for downloading radar altimetry data from ISRO and AVISO servers. Interface and basic structure were designed and developed in JAVA. The tool, Interactive Downloader for Radar Altimetry data (IDRAd) was designed in such manner that user can interact with it in many ways. User can simply pass the ground track number of desired satellite or import shape file or KML file of the preferred geographic area to download the data from the remote servers. To handle different kind of vector data, viz., shape file or KML file, Python script along with GDAL and OGR packages were also used as Python, GDAL and OGR have enough capability to process spatial data. The tool also facilitates database maintenance of the downloaded data. A database driver-database query processor (DBQP) was developed for managing the data. Natural Processing Language was used for all database queries. Also, C++ and a few DOS commands that were used to accomplish the requirement of the software.
[Authors] [ Overview programme] [ Keywords]
-
Paper 375 - Session title: EO Open Science Posters
OPEN-31 - Spaceknow Intelligence - cloud solution to analytics of satellite imagery
Reinstein, Michal; Simanek, Jakub Spaceknow, Inc., United States of America
Show abstract
Our primary contribution resides in the way we combine latest cloud technology and machine learning algorithms for real-time object classification and generalized change detection in multispectral multi-resolution satellite images. To achieve this goal we developed a proprietary platform with interface enabling easy unified access to a number of satellite providers. This one-point access to satellite imagery exploits the Google App Engine and Amazon AWS technologies to provide Intelligence - an integrated geospatial processing dataflow accessible through RESTful APIs or user-friendly web interface.
Our system can be used for advanced on-the-fly analytics of satellite imagery aided by relevant social media information and news-feeds, integrated into single interface optimized for best user experience. Selected spatio-temporal data from any location(s) on Earth can be analyzed in real-time; distributed batch processing can be delivered regardless the size of the dataset (limited only by the availability of the images on the side of providers).
Using our own APIs we are able to monitor industrial areas in order to estimate economic indicators of whole countries or selected individual facilities, e.g., monitoring of industrial facilities all across China. Using proprietary algorithms for such big data aggregation and analytics, we measure the level of manufacturing activity and report it as Satellite Manufacturing Index (SMI). This way we analyzed over 2.2 billion individual satellite observations corresponding to an area of 0.5 million square kilometers and spanning the time period of the last 14 years.
In case of natural disasters or for safety and security applications, our monitoring system can easily be used for automatic alerting. For this purpose we deploy the generalized change detection algorithm exploiting unsupervised machine learning approaches. The level of detail can be tuned for the desired application, ranging from single construction site monitoring to damage assessment of tsunami affected areas. For the real-time object classification we deploy state-of-the-art machine learning algorithms, both supervised and unsupervised, to high resolution satellite imagery. Though the algorithms are experimental, due to their deployment to cloud the ever-growing datasets with annotations improve the overall performance with time. The specific target application is mostly customer-driven, but currently we offer cloud solution to detection and localization of buildings, cars, and trucks. The most powerful solution always lies in data fusion of different modalities and sources of information even outside the scope of satellite imagery.
We aim to offer our platform as a universal multipurpose tool both to the scientific community, researchers, analysts, as well as commercial customers and companies all with different needs and technical abilities.
[Authors] [ Overview programme] [ Keywords]
-
Paper 451 - Session title: EO Open Science Posters
OPEN-38 - The Pericles Space Case: Preserving Earth Observation Data for the Future.
Pandey, Praveen; Muller, Christian B.USOC, Belgium
Show abstract
PERICLES (Promoting and Enhancing the Reuse of Information throughout the Content Lifecycle exploiting Evolving Semantics) is an FP7 project started on February 2013. It aims at preserving by design large and complex data sets.
PERICLES is coordinated by King’s College London, UK and its partners are University of Borås (Sweden), CERT (Greece), DotSoft(Greece), GeorgAugustUniversität, Göttingen (Germany), University of Liverpool (UK), Space Application Services (Belgium), XEROX France and University of Edinburgh (UK).
Two additional partners provide the two case studies: Tate Gallery (UK) brings the digital art and media case study and B.USOC (Belgian Users Support and Operations Centre) brings the space science case study.
PERICLES addresses the lifecycle of large and complex data sets in order to cater for the evolution of context of data sets and user communities, including groups unanticipated when the data was created. Semantics of data sets are thus also expected to evolve and the project includes elements which could address the reuse of data sets at periods where the data providers and even their institutions are not available any more. PERICLES applies the Linked Ressource Model (LRM). B.USOC supports experiments on the International Space Station and is the curator of the collected data and operation history. The B.USOC operation team includes B.USOC and SpaceApps personnel and is thus ideally configured to participate in this project. As a first test of the concept, B.USOC has chosen to analyse the SOLAR payload flying since 2008 on the ESA COLUMBUS module of the ISS. Solar observation data are prime candidates for long term data preservation as variabilities of the solar spectral irradiance have an influence on earth climate. The paradigm of these observations has already changed a lot in the last fifty years from a time where scientists were aiming at determining with high accuracy the “solar constant” which was the total solar energy per surface unit received at the top of the earth’s atmosphere to the present situation where the same quantity is known as the total solar irradiance and has been shown by thirty years of space observations to vary of about one tenth of a per cent in synchronism with the solar cycle. Right now, larger variations have been detected at UV wavelengths but their effects on climate and atmospheric chemistry are still a matter of scientific discussion.
The PERICLES project goes much further than its application to the single SOLAR case, it intends to develop itself into a new scheme in acquiring the data from new missions.
One of its main elements will be to constitute already the basis of data preservation at acquisition time instead of having to replay the mission, as planned for SOLAR, at the end of the operations of its space segment.
The different tools developed in the PERICLES project are now tested on an authorized slice of ISS data. In particular, the “Process Extractor Tool” and the “Anomaly Detector” have been evaluated in the B.USOC environment. A new encapsulation tool called PERICAT has been developed and will be demonstrated in the SOLAR case by rounding up all the SOLAR elements and data in a single archive.
Applications of the PERICLES process to other experiments managed by B.USOC and to the Long Term Data Preservation programme of ESA (HSO) are under consideration and will be discussed as well as an extension to the earth observation programme.
[Authors] [ Overview programme] [ Keywords]
-
Paper 557 - Session title: EO Open Science Posters
OPEN-42 - The Status and Future Directions of the GRACE Mission
Tapley, Byron (1); Flechtner, Frank (2); Watkins, Michael (1); Bettadpur, Srinivas (1) 1: University of Texas at Austin, United States of America; 2: German Research Centre for Geosciences , Germany
Show abstract
The twin satellites of the Gravity Recovery and Climate Experiment (GRACE) were launched on March 17, 2002 and have operated for over 14 years. The mission objectives are to sense the spatial and temporal variations of the Earth’s mass through its effects on the gravity field at the GRACE satellite altitude. The major cause of the time varying mass is water motion and the GRACE mission has provided a continuous decade long measurement sequences which characterizes the seasonal cycle of mass transport between the oceans, land, cryosphere and atmosphere; its inter-annual variability; and the climate driven secular, or long period, mass transport signals. In 2012, a complete reanalysis of the mission data, referred to as the RL05 data release, was initiated. The monthly solutions from this effort were released in mid-2013 with the mean fields following in subsequent years. The mission is entering the final phases of operations. The current mission operations strategy emphasizes extending the mission lifetime to achieve mission overlap with the GRACE Follow On Mission. This presentation will review the mission status and the projections for mission lifetime, describe the issues that influence the operations philosophy,discuss recent science results and summarize the content of science data products during this transition period.
[Authors] [ Overview programme] [ Keywords]
-
Paper 642 - Session title: EO Open Science Posters
OPEN-5 - DIMITRI: A tool for sensor radiometry monitoring
Berthelot, Béatrice G. (1); Arias, Manuel (2); Bouvet, Marc (3); Camlong, Nathalie (1); Mettel, Pascal (1); Alhammoud, Bahjat (2) 1: Magellium, France; 2: Argans Limited, UK; 3: ESA, ESTEC, The Netherlands
Show abstract
DIMITRI is an ESA tool developed to monitor the radiometry of sensors using bright and dark sites located in the ocean and deserts.
Long time series of satellite data (MERIS, MODIS/Aqua, AATSR, ATSR-2, VEGETATION, and PARASOL) are available in the database; most of them are available from 2002 up to now. Sentinel 2/MSI, S3/OLCI Landsat 8 OLI acquisitions are now managed by the application.
The software contains functionalities i)for automation of data downloading, data storage and access, and data selection, ii)to assess the quality data used for the radiometry sensor monitoring by defining quality indicators, iii)to control the cloud screening by developing an automatic cloud screening.
Once ingested, the software allows to monitor the sensor radiometry using ocean dark sites located in South Indian Ocean, and South Pacific Gyre, using bright target when sunglint is available in the acquisitions, and using bright desert sites located in Libya.
Two types of methodologies are implemented in DIMITRI. The first one, called “Sensor to Sensor comparison”, allow to compare TOA reflectances acquired in the quasi identical geometrical conditions for two sensors. The second one, called ‘Sensor to Simulation comparison”, allow to compare radiative transfer simulations of TOA reflectance to TOA measurements. Three methods are available using the Rayleigh, sunglint and desert sites.
Results are displayed graphically by the application and synthesis of the radiometry evolution is stored in a database.
[Authors] [ Overview programme] [ Keywords]
-
Paper 835 - Session title: EO Open Science Posters
OPEN-46 - Australia’s Regional Copernicus Data Access/Analysis Hub Initiative
Ross, Jonathon (1); Hicks, Richard (2); Witte, Christian (3); Woodcock, Robert (4); Lewis, Adam (1); Adams, Matthew (5); Thankappan, Medhavy (1) 1: Geoscience Australia, Australia; 2: New South Wales Government Office of Environment and Heritage; 3: Queensland Government Department of Science, Information Technology and Innovation; 4: Commonwealth Scientific and Industrial Research Organisation; 5: Landgate Western Australia
Show abstract
The European Union’s Copernicus programme will change the global Earth observation landscape, providing vast amounts of data on an operational basis over the long term. However, the huge data volumes that are the strength of Copernicus also present its major challenge. Ensuring that this volume of data is made available in forms that make it usable is very challenging. Old paradigms based on each individual user downloading all data to local systems for their own applications will not scale sufficiently to support the volumes of data that Copernicus will produce.
Particular technical challenges exacerbate these issues in the region around Australia: South-East Asia and the Pacific. Bandwidth is often limited, and data storage for huge volumes of data can be problematic. Tackling these problems at the level of individual institutions or users, where data is downloaded many times, is even more problematic; and when implemented, such ‘silo’ type solutions create barriers to collaboration across domains and disciplines that Copernicus, by virtue of the comprehensiveness and consistency of the data it offers, makes possible.
Australia is using its location in the region, and its expertise, to support the European Union to address these challenges.
Through the Regional Copernicus Data Access/Analysis Hub, Australia will greatly improve access to Copernicus data for users in the South-East Asia and Pacific region. This will include providing access to very large volumes of Copernicus data ready for use and analysis. This will help avoid key barriers, particularly those caused by limited bandwidth in some parts of the region, and those related to the challenges of storing the Petabytes of data that Copernicus will generate in multiple locations.
As well as overcoming significant technical challenges that would otherwise prevent effective exploitation of Copernicus data within a specific country in the region, the Regional Copernicus Data Access/Analysis Hub is also intended to provide a platform to enhance collaboration across borders.
By enabling users from across nations, and across disciplines and sectors, to work together ‘around’ the same data, and share and combine their results, barriers to cooperation and collaboration are broken down. This, in turn, enables people to work together more effectively in pursuit of the goals of fora such as Asia-Pacific Economic Cooperation and Association of South East Asian Nations. Tackling challenges like sustainable livelihoods, growth of the blue economy, and climate change become easier.
This paper reviews the history and status of the initiative, describes the unique approaches that are being taken to establish the data infrastructure, discusses how it will enable effective exploitation of Copernicus data across the South East Asia and Pacific region, and discusses how federal and state governments are collaborating to establish something that ‘gives back’, in a very concrete way, to the nations that provide the satellite data that is so important to Australia.
[Authors] [ Overview programme] [ Keywords]
-
Paper 898 - Session title: EO Open Science Posters
OPEN-6 - E-merge Network - An Interactive Environment for Alerting about the Emergency Situations
Ogrizovic, Vukan (2); Ignjatovic Stupar, Danijela (1) 1: International Space University, Strasbourg, France; 2: University of Belgrade, Faculty of Civil Engineering, Belgrade, Serbia
Show abstract
"E-merge Net" is an acronym for Emergency Network. It is a software solution based on a symbiosis of a monitoring web application and a smart-phone app. We have chosen the name after the main purpose of the system, which is to merge the emergency services with the wide network of the users informing them about the urgent traffic, weather or any other type of the incident, using their mobile devices, equipped with GNSS sensors. Having the precise position of the incident, with a snapshot of the spot of the incident, the operator in the monitoring centre can quickly and safely decide what kind of the action is needed and which specific emergency service should be alarmed (e.g. ambulance, police, fire department).
The E-merge network allow ordinary people to have real-time access to the situation in the places where they live or where they want to travel. For example, due to weather emergency, if a traffic disturbance is registered in the street where some should pass, that particular user of the system can choose another direction. Or, if some communal accident is reported near someone's property, he/she can react quickly and come home in time to protect it.To use this application the following space data is needed:
- Digital terrain or elevation models
- Thematic information concerning infrastructure, specific areas of interest or population density
- GNSS positions (2-5 m accuracy, 24/7/365 availability)
- Street maps (5 m accuracy)
- Rapid maps (4 m resolution, 1-6 h temporal frequency) based on optical and radar data.After the confirmation of the alert, the rapid mapping centre will use Sentinel 1 (Extra Wide Swath Mode) and Sentinel 3 (Surface Topography Mission, Sea and Land Surface Temperature Radiometer, Ocean and Land Colour Instrument) products, to create the maps of the accident. The map showing the details of the event will be transferred to our monitoring center, where the emergency services will respond to the occurred situation. The resulting information will be transferred to all smart phone users via their E-mergeNet apps. Since the geo-coded information about the event is received, the emergency services will check the official property ownership data and contact the owner where the event happens.
The advantage of this application is its self-expanding feature. Since the smart phone app is free of charge, it becomes interesting and popular with everyone who wants to be informed about the certain situations that can influence himself of his property. Also, if a user notices something that falls under the scope of this application, there is no need for a phone call, but rather only use of an app icon.
[Authors] [ Overview programme] [ Keywords]
-
Paper 916 - Session title: EO Open Science Posters
OPEN-48 - Copernicus Data and Exploitation Platform – a German national collaborative ground segment
Keuck, Vanessa; Hoffmann, Jörn; Staudenrausch, Helmut DLR, Germany
Show abstract
Copernicus establishes an operational European Earth Observation capacity. Copernicus collects data from multiple sources, processes them and provides users with reliable and up-to-date information regarding environmental and security issues. ESA implements the Sentinel data access infrastructure and has invited Participating States to establish national collaborative ground segments, aiming at the additional use of the Copernicus datasets.
The goal of the German Copernicus Data and Exploitation Platform is to set up an infrastructure for data access and value-added product generation. The functional design is based on user requirements and different application scenarios. It is conceptually structured into one interface layer, two service groups and governance. The Search & Portrayal interface will feature a Portal Website, a Marketplace and the Service Provisioning enabling the user to access all infrastructure elements, to publish user datasets/products and to deploy/launch own applications. The archive component will contain a mirror providing fast access to the mission data, a local storage for value-added products and a catalogue service with an external interface. As one of the requirements is to archive all Sentinel data - all time steps world-wide – additionally an intermediate archive with a bulk data access functionality is foreseen. Key challenges for the interface-design are the high data rates (560Mbit rate/8PSK/290MHz) and the massive data volume accumulating over the years (2018: ~ 12.5PB). It is difficult to predict the actual use in detail. Therefore, functionalities (ingestion service, subscription service, subsetting and distribution service, download service) and the processing facilities (applications, service hosting, hosted processing) have to be designed in a scalable manner.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1012 - Session title: EO Open Science Posters
OPEN-1 - DeDop3: the tool to process altimetry data yourself
Roca, Mònica (1); Cotton, David (2); Fomferra, Norman (3); Brockley, David (4); Garcia-Mondéjar, Albert (1); Escolà, Roger (1); Moyano, Gorka (1); Pattle, Mark (1); Baker, Steven (4); Tournadre, Jean (5); Chapron, Bertrand (5); Fabry, Pierre (6); Bercher, Nicolas (6); Ray, Chris (1); Walczowski, Waldemar (7); Bulczak (Kaczmarska), Anna (8); Martín-Puig, Cristina (9); Galin, Natalia (10); Donlon, Craig (11) 1: isardSAT -UK; 2: SatOC; 3: Brockmann Consulting; 4: UCL-MSSL; 5: Ifremer; 6: Along-Track; 7: IO PAN (IO PAS); 8: isardSAT -PL; 9: NOAA; 10: isardSAT's Consultant; 11: ESA/ESTEC
Show abstract
DeDop3: the tool to process altimetry data yourself
The recent development of SAR altimetry, or more properly Delay Doppler altimetry, as implemented on CryoSat-2 and soon to be operated on Sentinel-3, opens an exciting new era for the scientific community. This new approach offers to scientists an opportunity to develop new processing schemes and derive new and improved products, and so maximise the benefits of the measurements available from upcoming missions.
Historically, in conventional altimetry, the understanding of the Level 1b processor was in the hands of the instrument engineers with system expertise. This was logical as the following levels of processing only needed the results produced by the Level 1b processor (Level 1b product) and the information contained in it. There were not many different correct ways of processing the raw data up to Level 1b.
However, this set up is not appropriate for SAR Altimetry or Altimeter delay Doppler processing. The links between the Level 1B processing and the Level 2 processing, particularly the retracking of the waveforms, are very strong. For instance, different ways of performing the delay Doppler processing lead to different L1B waveform shapes, and processing peculiarities have a noticeable effect on the L1B waveform, leading to changes in the geophysical retrievals.
Due to this strong link between the Level 1B processing and the final geophysical retrievals, it is important that the SAR Altimetry scientific community gains a much better understanding the Level 1B processor, and is involved in new developments. For various reasons (e.g. the novelty of the processing, previous unavailability of adequate documentation, restricted availability of the low levels of data), this understanding is currently limited, both in general terms of what a delay Doppler Processing is, and in particular of the different options chosen for the different missions. This is the focus of the DeDop3 project.
The novel data as they are processed offer a new world of possibilities yet to be discovered, analysed, and understood. New approaches are still to be developed, and they must be developed to maximise the possibilities that these data are offering us.
The DeDop3 project provides the scientific community with the means to understand and use low level Altimetry data and how these data are processed, by providing them with a Fully Adaptable and Configurable delay Doppler processor and a friendly user interface (a tool) to help them to interact with this processor. The DeDop processor will have different options from which the user will be able to choose according to their particular field of interest. Examples of the new options are: surface focusing (particularly relevant for special targets like coasts, rivers or lakes), any kind of weighting along and across track, different azimuth processing approaches, stack masking, new stacking algorithms (e.g. ACDC), Sigma-0 at stack level, etc.
DeDop3 is open source and the code will be freely available, in such a way that users will be able to explore the code and its possibilities and to modify it to their own needs.
The tool also comes with various demonstrations of new features that can be investigated and retrieved when using these lower data processing levels. They are presented as successful case-studies:
Iceberg Detection: Detection of target emerging from the sea surface. Examples of this are: icebergs, ships, lighthouse, and small islands.
Ocean Wind / Wave Modelling: Exploitation of the high-resolution altimetry for ocean wind-wave modelling.
Sea Ice: To improve discrimination of sea-ice through the use of stack data.
Inland Water Evaluate use of delay Doppler L1B-S and L2 products for monitoring rivers and inland waters, focussing on the Amazon basin.
Transponder: The calibration of the main scientific parameters of the altimeter: range, datation, and Sigma-0.
Attitude Estimation: Demonstrate the retrieval of pitch of the Sentinel satellite, independent of the star trackers.
ACDC (Stack L1B waveform modelling): ACDC is a new method of forming the Doppler delay map, named Amplitude Compensation and Dilation Compensation. The objective of this study is to evaluate the stability of the ACDC.
Polar Ocean Eddies: To improve the estimation of the SSH across mesoscale eddies, in the sub-polar Arctic along the West Spitsbergen Current.
3D Stack Modelling: Perform 3D fitting, or fit of the overall stack to test and quantify if precision and accuracy of geophysical retrievals is improved.
Instead of dealing with the core processor itself, novice community users will invoke the DeDop processor mainly through a platform-neutral DeDop user interface which we propose to be composed of the DeDop shell and the DeDop workbench. The shell provides command-line interface to the core processor, while the workbench provides a graphical user interface allowing for easy configuration and operation of the DeDop shell. The workbench’s target users are community scientists wishing to learn, modify or extend the DeDop configuration and/or code and then use the tool for comparisons between various outputs generated by the DeDop core. However, the workbench may also be used by operators who create configurations of the shell for e.g. bulk data processing in batch mode or for generating on-demand processing requests.
The DeDop workbench will be used to create and manage named DeDop configurations, invoke the DeDop core processor with a given configuration and finally to read in the processor outputs for exploration and comparison with former outputs. It will have a clear, comprehensive, intuitive and accessible graphical user interface and comprise a flexible and extendible set of data visualisations and analysis functions for the L1A, L1BS and L1B outputs. As stated before, the ultimate aim of the DeDop user interface is to attract community scientists to use and modify the processing code and let them become acquainted with the new Level-1 altimeter products.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1020 - Session title: EO Open Science Posters
OPEN-43 - COSMO-SkyMed Open Call: an opportunity for the international scientific community and National SMEs
Battagliere, Maria Libera; Dini, Luigi; Daraio, Maria Girolamo; Sacco, Patrizia; Virelli, Maria; Coletta, Alessandro; Piperno, Osvaldo Italian Space Agency, Italy
Show abstract
COSMO-SkyMed (Constellation of Small satellites for Mediterranean basin Observation) is an Italian Earth Observation (EO) Dual-Use (Civilian and Defence) Space System devoted to environmental monitoring and surveillance applications and to disasters risk management and recovery. The constellation is composed of four satellites equipped with a Synthetic Aperture Radar (SAR) operating in X-band and positioned on the same orbital plane. It was stepwise deployed from June 2007 to November 2010 and it is fully operational since May 2011.
Currently, three of the four satellites have finished their nominal operational life (5.25 years), but the constellation is still operating and providing data with the required image quality (the nominal End Of Life due to the “consumables”, i.e. fuel sizing, batteries life etc, is 7 years). In the last year Italian Space Agency (ASI) communicated the extension of their operational phase until 2016.
Due to the dual use feature of the system, COSMO-SkyMed data access is regulated by mean of an appropriate and well-defined data policy. Users are classified mainly in two separate domains: Civilian domain and Defence domain. In the Civilian domain users can be both institutional and commercial. The first ones include international partners, national and international Administrations, Agencies, Ministries, Universities, Research Centres, etc. and they are managed and coordinated directly by ASI under the signature of specific agreements. The second ones are managed by e-Geos, an ASI (20%)-Telespazio (80%) company, which is the commercial provider of COSMO-SkyMed products.
Thanks to the COSMO-SkyMed constellation features, Italy plays a key role in the international EO context, providing an invaluable contribute during a number of emergency situations worldwide and being one of the most exploited SAR mission during awareness and disaster events. It is a good example of a space based technology useful to serve science and society. In addition, SAR data are suitable for a number of applications: maritime applications including ship and oil spill detection, coastal monitoring, sea ice monitoring and extraction of wind and wave information, land applications including topographic and thematic mapping, agriculture and forestry applications, DEM extraction and subsidence measurements. In these fields, ASI continues to promote the use of COSMO-SkyMed data through the issue, on February 2015, of an "Open Call for Science" and an "Open Call for Small and Medium Enterprises (SMEs)" with the main objectives to promote the improvement of existing applications or the development of new technologies and algorithms based on EO information. The first call is addressed to the scientific community (national and international), while the second one to national SMEs (including start-ups, spin-offs). The opportunity to submit a proposal is permanently open and the selected projects are supported through the provision of a COSMO-SkyMed data set free of charge (maximum 100 scenes). So far ASI received close than 50 proposals related to new projects.
This paper will focus on the status and results obtained by ASI through the above mentioned open calls, in particular the one addressed to the international scientific community, considering also a quantitative analysis of the received proposals (PIs nationality, areas of interest, field of application, etc).
[Authors] [ Overview programme] [ Keywords]
-
Paper 1120 - Session title: EO Open Science Posters
OPEN-44 - The ESA Earth Observation Mission Software
Zundo, Michele; De Bartolomei, Maurizio; Duesmann, Berthyl; Piñol Sole, Montserrat ESA, Netherlands, The
Show abstract
The ESA Earth Observation Mission Software
De Bartolomei, M.; Duesmann, B.; Piñol, M; Zundo, M.
ESA/ESTEC (The Netherlands)
The Earth Observation Mission Software (EOCFI SW) is a set of multiplatform software libraries which are made available free of charge to any user involved in supporting the Earth Observation missions preparation and exploitation. The EOCFI SW is typically used in Ground Segment operational facilities and in tools developed by ESA and their partners for Earth Observation Missions.
This software is widely used in completed missions (Envisat, GOCE), in missions already or soon in the operational phase (e.g. SMOS, Aeolus, Cryosat, Swarm, Sentinel-1, Sentinel-2, Sentinel-3), in on-going approved missions (e.g. Sentinels, EarthCARE, SeoSAT) and planned to be used in future missions (e.g. MetOp-SG, Sentinel-5P, Jason-CS).
The EOCFI SW libraries provide functionalities for a wide range of high-level computations:
orbit (e.g. interpolation, propagation using different models);
attitude (e.g. interpolation, attitude law);
target pointing (e.g. direct/inverse geo-location with DEM);
geometric properties of calculated targets;
instrument swath computation and zone intersection;
zone/station visibility events;
observation opportunities for instruments (time segments and coverage).
Low-level functions are also provided, for example to support for several file formats read/write; co-ordinates / time transformations.
Thanks to its modular design, the EOCFI SW can be used for the development of facilities and tools belonging to different domains, like mission analysis, mission planning, instrument data processors, instrument performance simulators and ad-hoc tools like test data generators. The EOCFI SW is written in C (C++ and Java APIs are also available) and is available for several platforms, such as Linux, Microsoft Windows, Mac OSX.
The Software is actively maintained by ESA and a support team is available at ESA/ESTEC to help users in using the libraries; two releases per year are provided in order to resolve problems, to improve or extend already existing functionalities, the quality of the development and deployment process, the runtime performance, and to adapt it to the needs from new missions.
Before being released, the Software goes through a formal validation process that includes the automated execution of the acceptance tests, on all platforms and for all APIs. The software engineering validation process, as well as other steps of the development, is aided by tools implementing a so-called continuous integration process that ensures overall software quality and allows quick deployment of urgent releases if needed. The scientific validation performed on the EOCFI SW includes accuracy assessment, comparison with real data and to the output of other status-of-the-art Software tools or libraries.
More information related to the EOCFI SW can be found at:
http://eop-cfi.esa.int/index.php/mission-cfi-software/eocfi-software
[Authors] [ Overview programme] [ Keywords]
-
Paper 1312 - Session title: EO Open Science Posters
OPEN-29 - Discrete Global Grid Systems and the Living Planet
Purss, Matthew Brian John (1); Gibb, Robert (2); Samavati, Faramarz (3); Peterson, Perry (4); Percivall, George (5); Lewis, Adam (1); Thankappan, Medhavy (1) 1: Geoscience Australia, Australia; 2: Landcare Research New Zealand; 3: University of Calgary; 4: Pyxis; 5: Open Geospatial Consortium
Show abstract
The rapid growth in global sensor networks is leading to an explosion in the volume, velocity and variety of geospatial and geoscientific data. This coupled with the increasing integration of geospatial data into our everyday lives is also driving an increasing expectation of spatial information on-demand, and with minimal delay. However, there remains a gap between these expectations and the present reality of accessible information products and tools that help us understand the Living Planet. A solution can only be achieved through the development and implementation of common analytical frameworks that enable large scale interoperability across multiple data infrastructures.
Success has been achieved using discrete global grid systems (DGGS). A DGGS is a form of Earth reference system that represents the Earth using a tessellation of discrete nested cells and is designed to ensure a repeatable representation of measurements that is better suited to today's requirements and technologies rather than for primarily navigation and manual charting purposes. A DGGS presents a common framework that is capable of linking very large multi-resolution and multi-domain datasets together to enable the next generation of analytic processes to be applied.
There are many possible DGGS, each with their own advantages and disadvantages; however, there exists a common set of key characteristics that enable the standardization of DGGS to be achieved. Through the Open Geospatial Consortium (OGC) a new standard has been developed to define these essential characteristics of DGGS infrastructures and the core functional algorithms necessary to support the operation of and interoperability between DGGS.
This paper describes the key elements of the new OGC DGGS Core Standard and how DGGS relate to the Living Planet. Examples from a number of conformant DGGS implementations will be used to demonstrate the immense value that can be derived from the adoption of a DGGS approach to issues of serious concern for the planet we live on.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1357 - Session title: EO Open Science Posters
OPEN-22 - The FAU GIS-Wiki: Supporting education with a tailor-made, open and interactive Training materials
Braun, Matthias (1); Feilhauer, Hannes (2); Saß, Björn (3) 1: University of Erlangen-Nuremberg, Germany; 2: University of Erlangen-Nuremberg, Germany; 3: University of Erlangen-Nuremberg, Germany
Show abstract
GIS and remote sensing techniques are important factors in the education of environmental scientists. To an increasing amount, these technologies are also used in school education. However, the start with the powerful software packages and to keep an overview on the various systems available is not always easy. The FAU-GIS Wiki is a open source, extendable platform to support students and beginner in their GIS and remote sensing education. It is tailor-made for to the FAU basic education classes, but also accessible and usable from outside. It provides an easy accessible and searchable knowledge base. The content covers software examples, tutorials to different techniques or compilations and links of information that otherwise might be extensive to compile and query. Using different tools like screen casts, video tutorial and simple graphs and text the content is communicated. The wiki platform (www.gis.wiki.fau.de) covers examples from commercial software like ArcGIS, but also open source solutions like QGIS and R are strongly supported. The platform is gradually extended, partly also in students-for-students efforts.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1379 - Session title: EO Open Science Posters
OPEN-36 - Datacubes in Action: Array Databases as Enabling Innovation
Baumann, Peter Jacobs University, Germany
Show abstract
A paradigm shift is becoming reality. We begin to see the datacubes behind the millions of files. We start combining heterogeneous datacubes in an ad-hoc fashion. And we begin to overcome the age-old, technology imposed divide between data and metadata, supported by query languages like OGC WCPS for geo datacubes and ISO SQL/MDA for general multi-dimensional arrays.
Pioneered by rasdaman, the NewSQL technology of Array Databases has set out to "significantly transform the way that scientists in different areas of Earth Science will be able to access and use data in a way that hitherto was not possible", as European Commission and international reviewers have attested rasdaman based on "proven evidence". Notably, rasdaman is the blueprint for datacube standards in OGC, INSPIRE, and ISO. Hence, Agile Analytics based on scalable, distributed processing of complex Big Data queries is becoming reality. For example, the intercontinental EarthServer federation is advancing its European and Australian 100 TB datacubes to PB size in 2016.
In our presentation we briefly introduce Array Database technology, background, and standardization status, exemplified by rasdaman.
The main part will be a live demonstration of datacube queries on 3-D satellite timeseries and 4-D weather data cubes, with the aim of stimulating creativity for novel access paradigms.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1549 - Session title: EO Open Science Posters
OPEN-7 - DESIRE Simulation Tool to demonstrate Data Products for Security Applications
Letterio, Federico (1); Tonetti, Stefania (1); Barrilero Portomeñe, Omar (2); Valcarce Gonzalez-Roson, Fernando (2); Fabrizi, Roberto (3); Massip, Pierre (4); Ranchin, Thierry (4); Regan, Amanda (5); Cornara, Stefania (1) 1: DEIMOS Space, Spain; 2: ISDEFE, Spain; 3: DEIMOS Imaging, Spain; 4: ARMINES, France; 5: ESA
Show abstract
The Data Products for Security Applications (DESIRE) activity is a simulation tool development aimed at demonstrating the added value of including a high-resolution thermal infrared (TIR) imager within different space-borne architecture options comprising different capabilities i.e. SAR and optical. The DESIRE tool comprises techniques and methods to demonstrate realistic data product combinations resulting from these architecture options.
The simulator tool has been developed considering system designers and scientists as potential users who can utilise the tool to assess and visualise the added value of TIR data when it is combined with other ones.
The DESIRE Tool development has been based on mission scenarios that address the priority areas identified in the Copernicus services for security e.g. Border Security, Maritime Surveillance and EU External Action Support. Particular relevant scenarios taken into account for the simulator tool user needs analysis have been Oil Spill Detection, Maritime Ship Surveillance, Industrial Site Monitoring and Urban Heat Islands (UHI).
The simulator tool comprises an external interface capable of ingesting different input products at different processing levels (from L0 to L2, depending on data type), a processing chain for each data type to bring the products up to L3, a co-registration module, different data combinations and data-fusion techniques (in order to generate merged maps or maps with information extracted from different products) and a set of modules to customize and validate the data-fusion products depending on the scenario under investigation.
DESIRE has been implemented as a flexible, configurable and modular simulation tool, to be used for existing and firmly planned in-orbit capability and to combine these with real or synthetic TIR data products. The DESIRE simulator tool is based on the OpenSF simulation framework.
The modular design of DESIRE allows for the future extension of simulator tool functionality with additional processing modules in order to deal with a wider range of scenarios and in-orbit architectures.
The DESIRE simulator tool has been tested using thermal infrared, optical and SAR data products generated from ASTER, MASTER, MODIS, ERS and ENVISAT. The tests focused on four selected scenarios:
Deep Water Horizon disaster for the Oil Spill scenario and validated against NOAA archive data;
San Francisco Bay area for Ship Detection and validated using visual detection;
Monroe and Diablo Canyon for the Industrial Site Monitoring scenario;
Madrid city area for the UHI scenario, using in-situ data from weather stations.
The paper will present the simulator tool and the test cases results, highlighting how TIR data can add value for the selected scenarios and how using a high-resolution thermal infrared imager flying with other kinds of in-orbit capabilities can fill identified measurement gaps.
A further demonstration campaign is currently under way and the results will be presented. It is based on a high-resolution Sentinel-3/SLSTR-like TIR sensor, with the objective to assess its impact when the resolution is increased up to ~20m. New real datasets from different optical and SAR sensors will be used, to demonstrate the simulator flexibility and robustness to different possible inputs. Also new in-situ ground truth (including AIS) will be considered for extending the validation campaign. The outcome of this activity will be two-fold: consolidating the confidence on the results that DESIRE can provide and identifying further lines of growth of the proposed concept.
Moreover, the results of an analysis on possible mission architectures will be presented, with the objective to pave the way towards possible further works. Different mission concepts and operational schemes meeting the identified user needs have been studied, with the objective of assessing the impact on the space system of adding TIR assets beside existing SAR and optical ones.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1634 - Session title: EO Open Science Posters
OPEN-28 - The monitoring of urban environments and built-up structures in a seismic area: web-based GIS mapping and 3D visualization tools for the assessment of urban resources
Montuori, Antonio (1); Costanzo, Antonio (1); Gaudiosi, Iolanda (1); Vecchio, Antonio (1); Gervasi, Anna (1); Minasi, Mario (1); Falcone, Sergio (1); La Piana, Carmelo (1); D'Amico, Maria (1); Russo, Emiliano (1); Romano, Dolores (2); Buongiorno, Maria Fabrizia (1); Musacchio, Massimo (1); Stramondo, Salvatore (1); Doumaz, Fawzi (1); Casula, Giuseppe (1); Caserta, Arrigo (1); Speranza, Fabio (1); Chiappini, Massimo (1); Guerra, Ignazio (3); Porco, Giacinto (2); Tiberti, Mara Monica (1); Bianchi, Maria Giovanna (1); Luzi, Guido (4); Pannaccione Apa, Maria Ilaria (1); Compagnone, Letizia (5); Cuomo, Massimo (5); De Marco, Michele (1); Pacor, Francesca (1); Basili, Roberto (1) 1: Istituto Nazionale di Geofisica e Vulcanologia (INGV), Italy; 2: SISMLAB company, Spin-Off of Università della Calabria (UniCal), Italy; 3: Physics Department, Università della Calabria (UniCal), Italy; 4: Centre Tecnològic de Telecomunicacions de Catalunya (CTTC), Spain; 5: Advanced Computer Systems (ACSYS), Italy
Show abstract
The monitoring of urban landscape and buildings for risk mitigation purposes is a challenging task that concerns two operational issues. On the one hand, there is the complexity in managing and assimilating huge quantity of reliable measurements able to observe and monitor anthropogenic and natural phenomena, which may impact on urban environment, buildings and human lives. On the other hand, there is the ongoing difficulty to meet user requirements by providing user-friendly results, that can be easily understandable and manageable by end-users involved in the environmental and structural monitoring. Both issues are relevant in case of natural hazards, where time-continuous multi-temporal and multi-spatial information is needed for the effective safeguarding of landscapes, structures and human lives. In this framework, the assimilation of remotely sensed data with geophysical investigations and in situ measurements could represent a suitable approach for the multi-risk mitigation of urban areas and resources.
In this work, an infrastructural system for the monitoring of urban environments and structures in a seismic area is presented, which consists of remote sensing sensors, geophysical investigations and in situ measurements. The conceived methodology consists of three modular interconnected steps.
In the first step, seismogenic analysis are used together with space-borne and airborne remote sensing tools for the long-term monitoring of hazardous prone areas at regional scale. Within such a framework, airborne aeromagnetic surveys are used to improve the knowledge about seismogenic structures, providing physical and geometric properties of detected fault systems. The latter are used with geophysical investigation of the area to simulate different deterministic shaking scenarios of the area. These investigations are further integrated with both surface deformation analysis, provided through the interferometric processing of COSMO-SkyMed Synthetic Aperture Radar (SAR) measurements, and the land-cover mapping of the observed area, obtained by the classification of multi-spectral Geoeye-1 optical imagery.
In the second step, geological setting and geotechnical surveys of the area are jointly exploited with airborne and ground-based remote sensing sensors to provide the short-term monitoring of urban areas at basin scale. Within such a framework, the evaluation of seismic site effects is accomplished to describe the response of the soil with respect tothe simulated earthquakes . These investigations are then complemented with (i) airborne Light Detection And Ranging (LiDAR) measurements for providing the digital terrain & surface models of the area, (ii) high-resolution land cover mapping through the spectral-based classification of airborne hyperspectral data, (iii) the surface displacement analysis through the interferometric processing of Ground-based (GB) SAR measurements.
Finally, the third step consists in combining proximal remote sensing tools for the real-time and near-real-time monitoring of buildings in terms of structural, geometrical and material properties. Within such a framework, a terrestrial laser scanner (TLS) is used to provide the 3D modelling of buildings. Moreover, noise measurements inside building and interferometric Real Aperture Radar (RAR) data are used to assess the vibrating properties of the observed structure. All these surveys are then integrated with non destructive testing techniques (e.g. InfraRed thermal camera, flat jacks testing, crack pattern analysis, etc.) to evaluate and monitor the main structural and material properties of the building.
The products provided by each investigation step are assimilated and visualized as geo-spatial metadata in a dedicated web-based Geographic Information System (GIS) platform and a 3D visual software for the monitoring of urban landscapes and building, respectively. Both value-added final products could be very useful to support decision making policy, risk mitigation plans and restoration activities.
Obtained within the PON MASSIMO[i] project (Monitoraggio in Area Sismica di SIstemi MOnumentali), some meaningful results of the proposed approach are shown to monitor cultural heritages within the seismic area of Calabria Region.
[i] The present work is supported and funded by Ministry of Education, University and Research (MIUR) under the project PON01-02710 "MASSIMO" - "Monitoraggio in Area Sismica di SIstemi MOnumentali".
[Authors] [ Overview programme] [ Keywords]
-
Paper 1686 - Session title: EO Open Science Posters
OPEN-39 - Research and Service Support: bringing users to data
Delgado Blasco, Jose Manuel (2,3); Sabatino, Giovanni (2,3); Cuccu, Roberto (2,3); Rivolta, Giancarlo (2,3); Marchetti, Pier Giorgio (1) 1: ESA/ESRIN European Space Agency, via Galileo Galilei, 1, 00044 Frascati (Italy); 2: ESA Research and Service Support, via Galileo Galilei, 1, 00044 Frascati (Italy); 3: Progressive Systems Srl, Parco Scientifico di Tor Vergata, 00133 Roma (Italy)
Show abstract
The ESA Research and Service Support (RSS) service has the mission to support the Earth Observation (EO) data exploitation, by an operational pilot of the new paradigm “bring users to data”. This approach fits well the challenges posed by larger data availability, from more missions, more frequently. It dramatically lowers the barrier to make research activities, develop algorithms and downstream services by opening such possibility to new users and reducing effort and resources needed by the ones already involved in EO. This objective is achieved: 1- by offering tools and services to the EO community, granting a fast and easy “virtualised” data access (i.e. without the need to transfer the data into scientist “own” infrastructure) and “real” (offered by RSS) scalable processing resources access, and 2- by supporting the researchers in developing new algorithms and also by enabling results visualization, verification, validation and sharing.
The RSS service offer is composed of several elements supporting different phases of the research process flow. It includes e-collaboration environments to find and share information, reference and sample datasets, “virtual” access to a huge EO data archive without needing to download data on scientist or developer “own” resources, customised cloud toolboxes where scientists and developers alike can fine tune their algorithms on selected datasets, on-demand processing environment where fine-tuned algorithms can be integrated and made available as EO applications for on-demand and/or massive processing, and results visualization tools.
RSS users are EO Scientists, Researchers and – during the prototyping phase - Service Providers. According to users’ experience resorting to the RSS service can generate significant saving in terms of time effort (~months), besides the more intuitive savings related to processing and storage. For large dataset processing these can be hundreds of CPU hours and ~tens of TB, respectively. It is worth to notice that such savings can, in turn, generate higher scientific productivity and faster time to market.
Two processing needs are addressed by RSS: (i) algorithm development and post-processing activity, and (ii) big data processing.
As far as the algorithm development process is concerned, including the fine-tuning phase, the RSS CloudToolbox is the basic tool offered by RSS to EO researchers. Such tool is a customised virtual machine with pre-installed software powerful enough to speed-up the development phase. The algorithm can be successively integrated into the RSS processing environment, thus bringing it close to data, once it is deemed to be stable, either if the scientist plans to run it on massive datasets (big data processing) or to make it available to the scientific community as a web application.
The RSS service makes available in its processing environment several applications based on EO algorithms covering different thematic areas such as Land, Marine, Atmosphere, Security and Emergency response. Among the available applications, it is possible for instance to compute a SAR interferogram using GAMMA or DORIS, perform multi temporal analysis using the P-SBAS algorithm provided by CNR-IREA, create a MERIS global vegetation maps at regional level, retrieve land surface temperatures, or use ESA operational processors integrated in the environment. All these tools can be freely tested and used on available datasets.
It is worth to mention that there is a permanent ESA call for proposal offering to EO scientists the possibility to perform bulk processing and/or validation of their own algorithms resorting to the large amount of ESA EO data by means of the RSS processing on-demand environment.
In such environment high-performance computing resources, using Grid and Cloud technologies, provide the necessary flexibility to allow quick access to data and fast production of processing results. The RSS physical infrastructure counts about 70 processing nodes with an total of 500 Gb RAM memory and 350 CPU’s. This represents the RSS base capacity that is on average sufficient to satisfy users’ processing requirements. When the processing requests exceed the RSS base capacity, it is possible to scale up the resources by seamlessly federating additional clusters deployed on the Cloud.
RSS provides “virtualised” data access to all current and past ESA missions (ERS, Envisat, Earth Explorers), Third Party Missions, as well as Copernicus missions. Besides, RSS offers a data provisioning service which can be requested by authorised researchers, making available on-demand specific dataset if not present yet in the RSS “virtual” data catalogue.
In the big data era, started with the launch of Sentinel-1A in April 2014, data volume and processing requirements are becoming more and more challenging. Hence, the EO scientific community accessing and using RSS resources will experience even greater benefits for all the activities related to the EO research process, including algorithm development, data access, processing and results analysis.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1737 - Session title: EO Open Science Posters
OPEN-9 - SUCE: Suitability Coverage Engine
Brodsky, Lukas (1); Vobora, Vaclav (1); Nalevka, Ondrej (1); Jindrova, Marketa (1); Stoica, Adrian (2); Buciu, Claudiu (2); Birtas, Dan (2) 1: Gisat, Czech Republic; 2: Terrasigna, Romania
Show abstract
Optical remote sensing represents the most widely used type of remote sensing in present. However, there is one significant disadvantage to be often dealt with: cloud coverage which is frequently present in various forms and extents. For diverse mapping and monitoring analyses a cloud-free imagery is needed. This can either be ensured by selecting scenes and dates with no cloud coverage or, if the area of interest is bigger, create a cloud-free mosaic by using cloud-free scenes or parts of several neighbouring scenes. The question remains: Which is the most efficient process to select imagery with no clouds and how to best create a mosaic? Up to now searching for a suitable cloud free imagery or datasets in various archives and creating an appropriate mosaic has been a time-consuming activity which had to be done manually and consisted of often excessive data downloading in order to perform visual checks before a decision could be done. Yet, this process often did not lead to the best results available, especially when looking for a cloud-free full coverage of a larger area.
The main objective of SUCE project is to define a concept and architecture, and provide a prototype which would effectively select EO products based on advanced user criteria and analytic needs. Such an automated process would significantly facilitate the current process and provide better results avoiding both manual filtering and transfer of useless data.
SUCE comes with a solution how to speed up the process of selecting suitable satellite imagery and avoid downloading of needless data. In this automated process all available data will be analysed based on its metadata containing cloud cover information and thus no imagery data downloading is needed for the analysis. A portal has been developed where user can enter his specific requirements, the request will be processed by the SUCE Engine based on the requirements, and a list of appropriate datasets will be returned. Several scenarios are considered including seamless full coverage mapping and time series monitoring. There is number of mapping programmes running inEuropethat could immediately profit from the SUCE Engine implementation. They include Copernicus Land Monitoring Services (e.g. LC/LU, HR Layers, Urban Atlas, Riparian zones, NATURA 2000 sites) or EC CAP monitoring activities (yearly CwRS campaigns).
When comparing the current manual selection process and SUCE automated process the advantages of the later are apparent: a) the time spent shortens from the order of hours to the order of minutes; b) no need for excessive data downloading; c) no manual processing necessary. SUCE solution significantly shortens the time and simplifies the process of selecting suitable cloud-free datasets for various analysis. It should be of a substantial help to users from various areas and make their work more efficient.
The entire software development process follows an open source approach and the first implementation results will be available to public in March 2016.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1798 - Session title: EO Open Science Posters
OPEN-37 - The Nansen-Cloud: a Scientific Platform as a Service for the Nansen Group
Hansen, Morten Wergeland; Korosov, Anton; Olaussen, Tor; Vines, Aleksander; Hamre, Torill Nansen Center, Norway
Show abstract
We introduce a new concept called a Scientific Platform as a Service (SPaaS) to aid the management and synergistic use of EO data, in particular from satellite remote sensing but also including in-situ and model data. The SPaaS implementation can be understood as an advanced cloud client providing the integration of scientific processing and analysis tools, algorithms, data stored at various locations (e.g., Sentinel collaborative ground segments), and various data catalogs, via an application programming interface (API). The SPaaS allows users to work on their local desktops, using local CPU with integration to cloud systems to analyse the EO data with their preferred tools.
In this manner, the SPaaS will allow the users to focus on the scientific research without bothering where the data is stored or its format, nor the maintenance of the infrastructure or the software. The open-source Nansen-Cloud will be based on mature components thus synchronizing already existing, open-source and cross platform working tools (e.g., postgis, opendap, vagrant, ansible, OpenShift, OpenStack, etc.). The Nansen-Cloud will contribute to ensure better handling of the daily stream of large amounts of satellite data and synergistic integration of information products from, e.g., NORMAP, NMDC and the SIOS Knowledge Centre and other European services. The platform will (1) advance ocean research and contribute to development of priority topics such as climate and environment, (2) support marine services, (3) be openly extensible and (4) be an excellent tool in teaching and training at global and national scales, with little or no setup overhead.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1858 - Session title: EO Open Science Posters
OPEN-50 - Copernicus Network and Security Services
Mazzucca, Paolo (1,2); Buscemi, Gioacchino (2); Nardelli, Manuel (1,2) 1: RHEA to ESA/ESRIN, Italy; 2: ESA, European Space Agency
Show abstract
This paper focuses on a new and advanced Network and Security Ground Segment designed to support the Europe’s Copernicus programme [1] for the data circulation and dissemination to the Copernicus user communities. The Copernicus programme provides reliable, timely and accurate services to manage the environment data, understand and mitigate the effects of climate change and provide support in emergency and crisis situations. The programme relies on the provision of robust data, predominately from a fleet of Earth observation satellites called Sentinels, and supplemented by data from Member States’ satellites. The Copernicus user scenario is characterized by a set of heterogeneous user communities [2] which have specific needs and with a hardly predictable evolution of the expected dissemination capacity over time.The network and security infrastructure has been designed to cope with the large amount of data, near-real-time applications and high throughput data transfer by ensuring at the same time a strong level of security.
The Copernicus Data Network is based on a twin centralized infrastructure with a high speed internet access, to deploy a 10 Gbps access network across Europe. The utilization of the state-of-art LAN technology at the different facilities and a virtualization of the dissemination services complement the Copernicus Network and Security elements. The security infrastructure is composed by a set of sophisticated security elements that protect the Copernicus perimeter and enforce the ESA Earth Observation (EO) security policy. A two tiers firewall and Intrusion Detection and Prevention Systems (IDPS) system, along with a centralised solution providing Distributed Denial of Service (DDOS) defence service, WEB Proxy and mail service with antivirus, perform pro-active and reactive security screening with 10+ Gbps performance.
The network and security operations are managed by the service provider, it interfaces with ESA via a single point of contact for any activity concerning all Copernicus network and security services. The entry point is the Service Desk, which registers, classifies and assigns the tickets to the relevant units (e.g. second level support, specialists, service partners) for resolution. All operations are supported by a set of tools which provide full visibility on the network status, behaviour and trends. Each element is constantly monitored by the appropriate monitoring tool which shows a real-time and graphical view of the network and gathers the network traffic statistics. All security layers generate a large amount of information, alarms and logs that are constantly analysed by Security Operation Centre (SOC) engineers. Cyber-attacks, identified by SOC, are addressed in close coordination with the ESA EO security team and according to the agreed security operational procedures (SecOps).
Data dissemination functions are based on the central Internet Access service and on the Pick-Up Point service. Both services provide access to the Sentinel data via the public Internet. A structured virtualization of the dissemination services ensures an efficient data delivery to the communities. The Pick-Up Point counts 24 Virtual Machines, each with a specific set of applications, work schedule and role.
The systematic processing of the acquired data corresponds to a sustained generation rate (24h/7d) of a continuous stream of user products (Sentinel-1,-2,-3, A series). Data download is available to users via terrestrial network through data hubs, with output rates up to 10 Gbps. Users can self-register to the data hubs and access to several months of data via rolling archives; access to full Sentinels long-term archive is being made available too.
References
[1] Copernicus Programme Web Site - http://www.copernicus.eu/
[2] ESA Sentinel mission web site - https://sentinel.esa.int/web/sentinel/home
[3] Sentinel Online web site - https://sentinel.esa.int/web/sentinel/sentinel-data-access
[Authors] [ Overview programme] [ Keywords]
-
Paper 1892 - Session title: EO Open Science Posters
OPEN-15 - An operations model for a virtual laboratory for ecosystem services (ESLab)
Anttila, Saku; Holmberg, Maria; Vanhala, Pekka; Hynninen, Mikko; Vihervaara, Petteri; Haaspuro, Tiina; Forsius, Martin Finnish Environment Institute, Finland
Show abstract
Experiences from the progress of an operations model for the development of a virtual laboratory for ecosystem services (ESLab) are presented. The general ESLab concept was presented in Holmberg et al. (2015). The necessity for the ESLab operations model rose from the observed gaps in interaction between the ecosystem model developers, end users of the information and the technical services that publish the information. ESLab operations model is a framework on which the development of ESLab is constructed emphasizing cooperation between the actors involved. ESLab operations model considers three aspects: (i) The ecosystem modelling and information framework selects and provides the ecosystem information, (ii) technical services processes and publish the results on web for visualization, analysis and downloading by using standardized practices for the data and metadata management, and (iii) the end users which ultimately defines the specific information needs and guide the agile development services ESLab provide. The development of ESLab operations model is presented by using the first manifestation of ESLab virtual laboratory, namely the ESLab-Luonnikas, which presents the results of natural environments’ role in a municipal level carbon budget on a web-service. In the ESLab-Luonnikas carbon budgets are presented for the 301 municipalities in Finland including e.g. scenarios on how the three different forest management actions can be expected to effect on the carbon budget in coming time periods of 2011-2020, 2020-2030 and 2030-2040. These results are based on the automation of LUONNIKAS model calculations by ARONIA-institute. The identified end users for this information include citizens, NGOs, media, government planners and experts, scientists, as well as representatives of education and livelihoods. The development of ESLab-Luonnikas started with enhanced SWOT analysis followed by the initial requirement mapping from the identified end users. The actual creation of ESLab-Luonnikas then involved several workshops between the modellers and technical developers resulting the automation of Luonnikas calculations and, finally, first version of the ESLab-luonnikas web service was built with the Tableau data visualization software. A clear need for the formalizations of interaction methods between the actors involved was identified. For the end user interaction we found that the agile development of a service can be enhanced by using service design methodology that give user identified requirements for the information and technical services. Similarly, for the interaction between the modellers and technical developers, we found that bottle necks hindering the understanding between these actors can be identified and methods for the improved collaboration developed. Here we present experiences and best practices observed during the first year of ESLab operations model development. In the near future our aim is to extend ESLab services to be able to forecast trends and trade-offs in e.g. biodiversity variables and carbon balance under various land use and climate change scenarios. These services are planned to utilize also the Copernicus EO data sets.
References: Holmberg, M., Akujärvi, A., Anttila, S., Arvola, L. Bergström, I., Böttcher, K. Feng, X., Forsius, M., Huttunen, I., Huttunen, M., Laine, Y., Lehtonen, H., Liski, J., Mononen, L., Rankinen, K., Repo, A, Piirainen, V., Vanhala, P. and Vihervaara, P. (2015). ESLab application to a boreal watershed in southern Finland: preparing for a virtual research environment of ecosystem services. Landscape Ecol. 30: 561-577. DOI 10.1007/s10980-014-0122-z.
[Authors] [ Overview programme] [ Keywords]
-
Paper 1910 - Session title: EO Open Science Posters
OPEN-23 - First ESA Massive Open Online Course
Boudinaud, Laure ESA, Italy
Show abstract
Free online courses are a new way for universities, companies and all kind of organisations to share their knowledge and reach a worldwide range of potential learners. For such organisations and more particularly for ESA, instructing citizens, especially digital natives, is of great interest as crowdsourcing and general awareness represent an incredible asset for the agency. Indeed, including mass knowledge into the agency’s activities and capitalizing on the digital revolution help to maximize the scientific benefits of Earth Observation (EO) data.
Open education is at the heart of ESA activities in order to inspire the next generation of scientists. Using the new technologies is essential to reach them as the learning landscape is evolving. Doing so, the platform enables the MOOC’s students to exchange about the course material with one another and with the scientists leading the course. ESA intends to inspire learners by revealing all the opportunities that space can offer.
The MOOC “Monitoring Climate from Space”[1] has attracted a very wide range of learners around the world, which makes it “massive” indeed. The first ESA MOOC started in June, 2015 and gathered around 10,000 students on the FutureLearn platform. FutureLearn is wholly owned by The Open University which has over 40 years of expertise in online learning. The platform is currently used by over one million learners in more than 190 countries. The goals of the ESA's MOOC have been to explain the use of earth orbiting satellites, in order to monitor the state of climate, for educational and decision-making purposes. The course lasts five weeks, during which learners get the chance to be guided by scientists experts in satellite data.
The course’s content is organized in five sections, one per week, each of which focuses on a specific topic: a general presentation of EO data, its management, the current methods, techniques and technology, and the future challenges. The course enters into more detailed environmental elements analysis such as ice thickness, aerosol, sea level and soil moisture. The content is based on measurements captured by recent EO missions: Cryosat, SMOS, GOCE and Sentinel satellites, among others. It is also based on findings from the Climate Change Initiative (CCI).
A first feedback has been released, and data analysis has shown that the completion rate of the course is remarkably high (around 30%, when most of MOOCs’completion rate rarely goes higher than 12%) Similarly, the proportion of active learners posting comments and participating to the interactive activities offered by the course is a proof of the interest learners have found in the course. These very optimistic results have led to the decision to run the MOOC a second time in November, 2015. Moreover, another MOOC is under preparation for March, 2016, which would be a sequel of the first one, with a deeper focus on the technical side of Earth Observation.
These MOOCs are designed for current and future policy makers, educators and anyone interested in climate change, as understanding the datasets and underlying theories of earth observation is essential to inform their work and gives a robust basis to decision-making processes. “Monitoring Climate from Space” helps to increase awareness of the threat climate change presents to our planet. With the introduction of the use of the data acquired by the EO missions, learners can better understand how it can support various applications and environmental policies.
[1] https://www.futurelearn.com/courses/climate-from-space
[Authors] [ Overview programme] [ Keywords]
-
Paper 1996 - Session title: EO Open Science Posters
OPEN-16 - MAESTRO: an Infrastructure for Earth-Observation Data Validation, Processing, Cataloguing and Access
Hajduch, Guillaume; Goacolou, Manuel; de Joux, Romain; Longépé, Nicolas; Husson, Romain; Vincent, Pauline CLS, France
Show abstract
In recent years the ability to access large quantities of Earth-Observation (EO) satellite images has greatly increased. Moreover the volume of single EO products drastically increased, together with the need to use them for more advance applications using computation intensive algorithms. Accessing larger volume of data allows addressing emerging applications but also creates more stringent needs for efficient data processing (both near real time and offline), data management efficient visualisation.
In this article we present the MAESTRO infrastructure that was designed for efficient data management, inventory, near real time and delay processing, and visualisation of multi sensor Earth-Observation data. The MAESTRO system was designed based on a Service Oriented Architecture (SOA) and standards of the Open Geospatial Consortium (OGC). This design allows ensuring:
- Multi sensor data cataloguing and inventory, including optical (Sentinel-2, Landsat), SAR (Sentinel-1, CosmoSkyMed, TerraSAR-X and Tandem-X, Radarsat-1 and Radarsat-2, Envisat/ASAR, ERS 1/2...), scaterrometers
- On line visualisation of EO data, auxiliary data and derived information for product inspection and validation and identification of scientific test cases
- EO data access for end users, based on standard web services
- Planning and ordering of multi sensor EO data acquisition
- Scheduling of both near real time (NRT) and offline processing
- Scalable data processing based on distributed processing nodes and load balancing
- User friendly operational monitoring of the processing
It permits addressing multiple typologies of users and services:
- Researchers for collocation of multi sensor EO data with auxiliary information for calibration and validation applications
- End users for the discovery and visualisation of EO data
- EO service users for the planning of activities collocated with EO data acquisition
- Operators of ground stations and processing centres: for the planning of acquisitions, the data management, and the supervision of data processing and delivery of information to end users.
The MAESTRO system is used in support to the Quality Control, Calibration and Validation of the Sentinel-1 products as part of the Sentinel-1 Mission Performance Centre. It allows a fast inspection of the products with collocation with auxiliary information, and the management of offline calibration processing chains.
The MAESTRO system is routinely used in order to operate the CLS/VIGISAT ground station. This station is equipped to process EO data from multiple SAR systems and is a collaborative ground station for Sentinel-1. The operation performed in this station cover both NRT applications (for instance operation of the EMSA/CleanSeaNet service) and off line processing.
The MAESTRO system is also used in order to promote the usage of EO data for a wide range of applications, covering maritime security (FP7/SeaU project...), marine renewable energy (CNES/SOMEMR project), NRT sea monitoring (CLS/VIGISAT Collaborative Station and MCGS project), global swell monitoring (MCGS, ESA/SOPRANO project). The demonstrations are available on the following web site https://eoda.cls.fr
[Authors] [ Overview programme] [ Keywords]
-
Paper 2018 - Session title: EO Open Science Posters
OPEN-25 - Máme rádi Slunce / I Love My Sun - Educational project for Czech primary schools. Results from 2014-2015
Humlová, Dana (1); Mošna, Zbyšek (2); Urbář, Jaroslav (2); Macúšová, Eva (2) 1: University Hospital in Motol, Czech Republic; 2: Institute of Atmospheric Physics, Czech Academy of Sciences
Show abstract
Máme rádi Slunce / I Love My Sun project in Czech primary schools 2014-2015
In Autumn 2014 we visited a primary school in Tanvald (pop. 7,000) , Northern Bohemia, to start series of educational presentations for children in the age of approximately 7 to 11 years. The project was inspired by the work of (Tulunay et al., 2013, J. Space Weather Space Clim. 3, 2013). The goal of the project is to educate children about the topic connected to the Sun, Space Weather, and Sun-planets interactions. It covers the Solar system, discovery of sunspots, existence of solary cycles, CME, solar wind, magnetosphere, auroras, ISS and so on.
During the lesson children first draw/paint their idea of the Sun, then attend the presentation with discussion and eventually draw again the Sun. Up to now (October 2015) we visited 16 classes in several schools in Tanvald, Prague, Vlašim, and Železnice, and collected more than 250 pairs of pictures.
Our results show that the children were very interested in the topic and actively participated during the lessons. Typically, while first drawings depict usual chilren's ideas of the Sun (for example yellow Sun with face and radial rays or local landscape with the Sun and blue sky) the second picture reflects newly obtained motives as the magnetosphere protecting the Earth against disturbed Sun, auroras, dark space with planets and space vehicles. We believe that using this hands-on educational tool the children become acquainted with basic concepts of solar-terrestrial interactions and the importance of Space Weather for their generation.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2028 - Session title: EO Open Science Posters
OPEN-13 - Connecting Science, Development and Operations on the EODC Earth Observation Platform
Briese, Christian; Mücke, Werner; Kidd, Richard; Wagner, Wolfgang EODC, Austria
Show abstract
The mission of the Earth Observation Data Centre for Water Resources Monitoring (EODC) is to enable the effective and efficient usage of Big Earth Observation (EO) data and to facilitate the cooperation among the scientific, public and private sectors. The concept of EODC’s IT infrastructure and framework brings scientists, developers and data analysts together on a single virtual platform, thereby fostering productive and collaborative working methods. It follows a novel paradigm in EO, driven by the requirements and challenges currently imposed on the EO communities: Bringing the algorithms and software to where the data are located, instead of vice versa. This is the only promising approach to be to handle the ever increasing amounts of data in an efficient way.
With this contribution we would like to give an introduction to EODC’s virtual research, development and operations environments. Building on the three key pillars of the EODC framework: (1) the Science Integration and Development Platform (SIDP), which is a fully equipped cloud computing infrastructure, (2) the Near Real-time Operations and Rolling Archive (NORA), which is a high-availability storage and processing cluster, and (3) the Global Testing and Reprocessing Facility (GTR), a Top100 (Nov. 2014) high performance supercomputer. This contribution presents how these IT capacities can be employed for collaborative methods and software development and testing, as well as accessing and processing a Petabyte-scale archive of EO data.
Furthermore, the search and discovery functions for the EO data through a custom-made meta database, as well as the functionalities of submitting, scheduling and monitoring of processing jobs on the processing cluster, are demonstrated. This is complemented by selected tailor-made exploitation services powered by EODC’s infrastructure, covering the topics of operational cropland monitoring, wetland delineation, and soil moisture retrieval. The complete suite of resources underpinning the EODC framework are accessible via a central and browser-based web portal and interactive delivery platform, which provides the necessary tools for building, testing, (near real-time) delivery and analysis of value-added EO products. By providing an open environment that connects science, development and operations the EODC is a catalyser for open and international cooperation amongst public and private organisations and fosters the use of earth observation for local and global monitoring tasks.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2064 - Session title: EO Open Science Posters
OPEN-10 - PREDICITS: a tool for ionosphere parameters prediction
Auge, Emmanuel (1); Decerprit, Guillaume (1); Lacoste, Frédéric (2); Rougerie, Sebastien (2); Boudesocque, Clément (1); Ginestet, Arnaud (1) 1: NOVELTIS France; 2: CNES France
Show abstract
The prediction of ionospheric parameters was the subject of much research in the twentieth century. This research, based on the development of a global network of HF sounders, has led to the development of empirical models describing the main parameters of the ionosphere. These models, which are still of reference today, offer a global representation of the ionosphere by estimating across the globe median monthly parameters according to the solar activity. The forecast of ionosphere parameters from these empirical models is usually a long-term prediction (several months) which is based on the approximate prediction of solar activity itself.
For a decade, new investigations have led to improvements of local and very short-term (few hours) forecasts.
Past few hours, forecast errors can no longer justify the use of more complex methodologies than empirical models. However, it remains necessary to accurately predict the ionosphere to allow anticipation of the propagation conditions for a system dependent on the state of the ionosphere and allow to anticipate the parameters that will allow to establish an HF link (i.e. ground to satellite). In this context, the selection in advance (2-3 weeks) of the transmitting station with presents the highest probability to establish a link with the target satellite has to be possible. Then a shorter forecast (5 days to a week) should offer the opportunity to confirm the choice of the selected broadcast station and refine potential connections between the selected station and satellite.
To meet this need, the tool “PREDICTIS” has been developed by NOVELTIS in the framework of a CNES contract.
PREDICTIS is a software aiming at giving predictions of ionosphere parameters over different periods (some days to few weeks). This one is based on the ionosphere state of the preceding days, the seasonal variations that lead to periodic (~monthly) patterns, the solar cycle that leads to annual variations, the existence of chaotic areas (e.g. near the poles).
It could be shown that PREDICTIS has excellent performances when it comes to predicting TEC maps and foF2. These performances outrun those of the usual model “Nequick” alone and represent a significant step forward.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2071 - Session title: EO Open Science Posters
OPEN-2 - A set of Software Tools supporting EO Satellites: Instrument Swath and Visualisation
Pinol Sole, Montserrat; Zundo, Michele ESA, Netherlands, The
Show abstract
A set of Software Tools supporting EO Satellites: Instrument Swath and Visualisation
Living Planet Symposium (LPS16) 09-13 May 2016
Prague, Czech Republic
Michele Zundo(1), Montserrat Piñol Solé (1)
(1)ESA - ESTEC, Keplerlaan 1, PO Box 299, 2200 AG Noordwijk, The Netherlands
ABSTRACT
This paper presents the software applications for mission analysis and 3D visualization distributed by the ESA-ESTEC EOP System Support Division to users part of the ESA Earth Observation Earth Explorer and Copernicus satellites community.
These software tools can be used to perform mission analysis activities related to instrument swath coverage over regions of interest and ground station contact (ESOV NG, EOMER) or to display satellite mission scenarios as high-resolution 3D animations (SAMIplay). These tools can and have been used in the preparatory feasibility studies (e.g. to analyse coverage and revisit time), to support downlink and ground station visibility analysis as well as as support to Calibration and Validation activities e.g. to plan on-ground campaigns during satellite commissioning or scheduling of ground transponders.
The Earth observation Swath and Orbit Visualization (ESOV NG) is a 2D orbit and swath visualization application, delivered with a predefined set of missions (Sentinel 1, 2, 3, 5p, 6, SWARM, Cryosat, SMOS, Aeolus, EarthCARE,…), although it is possible to define any other satellite mission. This tool is multi-platform, available for Mac OS X, Linux and Windows.
EOMER (Earth Observation Mission Evaluation and Representation) is a Windows application for multi-satellite and swath visualization in 2D/3D, tailored to ESA Earth Observation missions (currently supporting Sentinel 1, 2, 3).
Both ESOV NG and EOMER applications provide the user with the means to visualize the ground-track and instrument swaths of ESA Earth Observation satellites and assist in understanding when and where satellite measurements are made and ground station contact is possible.
The SAMIplay (SAtellite MIssion display) application displays stunning high-resolution 3D and 2D animations of ESA Earth Observation satellites. SAMIplay displays the swath ground tracks of the instruments on-board, the entering in area of visibility between the satellite and the ground stations as well as the solar arrays and antenna deployments and thruster firing events. The user can drive the various camera views or generate standalone animation for kiosk type application. The missions currently supported are Sentinel 1,2,3, SWARM, Cryosat, SMOS and Aeolus. The application runs on desktop platforms (Mac OS X, Windows) and mobile platforms (iOS based, e.g. iPad).
The coherence and accuracy of the orbital and geometrical calculations within ESOV NG and SAMIplay applications is ensured by the use of embedded Earth Observation CFI Software libraries (EOCFI SW),. In ESOV NG, the libraries are used to obtain orbit ground-track, instrument swaths, passes over selected area of interest or ground station while in SAMIplay, the EOCFI SW libraries perform orbit, attitude and swath related calculations.
The EOMER application instead makes use of the SatX and GanttX components developed by Taitus to support respectively the orbital calculations and its timeline visualisation.
- NOTES
More information related to ESOV NG and EOMER can be found at:
http://eop-cfi.esa.int/index.php/applications/esov
http://eop-cfi.esa.int/index.php/applications/eomer
[Authors] [ Overview programme] [ Keywords]
-
Paper 2102 - Session title: EO Open Science Posters
OPEN-34 - On the Potential of Big Data Capabilities for the Validation of a Weather Forecasting System
Iannitto, Giuseppe; Del Frate, Fabio; De Rosa, Michele GEO-K, Italy
Show abstract
A key component of the EO projects is the validation of the EO data products through a Ground Truth Validation. In the validation process data can be collected from various ground-based sources and sensors (in situ measurements, instruments, crowd-sourcing, open source platform), then quality-controlled, and finally compared with the satellite products in order to get validated retrievals.
The objective of this work is to develop a system that uses big data capabilities and tools for validation purposes, in particular for the assessment of a new weather nowcasting system, based on a predictive model exploiting Meteosat Second Generation (MSG) imagery.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2124 - Session title: EO Open Science Posters
OPEN-11 - Space-to-Ground Data Viewer and DFDL for Space Library
Zundo, Michele (1); Pinol Sole, Montserrat (1); Mestre, Rui (2); Gutierrez, Antonio (2) 1: ESA, Netherlands, The; 2: DEIMOS Engenharia
Show abstract
Space-to-Ground Data Viewer and DFDL for Space Library
Living Planet Symposium (LPS16) 09-13 May 2016
Prague, Czech Republic
Michele Zundo(1), Montserrat Piñol Solé (1), Rui Mestre(2), Antonio Gutiérrez(2)
(1)ESA - ESTEC, Keplerlaan 1, PO Box 299, 2200 AG Noordwijk, The Netherlands
(2)DEIMOS Engenharia, Av. D. Joao II, Lt 1.17.01, 10º, Ed. Torre Zen, 1998-023 Lisboa, Portugal
ABSTRACT
The Space to Ground (S2G) Data Viewer is an extensible and flexible tool to inspect the contents of the communication data units exchanged between the satellites (including their instruments) and the payload data-processing ground system. The activity of Space to Ground testing for payload data as well as the related activity to generate data for testing the science data processors requires the analysis and visualisation of telemetry data files produced by satellites, ground equipment or simulators. This satellite house-keeping telemetry or science instruments data is transmitted to the ground sensor stations according to the CCSDS standard formats, consequently the data files on ground can be formatted as CADUs (Channel Access Data Units), TFs (Transfer Frames) or ISPs (Instrument Source Packets) and ad-hoc tools are usually developed every time for each mission by the ground test engineers and Level 1 processors developers.
The S2G Data Viewer was therefore developed as a generic tool supporting both the visualization and basic diagnostic of data generated on-board by current ESA Earth Observation Earth Explorer and Copernicus satellites, avoiding in this way the effort and cost of re-developing such tools for each mission. The S2G Data Viewer hence interprets binary files containing concatenated CADUs, TFs or ISPs, and presents them as hierarchical lists of available data units displaying the fields of header and content and the associated values in raw, engineering and binary format. The values are checked against user-specified constraints (e.g. correctness of Spacecraft ID, valid APID, data ranges), the sequences (e.g. SSC and Frame Counter) are checked for continuity and each data unit is also checked with respect to checksums and error correction fields.
The tool furthermore provides additional functionalities such as a hexadecimal viewer to allow direct low-level data inspection, searching, filtering and data transformation functions (CADU unscrambling and TFs/ISPs extraction). The tool stores all the data definition as pre-configured as files and includes mission configuration for Sentinel 1, 2, 3, 5p, SWARM, SMOS, Aeolus and EarthCARE. The configuration for parsing data of any other mission is possible by defining and importing new DFDL binary definition schemas within S2G.
To support flexible binary data parsing the S2G development uses of a generic binary data binding library based on the Data Format Description Language (DFDL). Published as an Open Grid Forum Proposed Recommendation [1] DFDL is a modeling language evolved into an open standard for describing general text and binary data in a standard way. This language allows description of text, binary, and legacy data formats in a vendor-neutral declarative manner.
The DFDL for Space (DFDL4S) is the underlying software library used by S2G Data Viewer. It comprises the capability to use DFDL schemas to read, write and interpret CADU, TF or ISP data files. This library can therefore be used to support in straightforward manner the generation of test data in any specified binary format (e.g. in simulators) and the reading of ISPs (e.g. in Level 1 Processors).
References
[1] Michael J. Beckerle and Stephen M. Hanson, “Data Format Description Language (DFDL) v1.0 Specification” in Open Grid Forum (https://www.ogf.org/ogf/doku.php/standards/dfdl/dfdl), GFD-P-R-207, September 2014.
NOTES
More information related to S2G Data Viewer application and DFDL4S library can be found at:
http://eop-cfi.esa.int/index.php/applications/s2g-data-viewer
http://eop-cfi.esa.int/index.php/applications/dfdl4s
[Authors] [ Overview programme] [ Keywords]
-
Paper 2190 - Session title: EO Open Science Posters
OPEN-49 - NetCDF4 data dissemination in the context of the Copernicus Marine Service
Sala, Joan (1); Romero, Laia (1); Jolibois, Tony (2); de Dianous, Rémi (2) 1: Altamira-Information, Spain; 2: CLS (Collecte Localisation Satellites), France
Show abstract
Marine Information systems like the operational Copernicus Marine Service (CMEMS) are widely used to enable access to sea data by a significant worldwide user community of nearly 5000 registered users from intergovernmental bodies, European agencies, regional and national service provides and the private sector. The central information system (CIS) beneath this service is the result of more than 6 years of intensive research and improvements carried out in the course of the MyOcean projects (2009-2013) under FP7 and H2020. The Central Information System covers all the service needs from data discovery, download with authentication, viewing and catalogue management. This abstract focuses on the evolutions being performed on the new version of the distributed software component named MOTU for data dissemination, present in all the oceanographic data production units spread through Europe.
The MOTU open-source servlet provides REST (Representational State Transfer) services to the registered users willing to download data with or without subset through a queue system and with authentication relaying on a Central Authentication System (CAS). MOTU 2.0 presented strong limitations for the management of NetCDF4 data and intensive memory usage.
In the scope of CMEMS, MOTU 3.0 will include several new features, being the compatibility with compressed NetCDF4 data the most attractive one for fast download and increased disk management efficiency. Other improvements such as the OGC protocol WCS 2.0 and better log management to gather download service statistics will be also conducted and included in this upgraded version here presented.
For the design of the MOTU 3.0 subset service, TDS (Thredds Data Server) has been considered due to its overall performance and full compatibility with NetCDF4. However, TDS alone may present operational limitations, particularly in the handling of large volumes and number of concurrent users. Hence, MOTU 3.0 will be based on the new NetCDF subset service (NCSS) in conjunction with the latest NetCDF Java library in order to provide geographic, temporal and depth subset of the NetCDF files. Moreover, the use of the MOTU servlet as a proxy of the subset service will add robustness and performance through the queue management system and solve the around the world subset and the depth selection limitations of the NCSS implementation.
In conclusion, the upgraded MOTU software envisaged under the Copernicus Marine Service in 2016 will allow users to benefit from the compression of the files and the new OGC standard WCS with a system that will increase in robustness and performance.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2215 - Session title: EO Open Science Posters
OPEN-30 - EO Data Service: Enabling Technology for Exploitation of Earth Observation Products in the Big Data Era
Mantovani, Simone (1); Barboni, Damiano (1); Natali, Stefano (1); Baumann, Peter (2); Hogan, Patrick (3) 1: Meteorological and Environmental Earth Observation, Italy; 2: Jacobs University Bremen, Bremen, Germany; 3: NASA Ames Research Center, USA
Show abstract
The use of Earth Observation (EO) data is becoming more and more challenging as a consequence of the volume and variety of data and the needs of the users that exploit the data. Relevant tools for Big Data exploitation have been realized in the framework of European Space Agency and European Commission programmes (Earthnet - https://earth.esa.int/, Copernicus - http://www.copernicus.eu) to demonstrate the easiness of data exploration and deliver satellite data in its native context of a virtual globe via any standard internet browser (Chrome, Firefox, Safari, IE).
Key technologies for data exploitation (Multi-sensor Evolution Analysis, [1], rasdaman, [2], NASA Web World Wind [3]) are used to implement effective geospatial data analysis tools empowered with OGC standard interfaces (Web Map Service (WMS), [4], Web Coverage Service (WCS), [5], and Web Coverage Processing Service (WCPS), [6]): The EO Datacube (http://eodatacube.eu) and EO Data Service (http://eodataservice.org) portals offer dynamic and interactive access functionalities to improve and facilitate the accessibility to massive Earth Science data. Several use cases that combine heterogeneous products in the Atmosphere, Land and Ocean domains have been implemented within national and international initiatives:
EarthServer/EarthServer-2, [7], to support the Climate & Atmosphere and Land & Ocean communities on exploiting full mission of heterogeneous products including satellite, forecasts, ground-measurements
MaaS - MEA as a Service for Third Party Missions, to improve the accessibility and dissemination of Landsat-8 data available at the European Space Agency, providing the users with advanced data access and retrieval capabilities such as subsetting, on-the-fly processing (e.g. filtering, cloud mask, NDVI)
EOCHA Data Portal, [8], a join initiative between The World Bank and the European Space Agency, to provide the Climate-Health experts with an EO web platform for the analysis of links between climate and health risks in Africa
InSAR Italy Open Data portal, [9], to support dissemination of the ground deformation maps over the Italian territory measured using the techniques of multi-temporal satellite Synthetic Aperture Radar Interferometry (InSAR)
In the framework of the EarthServer-2 project, the Big Data Analytics tools will be enabled on datacubes of Copernicus Sentinel products to support agile analytics on this new generation sensors both from expert and non-expert users.
In this study we propose to provide an overview of the on-going Big Data access and exploitation initiatives, in view of the upcoming data streaming coming from Copernicus programme.
The current study is co-financed by the European Space Agency under the MaaS project (ESRIN Contract No. 4000114186/15/I-LG) and the European Union’s Horizon 2020 research and innovation programme under the EarthServer-2 project (Grant Agreement No. 654367).
REFERENCES
[1] http://www.meeo.it/wp/products-and-services/mea-multisensor-evolution-analysis/
[2] http://www.rasdaman.org
[3] http://webworldwind.org
[4] http://www.opengeospatial.org/standards/wms
[5] http://www.opengeospatial.org/standards/wcs
[6] http://www.opengeospatial.org/standards/wcps
[7] http://eodataservice.org
[Authors] [ Overview programme] [ Keywords]
-
Paper 2240 - Session title: EO Open Science Posters
OPEN-35 - GeoMultiSens – Scalable Multisensoral Analysis of Satellite Remote Sensing Data
Scheffler, Daniel (1); Sips, Mike (2); Behling, Robert (1); Dransch, Doris (2); Eggert, Daniel (2); Fajerski, Jan (3); Freytag, Johann-Christoph (4); Griffiths, Patrick (5); Hollstein, André (1); Hostert, Patrick (5,6); Köthur, Patrick (2); Peters, Mathias (4); Pflugmacher, Dirk (5); Rabe, Andreas (5); Reinefeld, Alexander (3); Schintke, Florian (3); Segl, Karl (1) 1: Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Section Remote Sensing, Telegrafenberg, Potsdam, Germany, 14473; 2: Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Section Geoinformatics, Telegrafenberg, Potsdam, Germany, 14473; 3: Zuse Institute Berlin, Distributed Algorithms and Supercomputing, Takustraße 7, Berlin, Germany, 14195; 4: Humboldt University Berlin, Databases and Information Systems, Rudower Chaussee 25, Berlin, Germany, 12489; 5: Humboldt University Berlin, Geography Department, Unter den Linden 6, Berlin, Germany, 10099; 6: Humboldt University Berlin, Integrated Research Institute on Transformations of Human-Environment Systems (IRI THESys), Unter den Linden 6, Berlin, Germany, 10099
Show abstract
For more than 40 years, remote sensing satellite missions are globally scanning the earth´s surface. They are ideal instruments for monitoring spatio-temporal changes. A comprehensive analysis of this data has the potential to support solutions to major global change challenges related to climate change, population growth, water scarcity, or loss of biodiversity. However, a comprehensive analysis of these remote sensing data is a challenging task: (a) there is a lack of Big Data-adapted analysis tools, (b) the number of available sensors will steadily increase over the next years and (c) technological advancements allow to measure data at higher spatial, spectral, and temporal resolutions than ever before. These developments create an urgent need to better analyze huge and heterogeneous data volumes in support of e.g. global change research.
The interdisciplinary research consortium of the “GeoMultiSens” project focuses on developing an open source, scalable and modular Big Data system that combines data from different sensors and analyzes data in the petabyte range (1015 Byte). The most important modules are: (1) data acquisition, (2) pre-processing and homogenization, (3) storage, (4) analysis, and (5) visual exploration. The data acquisition module enables users to specify a region and time interval of interest, to identify the available remote sensing scenes in different data archives, to assess how these scenes are distributed in space and time, and to decide which scenes to use for a specific analysis. The homogenization module uses novel and state-of-the-art algorithms that combine the selected remote sensing scenes from different sensors into a common data set. The data storage module optimises storage and processing of petabytes of data in a parallel and failure tolerant manner. The core technology of the data storage module is XtreemFS (http://www.xtreemfs.org). The analysis module implements image classification and time series analysis algorithms. The visual exploration module supports users in assessing the analysis results. All modules are adapted to a map-reduce processing scheme to allow a very fast information retrieval and parallel computing within the processing system Flink (http://flink.apache.org). Finally, a Visual Analytics approach integrates the individual modules and provides a visual interface to each step in the analysis pipeline.
The Big Data system “GeoMultiSens” will store and process remotely sensed data from space-borne multispectral sensors of high and medium spatial resolution such as Sentinel-2, Landsat 5/7/8, Spot 1-6, ASTER, ALOS AVNIR-2 and RapidEye. Our poster presents the overall scientific concept of the Big Data system “GeoMultiSens” and technical details of the most important modules. We discuss scientific challenges of the Big Data system “GeoMultiSens” and present our ideas to address these scientific challenges.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2245 - Session title: EO Open Science Posters
OPEN-40 - Bridging the gap between conservation practitioner and remote sensing science
Leidner, Allison (1); Headley, Rachel (2); Palumbo, Ilaria (3); Rose, Robert Andrew (4); Wegmann, Martin (5) 1: NASA Earth Science Division, United States of America; 2: Cobblestone Science, LLC, United States of America; 3: European Commission - Joint Research Centre, Italy. Institute for Environment and Sustainability, Land Resource Management Unit; 4: College of William & Mary, United States of America; 5: Remote Sensing and Biodiversity Research University of Wuerzburg, Germany
Show abstract
The rapid advancements in remote sensing technology offer a unique opportunity to address the increasing rates of ecosystem degradation and biodiversity loss. Although satellite data have been available for decades, and the use of remote sensing has been increasing, many barriers still exist to leveraging these data for the benefit of conservation planning and management. It is crucial to bridge the gap between the conservation community’s needs, and the remote sensing community’s awareness regarding what imagery and tools would most contribute to conservation activities.
Conservationists address the most critical environmental challenges, and remotely sensed data can contribute significantly to this effort. However, the conservation community needs to formulate their needs regarding spatial, spectral, and temporal datasets more clearly and with the knowledge of what capabilities already exist and realistic requests for current and future missions. Conversely, satellite missions can better support its users who have specialties outside of remote sensing by greater distribution and more explanation regarding the translation of satellite data into information.
The recent launch of the ESA Sentinel constellation opens the way to new products for conservation. The higher spatial and temporal resolution provided by the new satellites will dramatically improved our capacity to observe and monitor habitat status and its change over time. New and existing datasets (Landsat 8, Landsat series) need to be critically evaluated by the conservation community to provide valuable feedback to the space agencies.
Recently, a group of conservationists and remote sensing scientists published the 10 highest priority conservation challenges that could be addressed using remote sensing technologies (Rose et al., 2014). In acknowledgement of this very important first step, a conservation remote sensing community, the Conservation Remote Sensing Network (CRSNet), has formed to increase conservation effectiveness through enhanced integration of remote sensing technologies in research and applications. CRSNet is a growing community including now over 450 members from all sectors. Through its members it is linked to other initiatives and groups like GEO BON, NASA, ESA, and CEOSS. In the coming years, the CRSNet will play a critical role in increasing capacity for conservation remote sensing, develop best practice guidelines, communicate critical information, and foster research and collaboration with a overarching goal to reduce the barriers to conservationists’ use of satellite data and help bridge the gap between the conservation and remote sensing communities.
Using satellite data to create information vital to conservation is essential to addressing the most critical environmental challenges of our world. This talk will focus on the future of conservation remote sensing, the role remote sensing will play in addressing conservation priorities and opportunities to for the conservation community to better engage remote sensing scientists in collaborative efforts to improve conservation outcomes through integrated remote sensing technologies.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2283 - Session title: EO Open Science Posters
OPEN-19 - Mobile Devices for Community-Based Forest Biodiversity Monitoring
Pratihast, Arun Kumar (1); Decuyper, Mathieu (1); Mora, Brice (1,2); Herold, Martin (1) 1: Laboratory of Geo-Information Science and Remote Sensing, Wageningen University,The Netherlands; 2: GOFC-GOLD Land Cover Project Office, Wageningen, The Netherlands
Show abstract
Effective assessment and protection of biodiversity and ecosystem services rely on the provision of accurate and timely collected field data that can support monitoring activities. International policy negotiations (UN CBD) supported by the scientific community (IPBES, GEO BON) identified the need for improved biodiversity monitoring systems. Recently, advancements in Information and Communications Technologies (ICTs) using handheld devices has enabled local communities to monitor species’ populations and their habitat in an efficient and cost effective way. In this research we present an integrated data collection system based on mobile devices that streamlines the community-based data collection, transmission and visualization process. We discuss the preliminary lessons learned from protected areas in the UNESCO Kafa Biosphere Reserve in Southwestern Ethiopia, home to many endemic species. We also assess the accuracy and reliability of community-based data and propose a way to integrate them into remote sensing (e.g. near real-time satellite degradation monitoring). The results show that the local communities are able to provide data with accuracies comparable to expert measurements, while incurring lower costs. Furthermore, the results confirm that communities are more effective in monitoring habitats’ biodiversity. For example, communities are able to monitor animal species and their habitats the whole year round, while intensive expert biodiversity assessments are incomplete because of the limited timeframe for data collection. This study demonstrates the high potential of integrated community-based near real-time biodiversity monitoring systems, with Sentinel-2 data streams as key satellite data source.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2369 - Session title: EO Open Science Posters
OPEN-20 - Crowdsourcing Support to Copernicus Services: a Vegetation Mapping APP to Integrate Sentinel Data
De Vecchi, Daniele (1,2); Dell'Acqua, Fabio (1,2) 1: University of Pavia, Italy; 2: EUCENTRE foundation, Italy
Show abstract
Crowdsourcing is increasingly referred to as an important source of geospatial information that scientists may decide to incorporate in experiments, or even in operational contexts in perspective. The concept of crowdsourcing is broad and sometimes confusing; in this paper, however, we focus on the “dense network of observers” side of crowdsourcing. We intend indeed to encourage people wandering or cycling around with a smartphone to use it to collect vegetation-related information in rural areas.
Not only a basic app [1] [2] has already been developed in the context of the regional SEGUICI project [3], but also a framework has been designed at the same time to easily merge information coming from different types of terminals, including desktop and other units. All of the above is available Open-source to encourage its circulation and further development. A data fusion system for vegetation monitoring and water needs assessment and forecasting is envisaged, fusing in-situ and crowdsourced data with multispectral and radar satellite data from the first two Sentinels in order to ensure better water resources management.
The app will collect geo-localized reports in the form of pictures, accompanied by user’s own assessment driven by interaction with the app itself (e.g. proposing images showing different possible conditions of the selected vegetation). Pictures, labels, flags and other information items are uploaded to a server capable of ingesting and classifying Sentinel images as well. The processing system could provide interesting additional information to integrate the Copernicus Land Monitoring Service called “Natura2000” [4]. Vegetation maps generated by this service are updated every 6 years, while the proposed service could be used to generate “side maps” reporting on possible changes since the latest official map.
The envisaged system sits halfway between satellite-only products and other Copernicus services like LUCAS [5] in which the in-situ validation is carried out by a number of experts collecting samples in selected location for further analysis.
Potential end-users including irrigation consortia have already been contacted and involved in the context of the SEGUICI project, and they will be kept in the loop on these further developments.
References
[1] D. A. Galeazzo, D. De Vecchi, F. Dell’Acqua, “Citizens as Sensors: from a multi-purpose framework to App implementation”, Earth Observation Open Science 2.0, 12-14 October 2015, Frascati, Italy.
[2] D. A. Galeazzo, D. De Vecchi, F. Dell'Acqua, P. Demattei. “A small step towards citizen sensor: a multi-purpose framework for mobile apps”. International Geoscience and Remote Sensing Symposium IGARSS, 26-31 July, Milan, Italy, 2015.
[3] SEGUICI project, available online at: http://www.seguici.eu/
[4] Natura2000, Copernicus Local Land Monitoring Service, available online at: http://land.copernicus.eu/local/natura/view
[5] LUCAS, Copernicus In-situ Land Monitoring Service, available online at: http://land.copernicus.eu/in-situ/lucas/view
[Authors] [ Overview programme] [ Keywords]
-
Paper 2401 - Session title: EO Open Science Posters
OPEN-24 - Phytoplankton Seasonality and Coral Reef Biology: ESA Bilko LearnEO! Lesson
Racault, Marie-Fanny; Raitsos, Dionysios E. Plymouth Marine Laboratory (PML), Plymouth PL1 3DH, United Kingdom
Show abstract
Being at the base of the marine food-web, phytoplankton provide a source of food for the larvae of many coral reef species, including fish, crustaceans and molluscs. The bloom timing (phenology), along with the magnitude of the phytoplankton availability are determinants of larvae survival. In the Red Sea, coral reef ecosystems have adapted to survive and thrive in one of the most saline and warm seas in the world. However, long-term large-scale biological datasets to study this extreme environment are rare, mainly limited to satellite-based observations of ocean colour. Hence, there is important opportunity to promote outreach activities that make use of satellite products readily accessible. In this context, we have developed a Web-based lesson for introducing ocean-colour observations and phenology indices to non-remote-sensing experts. The lesson was written as part of the “ESA Bilko Learn Earth Observation LearnEO!” program. It introduces phenology metrics relevant to monitoring the seasonality of phytoplankton using satellite ocean-colour data based on the ESA Climate Change Initiative (OC-CCI) project. It allows high school, university students, and researchers to investigate phytoplankton dynamics in major coral reefs of the Red Sea and offers them further insight into why this information is key for fisheries management. This lesson will help broaden the community able to explore and utilise ocean-colour data and support responsible stewardship of the marine ecosystem.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2468 - Session title: EO Open Science Posters
OPEN-3 - Spatial and Temporal Collocation of Instrument Observations for Earth Observation Missions
Pinol Sole, Montserrat; Mezzera, Cecilia; Duesmann, Berthyl ESA, Netherlands, The
Show abstract
Spatial and Temporal Collocation of Instrument Observations for Earth Observation Missions
Living Planet Symposium (LPS16) 09-13 May 2016
Prague, Czech Republic
Montserrat Piñol Solé (1), Cecilia Mezzera(1), Berthyl Duesmann (1)
(1)ESA - ESTEC, Keplerlaan 1, PO Box 299, 2200 AG Noordwijk, The Netherlands
ABSTRACT
With the increasing number of flying Earth Observation missions, activities derived from the combination of data coming from instruments on different satellites have a growing potential. As a consequence of the increment in the number of missions, the opportunities of observations of the same area by different instruments within a limited time period also increase. The main question for the Earth Observation community would then be where and when these observation opportunities will take place.
Several software solutions exist to analyse the coverage for individual instruments. Examples of tools provided by ESA EOP System Support Division will be presented to cover this aspect. These software tools allow the users to identify coverage opportunities, both prior to launch and during commissioning phase and nominal operations.
This study goes a step further and presents the results of analysing the instrument collocation opportunities for different types of instruments (Optical, Radar, Altimeters) available on different satellites. Providing a mechanism to easily identify these opportunities would benefit users involved in instrument calibration activities and also users interested in combining data from different types of products, acquired over the same geographical area within a given period of time between observations.
As an example, prior to the Cryosat launch, the overlap between Envisat (instruments AATSR Nadir and ASAR Wide Mode) and Cryosat/SIRAL was simulated for an acquisition planning test. The regions of interest were two latitude bands around the poles (from 60 to 90 degrees North and from -60 to -90 degrees South). It was required to take into account only the overlapping Envisat-Cryosat observations with a time difference below 15 minutes. The main conclusions regarding the spatial and temporal collocation were the following:
- There were as maximum two opportunities per Envisat orbit, one around in the North Pole region and another in the South Pole
- A pattern of 2 days with opportunities followed by 3 days without any overlap was observed
- The maximum duration of the opportunities were about 30 seconds for ASAR Wide-SIRAL and 40 seconds for AATSR Nadir-SIRAL
A study case comparing acquisitions with EarthCARE /ATLID and EarthCARE /MSI with respect to Sentinel-5P /TROPOMI is analysed. The results are presented in terms of revisit time between observations and their geographical distribution. Constraints relative to the instrument operations, e.g. active only within certain range of sun zenith angle (day operations) are also considered.
A particular case within this type of analysis is the constellation of two or more satellites carrying identical instruments, like Sentinel-1A and 1B. In order to illustrate this particular type, the coverage of the instruments of the Sentinel-3 constellation is studied.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2675 - Session title: EO Open Science Posters
OPEN-8 - Lightweight Software Tools for Geospatial Data Representation in the Web
Panidi, Evgeny; Terekhov, Anton; Kazakov, Eduard; Kapralov, Evgeny Saint-Petersburg State University, Russian Federation
Show abstract
Modern platforms for the geospatial data representation and processing in the Web typically use server-side or cloud-based solutions. In general case, the physical or virtual dedicated server is required to store geospatial data or to publish Web-based geospatial applications. As a result, it is extremely difficult to integrate geospatial data and software tools with ordinary Web resources, which use a virtual hosting, mostly due to absence of ready-to-use frameworks for this purpose. Similarly, the capabilities of use of the client-side computational resources in geospatial Web applications for data processing are absent in fact. This situation leads to some difficulties in the implementation of small- and medium-scale projects, and imposes restrictions when building infrastructure for scientific research. As an example we can point that the cloud-based solutions are very helpful when we store and process remote sensing big data; however when we conduct research and development projects, we need to publish and retrieve relatively small datasets and some developed software tools in a grid-like (e.g. using peer-to-peer grid system) manner.
In our study, we work on possibilities estimation for using computational resources of the client when exploiting the geospatial Web services. Among other things, we estimate the possibilities of lightweight and portable geospatial Web services and Web applications design, which makes it possible to avoid the binding to some cloud infrastructure or requirement of a dedicated server. Previously, we designed techniques and implemented software tools for building the geospatial Web services for online data processing on the client side, which transmit the software components required for this purpose to the client side. At the current stage of our study, we work on implementation techniques for portable server-side software, which is needed in the case of geospatial data Web publishing. In this context, we develop solutions that compliant with Open Geospatial Consortium (OGC) standards, and allow to use ordinary hosting for publishing geospatial data services in the Web.
The study was partially supported by Russian Foundation for Basic Research (RFBR), research project No. 13-05-12079 ofi_m.
[Authors] [ Overview programme] [ Keywords]
-
Paper 2724 - Session title: EO Open Science Posters
OPEN-21 - ESERO CZ - Digital Tools for EO Education
Mares, Petr (1); Štych, Přemysl (2); Holman, Lukas (1); Kusak, Radim (1) 1: ESERO CZ, Czech Republic; 2: Charles University in Prague
Show abstract
The contribution presents activities and outcomes of ESERO Czech Republic as a part of ESA's oficial network of education offices (ESERO - European Space Education Resource Office). Among nine ESERO offices in Europe the Czech is specialized on EO topics and use of digital technologies in education. Together with classical education programmes focusing on EO (analysis and interpretation of satellite images) also newly prepared tablet app for secondary schools and further possibilities of developing tablet/mobile apps using virtual reality will be presented.
In the Czech Republic the ESERO is implemented by a consortium of educational and scientific organisations (Scientica Agency, Charles University in Prague, Astronomical Institute of ASCR, Czech technical University in Prague, Tereza Association, iQLandia, Palacky University in Olomouc) and all the outputs and programmes are used for free.
[Authors] [ Overview programme] [ Keywords]