remotesensing-logo

Journal Browser

Journal Browser

Data Fusion for Remote Sensing of Fires and Floods in the Sentinels Era

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Environmental Remote Sensing".

Deadline for manuscript submissions: closed (30 June 2023) | Viewed by 18146

Special Issue Editors


E-Mail Website
Guest Editor
CNR IREA, Via Bassini 15, 20133 Milan, Italy
Interests: fuzzy logic and soft computing for the representation and management of imprecision and uncertainty of textual and geographic information; volunteered geographic information user-driven quality assessment in citizen science; crowdsourced information spatiotemporal analytics; information retrieval on the web; flexible query languages for information retrieval and geographic information systems; ill-defined environmental knowledge representation and management; multisource geographic information fusion and synthesis
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Institute for Electromagnetic Sensing of the Environment, Italian National Research Council, (IREA-CNR), 7-00185 Roma, Italy
Interests: burned area mapping; multi-spectral image processing; time series analysis; assessment of fire impacts
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues

This Special Issue aims to collect scientific contributions proposing multisource Data Fusion methods for the remotely sensed mapping, monitoring and assessment of fires and floods in the Sentinels Era.

Natural hazards such as fires and floods are phenomena with a large spatial dimension and impact. Mapping, monitoring and assessment can be carried out using satellite remote image platforms. Sentinel missions have brought a new era for operational monitoring, providing reliable and up-to-date information for ecosystem monitoring and managing.

Climate change poses further challenges for the exacerbation of the impacts and effects of extreme events on the ecosystems; in this framework, remotely sensed data can support environmental analyses and monitoring by making particular use of Earth Observation Sentinel missions.

Besides Sentinel, from a broader perspective, Earth Observation systems orbiting the Earth encompass several missions, providing data with distinct spectral, spatial and temporal granularities. Algorithms and methods that are able to fully exploit the complementarity of these systems are necessary to provide robust and operational semantic information that is easily interpretable by humans.

Moreover, the integration of remotely sensed data with data collected by ground/in situ measurements calls for new methods and applications of data fusion to enhance and improve traditional approaches to the remote sensing of natural resources.

This Special Issue aims to collect a broad set of scientific contributions proposing Data Fusion methods for remotely sensed mapping, monitoring and assessment of fires and floods by the use of multisource sensors and data, including both Sentinel and alternative remote-sensing data, in situ sensors and human sensors. Topics of interest include both theoretic and applicative themes:

Dr. Gloria Bordogna
Dr. Daniela Stroppiana
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Multisource and multi-scale image data fusion from Sentinel missions
  • Multi-mission data fusion (Sentinel and not Sentinel missions)
  • Multiscale, multispectral and multi-temporal remote-sensing data fusion
  • Pixel-level, attribute-level, feature-level and object-level remote-sensing data fusion
  • Quality enhancement by remote-sensing data fusion
  • Uncertainty reduction by remote-sensing data fusion
  • Concurrent and complementary remote-sensing data fusion
  • Numeric and symbolic remote-sensing data fusion
  • Soft fusion strategies in remote-sensing
  • Remote-sensing data fusion for susceptibility, vulnerability, hazard and risk keyword.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

22 pages, 7523 KiB  
Article
A Novel Water Index Fusing SAR and Optical Imagery (SOWI)
by Bin Tian, Fangfang Zhang, Fengkai Lang, Chen Wang, Chao Wang, Shenglei Wang and Junsheng Li
Remote Sens. 2022, 14(21), 5316; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14215316 - 24 Oct 2022
Cited by 1 | Viewed by 3551
Abstract
Continuous and accurate acquisitions of surface water distribution are important for water resources evaluation, especially high-precision flood monitoring. During surface water extraction, optical imagery is strongly affected by clouds, while synthetic aperture radar (SAR) imagery is easily influenced by numerous physical factors; thus, [...] Read more.
Continuous and accurate acquisitions of surface water distribution are important for water resources evaluation, especially high-precision flood monitoring. During surface water extraction, optical imagery is strongly affected by clouds, while synthetic aperture radar (SAR) imagery is easily influenced by numerous physical factors; thus, the water extraction method based on single-sensor imagery cannot obtain high-precision water range under multiple scenarios. Here, we integrated the radar backscattering coefficient of ground objects into the Normalized Difference Water Index to construct a novel SAR and Optical Imagery Water Index (SOWI), and the water ranges of five study areas were extracted. We compared two previous automatic extraction methods based on single-sensor imagery and evaluated the accuracy of the extraction results. Compared with using optical and SAR imagery alone, the accuracy of all five regions was improved by up to 1–18%. The fusion-derived products resulted in user accuracies ranging 95–99% and Kappa coefficients varying by 85–97%. SOWI was then applied to monitor the 2021 heavy rainfall-induced Henan Province flood disaster, obtaining a time-series change diagram of flood inundation range. Our results verify SOWI’s continuous high-precision monitoring capability to accurately identify waterbodies beneath clouds and algal blooms. By reducing random noise, the defects of SAR are improved and the roughness of water boundaries is overcome. SOWI is suitable for high-precision water extraction in myriad scenarios, and has great potential for use in flood disaster monitoring and water resources statistics. Full article
Show Figures

Graphical abstract

26 pages, 14466 KiB  
Article
A Google Earth Engine Approach for Wildfire Susceptibility Prediction Fusion with Remote Sensing Data of Different Spatial Resolutions
by Sepideh Tavakkoli Piralilou, Golzar Einali, Omid Ghorbanzadeh, Thimmaiah Gudiyangada Nachappa, Khalil Gholamnia, Thomas Blaschke and Pedram Ghamisi
Remote Sens. 2022, 14(3), 672; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14030672 - 30 Jan 2022
Cited by 41 | Viewed by 9744
Abstract
The effects of the spatial resolution of remote sensing (RS) data on wildfire susceptibility prediction are not fully understood. In this study, we evaluate the effects of coarse (Landsat 8 and SRTM) and medium (Sentinel-2 and ALOS) spatial resolution data on wildfire susceptibility [...] Read more.
The effects of the spatial resolution of remote sensing (RS) data on wildfire susceptibility prediction are not fully understood. In this study, we evaluate the effects of coarse (Landsat 8 and SRTM) and medium (Sentinel-2 and ALOS) spatial resolution data on wildfire susceptibility prediction using random forest (RF) and support vector machine (SVM) models. In addition, we investigate the fusion of the predictions from the different spatial resolutions using the Dempster–Shafer theory (DST) and 14 wildfire conditioning factors. Seven factors are derived separately from the coarse and medium spatial resolution datasets for the whole forest area of the Guilan Province, Iran. All conditional factors are used to train and test the SVM and RF models in the Google Earth Engine (GEE) software environment, along with an inventory dataset from comprehensive global positioning system (GPS)-based field survey points of wildfire locations. These locations are evaluated and combined with coarse resolution satellite data, namely the thermal anomalies product of the moderate resolution imaging spectroradiometer (MODIS) for the period 2009 to 2019. We assess the performance of the models using four-fold cross-validation by the receiver operating characteristic (ROC) curve method. The area under the curve (AUC) achieved from the ROC curve yields 92.15% and 91.98% accuracy for the respective SVM and RF models for the coarse RS data. In comparison, the AUC for the medium RS data is 92.5% and 93.37%, respectively. Remarkably, the highest AUC value of 94.71% is achieved for the RF model where coarse and medium resolution datasets are combined through DST. Full article
Show Figures

Figure 1

21 pages, 41697 KiB  
Article
Mapping Outburst Floods Using a Collaborative Learning Method Based on Temporally Dense Optical and SAR Data: A Case Study with the Baige Landslide Dam on the Jinsha River, Tibet
by Zhongkang Yang, Jinbing Wei, Jianhui Deng, Yunjian Gao, Siyuan Zhao and Zhiliang He
Remote Sens. 2021, 13(11), 2205; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13112205 - 04 Jun 2021
Cited by 7 | Viewed by 3155
Abstract
Outburst floods resulting from giant landslide dams can cause devastating damage to hundreds or thousands of kilometres of a river. Accurate and timely delineation of flood inundated areas is essential for disaster assessment and mitigation. There have been significant advances in flood mapping [...] Read more.
Outburst floods resulting from giant landslide dams can cause devastating damage to hundreds or thousands of kilometres of a river. Accurate and timely delineation of flood inundated areas is essential for disaster assessment and mitigation. There have been significant advances in flood mapping using remote sensing images in recent years, but little attention has been devoted to outburst flood mapping. The short-duration nature of these events and observation constraints from cloud cover have significantly challenged outburst flood mapping. This study used the outburst flood of the Baige landslide dam on the Jinsha River on 3 November 2018 as an example to propose a new flood mapping method that combines optical images from Sentinel-2, synthetic aperture radar (SAR) images from Sentinel-1 and a Digital Elevation Model (DEM). First, in the cloud-free region, a comparison of four spectral indexes calculated from time series of Sentinel-2 images indicated that the normalized difference vegetation index (NDVI) with the threshold of 0.15 provided the best separation flooded area. Subsequently, in the cloud-covered region, an analysis of dual-polarization RGB false color composites images and backscattering coefficient differences of Sentinel-1 SAR data were found an apparent response to ground roughness’s changes caused by the flood. We carried out the flood range prediction model based on the random forest algorithm. Training samples consisted of 13 feature vectors obtained from the Hue-Saturation-Value color space, backscattering coefficient differences/ratio, DEM data, and a label set from the flood range prepared from Sentinel-2 images. Finally, a field investigation and confusion matrix tested the prediction accuracy of the end-of-flood map. The overall accuracy and Kappa coefficient were 92.3%, 0.89 respectively. The full extent of the outburst floods was successfully obtained within five days of its occurrence. The multi-source data merging framework and the massive sample preparation method with SAR images proposed in this paper, provide a practical demonstration for similar machine learning applications using remote sensing. Full article
Show Figures

Figure 1

Back to TopTop