Next Article in Journal
Comparison of Relief Shading Techniques Applied to Landforms
Next Article in Special Issue
Spatio-Temporal Visualization Method for Urban Waterlogging Warning Based on Dynamic Grading
Previous Article in Journal
Machine Learning for Gully Feature Extraction Based on a Pan-Sharpened Multispectral Image: Multiclass vs. Binary Approach
Previous Article in Special Issue
The Measurement of Mobility-Based Accessibility—The Impact of Floods on Trips of Various Length and Motivation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Detection of Levee Damage Based on UAS Data—Optical Imagery and LiDAR Point Clouds

Department of Photogrammetry, Remote Sensing and Spatial Information Systems, Faculty of Geodesy and Cartography, Warsaw University of Technology, Pl. Politechniki 1, 00-661 Warsaw, Poland
*
Author to whom correspondence should be addressed.
ISPRS Int. J. Geo-Inf. 2020, 9(4), 248; https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi9040248
Submission received: 19 March 2020 / Revised: 15 April 2020 / Accepted: 16 April 2020 / Published: 17 April 2020

Abstract

:
This paper presents a methodology for levee damage detection based on Unmanned Aerial System (UAS) data. In this experiment, the data were acquired from the UAS platform, which was equipped with a laser scanner and a digital RGB (Red, Green, Blue) camera. Airborne laser scanning (ALS) point clouds were used for the generation of the Digital Terrain Model (DTM), and images were used to produce the RGB orthophoto. The main aim of the paper was to present a methodology based on ALS and vegetation index from RGB orthophoto which helps in finding potential places of levee failure. Both types of multi-temporal data collected from the UAS platform are applied separately: elevation and optical data. Two DTM models from different time periods were compared: the first one was generated from the ALS point cloud and the second DTM was delivered from the UAS Laser Scanning (ULS) data. Archival and new orthophotos were converted to Green-Red Vegetation Index (GRVI) raster datasets. From the GRVI raster, change detection for unvegetation ground areas was analysed using a dynamically indicated threshold. The result of this approach is the localisation of places, for which the change in height correlates with the appearance of unvegetation ground. This simple, automatic method provides a tool for specialist monitoring of levees, the critical objects protecting against floods.

Graphical Abstract

1. Introduction

Hazard monitoring seems to be increasingly more important. Satellite images that are systematically acquired are commonly used for monitoring the phenomena that may devastate large areas, for example, tornadoes, earthquakes, big forest fires and flood hazards [1,2,3]. However, while there are also hazards that may be dangerous for smaller areas, it is not possible to use satellite images in order to monitor them, even though the threat associated with their damage can cause huge hazards related to floods. Levees, also called dykes, embankments, floodbanks or stopbanks, are small-sized linear facilities. They play a very important role in flood protection, but they are only one of the tools available for flood control [4]. They are usually monitored and periodically measured with direct surveying techniques that guarantee high accuracy, but long distances between cross sections limit detailed analysis of their entire surface. Considering various remote sensing data sources, it is not possible to see defects in levees using satellite imagery because the resolution is not high enough. These defects are: soil loss, soil movement and damage caused by animals and people. Therefore, airborne data can be used in this application in order to detect the hazardous areas [5,6]. Nowadays, levee monitoring can be successfully conducted by means of remote sensing techniques, and in most cases, aerial photogrammetric data is used—especially high-resolution aerial imagery [7,8] and LiDAR (Light Detection and Ranging) point clouds [9]. LiDAR data have many advantages in comparison to photogrammetric data. The penetration through vegetation is the most important one. LiDAR data can deliver reliable information about bare ground; however, the resolution of these kinds of data (expressed in point cloud density) cannot be compared to aerial images that can be collected with centimetre resolution. The comparison of both techniques has been the topic of scientific publications for years [10] and at the beginning it concerned high-altitude manned platforms. Unfortunately, the disadvantage of the manned platforms is the frequency and costs of data acquisition. Photogrammetric flights are usually performed every few years for the same area. For corridor objects, such as levees, acquiring data using the Unmanned Aerial System (UAS) may become a more cost-efficient alternative to high-altitude photogrammetry [7,8,11]. In the literature, there are some articles available that focus on using spectral information from UAS-borne images in order to analyse and detect vegetation [12,13,14]. Typically, RGB (Red, Green, Blue) orthophotos are the primary data that are generated from UAS imagery, however, UAS data can also deliver rasters of vegetation indices such as the Normalised Difference Vegetation Index (NDVI) or Green-Red Vegetation Index (GRVI). The latter index is interesting in the case of UAS because most of such platforms are equipped only with optical RGB cameras. In the literature, some references address using multi-temporal data from the UAS in the analysis of change-detection [15,16,17,18,19]. References to the use of UAS laser scanning mostly present experiments over high vegetation where it is the most justified [20]. The increased temporal and spatial resolution of UAS collections should provide more effective monitoring of levee conditions and identification of potential flood hazards. This is especially true since using other moderate resolution airborne data to evaluate changes in small areas are often technologically, resource, or environmentally limited.
In this article, a solution utilising the UAS platform is presented as a more cost-efficient alternative to high-altitude photogrammetry in levee monitoring. The UAS data can be also used in multitemporal analysis with typical aerial data. Unmanned aerial systems have already found application in various fields, which can be attributed to the short UAS data processing time, high accuracy and the non-invasive character of the measurement. Additionally, UAS platforms can be equipped with various sensors (laser scanners and almost all types of cameras: optical including near-infrared, ultraviolet or thermal).
The aim of this article was to present a methodology of levee monitoring and detecting the defects in levees in order to reduce the negative effects of floods, if they occur. The datasets that were used for developing this methodology—LiDAR point clouds and RGB images—come from two time periods and experiments from two different sources. One dataset was acquired using a UAS, whereas archival datasets were collected using an airplane. LiDAR point clouds and RGB images were processed to differential elevation models and unvegetation ground change detection, respectively. Then, this information was used by hydrologists to provide the specific spatial analysis about the levee conditions that have a crucial influence on the evaluation of flood protecting infrastructure.
This paper is the part of an implementation project where multi-sourced and multi-temporal data make it possible to evaluate the levee’s condition. In the following sections, the methodology of the use of products derived from the data obtained with the UAS platform is presented: Digital Terrain Models (DTMs) and orthophotos. Furthermore, a practical example is given, with a discussion on the differential model and the vegetation index raster, including the selection of the threshold value for ground change detection necessary to distinguish the unvegetation ground. The final sections present a summary, conclusions and future work related to the methodological implementation within the built IT (information technology) system for levee management [6,21] called SAFEDAM.

2. Methodology of UAS Data Application in the IT System

This section presents a methodology which uses UAS data for levee monitoring and implements this in the IT system. There is also a description of the data that were tested in the experiment.

2.1. Description of the Workflow

In Figure 1, the methodology for detecting levee damage developed within SAFEDAM project is presented if RGB and LiDAR data are available. It based on UAS or high-altitude data. In the methodology, two remote sensing data sources were utilised: point clouds obtained from the laser scanners and the RGB orthophoto. They are used in the analysis of elevation and land cover change-detection in parallel.
The laser scanning data described in Section 2.2 were used for generating the digital terrain models that were conducted in an exterior software dedicated to LiDAR data, including trajectory post-processing (Applanix Pos-Pack) and point cloud classification for two classes: ground and non-ground and DTM generation (Terrasolid). A raster of height differences (dDTM – differential DTM) was calculated from two DTMs (different time periods) in a further step to detect elevation changes (e.g., landslides) that may imply a defect and, as a result, a hazard. In the experiment, low-altitude UAS Laser Scanning (ULS) and typical high-altitude airborne laser scanning (ALS) datasets were used. In the IT system, if appropriate data are available, both ULS-based DTMs or ALS-based data collection can be subtracted. The dDTM must be reclassified to detect areas with changes that can be correctly identified and the result cannot be influenced by accuracy of the data sets. The description of the reclassification threshold value selected in the methodology is presented in Section 3.1.
Aerial images described in Section 2.2 were processed in exterior photogrammetric software (Agisoft Metashape for UAS data and Trimble Inpho for archive aerial images) to orthorectify the optical data using the DTM. In the described methodology, both RGB orthophotos are used to calculate the GRVI. The index was used to detect the “new” unvegetation ground areas that occur between two periods of data acquisition, because the lack of vegetation is also a hazard when levee safety is considered. Analysing both rasters of unvegetation ground shows that differences (places with new locations of unvegetation ground) were found. The crucial problem in the application of the GRVI index is to find appropriate values for this index for unvegetation area detection. This is related to solving the problem of different dates, times and conditions during data collection. This issue is presented in Section 3.2.
“New” unvegetation ground areas and dDTM are delivered to identify areas of possible levee damage for further analysis by a specialist. During interpretation, areas with both conditions met (|dDTM| > threshold and detected “new” unvegetation ground area) must be verified carefully if a potential failure is confirmed; however, noticing only the first condition (|dDTM| > threshold ) can also be interpreted as a hazard in contrast to noticing only “new” unvegetation ground area, which could be the effect of a simple change in vegetation caused by other reasons such as a drought.

2.2. Description of the Tested Data

The test area where the data were acquired is a levee in Annopol, Poland, near the Vistula River (Figure 2). The test area was approximately 6 km long. The aerial images from the airplane were acquired in June 2015 with a Ground Sampling Distance (GSD) of 25 cm, and laser scanning data with an average density of 4 points per square metre was registered in 2011, within the national programme of collecting ALS data [22]. The UAS data were acquired in May 2017 with The Hawk Moth platform equipped with two sensors: the laser scanning YellowScan Surveyor and an RGB camera Sony Alpha a6000, in eight corridor flight missions over levees and their surroundings [19].
The density of the ULS point cloud was 180 points per square metre. The ULS point cloud classification was processed in Terrasolid software, which implements the Axelsson’s algorithm for ground classification. DTMs were generated from classified ULS and ALS in the resolution of 0.25 m, which was investigated by the authors in an earlier article [19]. Detailed information about the data is presented in Table 1.
The resolution of images obtained with the UAS platform was 0.025 m, which was the result of the simultaneous collection from the UAS platform collecting ULS data, and the final orthophoto resolution which was 0.10 m. Such a resolution is adequate for detecting unvegetation ground for levee damage assessments. Considering image orientation, archival aerial images from 2015 were already oriented with exterior orientation parameters provided by the National Geodetic and Cartographic Resources. The UAS images were georeferenced in a bundle adjustment and orthorectified in Agisoft Metashape software using control and check points (1 × 1 m white-black chessboard, marked in and measured in a field with GNSS Leica GS15 and CS15 antenna using the Real-Time Kinematic method with an accuracy of 0.02 m).

3. Results of Change-Detection

This section is divided into three parts presenting the use of elevation data (Section 3.1), optical data (Section 3.2) and their application in the IT system (Section 3.3). The first part consists in detecting elevation changes between the DTMs that were generated from the laser scanning data acquired in different time periods. The analysis of the height changes based on the differential DTMs indicates the places of possible hazard using only information from elevation change-detection. The second part consists of searching for the changes in the levee land cover. Thus, in this part, RGB images and vegetation indices that are calculated from the image bands are used. The last part is a description of the above-mentioned products’ application in the presented IT system of levee monitoring.

3.1. Elevation Data

The first part of the methodology presented in Figure 1 consists of detecting changes between elevation data. Two DTMs that were acquired in different periods were compared (Figure 3). The first DTM was obtained from archival ALS data and this model was treated as a reference. The second DTM was generated from the UAS LiDAR data. Subsequently, the DTMs were subtracted and the differential DTM (dDTM) was analysed. Such a raster from two DTMs is a popular source for the interpretation of mass movements [23,24,25]. In the experiment, various height thresholds were considered: 0.10 m, 0.25 m and 0.50 m, which were proposed as the best visualisation in order to support end-users with interpreting the eventual occurrence of hazard related to potential levee damage. For the threshold equal to 0.10 m, there were a lot of areas that may be qualified as a hazard. This may be a result of low vegetation, which mainly grows on levee slopes and their surroundings. These places can be also influenced by accuracy of LiDAR data or possibilities of laser beam penetration through dense vegetation. Analysing the threshold equal to 0.50 m, almost nothing was classified as hazard in dDTM. Thus, 0.25 m was chosen as the best threshold value for the visualization in a system supporting hydrologists’ work. In Figure 4, the reclassified differences between DTMs are presented. The height differences that exceeded the absolute value of 0.25 m are marked with a blue red colour scale as a hazard and need to be verified and compared to the results of the orthoimage analysis. The analysed difference raster of the DTM is not burdened with vegetation because LiDAR point cloud processing filters out vegetation during DTM generation. However, it is possible to observe the small impact of dense vegetation (crops, young coniferous trees or dense high grass) in the differential model. If such an influence can be observed in potential low values (usually classified in 0.25–0.50 m class), it can be interpreted using an orthomosaic.

3.2. Optical Data

The second part of the methodology presented in Figure 1 consists of detecting changes in levees from the RGB images by determining the Green-Red Vegetation Index (GRVI), which was used or examined in a few other publications [26,27,28]. GRVI index is calculated with the given formula (1):
G R V I = ρ G ρ R ρ G + ρ R
where:
  • ρR—is the reflectance of the visible red on the image
  • ρG—is the reflectance of the visible green on the image
Photos taken by the UAS platform are collected at different times and dates; thus, for the approach to be applicable, it must be focused on proper results of index processing, which is a raster of unvegetation area. The methodology of image processing does not provide radiometric correction of photos with marked control points, nor white point or sensor response curves; however, radiometric adjustment was implemented in photogrammetric software during orthomosaic generation before adding orthomosaics to the system. It unifies radiometry in one area for one date. Proper detection of unvegetation areas is closely related to the threshold of GRVI. Motohka et al. [26] proposed a value approximated to GRVI = 0, which can be an effective threshold to distinguish between green vegetation and other types of ground covers. It was noticed that this index changes according to seasons, especially in autumn. It was decided that in this case, the GRVI index serves as the basis for unvegetation ground detection.
In the presented methodology, the GRVI was calculated using aerial and UAS images. An example of its value is shown in Figure 5. As a next step, in order to distinguish the unvegetation ground from other objects in the image, the threshold value of the index needs to be determined. For both aerial and UAS images, the threshold values depend on a few factors, for example, time of image acquisition, sensor type, sunlight etc. Therefore, the values are not constant and need to be assigned every time that new RGB images are used. The raster files representing the unvegetation ground must be subtracted in order to identify new unvegetation ground occurring within the period of analysis. This change-detection process generates a raster file representing the new unvegetation ground.
In order to find the most appropriate GRVI threshold value, an experiment was conducted, in which unvegetation ground areas were used. Polygons, which were created within these training sites, were used to assess classification accuracy along a range of GRVI threshold for both datasets (Figure 6). As a next step, the raster files presenting the GRVI values were classified into two classes according to the chosen threshold values: unvegetation ground and vegetation ground. Finally, it was possible to evaluate which threshold value is the most suitable. Therefore, the areas of the vectorised test fields were compared with the unvegetation ground class included within the polygons. The results of the comparison are presented in Table 2.
As can be concluded from Table 2, there are differences between the thresholds for the GRVI calculated from the aerial images and from the UAS images (different time period, sensor etc.). For the given threshold values, the percentage of the pixels correctly classified as unvegetation ground is different. According to Table 2, the threshold that guarantees, i.e., at least 95% correctness in unvegetation area detection for the GRVI from aerial images, is lower than for the GRVI from the UAS images (0.02 and 0.03, respectively).
Based on the results presented in Table 2, a graph was generated (Figure 7). On the graph, two curves representing the percentage of detected ground within the UAS and the aerial images for the selected thresholds are presented. As can be noticed, for the negative values of the threshold the percentage of detected unvegetation ground increased rapidly, and then the curves gradually became flatter. This suggests that an appropriate threshold value was reached.
By analysing the results in Table 2 and the GRVI thresholds presented in Figure 8, the best GRVI threshold values for both high-altitude and the UAS images can be chosen. In Figure 8a, the unvegetation ground classification based on the aerial images is presented. Large differences between the selected thresholds can easily be noticed. Similar remarks can be made analysing Figure 8b, in which the unvegetation ground areas based on the UAS images are shown. According to Table 2, for selected test fields with a growing threshold value, the correctness of the detected unvegetation ground is increasing. However, the higher the threshold value, the slower the correctness increase. Additionally, it is worth to analyse the graphic representation of the GRVI threshold in Figure 7, which shows that with a higher threshold, the unvegetation ground class may be overestimated. Based on the Table 2 as well as Figure 8, the threshold values have been selected: for high-altitude photogrammetry: 0.02 and for the UAS imagery: 0.03. According to the results presented in Table 2, higher threshold values distinguish the unvegetation ground surface more accurately, but the GRVI classification rasters indicate the overestimation of the unvegetation ground class.
After choosing the most appropriate GRVI threshold value for both datasets using dynamic tool visualising transparent effect of reclassification overlying orthophoto in large scale, a new unvegetation ground class can be distinguished during the change-detection analysis for whole area. Thus, the raster files presenting the unvegetation ground, which were classified according to the selected thresholds must be subtracted. As a result, the raster file showing the “new” unvegetation ground (uncovered ground change-detection raster) was obtained.

3.3. Application of the Results

The idea of using UAS data for levee monitoring is that differential digital terrain model and the “new” unvegetation ground detection data should provide a preliminary assessment of “hazards”. These potential hazard areas would subsequently be analysed by an expert, who shall provide a more thorough evaluation of the site, looking for ground movement on a large area of the levee. Such layers can support hydrologists work if orthophotos and DTMs for both analysed time periods are available. Based on the results shown in Figure 9 and Figure 10, a few places that represent changes in the levee construction were found. In Figure 10, a profile showing height differences between two periods (UAS DTM and ALS DTM) in the levee is illustrated. In Figure 10, in the blue rectangles, an area was marked. These are the results of the analysis that indicate a potential danger. These areas were found based on the assumed approach, i.e., differential DTM and “new” unvegetation ground class. In these places, both height differences and new unvegetation ground occurred (Figure 9), even though the reference ALS data were collected three years before the optical aerial data. In this case, actual data collection is crucial. The profile in Figure 10 confirms the presumption related to the landslide of a levee side, levee shaft and random elevation changes caused by animal or human activity. In the analysed test area, dozens of damages were verified and all of them were indicated and confirmed with the presented method, which confirms its usefulness.

4. Discussion

Utilising differential DTMs from LiDAR and aerial imagery in landslide inventory are well-known methodologies [23,25]. Because of the higher resolution of the UAS data, this technology has recently been introduced to erosion monitoring. Previous studies have used DSMs from high resolution images for automated measurement of erosion on loess soil [17]. In other applications, digital terrain models and orthoimages from the UAS were also used [15]. In our approach, we applied separate processing and analytics for UAS-derived elevation and optical data as a tool supporting the SAFEDAM system [6,21]. We showed how this methodology helps in finding potential places of levee failure. All detected damages measured manually were found with the proposed algorithm, indicated by analysis of differential DTMs and using GRVI index providing information about “new” unvegetation ground areas.
Considering the fact that all results of damages potentially detected must be manually verified or measured in a field, we decided to develop a simple semi-automated tool for levee monitoring. The presented results showed that there is a high potential of using UAS data in erosion monitoring.
Monitoring of erosion and changes based on the DTMs, especially in areas not covered with vegetation, is not a very complex task, assuming accurate and multi-temporal data. The challenge may lie in taking into account vegetation in the analysis and using vegetation indices. In vegetation analysis and detection, near-infrared band is crucial; however, in [28], using the indices based on RGB images in biomass estimation was shown to be very promising. In [16], the influence of different factors on values of indices from RGB images was examined. The GRVI index seemed to show the greatest accuracy variations associated with flight altitude. In other research [26], the GRVI index was useful in indicating vegetation phenology, especially for autumn leaf colouring. What is worth highlighting, according to the mentioned articles, is that there is a problem with setting one threshold value in the vegetation analysis. This is caused by the index’s values depending on all the conditions while the images were taken, from the lighting and even flight height. Similar conclusions can be drawn from the experiment described in this article. Thus, in the task of detecting vegetation/non-vegetation areas, it is not possible to set one threshold value of the GRVI. There are a few factors that impact the determination of the GRVI threshold. The main factor is the time when the images are acquired. During dry weather, the vegetation condition is worse. As a result, dry vegetation may also be classified as unvegetation ground using the GRVI index. Additionally, analysing the data used in this approach, there were also differences in the image resolution. The resolution of the aerial orthophoto was 0.25 m and the UAS orthophoto is characterised by a pixel size of 0.10 m, which may also influence the GRVI threshold value.
Although UAS LiDAR is becoming more popular, most images are obtained from the UAS. The methodology presented in this article is one of the first approaches in which both UAS images and LiDAR are used for detection of damages such as erosion. In Zhou [11], it was described that small damages of several centimetres can cause internal erosion; however, the detection of such detailed differences between two datasets is quite difficult considering the errors of orientation for both data sources: photogrammetric and LiDAR. Considering that the accuracy of ULS data is better than 0.10 m [18,19], it is possible to detect elevation changes of the size 0.15–0.20 m using multi-temporary datasets. Detecting of an altitude change under 10 cm is very challenging, thus, a threshold of 25 cm was selected to indicate areas not influenced by incorrect point cloud filtering or weak penetration through vegetation. Based on the conducted experiment, the detection of damage in the levee structure using remote sensing data can support the work of specialists who must monitor levees without fully automated procedures.

5. Conclusions

In this article, the methodology of detecting defects in levees using RGB imagery and LiDAR point clouds from an airborne platform (including the UAS) was presented. The data used in the experiment were acquired in two different periods. In the methodology, two aspects were analysed: elevation differences and changes in land cover. While generating the DTMs, it is important to distinguish the ground class properly. Otherwise, high differences in the differential model will falsely indicate the hazard. GVRI index was used to identify land cover changes. In order to distinguish the unvegetation ground class, a threshold value must be selected. The threshold may differ for various images (different sensor, time, date, vegetation status, weather etc.). Thus, it is not recommended to choose one value, but rather to enable the manual selection of the most appropriate threshold value in order to detect unvegetation ground. Additionally, there may be some other differences in the analysed levees, i.e., they may be built differently, there may be some damage or there may be changes in the structure of levees between the data acquisitions. Therefore, an individual analysis approach to each area can be recommended. The aim of the methodology is then to deliver reliable data, which is the result of the DTM and orthoimage processing from two periods. This makes it possible to detect changes and potential damages in the levee area using differential DTM and the extracted “new” unvegetation ground layer. The observer should thus initially focus their attention on these places.
The experiment has demonstrated that the methodology works properly. Additionally, the proposed methodology shows great potential because the results cannot be influenced by the existing conditions, time of data acquisition or by using different types of sensors. However, it must be highlighted that the implementation of the methodology will not be automated. The results need to be verified by hydrology specialists, who have better knowledge about the levee structure after indication of the potential damage with the use of presented method. Many complex approaches can be developed in this application regarding automatic classification of change-detection. Current approaches of change-detection based on three bands from RGB image and elevation raster are not enough for reliable detection. Nevertheless, analysis other than that proposed in this paper, based on differential DTM rasters and detection of new unvegetation ground areas, could be included in some automatic methodology. Looking for a method of detecting non-vegetated surfaces, it is possible to analyse the normalised Digital Surface Model (nDSM) and sigma0, which is a geometrical index showing the vertical distribution of the point cloud. There is also the option to use information from the intensity of a laser beam. Such approaches can be more difficult for the multi-temporary analysis of elevation models and orthoimages. They would also be hard to implement in tools dedicated to specialists who normally do not work with geospatial data. We strongly believe that these issues are a good topic to be discussed in future works.
Utilising an unmanned aerial system has many advantages compared to high-altitude photogrammetry. The data can be acquired more often, and this method is a more cost-efficient alternative. A UAS can deliver better quality data as opposed to airborne data: higher LiDAR point cloud density and images with higher spatial resolution. In the systems, which aim to create simple land cover maps based on the classification and using the vegetation index, the resolution can help delivering the detection the places of interest with a small area. In the proposed approach, the analysis ends in the raster, which is an additional layer for the expert. This seemingly simple methodology also solves the problem of spatial data resolution, because the same index values cannot be used for datasets acquired with different sensors.

Author Contributions

Krzysztof Bakuła designed the methodology and application of the UAS data in the IT system that was created within the SAFEDAM project. He was responsible for funding acquisition. Magdalena Pilarska carried out the major analyses related to the GRVI index and Adam Salach and Krzysztof Bakuła worked with the ULS data in the presented investigation; Zdzisław Kurczyński coordinated the research, took part in the analysis of the result and was responsible for project administration; all authors contributed to the final version of the manuscript text and participated in preparing the graphic part of the paper. All authors have read and agree to the published version of the manuscript.

Funding

The presented results are obtained within the realisation of the project “Advanced technologies in the prevention of flood hazard (SAFEDAM)”, financed by the National Centre for Research and Development in Defence, Security Programme.

Acknowledgments

The authors would like to thank the MSP and the Institute of Meteorology and Water Management—National Research Institute for their co-operation with photogrammetric works and providing the UAS images and laser scanning data used in the presented study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tralli, D.M.; Blom, R.G.; Zlotnicki, V.; Donnellan, A.; Evans, D.L. Satellite remote sensing of earthquake, volcano, flood, landslide and coastal inundation hazards. ISPRS J. Photogramm. Remote Sens. 2005, 59, 185–198. [Google Scholar] [CrossRef]
  2. Kussul, N.; Skakun, S.; Shelestov, A.; Lavreniuk, M.; Yailymov, B.; Kussul, O. Regional scale crop mapping using multi-temporal satellite imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 45–52. [Google Scholar] [CrossRef] [Green Version]
  3. Weintrit, B.; Osińska-Skotak, K.; Pilarska, M. Feasibility study of flood risk monitoring based on optical satellite data. Misc. Geogr. 2018, 22, 172–180. [Google Scholar] [CrossRef] [Green Version]
  4. Yen, B.C. Hydraulics and Effectiveness of Levees for Flood Control. In US–Italy Research Workshop on the Hydrometeorology, Impacts, and Management of Extreme Floods. 1995. Available online: https://www.engr.colostate.edu/ce/facultystaff/salas/us-italy/papers/44yen.pdf (accessed on 14 April 2020).
  5. Long, G.; Mawdesley, M.J.; Smith, M.; Taha, A. Simulation of airborne LiDAR for the assessment of its role in infrastructure asset monitoring. In Proceedings of the International Conference on Computing in Civil and Building Enginerering; Tizani, W., Ed.; Nottingham University Press: Nottingham, UK, 2010. [Google Scholar]
  6. Kurczyński, Z.; Bakuła, K. SAFEDAM-zaawansowane technologie wspomagające przeciwdziałanie zagrożeniom związanym z powodziami. Arch. Fotogram. Kartogr. Teledetekcji 2016, 28, 39–52. [Google Scholar]
  7. Tournadre, V.; Pierrot-Deseilligny, M.; Faure, P.H. UAV photogrammetry to monitor dykes-calibration and comparison to terrestrial lidar. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 143. [Google Scholar] [CrossRef] [Green Version]
  8. Zhou, Y.; Rupnik, E.; Faure, P.H.; Pierrot-Deseilligny, M. GNSS-assisted integrated sensor orientation with sensor pre-calibration for accurate corridor mapping. Sensors 2018, 18, 2783. [Google Scholar] [CrossRef] [Green Version]
  9. Bakuła, K.; Ostrowski, W.; Szender, M.; Plutecki, W.; Salach, A.; Górski, K. Possibilities for using lidar and photogrammetric data obtained with an unmanned aerial vehicle for levee monitoring. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 773–780. [Google Scholar] [CrossRef]
  10. Baltsavias, E.P. A comparison between photogrammetry and laser scanning. ISPRS J. Photogramm. Remote Sens. 1999, 54, 83–94. [Google Scholar] [CrossRef]
  11. Zhou, Y. 100% Automatic Metrology with UAV Photogrammetry and Embedded GPS and Its Application in Dike Monitoring. Ph.D. Thesis, Université Paris-Est, Paris, France, 2019. [Google Scholar]
  12. Berni, J.A.J.; Zarco-Tejada, P.J.; Suárez, L.; González-Dugo, V.; Fereres, E. Remote sensing of vegetation from UAV platforms using lightweight multispectral and thermal imaging sensors. Int. Arch. Photogramm. Remote Sens. Spat. Inform. Sci. 2009, 38, 6. [Google Scholar]
  13. Hunt, E.R.; Hively, W.D.; Fujikawa, S.; Linden, D.; Daughtry, C.S.; McCarty, G. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef] [Green Version]
  14. Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Comput. Electron. Agric. 2015, 114, 43–52. [Google Scholar] [CrossRef]
  15. D’Oleire-Oltmanns, S.; Marzolff, I.; Peter, K.D.; Ries, J.B. Unmanned aerial vehicle (UAV) for monitoring soil erosion in Morocco. Remote Sens. 2012, 4, 3390–3461. [Google Scholar] [CrossRef] [Green Version]
  16. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  17. Eltner, A.; Baumgart, P.; Maas, H.; Faust, D. Multi-temporal UAV data for automatic measurement of rill and interrill erosion on loess soil. Earth Surf. Process. Landforms 2015, 40, 741–755. [Google Scholar] [CrossRef]
  18. Bakuła, K.; Ostrowski, W.; Pilarska, M.; Szender, M.; Kurczyński, Z. Evaluation and calibration of fixed-wing multisensor uav mobile mapping system: Improved results. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 189–195. [Google Scholar] [CrossRef] [Green Version]
  19. Salach, A.; Bakuła, K.; Pilarska, M.; Ostrowski, W.; Górski, K.; Kurczyński, Z. Accuracy Assessment of Point Clouds from LiDAR and Dense Image Matching Acquired Using the UAV Platform for DTM Creation. ISPRS Int. J. Geo-Inf. 2018, 7, 342. [Google Scholar] [CrossRef] [Green Version]
  20. Wallace, L.; Lucieer, A.; Malenovskỳ, Z.; Turner, D.; Vopěnka, P. Assessment of forest structure using two UAV techniques: A comparison of airborne laser scanning and structure from motion (SfM) point clouds. Forests 2016, 7, 62. [Google Scholar] [CrossRef] [Green Version]
  21. Weintrit, B.; Bakuła, K.; Jędryka, M.; Bijak, W.; Ostrowski, W.; Wziątek, D.Z.; Ankowski, A.; Kurczyński, Z. Emergency rescue management supported by UAV remote sensing data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 563–567. [Google Scholar] [CrossRef] [Green Version]
  22. Kurczyński, Z.; Bakuła, K. The selection of aerial laser scanning parameters for countrywide digital elevation model creation. In Proceedings of the 13th SGEM GeoConference of Informatics, Geoinformatics and Remote Sensing, Albena, Bulgaria, 16–22 June 2013; Volume 2, pp. 695–702. [Google Scholar]
  23. Dewitte, O.; Jasselette, J.C.; Cornet, Y.; Van Den Eeckhaut, M.; Collignon, A.; Poesen, J.; Demoulin, A. Tracking landslide displacements by multi-temporal DTMs: A combined aerial stereophotogrammetric and LIDAR approach in western Belgium. Eng. Geol. 2008, 99, 11–22. [Google Scholar] [CrossRef]
  24. Allemand, P.; Delacourt, C.; Gasperini, D.; Kasperski, J.; Pothérat, P. Thirty Years of Evolution of the Sedrun Landslide (Swisserland) from Multitemporal Orthorectified Aerial Images, Differential Digital Terrain Models and Field Data. Int. J. Remote Sens. Appl. 2011, 1, 30–36. [Google Scholar]
  25. Zieher, T.; Perzl, F.; Rössel, M.; Rutzinger, M.; Meißl, G.; Markart, G.; Geitner, C. A multi-annual landslide inventory for the assessment of shallow landslide susceptibility–Two test cases in Vorarlberg, Austria. Geomorphology 2016, 259, 40–54. [Google Scholar] [CrossRef]
  26. Motohka, T.; Nasahara, K.N.; Oguma, H.; Tsuchida, S. Applicability of green-red vegetation index for remote sensing of vegetation phenology. Remote Sens. 2010, 2, 2369–2387. [Google Scholar] [CrossRef] [Green Version]
  27. Nagai, S.; Saitoh, T.M.; Kobayashi, H.; Ishihara, M.; Suzuki, R.; Motohka, T.; Nasahara, K.N.; Muraoka, H. In situ examination of the relationship between various vegetation indices and canopy phenology in an evergreen coniferous forest, Japan. Int. J. Remote Sens. 2012, 33, 6202–6214. [Google Scholar] [CrossRef]
  28. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
Figure 1. Methodology of detecting a levee hazard area based on the UAS and high-altitude photogrammetry data. LiDAR: Light Detection and Ranging; RGB: Red, Green, Blue; DTM: Digital Terrain Model; dDTM: differential DTM; GRVI: Green-Red Vegetation Index.
Figure 1. Methodology of detecting a levee hazard area based on the UAS and high-altitude photogrammetry data. LiDAR: Light Detection and Ranging; RGB: Red, Green, Blue; DTM: Digital Terrain Model; dDTM: differential DTM; GRVI: Green-Red Vegetation Index.
Ijgi 09 00248 g001
Figure 2. Levee test area in Annopol near the Vistula River (red filled polygon), presented on the orthophoto.
Figure 2. Levee test area in Annopol near the Vistula River (red filled polygon), presented on the orthophoto.
Ijgi 09 00248 g002
Figure 3. Digital terrain models generated from airborne laser scanning (ALS) and ULS data with the calculated differential model of elevation.
Figure 3. Digital terrain models generated from airborne laser scanning (ALS) and ULS data with the calculated differential model of elevation.
Ijgi 09 00248 g003
Figure 4. Results of differential DTM reclassification using distinct threshold values (0.10 m, 0.25 m and 0.50 m) and an orthophoto (2015) of the test area.
Figure 4. Results of differential DTM reclassification using distinct threshold values (0.10 m, 0.25 m and 0.50 m) and an orthophoto (2015) of the test area.
Ijgi 09 00248 g004
Figure 5. Example of GRVI calculated from the aerial RGB images compared with the orthophoto (aerial images: June 2015, UAS images: May 2017).
Figure 5. Example of GRVI calculated from the aerial RGB images compared with the orthophoto (aerial images: June 2015, UAS images: May 2017).
Ijgi 09 00248 g005
Figure 6. Example of reference polygons used in the best GRVI threshold determination (red polygons—aerial images, blue polygons—UAS images).
Figure 6. Example of reference polygons used in the best GRVI threshold determination (red polygons—aerial images, blue polygons—UAS images).
Ijgi 09 00248 g006
Figure 7. Curves showing the percentage of detected ground on the UAS (May 2017) and aerial (June 2015) images for the selected threshold values.
Figure 7. Curves showing the percentage of detected ground on the UAS (May 2017) and aerial (June 2015) images for the selected threshold values.
Ijgi 09 00248 g007
Figure 8. Visualisation of selected GRVI thresholds calculated from aerial (a) and UAS (b) images.
Figure 8. Visualisation of selected GRVI thresholds calculated from aerial (a) and UAS (b) images.
Ijgi 09 00248 g008
Figure 9. Profile example presented on the new unvegetation ground class (ac) and the DTM differences (df) resulting in potential danger for the levee (landslide) for three examples.
Figure 9. Profile example presented on the new unvegetation ground class (ac) and the DTM differences (df) resulting in potential danger for the levee (landslide) for three examples.
Ijgi 09 00248 g009
Figure 10. Profiles presenting the three examples of the differences between DTMs detected on the differential DTM and new unvegetation ground class with an indication of potential danger related to landslide of levee shaft (a), landslide of levee side (b) and random elevation changes (c) presented in two dimensions in Figure 9.
Figure 10. Profiles presenting the three examples of the differences between DTMs detected on the differential DTM and new unvegetation ground class with an indication of potential danger related to landslide of levee shaft (a), landslide of levee side (b) and random elevation changes (c) presented in two dimensions in Figure 9.
Ijgi 09 00248 g010
Table 1. Characteristics of the data used in the experiment. UAS: Unmanned Aerial System; ULS: UAS Laser Scanning.
Table 1. Characteristics of the data used in the experiment. UAS: Unmanned Aerial System; ULS: UAS Laser Scanning.
DatasetAcquisition DateResolution/
Density
Accuracy Sensor
aerial orthophotoJune 20150.25 m0.50 m horizontalUltraCam-Xp
UAS orthophotoMay 20170.10 m0.10 m horizontalSony Alpha 6000
aerial LiDAROctober 20114 points/m20.15 m vertical
0.40 m horizontal
N/A
ULSMay 2017 180 points/m20.10 m vertical
0.20 m horizontal
YellowScan Surveyor
Table 2. Comparison of the detected unvegetation ground areas within reference polygons for different GRVI threshold values.
Table 2. Comparison of the detected unvegetation ground areas within reference polygons for different GRVI threshold values.
GRVI ThresholdOrthophoto from Aerial ImagesOrthophoto from UAS Images
No. of px (resol. 0.25 m)Area (m2)Area (%)No. of px (resol. 0.10 m)Area (m2)Area (%)
0.0657,1653572.899.8256,3212563.299.3
0.0556,9643560.299.4255,0512550.598.9
0.0456,6243539.098.8252,9902529.998.1
0.0356,1113506.997.9248,7102487.196.4
0.0255,3163457.296.5240,1652401.793.1
0.0154,0383377.494.3218,3842183.884.6
0.0051,5233220.289.9179,9171799.269.7
−0.0143,7622735.176.4124,3561243.648.2
−0.0233,6532103.358.782,727827.332.1
−0.0319,1341195.933.449,612496.119.2
−0.044045252.87.127,018270.210.5
−0.05171.10.014,019140.25.4

Share and Cite

MDPI and ACS Style

Bakuła, K.; Pilarska, M.; Salach, A.; Kurczyński, Z. Detection of Levee Damage Based on UAS Data—Optical Imagery and LiDAR Point Clouds. ISPRS Int. J. Geo-Inf. 2020, 9, 248. https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi9040248

AMA Style

Bakuła K, Pilarska M, Salach A, Kurczyński Z. Detection of Levee Damage Based on UAS Data—Optical Imagery and LiDAR Point Clouds. ISPRS International Journal of Geo-Information. 2020; 9(4):248. https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi9040248

Chicago/Turabian Style

Bakuła, Krzysztof, Magdalena Pilarska, Adam Salach, and Zdzisław Kurczyński. 2020. "Detection of Levee Damage Based on UAS Data—Optical Imagery and LiDAR Point Clouds" ISPRS International Journal of Geo-Information 9, no. 4: 248. https://0-doi-org.brum.beds.ac.uk/10.3390/ijgi9040248

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop