Next Article in Journal
The Effect of Crown Dimensions on Stem Profile for Dahurian Larch, Korean Spruce, and Manchurian Fir in Northeast China
Next Article in Special Issue
Under-Canopy UAV Laser Scanning Providing Canopy Height and Stem Volume Accurately
Previous Article in Journal
Assessment of Machine Learning Algorithms for Modeling the Spatial Distribution of Bark Beetle Infestation
Previous Article in Special Issue
Recent Advances in Unmanned Aerial Vehicle Forest Remote Sensing—A Systematic Review. Part I: A General Framework
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part II: Research Applications

by
Riccardo Dainelli
,
Piero Toscano
,
Salvatore Filippo Di Gennaro
* and
Alessandro Matese
Institute of BioEconomy (IBE), National Research Council (CNR), 50145 Florence, Italy
*
Author to whom correspondence should be addressed.
The two authors contributed equally to this work.
Submission received: 16 February 2021 / Revised: 19 March 2021 / Accepted: 24 March 2021 / Published: 27 March 2021
(This article belongs to the Special Issue Forestry Applications of Unmanned Aerial Vehicles (UAVs) 2020)

Abstract

:
Forest sustainable management aims to maintain the income of woody goods for companies, together with preserving non-productive functions as a benefit for the community. Due to the progress in platforms and sensors and the opening of the dedicated market, unmanned aerial vehicle–remote sensing (UAV–RS) is improving its key role in the forestry sector as a tool for sustainable management. The use of UAV (Unmanned Aerial Vehicle) in precision forestry has exponentially increased in recent years, as demonstrated by more than 600 references published from 2018 until mid-2020 that were found in the Web of Science database by searching for “UAV” + “forest”. This result is even more surprising when compared with similar research for “UAV” + “agriculture”, from which emerge about 470 references. This shows how UAV–RS research forestry is gaining increasing popularity. In Part II of this review, analyzing the main findings of the reviewed papers (227), numerous strengths emerge concerning research technical issues. UAV–RS is fully applicated for obtaining accurate information from practical parameters (height, diameter at breast height (DBH), and biomass). Research effectiveness and soundness demonstrate that UAV–RS is now ready to be applied in a real management context. Some critical issues and barriers in transferring research products are also evident, namely, (1) hyperspectral sensors are poorly used, and their novel applications should be based on the capability of acquiring tree spectral signature especially for pest and diseases detection, (2) automatic processes for image analysis are poorly flexible or based on proprietary software at the expense of flexible and open-source tools that can foster researcher activities and support technology transfer among all forestry stakeholders, and (3) a clear lack exist in sensors and platforms interoperability for large-scale applications and for enabling data interoperability.

1. Introduction

It is now undoubted that the scientific production on the use of UAV (Unmanned Aerial Vehicle) for forest remote sensing is growing exponentially, especially in recent years [1]. In this framework, it may prove useful to arrange and analyze the unmanned aerial vehicle–remote sensing (UAV–RS) scientific knowledge and based solutions to make it easily accessible both to researchers and to forestry stakeholders, such as foresters, consulting companies, and public organisms for improving sustainable forest management and conservation [2].
The present review is organized into two parts (Part I and Part II) and tries to accomplish a systematic analysis of the recent peer-reviewed literature (2018–mid-2020) on using UAV in forestry remote sensing (RS) applications. The focus is on exploring existing new research papers and dealing with both purely scientific issues and practical applications to the forestry sector. Part I deals with general aspects of applying UAV in natural, semi-natural, and artificial forestry ecosystems, with the aims of (1) creating a selected bibliographic dataset that other researchers can draw on, (2) analyzing general and recent trends in RS forest monitoring, and (3) reveal gaps in the general research framework. The targeted audience is mainly represented by forest stakeholders (entrepreneurs, technicians, public authorities) and young researchers who want to approach UAV–RS in forestry. Furthermore, even skilled researchers could benefit from Part I, as underpinning for their activities [3].
In this paper (Part II), the authors intend to give a clear picture of recent advances in forest UAV remote sensing and wish to increase general understanding of the current research topic by synthesizing technical key points from previous scientific papers. In particular, specific goals are threefold: (1) to give an overview of the technical issues addressed by researchers and identify the most popular forestry topics to lay the foundations for the widespread of practical applications through forestry sector stakeholders, (2) to show how different UAV systems (platform + sensor) are used for multipurpose tasks and provide advice on pros and cons of the highlighted solutions, and (3) to reveal critical points (not only at the technological level) where additional effort is needed and suggest future directions to address the research questions that could not be resolved with this synthesis. Part II of the present review is primarily aimed at a skilled audience, such as expert researchers, technicians, and consultants.
After a brief synthesis of the workflow for the dataset creation, the Materials and Methods Section presents the research questions utilized for guiding the systematic analysis along with specific technical issues investigated for each topic. Basing on a rigorous review, results are elaborated to address the pivotal research questions for the set of studies, previously clustered by six forestry topics. Here, the authors present, where applicable, a focus for each topic sub-section on hyperspectral sensors, comparison with other RS platforms, and gathering and use of field data. Then, the discussion section focuses on the comparison of different UAV system solutions, technical advantages and drawbacks of the reviewed UAV technology, and considerations on technology transfer of UAV–RS in a real management context. Finally, the main conclusions stemming from Part II of the present review are drawn, outlining both popular topics and strengths and providing some suggestions for research gap filling and future hot points.

2. Materials and Methods

2.1. Workflow for Dataset Creation

The specific details for searching comprehensive literature and retrieving the final dataset are described in the first paper of this series (Part I) [3]. Briefly, the authors type the Web of Science search engine a combination of semantic keywords for the investigated topic (“UAV”, “unmanned aerial vehicle”, “UAS” (Unmanned Aerial System), “unmanned aerial system”, “drone” + “forest” or “forestry”). Through the exclusion criteria available in the search engine, papers are filtered for time boundaries (from 2018 to 30 June 2020), document type (original articles, conference papers, book chapters), and language (English). The second filtering step consists of applying customized exclusion criteria such as (1) paper adequacy to review aims, (2) relevance of publishing source and paper citations, and (3) online availability within bibliographic resources of our research Institute. Finally, a database of 227 peer-reviewed works was created to analyze and discuss both generic (Part I) [3] and specific issues (Part II) on using UAV–RS in the forestry sector, regarding recent applications (2018–mid-2020).

2.2. Topic Classification and Related Research Questions

Papers have been categorized into six main topics following the main research issues investigated in the selected papers. The addressed topics are listed below while a brief description of each topic is reported in Part I [3]. After categorization, information extraction from each paper is carried out following three degrees of detail. Firstly, general data for paper identification through publication year, author, title, publication name, and publication type are collected. Secondly, technical details common to all research papers regarding, sensor type, study location (continent and country), forest type, plant group, species are extracted; moreover, a focus on machine learning techniques (crown segmentation algorithms/object detection methods), other RS platform uses or comparison and ground truth data are performed, when relevant. Thirdly, for an in-depth analysis, specific information is gathered considering the particularities of each topic.
This synthesis is then used to address the research questions (RQs), which are reported below together with the topic list and topic-specific information gathered in the third step of paper analysis. The authors compiled the RQs that allowed both a systematic report of the results and a comparison between different papers within the same category.
  • Setting and accuracy of imagery products:
The acquisition, preprocessing and processing of UAV data for forest structure characterization are the key issues within this topic. Photogrammetry software and related versions, where available, are analyzed, which are used for processing UAV data for forest structure characterization.
RQs: What are the flight and imagery processing settings the researchers have focused on? What are the most generated products? How these products and their creation differ between scientific papers?
2.
Tree detection and inventory parameters:
This topic encompasses the retrieving of several forest inventory parameters, such as height diameter at breast height (DBH), crown area, etc. Considering that individual tree detection and crown delineation are hot issues within this topic, the focus is on crown segmentation algorithms and object detection methods. Additionally, information on software packages employed to implement machine learning techniques is gathered.
RQs: What are the most investigated forest structural parameters? What purpose their estimation served? What are the main machine learning techniques that have been used for tree detection and delineation?
3.
Aboveground biomass/volume estimation:
Biomass estimation is a key issue in UAV–RS forest monitoring. The authors collect the biomass parameter on which the assessment is centered (i.e., aboveground biomass (AGB), overall volume, stock growing volume, carbon content). Then, allometric equations or modeling methods for retrieving and upscaling UAV-monitored biomass are also highlighted.
RQs: What purpose their estimation served? What are the main machine learning techniques that have been used for tree detection and delineation? What kind of allometric equations the do authors use?
4.
Pest and disease detection:
Examining papers dealing with forest health monitoring, the review focuses on the host–pathogen system. Therefore, tree species involved in pathogen outbreak are identified and causal agent (disease, insect pest or abiotic stressor) has been classified.
RQs: Which diseases or pests are mainly discussed? In which types of forests is disease incidence most severe? Which sensors are diseases monitored with?
5.
Species recognition and invasive plant detection:
In this topic, the recognition goals of the selected studies are underlined. In particular, papers are classified according to their main aim, i.e., assessment of species (dominant or not) in forest stands or spatial analysis of weed/alien plant invasions. A brief description of the general context in which the studies are carried out and target species are also specified. Moreover, as in the case of inventory parameters topic, crown segmentation algorithms and object detection methods are analyzed due to their importance in crown delineation and tree detection.
RQs: What are the main goals for which tree recognition is applied? Which invasive species are mapped the most and in which type of ecosystem/research place? What are the main machine learning techniques that have been used for tree detection and delineation?
6.
Conservation, restoration, and fire monitoring:
This section includes a broad range of sub-topics. Hence, there is a need to clarify what is the main objective for each study. The present review categorizes into different groups the selected papers that have the following research purposes: land-use classification, biodiversity conservation, fire monitoring, post-disturbance monitoring, and restoration.
RQs: What are the main sensors employed in the conservation sub-topic? In which cases concerning the restoration is UAV technology used? What are the main research applications, and which are the sensors utilized in fire monitoring?
At the end of each topic section, the authors present, where applicable, a focus on hyperspectral sensors, comparison with other RS platforms, and gathering and use of field data. The RQs that helped us report findings and structure related discussions are the following:
RQs: Which applications are the hyperspectral sensor used for? How are other RS platforms used, and what pros and cons emerge? What are the inventory parameters collected as ground data, and how they are used?

3. Results

3.1. Global Results: A Brief Recap of Part I

A total of 227 papers were selected to form the final dataset of the present systematic review. The global analysis results together with the displaying of the overall dataset are reported in Part I [3]. Below, the authors report a brief synthesis of the general results achieved after applying the second level of the filtering criteria (adequacy, relevance, and paper availability) in Part I. Hence, it emerges that, even in the short timespan analyzed (2018–mid-2020), the trend in the increase of the number of scientific publications in UAV-based forestry research seems to be confirmed, considering also the projection for 2020. Analyzing the geographic distribution of research sites, Asia and Europe are top-ranked for scientific production (33% and 31%, respectively), while Africa presents only five research items. The researchers have focused mainly on natural forests (142 papers) and, among plant groups, on broadleaf woodland (37%). Tree detection and inventory parameters are the most discussed topic (42%), while pest and disease detection are poorly addressed (7%). The settings of image accuracy and AGB estimation are particularly discussed in the natural forest and, as could be expected, species recognition and invasive plant detection are not tackled in the planted/irregular forest. Regarding sensors, it turns out that the RGB (Red Green Blue) camera emerges by far from other technologies, having been used in almost 51% of the selected papers, especially for gathering tree inventory parameters.
From here on, the authors systematically report findings on the selected UAV forestry topics that researchers have dwelt on and on the recent advances gained.

3.2. Setting and Accuracy of Imagery Products

The research papers dealing with setting and accuracy of imagery products are 22 and are reported in Table 1; all of these papers are carried out on natural/irregular forests, except for [4,5,6].
As we can infer from Table 1, the most used photogrammetry software is AgiSoft Photoscan (Agisoft LLC, St. Petersburg, Russia), from version 1.2.2 to 1.4.6 [4,7,8,9,10,11,12,13,14,15,16,17]. Some of the reviewed studies perform a comparison among leading structure from motion (SfM) packages for UAV image processing [5,18,19]. In particular, Brach et al. [19] examine six of the most frequently used photogrammetric software in forestry applications: AgisSoft Photoscan, DroneDeploy (San Francisco, CA, USA), Pix4Dmapper (Pix4D S.A., Lausanne, Switzerland), APS (Menci Software S.r.l., Arezzo, Italy), PrecisionMapper (PrecisionHawk, Raleigh, NC, USA), and Maps Made Easy (Drones Made Easy, San Diego, CA, USA. In addition to photogrammetric software performances, researchers have focused on image acquisition setting in terms of flying height [16,18], overlap [7,13,16], timing [10,20,21] and resolution [16].
The comparison of UAV digital aerial photogrammetry (DAP) imagery products with other imagery sources represents an interesting issue in analyzing this topic. A terrestrial laser scanning (TLS) derived ground point cloud is employed as the ground reference for UAV digital terrain model (DTM) in teak plantations [4], while Graham et al. [22] compare error values of the UAV digital elevation model (DEM) with airborne laser scanning (ALS) point clouds generated in stratified classes of canopy cover.
Only a few studies utilize the hyperspectral sensor. Hakala et al. [11] and Oliveira et al. [15] use a 2D Fabry–Pérot interferometer-based (FPI) camera (prototype 2012b) (VTT Photonic Devices and Measurement Solutions, Espoo; Finland), while Yu et al. [23] present a pipeline to correct spatial coordinates of images acquired by a Gaia sky-mini pushbroom hyperspectral imaging system.
UAVs are compared with other RS platforms using airborne lidar [12] or spaceborne optical sensors, such as in [14], in which mangroves are mapped also with Pleiades multispectral images. Moreover, aircraft imagery products are used as a reference for accuracy assessment [13,24].
Through the present topic, only four studies carried out intensive field campaigns. Diaz et al. [5] and Aguilar et al. [4] acquire data from a TLS. Jayathunga et al. [12] collect, in two consecutive years, DBH, species, and height of eight canopy trees in each of the 105 plots, while Ruwaimana et al. [14] obtain measurements from mangroves plots and recorded the existing land-use/cover of the study area. This scarceness in terms of field data gathering demonstrates how research efforts are aimed to image acquisition setting and processing workflow.
Among the most representative papers, Seifert et al. [16] aim to increase knowledge on the influence of altitude, image overlap, and image resolution on a multi-view geometry (MVG) reconstruction of a forest from RGB video-based UAV imagery. Low flight altitudes (15–30 m) yield the highest reconstruction details, particularly in combination with high image overlaps (95% for forward and 70% for side overlap). Conversely, lower altitudes required a reduced UAV speed to avoid motion blur and hence reduced UAV endurance and area covered with one battery load. Diaz et al. [5] propose a multi-camera array for acquiring nadir-oblique images and conclude that, in open canopy deciduous forest patches, the system increases the estimation of maximum canopy height by 50% in comparison to a single camera system. Discussing multiplatform data fusion for lidar, Guan et al. [6] show a lidar data registration framework to avoid manual efforts and overcome errors due to forest complexity and irregularity. Unlike [25], they merge three different datasets (namely, backpack and UAV lidar with multi-scan terrestrial lidar), reaching a satisfying data registration accuracy (horizontal root-mean-square error (RMSE) < 30 cm and vertical RMSE < 20 cm).
Table 1. Reviewed studies for setting and accuracy of imagery products topic: sensor, research place, tree species, and photogrammetry software.
Table 1. Reviewed studies for setting and accuracy of imagery products topic: sensor, research place, tree species, and photogrammetry software.
Research PaperSensorResearch PlaceSpeciesPhotogrammetry Software
Fraser and Congalton [18]RGBUSAVariousAgiSoft PhotoScan, Pix4Dmapper Pro 3.2
Frey et al. [7]RGBGermanyNorway SpruceAgiSoft PhotoScan
Goodbody et al. [10]MultispectralCanadaMaple, birchAgiSoft Photoscan
Hakala et al. [11]HyperspectralEnglandUnspecifiedAgiSoft Photoscan
Jayathunga et al. [12]RGBJapanSakhalin fir (dominant), VariousAgiSoft Photoscan 1.3.2
Ni et al. [13]RGBChinaLarchAgiSoft Photoscan 1.2.2
Ruwaimana et al. [14]MultispectralMalaysiaMangrovesAgiSoft Photoscan
Aguilar et al. [4]RGBEcuadorTeakAgiSoft Photoscan 1.4.4
Brach et al. [19]RGBPolandUnspecifiedVarious (6)
Graham et al. [22]RGBCanadaDouglas fir, hybrid spruce, western redcedarPix4Dmapper Pro 4.1
Kellner et al. [20]LidarCzech RepublicEuropean beech, Norway spruce-
Oliveira et al. [15]HyperspectralBrazil/FinlandTropical forest, birchAgiSoft Photoscan 1.2.6
Polewski et al. [25]LidarChinaMetasequoia, poplarLiForest 1
Seifert et al. [16]RGB (Video)GermanyNorway spruce, sycamore maple, silver birchAgiSoft Photoscan 1.4.0
Tomastik et al. [17]RGBSlovakiaBeech, larch, Norway spruceAgiSoft Photoscan 1.4.6
Wallace et al. [8]RGBChileVariousAgiSoft Photoscan 1.4.1
Diaz et al. [5]RGBArgentinaPonderosa pineMicMac 2, Pix4Dmapper Pro 3.0.17 and 4.1.24, VisualSFM 3
Fletcher and Mather [21]RGB, MultispectralAustraliaEucalyptusPix4Dmapper 4.2.6-4.3.33
Graham et al. [24]RGBCanadaDouglas fir, hybrid spruce, western redcedarPix4Dmapper Pro 4.1
Guan et al. [6]LidarChinaScots pine, Manchurian red pineLIDAR360 1
Jurjevic et al. [9]RGBCroatiaVariousAgisoft PhotoScan 1.4.3
Yu et al. [23]HyperspectralChinaVariousDPW Photomod 4
1 GreenValley International, Berkeley, CA, USA 2 Free and open source software (Rupnik et al. 2017) https://github.com/micmacIGN/micmac (accessed on 16 February 2021) 3 Free for personal, non-profit or academic use (Wu, 2013) http://ccwu.me/vsfm/ (accessed on 16 February 2021) 4 Racurs, Moscow, Russia. Abbreviations: RGB = Red Blue Green.

3.3. Tree Detection and Inventory Parameters

Most of the papers analyzed in the present review are classified within the tree detection and inventory parameters topic, consisting of 95 research works (Table 2). The topic encompasses a wide range of research issues. Detection techniques and inventory parameter extraction are often joint together in the workflow of the reviewed studies. Detection mainly concerns the localization of a single whole tree through the forest structure both in planted [26,27,28] and in natural/irregular forests [29,30,31]. In some case, even newly grown leaves [32,33], seedling [34], stump [35], fallen log [36], and forest gaps [37] are the detected target. Moreover, detection (especially for crown) may be followed by object classification [38,39,40,41]. The more frequent inventory parameters extracted in the reviewed studies, alone or as a combination, are height [42,43,44,45,46,47], DBH [48,49,50,51,52,53], crown diameter [54,55,56,57], Leaf Area Index (LAI) [58,59,60], and thermal data [61,62,63]. In addition to the common tree features, tasks regarding phenology [64,65], phenotyping [66,67,68], light behavior [69,70], photosynthetically active radiation [71], and solar radiation to infer on seedling growth [72] are also accomplished. Furthermore, the first group of dendrometric characteristics is often related to works that have practical or technology transfer purposes.
More than half of the papers (49) use specific algorithms or machine learning techniques to automatically detect, segment, and classify trees. Many studies segment tree crown through the watershed algorithm [39,73,74,75,76,77,78] and similar ones, such as the inverse watershed [79,80] and the marker-controlled watershed [54,81,82,83]. Others use convolutional neural networks (CNN) based on existing architectures [38,41,84,85] or novel ones [86]. Comparison among segmentation techniques is also carried out [87,88,89,90]. As shown in Table 2, the mainstream software tools are eCognition (Trimble Geospatial, Westminster, CO, USA), Matlab (MathWorks, Natick, MA, USA) and R (with LidR and rLidar packages) (free software https://www.r-project.org/ (accessed on 16 February 2021)).
Forest hyperspectral applications are quite limited in this topic (only four papers). Image recording is carried out with customized solutions [78,91,92] or with an off-the-shelf sensor, i.e., FireflEYE 185 (Cubert GmbH, Germany) [93].
Regarding other RS platform comparison/use (13 papers), authors focus mainly on the use of aircraft ALS as opposed to UAV–DAP for estimating tree height [94,95] and canopy features [96,97]. Satellite images are used as ancillary data for observing plant phenology [98] and calculating new spectral indexes [93]; on the contrary, St-Onge and Grandin [99] utilize a spaceborne sensor as the main data source for measuring individual tree height.
About 70% of the research works gather field data. For the sake of brevity, only a few papers that stress this issue (citing the word “field” in the title) are briefly reported—Hentz et al. [74] and Ganz et al. [77] measure height on standing trees, while Jin et al. [100] also record age and DBH, together with height.
By highlighting the most interesting works, Gu et al. [83] focus on segmentation routine. In particular, they compare the accuracy of segmentation using structural versus spectral information, concluding that single tree crowns segmented using the spectral lightness are more accurate compared to a canopy height model (CHM) approach. Abdollahnejad et al. [79] consider successfully the extrapolation of the UAV data to a larger area, using a correlation with Pleiades satellite (R2 = 0.94). Finally, recent research [101] analyzes three data sources (field survey, UAV–DAP, and UAV–ALS) to measure height and DBH in a high-value timber forest; about tree height, the authors conclude that UAV–DAP could underestimate this inventory parameter, while traditional field height measurement is likely to be influenced by individual species.
Table 2. Reviewed studies for tree detection and inventory parameters topic: sensor, tree species, segmentation algorithm/object detection method, software for object detection, other remote sensing (RS) platforms use/comparison, and field data gathering.
Table 2. Reviewed studies for tree detection and inventory parameters topic: sensor, tree species, segmentation algorithm/object detection method, software for object detection, other remote sensing (RS) platforms use/comparison, and field data gathering.
Research PaperSensorSpeciesSegmentation/Object Detection AlgorithmSoftwareField Data
Abdollahnejad et al. [72]RGBScots pine
Abdollahnejad et al. [79]RGBScots pineInverse watershedArcGIS 1
Alexander et al. [29]RGBTropical forest
Bagaram et al. [37]RGBBeech, Turkey oakContrast split segmentationeCognition Developer
Balsi et al. [73]LidarHazelWatershed-
Carr and Slyder [102]RGBVariousIteratively growing from seed points)Customized tool
Chen et al. [33]RGBTropical savannaConstrained energy minimization/Otsu method-
Demir [28]RGBStone pine
Fankhauser et al. [45]RGBPonderosa pine, lodgepole pine
Feduck et al. [34]RGBWhite spruce, lodgepole pineCART algorithm/Multiresolution Salford Predictive Modeler v. 70 2/eCognition 9.1
Goodbody et al. [39]RGBVariousWatershed by immersion segmentation/Random forest (OBIA)-
Guerra-Hernandez et al. [94]RGBEucaliptus
Guo et al. [60]MultispectralMangrove
Hentz et al. [74]RGBEucalyptus, Loblolly pineWatershedArcGIS 10.4
Huang et al. [81]RGBFragrant olive, Buddist pineMarker-controlled watershedMatlab
Iizuka et al. [61]Thermal, RGBAcacia crassicarpa Northern wattle
Iizuka et al. [75]RGBJapanese cypressWatershedSAGA-GIS 3 5.0.0
Kattenborn et al. [52]RGBSouthern beech
Klosterman et al. [98]RGBOak, maple
Kuzelka and Surovy [51]RGBNorway spruce, European beech
Lin et al. [32]RGB Big leaved mahogany
Mayr et al. [76]RGBSavannahWatershedSAGA-GIS
Medauar et al. [103]MultispectralEucaliptus
Mokros et al. [49]RGBBeech
Morales et al. [85]RGBMoriche palmCNN semantic level segmentation-
Puliti et al. [35]RGBUnspecifiedIterative region growingR software
Qiu et al. [104]RGBPearSeed region growing-
Rosca et al. [92]HyperspectralTropical forest
Shin et al. [105]MultispectralPonderosa pineTop-to-bottom region growingR software
Thomson et al. [91]HyperspectralTropical forest
Webster et al. [62]Thermal, RGBNorway spruce
Yan et al. [106]LidarMetasequoia, poplarImproved NCut segmentation/Marker controlled watershedMicrosoft Visual Studio 2013 4 (programed using C++) and MATLAB 2016a/ArcGIS 10.2
Blonder et al. [68]MultispectralQuaking aspen
Brieger et al. [54]RGBLarch, Siberian pine, Siberian spruceMarker-controlled inverse watershedR software
Carl et al. [40]RGB, Multispectral Black locustOBIA with multiresolution segmentation/CNN developed by the authorseCognition Developer/Python 5
Chakraborty et al. [107]RGBUnspecifiedMulti-resolutioneCognition Developer
Chung et al. [95]RGBVarious
dos Santos et al. [38]RGBDipteryx alataFaster R-CNN, YOLOv3, RetinaNetBase codes freely available
Durfee et al. [108]MultispectralJuniper
Fawcett et al. [109]RGBOil palmCentroidal Voronoi tessellationR software
Ganz et al. [77]Lidar, RGBDouglas firWatershed-
Gulci [55]RGBCalabrian pineForestCAS R software
He et al. [43]RGBCunninghamia, pine
Huang et al. [46]RGBVarious
Imangholiloo et al. [78]Hyperspectral, RGBScots pine, Norway spruceWatershedSAGA-GIS 2.3.2
Jayathunga et al. [96]RGBVarious
Krause et al. [44]RGBScots pine
Lendzioch et al. [59]RGBNorway spruceManual-
Li et al. [57]LidarMetasequoia, poplarTop-to-bottom region growing-
Liang et al. [50]LidarScots pine, Norway spruce, birch
Maturbongs et al. [89]RGBDouglas firVariable window filter, Graph-theoretical, WatershedR software, Tree Extraction software, ArcGIS 10.5.1
Nuijten et al. [82]MultispectralVariousMarker-controlled watershedR software
Panagiotidis et al. [36]RGBNorway spruce, Scots pineCanny approach edge detection/Hough transformationMatlab 2017b
Park et al. [65]RGBTropical forestStochastic gradient tree BoostingR software
Puliti et al. [97]RGBNorway spruce, Scots pine
Rissanen et al. [69]RGBVarious
Santini et al. [66]Thermal, Multispectral, RGBAleppo pine
Santini et al. [67]Multispectral Black pine
Schneider et al. [110]LidarBeech, tropical forest
Shashkov et al. [111]RGBPine, birch, spruce, fir
St-Onge and Grandin [99]RGBBlack spruce
Tian et al. [42]RGBMetasequoia
Wang et al. [26]RGBOil palmHOG features and SVM classifier-
Wu et al. [87]Lidar Kew treeWatershed, Polynomial fitting, Individual tree crown segmentation, Point cloud Segmentation-
Xu et al. [31]RGB Saxaul
Xu et al. [93]Hyperspectral Manchurian red pine, larch
Yancho et al. [112]LidarWestern hemlock, Western red-cedar, Douglas firModified top-to-bottom region growingR software
Yilmaz and Gungor [90]RGBVariousPolynomial fitting based methodologyMatlab
Yin and Wang et al. [56]LidarMangroveSeeded region growing, Marker controlled watershedMatlab 2017a
Yurtseven et al. [41]RGBVariousOBIA (unspecified)eCognition
Zhang et al. [58]RGBDahurian larch, white birch, aspenAlgorithm by Liu, 2017-
Zeng et al. [71]LidarCyclocarya paliurus
Apostol et al. [113]RGBNorway spruce, beechMultiresolution segmentation/OBIAeCognition ver. 7
Balkova et al. [80]MultispectralBeech, spruce, firInverse watershed -
Brullhardt et al. [70]RGBVarious
Chen et al. [84]LidarUnspecifiedFully CNN-
Dalla Corte et al. [48]Lidar Camden white gum
D’Odorico et al. [64]MultispectralWhite spruce
Dong et al. [114]LidarVariousGraph-based segmentation-
du Toit et al. [115]LidarDouglas firTop-to-bottom region growing R software
Gil-Docampo et al. [30]MultispectralScots pine, birch, rowan
Gu et al. [83]RGBVariousMarker-controlled watershed -
Hastings et al. [88]LidarVarious5 algorithms available in lidR packageR software
Hu et al. [86]RGBLarchNeural network developed by the authors-
Isibue and Pingel [116]RGBVarious
Jin et al. [100]RGBPines, Japanese larch, Manchurian fir
Jurado et al. [117]RGB, MultispectralOak, pine, eucalyptusK-means clustering algorithm-
Krisanski et al. [53]RGB White peppermint
Kuzelka et al. [118]LidarNorway spruce, Scots pineVoronoi diagram-
Li et al. [119]RGBMongolian pine, larchSuperpixel segmentation, half-Gaussian fitting methodMatlab 2018a
Marzahn et al. [63]Thermal, MultispectralVarious
Moe et al. [101]RGBMonarch birch, castor aralia, Japanese oak
Picos et al. [27]LidarEucalyptus
Vanderwel et al. [47]RGBLodgepole pine, white spruce, trembling aspen
Yan et al. [120]LidarMetasequoia, poplarSelf-adaptive bandwidth mean shiftMatlab 2016a
1 ESRI, Redlands, CA, USA 2 Minitab LLC, State College, PA, USA 3 System for Automated Geoscientific Analyses, free software (Böhner et al. 2006) https://sourceforge.net/projects/saga-gis/ (accessed on 16 February 2021) 4 Microsoft, Redmond, WA, USA 5 Free software https://www.python.org/ (accessed on 16 February 2021). Abbreviations: CART = Classification and Regression Trees; OBIA = Object-Based Image Analysis; CNN = Convolutional Neural Network; NCut = Normalized Cut; R-CNN = Region Based Convolutional Neural Network; YOLO = You Only Look Once; HOG = Histogram of Oriented Gradients; SVM = Support Vector Machine; The symbol ● indicates field data gathering in the research.

3.4. Aboveground Biomass/Volume Estimation

Biomass estimation is certainly encompassed in the topic of inventory parameters, but considering its importance in research and professional forestry from an economic and ecological point of view, we choose to report the related results in a dedicated section. Moreover, this choice is corroborated by the high number of selected papers (45) (Table 3). All reviewed studies start from one or more remote-sensed basic inventory parameters (i.e., height, DBH, crown diameter) and then estimate the biomass. When it comes to biomass, most papers refer to AGB (43%) and volume (31%), in particular, to stem volume [121,122,123,124,125]. Other authors refer to biomass as carbon stock [126,127,128,129], pruning wood biomass [130], or woody debris volume [131].
Once biophysical parameters such as DBH, height, or canopy diameter have been estimated by UAV, biomass can be calculated mainly in two ways—through allometric equations or targeted models. Allometric equations provide an efficient way to estimate biomass but require genus- or species-specific equations. Nevertheless, numerous studies use allometric equations from the literature [128,132,133,134,135,136,137,138,139]. The other way through which biomass can be estimated is represented by general (multi-species) and often newly developed models. The most popular models among the reviewed studies are represented by simple or multiple linear regression models [124,127,130,140,141,142,143,144,145,146]. It is worth noting that machine learning regression techniques are also exploited, in particular, random forest regression [122,126,147,148], support vector regression [122,126,149], and k-nearest neighbors [150], and artificial neural network model [126].
For retrieving single tree basic features, many studies (46%) utilize segmentation algorithms. As in Section 3.3, the watershed algorithm and its variants are the most used [122,125,130,135,137,145,151,152,153,154].
The hyperspectral sensor is used only by Zou et al. [155]. They collect hyperspectral images to predict the growth rate of a Chinese fir (Cunninghamia lanceolate) plantation through a 176-band off-the-shelf camera (Gaiasky-mini2-VN, Zuolihanguang, Beijing, China).
Overall, 38% of the reviewed studies handle images gathered from other RS platforms. Manned aircraft are always equipped with a lidar sensor for establishing a reference dataset [156], for comparing results in respect to UAV [124,127,135,140,142,152], or for constructing reliable DTMs [140,148,157]. Instead, satellite platforms are exploited to upscale biomass estimation generated by UAV to broader areas. Both optical and radar sensors carried by different platforms are used, such as Sentinel-2, WorldView-2, and Gao fen-2 (optical) and ALOS-2, PALSAR-2, Sentinel-1, and Gaofen-3 SAR (radar) [122,147,149].
All the selected studies, except for a few (seven papers), present field data. Following the selection criterion reported in Section 3.3, five articles are highlighted. Together with the measurement of common tree inventory parameter (DBH, height, crown diameter) [123,129,132,138,158,159], a tough task consisting of mangrove species recognition is accomplished by Otero et al. [132] and Wang et al. [158]. Moreover, Fernandes et al. [129] take several soil samples for carbon content analysis and measure directly in the field the aboveground fresh weight of the endemic grey willow (Salix atrocinerea), in which no allometric equation was found.
Regarding the handful of newsworthy papers, Hyyppa et al. [121] compute stem volume using lidar in a regular conifer stand with a relative RMSE of 10.1%, compared to TLS. The peculiarity is that data are obtained with an under-canopy flying UAV that was manually piloted with the help of video goggles receiving a live video feed from the onboard camera. Puliti et al. [138] assess the growing stock volume through UAV–ALS by wondering if it can be carried out without field data. Using field data only for independent validation, they conclude that the need for fieldwork can be reduced only to the acquisition of some data for quality control. Regarding carbon dynamics simulation, Fujimoto et al. [160] develop an end-to-end process based on UAV–SfM and the individual model FORMIND 3.2 (free software, http://formind.org/model/ (accessed on 16 February 2021)). They visualize carbon dynamics with a time horizon up to 2100, inferring that the investigated forest will bear a reasonable demand for timber.
Table 3. Reviewed studies for aboveground biomass (AGB)/volume estimation topic: sensor, tree species, biomass parameter, allometric equation/modeling method, segmentation algorithm, other RS platforms use/comparison, and field data gathering.
Table 3. Reviewed studies for aboveground biomass (AGB)/volume estimation topic: sensor, tree species, biomass parameter, allometric equation/modeling method, segmentation algorithm, other RS platforms use/comparison, and field data gathering.
Research PaperSensorSpeciesBiomass ParameterAllometric Equation/Modeling MethodCrown Segmentation AlgorithmOther RS Platform Use/ComparisonField Data
Alonzo et al. 151]RGBBlack spruce, white spruce, birch, aspenAGBSum of individual crown volumesWatershed, Mean-shift
Giannetti et al. [140]Lidar, MultispectralBeech, silver fir, Norway spruce, Scots pine, birch Growing stock volumeMultivariate linear regression models (3) Aircraft
Jayathunga et al. [141]RGBVariousMerchantable volume, carbon stockGeneralized linear mixed model Helicopter
Lin et al. [161]RGBMinjiang firAGBNew allometric equation (predictor variable: H)Multi-resolution segmentation
Liu et al. [150]Lidar Kew treeAGBPartial least squares, K-nearest neighbors modelsPoint cloud segmentation
Otero et al. [132]RGBMangrovesAGBAllometric equation by Ong et al. (2004)
Pena et al. [133]RGB, MultispectralPoplarDry biomassBiomass = a (DBH)2 × H
Puliti et al. [157]RGBNorway spruce, Scots pine, birchGrowing stock volumeHierarchical model-based inference Satellite (main)
Brede et al. [153]LidarBeech, oak, Norway spruce, firVolumeTreeQSM methodMarker-controlled inverse watershed
Cao et al. [162]Lidar, RGBDawn redwood, poplarAGBExponential predictive models (3)
Domingo et al. [142]MultispectralTropical forestBiomass (generic)Multiple linear regression models (5)
Fujimoto et al. [160]RGBCypress, cedarAGBFORMIND modelForestCAS
Gonzalez-Jaramillo et al. [134]RGB, MultispectralTropical forestAGBAllometric equation by Chave et al. (2005), NDVI-based equation by Das and Singh (2014)
Guerra-Hernandez et al. [152]RGBEucalyptusGrowing stock volumePower function models (2)WatershedAircraft
Jayathunga et al. [148]RGBErman’s birch (dominant), Sakhalin firLiving biomassSimple/multiple linear regression model, Random Forest Aircraft
Li et al. [126]RGBMangrovesAbove ground carbon stockRandom Forest, Support Vector Regression, Artificial Neural Network (Regression models)Multi-resolution segmentation
McClelland et al. [127]Lidar, RGBVariousCarbon contentFunctional and linear model by Zhao et al. 2009 Aircraft
Navarro et al. [149]RGB Red mangroveAGBSupport vector regression models (2)ForestCASSatellite
Ni et al. [156]RGBLarch (dominant), birch, aspenAGBPower function models (3) Aircraft
Ota et al. [163]RGBTeakAGBType 1 Tobit model
Qiu et al. [154]LidarMangrovesAGBIndividual tree-based inference methodMarker-controlled watershedSatellite
Shen et al. [164]RGB, Multispectral Kew treeVolumePartial least squares regression models (2)
Swinfield et al. [128]Lidar, RGB, MultispectralTropical forestAboveground carbon densityAllometric equation by Jucker et al. (2017)
Tian et al. [159]Lidar Euphrates poplarBiomass (generic)Customized allometric model for GLAS lidar
Vaglio Laurin et al. [135]RGBTropical forestAGBAllometric equation by Chave et al. (2014)WatershedAircraft
Wang et al. [139]LidarVariousAGBFinnish species-specific allometric equationsAlgorithm by Wang et al. (2016)
Wang et al. [165]LidarMangrovesAGBG~LiDAR model
Windrim et al. [131]RGB Radiata pineWoody debris volumeV = π (d2/4) L (cylinder volume)Convolutional Neural Network
Xu et al. [146]Lidar, RGBJapanese zelkova, Paper mulberry, Blue sandlewoodVolume, AGBMultivariate linear regression
Zou et al. [155]HyperspectralChinese firBiomass (generic) -
Di Gennaro et al. [130]Multispectral Sweet chestnutPruning wood biomassLinear modelWatershed
d’Oliveira et al. [143]Lidar Rubber tree, Brazil nut AGBMultiple linear regression models (2) Aircraft
Fernandes et al. [129]RGB, MultispectralVariousCarbon stockNon-linear regression model
Hyyppa et al. [121]Lidar Scots pineStem volumeMethod by Hyyppä et al. (2020)
Iizuka et al. [122]RGBJapanese cypressStem volumeRandom forest regression, Support vector regressionWatershedSatellite
Jones et al. [144]RGB Gray mangrove Biomass (generic)Linear regression models
Kotivuori et al. [123]RGBUnspecifiedStem volumeNon-linear regression model Aircraft
Lu et al. [136]Lidar Black locustAGBAllometric equation by the Chinese state forestry administrationPoint cloud segmentation
Navarro et al. [137]RGB Gray mangroveAGBAllometric equations by Owers et al. (2018) and Fu and Wu (2011)Marker-controlled watershed
Puliti et al. [138]LidarScots pine, Norway spruce, birchGrowing stock volumeSpecies-specific allometric modelsAlgorithm by Dalponte and Coomes (2016)
Puliti et al. [122]Lidar, RGB Radiata pineStem volumeMultiple linear regression models Aircraft
Wang et al. [158]LidarMangrovesAGBG∼LiDAR∼S2 model Satellite
Yrttimaa et al. [125]RGBScots PineStem volumeStem taper curveMarker-controlled watershed
Zhou et al. [145]Lidar, RGBLarch, Chinese pineBiomass (generic)Linear models (stepwise regression method)Watershed
Zhu et al. [147]RGBSonneratia apetalaAGBRandom forest regression Satellite
Abbreviations: AGB = Aboveground Biomass; DBH = Diameter at Breast Height; H = Height; NDVI = Normalized Difference Vegetation Index; GLAS = Geoscience Laser Altimeter System. The symbol ● indicates field data gathering in the research.

3.5. Pest and Disease Detection

Pest and disease detection is the least represented topic in this review (17 papers) (Table 4). The researchers dealing with this issue mainly focus on coniferous forests (59%) and in particular on various species of pine [166,167,168,169,170,171,172] and Norway spruce (Picea abies) [173,174,175]. A broad range of pathogenic organisms are investigated, including insects [167,168,169,175,176,177], fungi [172,173,174,178], bacteria [179], and nematodes [170,171]. Other target stressors for plants are represented by the hemiparasitic plant (Amyema miquelii) [180] and herbicide-induced stress [166], while Padua et al. [181] discuss a case study regarding three key plagues affecting sweet chestnut (Castanea sativa).
UAV imagery products are mostly generated by using RGB sensors, also in combination with hyperspectral, multispectral, and thermal ones (53%). Hyperspectral sensors are used mainly in the wavelength range from visible to near-infrared (450–950 nm) but with a different number of spectral channels and hyperspectral imaging systems. Thus, a prototype FPI hyperspectral camera with 24 bands is used in [175], an off-the-shelf Headwall Nano-Hyperspec (Headwall Photonics Inc., Bolton, MA, USA) in [178], while Zhang et al. [168] compose their hyperspectral system with a UHD 185 imaging spectrometer (Cubert GmbH, Ulm, Baden-Württemberg, Germany).
Other RS platform products are exploited for several purposes within this topic. Regarding aircraft, Smigaj et al. [172] acquire a lidar dataset to calculate CHM and then multiple structural metrics, while Nasi et al. [175] use the same hyperspectral sensor mounted both on a UAV and on a Cessna aircraft to compare images with a different spatial extent. Multispectral imagery from spaceborne sensors is used in combination with UAV products to increase the monitored area for detection of physiological stress by using RapidEye [166] and Landsat 8 [167].
The fieldworks and surveys are mainly aimed at recognizing the presence/severity of the stress-induced damages. Moreover, fundamental dendrometric variables, such as DBH and tree height [173,175,180], and weather parameters [172], are surveyed.
Regarding the most interesting researches, the authors have selected the papers by Kloucek et al. [176], Cardil et al. [169], and Smigaj et al. [172]. Kloucek et al. [176] evaluate the possibilities of UAV-mounted RGB and modified near-infrared sensors to detect bark beetle infestation (Ips typographus) at different stages for individual trees. They assess the severity of infestation damages through various vegetation indexes and perform recognition of still green but already infested trees (so-called green attack), using a retrospective time series. Instead, for quantifying pine processionary moth (Thaumetopoea pityocampa) defoliation, the results by Cardil et al. [169] show the effectiveness of using NDVI (Normalized Difference Vegetation Index) in mixed forests; moreover, they gain a high accuracy (81.8%) in classifying automatically pines among non-defoliated, partially defoliated, and completely defoliated trees. Using a thermal sensor in a Scots pine (Pinus sylvestris) monoculture plantation, Smigaj et al. [172] monitor changes in the forest physiological status due to the red band needle blight (Dothistroma septosporum). They find a fairly good correlation between canopy temperature depression and disease levels (R2 up to 0.41), which may be related to loss of cellular integrity, necrosis, and desiccation.
Table 4. Reviewed studies for pest and disease detection topic: sensor, research place, tree species, type of pest/disease/stress, and field data gathering.
Table 4. Reviewed studies for pest and disease detection topic: sensor, research place, tree species, type of pest/disease/stress, and field data gathering.
Research PaperSensorResearch PlaceSpeciesType of StressorField Data
Brovkina et al. [173]MultispectralCzech RepublicNorway spruceHoney fungus (Armillaria ostoyae)
Dash et al. [166]MultispectralNew Zealand Radiata pineHerbicide-induced stress
Ganthaler et al. [174]RGBAustriaNorway spruceNeedle bladder rust (Chrysomyxa rhododendri)
Maes et al. [180]RGB, ThermalAustraliaGrey box, red ironbarkBox mistletoe (Amyema miquelii)
Nasi et al. [175]HyperspectralFinlandNorway spruceEuropean spruce bark beetle (Ips typographus)
Otsu et al. [167]RGBSpainBlack pine, Scots pinePine processionary moth (Thaumetopoea pityocampa)
Padua et al. [181]MultispectralPortugalSweet chestnutChestnut ink disease (Phytophthora cinnamomi), chestnut blight (Cryphonectria parasitica), oriental chestnut gall wasp (Dryocosmus kuriphilus)
Sandino et al. [178]RGB, HyperspectralAustraliaPaperbark tea treesMyrtle rust (Austropuccinia psidii)
Zhang et al. [168]HyperspectralChinaManchurian red pineChinese pine caterpillar (Dendrolimus tabulaeformis)
Barmpoutis et al. [182]RGBGreeceFirStress (general)
Cardil et al. [164]MultispectralSpainScots pine, holm oakPine processionary moth (Thaumetopoea pityocampa)
Dell et al. [179]RGBMalaysia Red mahoganyBacterial wilt (Ralstonia spp.)
Jung and Park [170]MultispectralKoreaPinePine wilt disease (Bursaphelenchus xylophilus)
Kloucek et al. [176]RGB, MultispectralCzech RepublicNorway spruce, mountain-ash, beech, silver firEuropean spruce bark beetle (Ips typographus)
Lee and Park [171]RGBKoreaPinePine wilt disease (Bursaphelenchus xylophilus)
Safonova et al. [177]RGBRussiaSiberian fir (dominant)Four-eyed fir bark beetle (Polygraphus proximus)
Smigaj et al. [172]ThermalScotlandScots pineRed band needle blight (Dothistroma septosporum)
The symbol ● indicates field data gathering in the research.

3.6. Species Recognition and Invasive Plant Detection

The selected papers (24) and their recognition goals are listed in Table 5. Within this topic, all the UAV applications are carried out in natural/irregular forests and distributed among all the continents, except for Africa. There are also two types of “special” study sites, such as arboretum [183,184] and plant nursery [185]. Moreover, it should also be noted that the complete range of UAV–mountable optical sensors, both passive and active, are successfully utilized for species detection goals throughout the selected papers.
A total of nine research works aim to identify invasive plants, especially in places where the biodiversity rate is high, such as in subtropical and tropical forests (Ecuador, Chile, Costa Rica, Malaysia), in mountain zones (China), or isolated ecosystems (New Zealand). The invasive plants that were chosen as targets belong to perennial creepers, i.e., lianas and bitter vine (Mikania micrantha) [186,187,188], conifers, i.e., Pinus spp. [189,190], and broad-leaved trees, i.e., Acacia spp. [190,191] and black locust (Robinia pseudoacacia) [192].
Crown segmentation algorithms and object detection methods are widely tested and discussed for species detection. The most used automatic segmentation method is the multiresolution segmentation algorithm [193,194,195,196]. Machine learning techniques are especially exploited to recognize and classify plant species. Among the most frequent, there are support vector machine [183,188,197,198,199,200], random forest [184,189,191,193,194,201], convolutional neural networks [192,202,203,204] and k-nearest neighbor [184,188,199,205]. Performing machine learning algorithms, authors are supported by different software packages, such as eCognition [188,194,198], ENVI (Harris Geospatial Solutions, Boulder, CO, USA) [183,185], and R [191,206].
Applications of the hyperspectral sensor are numerous within this topic (eight papers). Hyperspectral cameras, which are exploited for detecting spectral futures of different tree species, can be classified into two major groups. Many authors use commercial solutions, such as Cubert UHD 185 [199], Senop Rikola (Rikola Ltd., Oulu, Finland) [201,204], and two FPI-based models by Senop [196,205]. Conversely, the UAVs that were flown by Tuominen et al. [184] and Nezami et al. [202] mount FPI-based prototype cameras. All the cited hyperspectral cameras acquire images from the visible to the near-infrared spectral range, except Tuominen et al. [184], who use also a short-wave infrared hyperspectral sensor.
In this section, the comparison/use of other RS platforms refers to few cases mainly because of the hitch to accomplish tree species identification task by airborne or spaceborne sensors, often associated with a low spatial resolution. Nevertheless, Rivas-Torres et al. [195] use Landsat 8 fusion imagery to obtain the vegetation map of the Galapagos archipelago, while Kattenborn et al. [190] test the upscaling of the UAV-estimated species cover to the spatial scale of Sentinel-1 and Sentinel-2. Aircraft are used to obtain ALS-derived CHM [201] and both lidar and multispectral datasets to compare them with the same acquired by UAV [189].
All the studies carrying field campaigns (13) survey features of the target plant, such as species [183,196,199,200], flower counting [191], crown shape [193], liana load [186] and spectral signals [198]. Additionally, other tree-related parameters are gathered, i.e., height, DBH, health status, and age [189,197,201,205,206].
Among the noteworthy papers, mangrove species discrimination through NDVI is performed by Yaney-Keller et al. [197]. At 100 cm/pixel resolution, UAV-derived tree metrics result not statistically different from ground measurements, unlike at 10 cm/pixel, indicating lower accuracy as the resolution becomes extremely fine. In freshly new research work, Casapia et al. [206] quantify the abundance of non-timber palm species, i.e., they delineate palm crowns using color and textural information and then test four different machine learning techniques, yielding the best results with the random forest classifier (85% overall accuracy). Moving to European forests, de Sa et al. [191] seek to map the invasion of Portuguese coastal areas by an alien species (Sydney golden wattle, Acacia longifolia) through its flowering. The authors generate flower presence/absence maps using supervised random forest but the correlation between the number of in-field counted flowers and the area covered by flowering result weak.
Table 5. Reviewed studies for species recognition and invasive plant detection topic: sensor, research place, recognition goal, target species, crown segmentation algorithm/object detection method, and field data gathering.
Table 5. Reviewed studies for species recognition and invasive plant detection topic: sensor, research place, recognition goal, target species, crown segmentation algorithm/object detection method, and field data gathering.
Research PaperSensorResearch PlaceRecognition GoalTarget SpeciesCrown Segmentation/Object DetectionField Data
Cao et al. [199]HyperspectralChinaClassification of seven mangrove speciesMangrovesBottom-up region-merging/K-nearest neighbor, support vector machine
de Saet al. [191]MultispectralPortugalMapping an invasive plant through its flowering Sydney golden wattleRandom forest
Franklin and Ahmed [193]MultispectralCanadaClassification of 4 hardwood speciesAspen, white birch, sugar maple, red mapleMulti-resolution/Random forest
Gini et al. [185]MultispectralItalyClassification of 11 species in a plant nurseryVariousMaximum likelihood classifier
Komarek et al. [183]RGB, Multispectral, ThermalCzech RepublicClassification of tree, shrub, and herbaceous species in an arboretumVarious (arboretum)Edge method/Support vector machine
Liu et al. [198]MultispectralChinaDetection of an endangered tree speciesFirmiana danxiaensisSVM/Bottom-up region-merging
Mishra et al. [194]MultispectralNepalMapping tree and shrub species along the Himalayan ecotoneHimalayan fir, bell rhododendronMulti-resolution, spectral difference segmentation/Random forest
Rivas-Torres et al. [195]RGBEcuadorMapping native and invasive vegetation in the Galapagos IslandsVariousMulti-resolution/Fuzzy membership function
Saarinen et al. [205]Hyperspectral, RGBFinlandQuantifying deciduous species richness as a biodiversity indicatorScots pine, Norway spruce, birch, alderWatershed/K-nearest neighbor
Tuominen et al. [184]HyperspectralFinlandRecognition of 26 tree species in an arboretumVarious (arboretum)K-nearest neighbor, random forest
Dash et al. [189]Lidar, MultispectralNew ZealandDetection of invasive exotic conifers in New ZealandScots pine, ponderosa pineRandom forests, logistic regression.
Kattenborn et al. [190]RGBChileMapping woody invasive species in ChileRadiata pine, gorse, Silver wattleMaximum entropy
Sothe et al. [200]Hyperspectral, RGBBrazilClassification of 12 tree species in a subtropical forestSubtropical forestSupport vector machine
Waite et al. [186]RGBMalaysiaAssessing liana infestation in a tropical forestLiana (tropical forest)
Wu et al. [188]RGBChinaMapping an invasive plant in a mountain areaBitter vineSupport vector machine, classification and regression tree, K-nearest neighbor
Yaney-Keller et al. [197]MultispectralCosta RicaNDVI discrimination between the 7 most abundant mangrove speciesMangrovesSupport vector machine
Yuan et al. [187]Thermal, MultispectralCosta RicaDetection of liana-infested areas in a tropical forestLiana (tropical forest)
Casapia et al. [206]RGBPeruIdentifying and quantifying economically important palm tree speciesPalmsRegion growing and merging/Various (4)
Kattenborn et al. [203]RGBNew ZealandMapping (i) tree species in primary forests, (ii) woody plant invasion, and (iii) vegetation successionVariousCNN semantic segmentation
Kentsch et al. [192]RGBJapanDetection of an invasive broadleaf tree in a coniferous coastal forestBlack locust
Miyoshi et al. [204]HyperspectralBrazilIdentifying single-tree species in a highly vegetated area Queen palmCNN
Miyoshi et al. [201]HyperspectralBrazilIdentifying 8 tree species in a highly diverse forestVariousRandom forest
Nezami et al. [202]Hyperspectral, RGBFinlandClassifying 3 major tree species in a boreal forestScots pine, Norway spruce, silver birch.3D convolutional neural networks
Sothe et al. [196]HyperspectralBrazilClassifying 16 tree species in subtropical forest fragmentsVariousMultiresolution region growing/Various (4)
The symbol ● indicates field data gathering in the research.

3.7. Conservation, Restoration, and Fire Monitoring

Conservation, restoration, and fire monitoring is a composite topic, and it presents multiple research issues, consisting of 24 papers in total (Table 6). Generally, firefighting has the highest number of scientific works and in particular the post-fire monitoring. This activity is conducted mainly for damage assessment with a multispectral sensor [207,208,209,210] and with RGB for planning rehabilitation measures [211]. In only one case [212], fire prevention activity is carried out with the help of a lidar sensor in the characterization of the forest fuels. For conservation targets, UAVs mount an RGB camera for predicting tree-related microhabitats [213], while a multispectral sensor is used in different case studies of plant conservation [214].
Restoration monitoring is a challenging task that can be tackled effectively through UAV. The core issue is the monitoring of restoration in planted [215,216,217] or natural [218] forests. Thus, in planted forests, mine site revegetation is remotely sensed [219] and detection of conifer seedlings in seismic lines in natural woodland [220].
In this section, drones are never equipped with a hyperspectral sensor.
Satellite platforms are utilized both for comparing multispectral datasets in respect to UAV in the burned landscape [207] and for testing different land-use classifiers with SPOT6 [221] and Pleiades [208] imagery. On the contrary, UAV-acquired images are used as validation dataset to assess the effectiveness of satellite in post-fire monitoring [210] and in tracking the spring phenology in trees [222]. Moreover, UAV products are examined for compensating for the fraction of vegetation cover computed with Sentinel-2 images [223]. The use of aircraft is present only in a few studies and it is aimed at gaining ALS–DTM and tree metrics [217] and at generating a hyperspectral dataset with the NASA Glenn HSI2 sensor [221].
Field measurements are widely used within this topic (14 papers). Together with common tree characteristics, such as species [208] and health status [218,224,225], pruning height [212], and tree-related microhabitats [213] are also recorded. In land-use case studies, ground truth points for the generation of thematic maps [226,227] and the field reflectance of different land cover types [221] are surveyed.
Regarding the most valuable contributions, Khokthong et al. [224] monitor a biodiversity enrichment assessing the mortality of interplanted trees in an oil palm (Elaeis guineensis) monoculture. Although further long-term analysis on other control factors is required, the probability of mortality depends on the amount of oil palm canopy cover, which is related, in turn, to the light requirements of the interplanted species. In a Canadian research study, both aspen (Populus tremuloides) sucker density and height decrease significantly as the level of skidder traffic intensity increases. Therefore, the authors propose and successfully verify the suitability of using winter harvesting to mitigate severe soil compaction. In the scope of fire prevention, Fernandez-Alvarez et al. [212] identify priority areas in the wildland-urban interface by using a UAV-mounted lidar; this way, parameters of pruning height and tree spacing are automatically and objectively obtained.
Table 6. Reviewed studies for conservation, restoration, and fire monitoring topic: sensor, main objective, species, and field data gathering.
Table 6. Reviewed studies for conservation, restoration, and fire monitoring topic: sensor, main objective, species, and field data gathering.
Research PaperSensorMain ObjectiveSpeciesField Data
Baena et al. [214]MultispectralPlant conservation: issues on UAVs use and case studiesVarious
Fernandez-Guisuraga et al. [207]MultispectralPost-fire monitoring: vegetation survey on burned areas Maritime pine
Nagai et al. [225]RGBForest disturbance: evaluation of heavy snow damageJapanese cedar
Roder et al. [218]RGBPost-disturbance monitoring: an inventory of natural regeneration and deadwoodNorway spruce
Rossi et al. [208]MultispectralPost-fire monitoring: delineation of forest cover after mixed-severity firesVarious
Rupasinghe et al. [221]RGBLand use: mapping vegetation of coastal areasRiparian vegetation
Whiteside et al. [219]MultispectralRestoration: monitoring mine site revegetation at scaleUnspecified
Almeida et al. [215]LidarRestoration: quantifying forest changes from a mechanical thinning treatmentVarious
Belmonte et al. [216]RGBRestoration: assessing the structure of a mixed-species plantationPonderosa pine
Berra et al. [222]MultispectralEcosystem monitoring: tracking the temporal dynamics of spring phenologyVarious
De Luca et al. [226]MultispectralLand use: vegetation layer classification in a structurally complex forest ecosystemCork oak
Fernandez-Alvarez et al. [212]LidarFire prevention: characterizing the forest fuels in a wildland-urban interfaceEucalyptus
Fraser and Congalton [227]RGBLand use: assessment of thematic map accuracyVarious
Iizuka et al. [223]RGB, MultispectralLand use: vegetation cover changing in a plantation forest Northern wattle
Khokthong et al. [224]RGBBiodiversity enrichment: assessing the mortality of interplanted trees in an oil palm monocultureVarious
Paolinelli Reis et al. [217]MultispectralRestoration: assessing land cover as an indicator of management interventionsEucalyptus
Rossi and Becke [211]RGBPost-fire monitoring: planning of adaptive rehabilitation measuresTropical forest
Sealey and Van Rees [228]MultispectralDisturbance mitigation: evaluating suitable practices to lessen soil compactionAspen
Shin et al. [209]MultispectralPost-fire monitoring: classification of forest burn severityKorean red pine
Yeom et al. [229]RGBPost-fire monitoring: forest fire damage assessmentUnspecified
Frey et al. [213]RGBBiodiversity conservation: predicting tree-related microhabitats for the selection of retention elementsEuropean beech, Norway spruce, silver fir
Fromm et al. [220]RGBRestoration: detection of conifer seedlings in seismic linesVarious
Padua et al. [210]RGB, MultispectralPost-fire monitoring: assessment of the fire severity and multi-temporal analysisVarious
Sealey and Van Rees [230]RGBPost-disturbance monitoring: effect of residual slash coverage on forest regenerationTrembling aspen
The symbol ● indicates field data gathering in the research.

4. Discussion

This section reports the discussions for each topic following the RQs formulated in Section 2.2 as a general guide. Cross issues regarding hyperspectral sensors, comparison with other RS platforms, and gathering and use of field data are argued. Moreover, a comprehensive discussion is dedicated to the application of machine learning techniques. Constraints and opportunities for technology transfer of UAV–RS to a real management context are also debated. Finally, brief considerations are drawn on the costs of different UAV solutions.
In setting and accuracy of imagery products, researchers focus mainly on forward and lateral overlap, acquisition timing through daytime and seasons, view angle, and ground resolution as an outcome of flight altitudes × sensor resolution. It worth noting that although intensifying forward overlap can be performed without any additional cost in UAV campaigns [231], enhancing lateral overlap would bring up the operational cost due to the requirement of more flight lines [232]. In some research works [5,145,161], an oblique angle (off-nadir) is set for image acquisition in association with low fling altitude; this view from the side rather than from the top (nadir) can be challenging in the analysis when using traditional RS algorithms [233]. Nevertheless, off-nadir acquisition can contribute to the completeness of the image reconstruction as evidenced by Nesbit [234]. The most representative imagery products are point cloud, DTM, DEM, and CHM. Moreover, 3D point clouds contain primary structural information to calculate forest attributes. In particular, point clouds by acquiring both ground surface and tree vegetation allow deriving CHM, which represents tree height and one of the main sources for estimation of other forest attributes. When acquiring point cloud for deriving CHM, UAV images are recommended to be acquired during the leaf-on season [235] to avoid negative results, such as image aberration. For UAV data acquisition, there are two major structural sources within the reviewed studies—DAP and ALS. The imagery acquired with DAP is then processed using the SfM technique to obtain a 3D point cloud. Sometimes, forest uniform texture, repeating patterns, and potential movement due to wind can be challenging for SfM matching algorithms, potentially leading to incomplete reconstruction or noisy point clouds [236]. Despite this fact, SfM is widely and successfully applied in forest monitoring. Point clouds derived from UAV–DAP and UAV–ALS are substantially denser than those acquired by aircraft-mounted ALS; the latter results in a more continuous canopy representation and improved detection of canopy tops [45], even if the 3D quality is strongly related to flight parameters and camera settings [237]. DAP has recently become a very popular technique because it can require only a consumer-grade solution, which often includes an off-the-shelf RGB sensor as resulted by the current review [7,12,19]. As a major drawback in comparison to UAV–ALS, DAP is limited to the characterization of the outer canopy envelope [238], while lidar can acquire the vertical profile of vegetation also operating under-canopy conditions [121]. Finally, photogrammetry software products are quite numerous on the market and new ones are constantly appearing. Surely, they play a fundamental role in UAV product reconstruction, but they present several critical points, such as high demand for computation time/resource, poor availability of open-source solutions, and, sometimes, limited impact on processing workflow by the user.
UAV technologies proved to be a very effective and powerful tool in the modern management of forest inventories. Their applications in this area are based on detecting the entire tree or some specific features, such as newly grown leaves, seedling, stump, fallen log, and forest gaps. Nevertheless, it is worth noting that the number of works dedicated to the evaluation of forest regeneration (seedling) is surprisingly low. This could be considered a shortcoming for forest practitioners who utilize regeneration status as essential information for management decisions, i.e., for replacing damaged seedlings [239]. In parallel, UAV–RS is fully applicated for obtaining accurate information from practical parameters (height, DBH, and crown diameter). The best-described parameter in vegetation assessment is height. Both individual tree delineation or area-based (stand-level) approaches are successfully used for a direct measure of vegetation height. High accuracy in data assessment is obtained also by the two main types of sensors—optical and lidar. In some cases, the reviewed papers investigate research-oriented tasks, such as phenology assessment, phenotyping, and canopy light behavior. The number of scientific papers encompasses within this topic is considerable and their results show a high degree of effectiveness and soundness. This demonstrates how UAV–RS is now a ready-to-use tool applicable in a real management context. Regarding the assessment of forest basic structural parameters, UAV–RS is a quick and reliable technology and research achievements can be made available to forest practitioners. The goal, however, is not entirely accomplished— there is a need for more efforts by academia in technology transfer and training of technicians.
UAV–RS provides an alternative for estimating tree biomass on various forest extents in comparison to field surveys. Despite this being an accurate method, collecting and weighing samples of vegetation is time-consuming, labor-intensive, and destructive [240]. In this framework, UAVs can bridge the gap between ground observations and traditional manned aircraft or spaceborne platforms [148]. Tree biomass estimation is performed aiming at two major research goals—environmental assessment and potential profitability of timber. The environmental research goal is pursued in the natural forest where biomass is often estimated as carbon stock [127,128,129]. More practical issues are conversely addressed and declined in various aspects for estimation of timber biomass, namely, in estimating the merchantable volume of mixed forests [140], assessing AGB of high-value timber and its sustainability over time for logging operations [160], or quantifying pruning biomass as a coproduct for sale in a sweet chestnut orchard [130]. No remote sensing method directly measures biomass. Therefore, researchers use basic variables (height, DBH, crown diameter) to indirectly predict biomass through scaling equations/models, which are themselves developed from a relatively small number of trees that have been measured, harvested, and weighed [20]. Several studies use allometric equations from literature; these equations relate biomass to measurable biophysical parameters (DBH, height, or canopy area) and represent an effective method for biomass estimation. Nevertheless, their use is strictly associated with tree genus or species and their development needs calibration with direct biomass information [241]. The other approach for gathering UAV-derived biomass is represented by general or often customized (newly developed) models. They require a formal analysis being implemented through a fitting method, mainly simple and multiple linear regression or machine learning regression. Moreover, most or all these models lean on field data for their construction or validation [129,132,138,158], as demonstrated also by the intensive campaigns performed by 82% of the reviewed studies. Despite these critical issues, custom models have the undoubted advantage of being specific to the forested area under investigation. Finally, a possible way to overcome allometry uncertainty could encompass the estimation of wood volume using very high-density laser scans [242].
Considering that threats to forests are globally increasing, monitoring of forest health is of pivotal importance as part of sustainable forest management [243]. When compared with field assessments, UAV–RS may represent a very suitable solution by providing an objective and modulable approach, high spatiotemporal resolution, and quick results [166,237]. Hence, the identification of causal agents inducing the declining forest health becomes imperative. The present review collects papers that mostly deal with biotic stressors, such as insects, fungi, bacteria, and nematodes. In particular, the decay of forest health is mainly due to insects belonging to different orders, such as Coleoptera (European spruce bark beetle and four-eyed fir bark beetle), Lepidoptera (pine processionary moth and Chinese pine caterpillar), and Hymenoptera (oriental chestnut gall wasp). A relevant share of the forested area under investigation is composed of conifer trees, such as Norway spruce and various species of pine. These species are located in different countries (Czech Republic, Austria, Finland, Korea), where forests represent a key environmental and economic asset and where UAV–RS technology can be considered quite widespread for few years. UAV imagery products are mostly generated by using RGB sensors, which can detect reflectance increases in the red and green band of discolored vegetation stressed by a pest or disease [244]. Being sensitive to a decrease in the NIR (Near Infrared) reflectance, multispectral cameras are also used to identify and quantify forest defoliation [245]. Although progress needs to be made in research, UAVs can allow managers to effectively operate to maintain healthy and productive forests, as demonstrated by the reviewed papers.
In species recognition and invasive plant detection topic, all the research applications are carried out in natural/irregular forests, especially in places where the biodiversity rate is high, such as in subtropical and tropical forests, and boreal and cold-temperate woodlands. Species detection involves the classification of few dominant hardwood trees, the recognition of mangrove species and economically important palms, and the discrimination of vegetation types (herbaceous, shrub, and tree). In some cases, researchers perform a proof of concept by detecting more than 10 tree species in a plant nursery [185] and arboretum [184]. Invasive plant identification is also a research issue of great interest—isolated and high-biodiverse ecosystems are threatened by perennial creepers, conifers, and broad-leaved trees. Regarding future research, species recognition and invasive plant detection will benefit from the analysis of streamed imagery. Real-time processing will enable timely detection and, in a not very far future, eradication of invasive plant could be performed at the same time together with the identification step [246] (i.e., through spraying drones). In the end, acquiring specific richness and presence of invasive species over time can give information about forest resilience, also after disturbance events, for predicting forest status and its capacity to recover [247].
Conservation, restoration, and fire monitoring is a composite topic comprehending a constellation of scientific issues. Restoration monitoring is performed for mine site revegetation and the evaluation of mechanical thinning. For detection of seedling, drone imagery represents an unrivaled tool by providing ground sampling distance often reaching sub-centimeter level [239]. Land use is declined as thematic map creation or as an indicator of conservation interventions, while studies on biodiversity are addressed to plant conservation, species enrichment, and prediction of tree-related microhabitat. The impacts of logging activities are evaluated in postharvest areas by analyzing the effect of residual slash coverage on forest regeneration and the suitable practices to lessen soil compaction. Wildfires are an increasing global concern especially because of global warming and changes in land use [248]. UAV technologies are employed only in one of the reviewed studies for a crucial activity as fire prevention. On the contrary, post-fire monitoring is debated by various studies, especially for planning adaptive rehabilitation measures and for classifying forest burn severity. Since the reflectance of a burned area has higher visible and SWIR (Short-Wave Infrared) values and lower NIR values in comparison to an untouched forest [249], burned vegetation can be determined by classification through a single-time post-fire image. On the contrary, for large and heterogeneous areas, an approach based on temporal comparison of thermal anomalies is considered more reliable [250]. Moreover, post-fire monitoring is essential for preventing the danger of landslides or other secondary disasters [209]. To a broader extent, understanding forest dynamics and drivers of change are pivotal for preservation in a context of rapid change [250]—UAV–RS can regularly acquire up-to-date and affordable data for all the above-mentioned forest conservation purposes. This confirms the findings of Eugenio et al. [251], who state that the use of UAV for ecosystem conservation has gained notoriety in different directions.
The specific combinations of wavelengths identified through hyperspectral sensors could be used to improve forest inventory and health information and increase information about biodiversity and natural disturbances. The present studies review 19 research papers, in total, acquiring UAV imagery utilizing a hyperspectral sensor. Camera technologies vary from off-the-shelf shelf packages to newly developed commercial products not yet on the market (used for test) and customized/prototype solutions. The most used imaging approach is the line scan (i.e., the so-called pushbroom). Species detection (woodland tree/invasive plant) makes the widest use of hyperspectral sensors (42% of the reviewed studies within this topic). A performance assessment of vegetation indexes retrieved through the hyperspectral sensor and a correct waveband region selection are key factors to identify different species [252]. Moreover, the wavebands selected for classification are influenced by taxonomic and structural features of the plant and the methods and scale at which hyperspectral imagery is acquired [253]. Nowadays, there is a fast-growing trend observed recently in hyperspectral research in terms of technological developments (data collection, data resolution, spatial coverage, and processing) and in the application domain, where multidisciplinary approaches emerged, including forestry applications [254]. Nevertheless, hyperspectral sensors are not yet available, especially in many developing countries, because of the high market prices of commercial packages. This deprives forest managers of a potential monitoring tool whose characteristics could be used to promptly block disease outbreaks or the diffusion of invasive species in tropical and subtropical woodland.
The comparison between UAV and other RS platforms is crucial for understanding how different technologies are used and for selecting the most suitable solution for research and management purposes. Spaceborne platforms are mainly employed for upscaling results derived from UAV to broader areas and as a source of complementary data or in multi-temporal forest monitoring. In a few cases, satellite images are used as the main data source. Manned aircraft, often equipped with lidar, are used for providing reference datasets or for comparing images with a different spatial extent. It is worth mentioning that a new generation of high spatial resolution satellites with a revisit time of less than a week is coming to the fore [255], together with new sources of open satellite imagery, such as Landsat and Sentinel-2 [256]. Despite all of these developments, compared to satellites, UAV flexibility makes them more suited to acquire local imagery promptly at a very high spatial resolution [257]. Compared to manned aircraft, instead, UAVs represent a growing alternative to commercial airplanes, which are more expensive also than crew-based field campaigns for a typical forest extent [127]. In 2015, Tang and Shao [1] have already reasonably anticipated that the role of drone remote sensing would have overtaken manned aircraft in the near future. Moreover, small unmanned aerial platforms have the advantage of being able to be flown in response to specific events (post-fire monitoring, disease outbreak) and can also fly under a cloudy sky [258]. However, above all it is to be highlighted that, especially in professional forestry, the RS platform choice must be guided by the specific purposes of the activity, considering also the technical skills and resources required for data management and processing.
More than 60% of studies through the entire dataset collect ground data. Field campaigns are carried out to gather mainly basic inventory parameters (DBH, height, crown diameter), tree species, or health status. In several studies, also ground control points are acquired through GPS (Global Positioning System) for the geometric calibration of UAV-derived imagery products. Ground measurements of woodland features consume substantial resources in terms of both time and cost at any spatial scale. Moreover, in-field surveys are de facto impossible over large spatial and temporal scales [259]. Nevertheless, field data gathering still plays a crucial role in forest monitoring and management. They are often labeled as “ancillary data,” but this definition seems a bit reductive, considering their importance. Among the reviewed studies, they are indeed collected and used for dataset validation, accuracy assessment, result comparison, and regression analysis in biomass estimation. Few studies [123,138] try not to use field data but, necessarily, they are obligated to rely on previous surveys or collect new measurements for comparing results or validating their workflow.
The implementation of automatic processes is crucial to extract information and provide forest inventories from remote sensing data [260]. Object-based image analysis (OBIA) is usually composed of two main phases, namely, (1) image segmentation and (2) feature extraction and classification. [261]. Crown segmentation algorithms and object detection methods are intensively exploited through inventory parameters, biomass estimation, and species recognition to automatically detect, segment, and classify trees. Image segmentation is a critical and important step in the OBIA process. Most of the approaches for individual tree crown delineation utilize interpretation of CHM derived from photogrammetric or lidar 3D point cloud by applying different algorithms, such as the watershed algorithm and its variants, the multiresolution segmentation algorithm, and existing or new developed convolutional neural networks architectures. Machine learning offers great potential for the efficient processing of remote sensing data, and it is often used for final object classification in OBIA. These techniques are especially exploited to recognize and classify plant species. Among the most frequent, there are support vector machines, random forest, and convolutional neural networks. Automatic tree detection is a complex process in imagery processing. Different types and numerous objects could be identified through UAV images; therefore, robust algorithms are required to correctly detect the targeted objects [262]. Performing crown segmentation and machine learning algorithms, authors are supported by different software packages both proprietary applications, i.e., eCognition, Matlab, and ENVI, and open-source applications, i.e., R (especially with LidR and rLidar packages). Through built-in algorithms, they can identify objects from UAV imagery exploiting spatial, spectral, and texture forest characteristics and thus extract specific tree features [263]. Machine learning algorithms are efficient tools and ensure standardization of the process. Their results are not necessarily more correct than those achieved through visual interpretation. However, they can handle automatically massive workloads, thus allowing to extend monitoring over large forested areas—this is one of the undoubted advantages that has boosted their use in UAV forest monitoring. Nevertheless, the implementation and use of automatic processes for image analysis have some weak points. One of the major obstacles is the perceived lack of interpretability of these methods, which are often considered to be black-box models [264]. The full replicability of the results using such techniques is made difficult by complex workflows (sometimes also for researchers!) and by proprietary software that limits the ability to understand some processing outcomes [21]. This weak point strongly hinders the technology transfer of image automatic analysis and limits the fruition for professional users to the mere acceptance of the results rather than to the understanding of the entire processing workflow.
Hereafter, this discussion section brainstorms two general aspects of UAV–RS forestry research—technology transfer and operational costs.
Research in forestry remote sensing often has an applied perspective, as demonstrated by numerous reviewed papers (39%) containing in the abstract words related to management. Tackling the challenges of knowledge transfer from scientists to managers is a key issue for the development of sustainable forestry. Hence, it is of pivotal importance to focus on how remote sensing advances can be converted into valuable and available tools to forest managers. They are doubtful about the operational potential of RS techniques [265]. The objectives of technology transfer are often quite tough to define by mutual consent of researchers and managers. To obtain the best achievements, there is a need for explicit space for collaboration and sometimes it is required to challenge the work routine of managers for seeking adapted solutions [266]. This process could be assisted by a technology transfer professional and a pool of scientists with different expertise to avoid narrow solutions. In the case of UAV platforms, researchers can be considered both RS data producers and RS tool developers. As data producers, they can provide affordable and easy-to-use datasets; as developers, they can implement tailored tools using raw data and interacting closely with forest managers [267]. Therefore, after the identification of the objectives, researchers bear most of the burden of developing new solutions to make RS more accessible to professionals. The final step in conveying RS techniques is the evaluation by final users. Tailored tools must be easy to use and integrated into existing workflows [265]. Moreover, feedback from practitioners on real use and gained results is fundamental also for continuously improving the proposed solutions. Having said that, what are the concrete actions to improve the technology transfer process? For sure, one of the pillars on which technology transfer must be based is training. It can also be carried out through remote classes including at the same time in-person visits by specialists to forestry companies. Secondly, the tailored processing routines developed by researchers should be embedded into flexible and interoperable open-source toolboxes. These tools could be a foundation that forest stakeholders could adapt to specific projects [267]. Thirdly, it is necessary to facilitate access to RS data to broaden the user base, also by implementing open and user-friendly environments to handle data [268]. This issue concerns particularly forest management in developing countries considering that expertise in RS data acquisition and analysis is mainly held by stakeholders of wealthier countries [269]. Part I of the present review [3] reported on how UAV–RS research studies are poor in places where the forest heritage is enormous (especially in Africa and South America)—technology transfer is essential, but, in these cases, its success does not depend only on researchers’ efforts but involves all forestry stakeholders, especially public organisms.
UAV platform and sensor technologies are constantly evolving; moreover, forest monitoring can be considered a niche sector compared to other UAV civilian applications. Therefore, a thorough cost analysis of UAV for forest application is not easy to complete nor is it one of the main objectives of this study. However, it can be stated that simple remote sensing approaches can be sometimes deployed with only minor investment [267]. Nevertheless, the fixed costs are often relatively high, and implementing a remote sensing approach can be prohibitive for forest stakeholders. Global costs include designing the operational plan, deploying the platform, and acquiring processing software. Not least, performing certain analyses requires technical skills, and therefore, external trained staff or training course for in-house personnel are required. Below, some examples of budget assessment for UAV in forestry applications are reported. Padua et al. [270] estimate costs of UAV systems in small areas (up to 50 ha) as follows: EUR 10,000 for disease detection and identification (multispectral), EUR 3000 for vegetation height maps (optical), and EUR 2000 for biomass estimation (optical). For broader areas (up to 5 km2), the estimated costs are EUR 30,000 and EUR 25,000 for forest inventory (lidar) and post-fire burn area estimation (multispectral), respectively. In the Canadian boreal forest, Chen et al. [271] measure vegetation height for 30 sites on seismic lines (overall extension of ca. 3 ha). They estimate a total cost, including field crew, of about EUR 9300 for UAV–ALS and EUR 6800 for UAV–DAP. A possible solution for reducing costs of UAV–RS could be represented by high-performing and fully free software for image processing, which is increasingly appearing on the market or customized full packages, directly usable by final users. This last hypothesis, even if more suitable, seems to be more difficult to pursue due to the different missions of stakeholders involved in forest management. Moreover, UAV-forestry narrow market provides relatively limited opportunities for companies to develop tools adapted to any specific goals of woodland monitoring. This is why the authors hope for greater collaboration among companies, researchers, and forestry stakeholders to implement new solutions that are increasingly suited to the needs of forest management.

5. Conclusions

The present study analyzes a substantial body of literature by gathering a comprehensive dataset (227 articles) dealing with UAV forest remote sensing over a recent timespan (2018–mid–2020). This Part II of the review discusses specific technical issues of applying UAV–RS research in different forest ecosystems. As reported also in Part I [3], the final assessment presents many positives, but few weak points also emerge.
UAV–RS contributes to understanding forest ecosystems better by allowing insights into forest status and dynamics. Regarding strong points, UAV data acquisition is based mainly on two consolidated structural sources, i.e., DAP and ALS. SfM processing technique is particularly exploited and investigated with positive results by the reviewed papers. UAV–RS is fully applicated for obtaining accurate information from practical parameters (height, DBH, and crown diameter) with a considerable number of researches dealing with this topic. Their effectiveness and soundness demonstrate that UAV–RS is now ready to be applied in a real management context. Researchers have made a lot of effort to estimate tree biomass, especially through allometric custom models, which have the undoubted advantage of being specific to the forested area under investigation.
Nevertheless, challenges still exist regarding both purely technical (real-time image processing, hyperspectral sensor spreading, RS platform interoperability) and general issues (i.e., flight regulatory regime, technology transfer), and they can hinder UAV–RS progress. Therefore, improvable and unclear issues require additional research. To help managers maintaining healthy and productive forests, the number of articles tackling pest and disease detection should be greatly increased. Real-time species recognition should be desirable for an on-the-flight detection and timely eradication of invasive plants, by merging the steps of image acquisition and processing. Even though huge efforts are being made in this direction, great potentials remain to be explored for the hyperspectral sensor. Hyperspectral imagery is too little used. Novel applications for species recognition or pest and disease detection should be based on the capability of acquiring spectral signature, which is linked with tree characteristics and status. Greater interoperability would be desirable between UAV and freely available satellite data, even if they are not always easy to access for professionals. Spaceborne imagery could be used as a screening tool to pinpoint forest anomalies and then, where problems or peculiarities arise, UAVs could be deployed to perform in-depth monitoring. Although field data are widely gathered throughout the reviewed papers, they should be leveraged more for model validation, especially when using UAV to predict vegetation biomass as this guarantees robust and transferable results. The use of automatic processes for image analysis is certainly destined to grow and represents a leap forward for UAV–RS forest monitoring—sometimes they seem to be black boxes due to the poor flexibility of proprietary software and the complexity of workflow. These aspects can be very challenging and can hinder the technology transfer and thus the full understanding (conscious fruition) of non-specialist users, such as forest managers and professionals.
While the technical skills of researchers are appropriate for addressing the complexity of forest UAV–RS, there is still a lot that can be accomplished to bridge the gap between academia and forest stakeholders. For improving technology transfer, researchers should develop approaches that are robust to slightly different contexts and that take advantage of easy-to-collect data. Additionally, processing routines tailored to stakeholder needs should be embedded into flexible and interoperable open-source tools. To boost the technology transfer, there is also a need for funding through public and dedicated resources. Moreover, cooperation projects could help the spread of RS monitoring in some areas where it is lacking and where its use could be strongly recommended to help manage and protect the huge forest heritage. Of course, the burden of technology transfer cannot be entirely borne by researchers, and for this reason, the authors hope for an effective collaboration among all forestry stakeholders to develop new solutions increasingly tailored to forest management.
In completing this review, the authors came across only a few studies (here not encompassed) dealing with the use of UAV as a tool of forestry operations. These are limited to aerial seeding [272,273] and to pesticide spraying [274], which, for the sake of completeness, is not yet allowed in large areas within the European Union. Nevertheless, the rapid advancement of UAV–RS seems to be unstoppable, and developing unmanned operating platforms could be one of the new frontiers of UAV research in forestry applications.

Author Contributions

Conceptualization, R.D., P.T., S.F.D.G., and A.M.; methodology, R.D., P.T., S.F.D.G., and A.M.; software, R.D.; formal analysis, R.D.; investigation, R.D.; data curation, R.D.; writing—original draft preparation, R.D.; writing—review and editing, R.D., P.T., S.F.D.G., and A.M.; visualization, R.D.; supervision, P.T., S.F.D.G., and A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available in the research articles displayed all along the text or summarized in Table 1 of Part I [3].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Tang, L.; Shao, G. Drone Remote Sensing for Forestry Research and Practices. J. For. Res. 2015, 26, 791–797. [Google Scholar] [CrossRef]
  2. Ruiz, M.; Santos, L.; Van der Elstraeten, A. Drones for Community Monitoring of Forests. New Technologies for Self-Management of Indigenous Territories in Panama; FAO: Panama City, Panama, 2018. [Google Scholar]
  3. Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent advances in Unmanned Aerial Vehicles forest remote sensing–A systematic review. Part I: A general framework. Forests 2021, 12, 327. [Google Scholar] [CrossRef]
  4. Aguilar, F.J.; Rivas, J.R.; Nemmaoui, A.; Peñalver, A.; Aguilar, M.A. UAV-based digital terrain model generation under leaf-off conditions to support teak plantations inventories in tropical dry forests. A case of the coastal region of Ecuador. Sensors 2019, 19, 1934. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Díaz, G.M.; Mohr-Bell, D.; Garrett, M.; Muñoz, L.; Lencinas, J.D. Customizing unmanned aircraft systems to reduce forest inventory costs: Can oblique images substantially improve the 3D reconstruction of the canopy? Int. J. Remote Sens. 2020, 41, 3480–3510. [Google Scholar] [CrossRef]
  6. Guan, H.; Zhang, J.; Ma, Q.; Liu, M.; Wu, F.; Guo, Q.; Su, Y.; Hu, T.; Wang, R.; Ma, Q.; et al. A Novel Framework to Automatically Fuse Multiplatform LiDAR Data in Forest Environments Based on Tree Locations. IEEE Trans. Geosci. Remote Sens. 2020, 58, 2165–2177. [Google Scholar] [CrossRef]
  7. Frey, J.; Kovach, K.; Stemmler, S.; Koch, B. UAV photogrammetry of forests as a vulnerable process. A sensitivity analysis for a structure from motion RGB-image pipeline. Remote Sens. 2018, 10, 912. [Google Scholar] [CrossRef] [Green Version]
  8. Wallace, L.; Bellman, C.; Hally, B.; Hernandez, J.; Jones, S.; Hillman, S. Assessing the ability of image based point clouds captured from a UAV to measure the terrain in the presence of canopy cover. Forests 2019, 10, 284. [Google Scholar] [CrossRef] [Green Version]
  9. Jurjević, L.; Gašparović, M.; Milas, A.S.; Balenović, I. Impact of UAS image orientation on accuracy of forest inventory attributes. Remote Sens. 2020, 12, 404. [Google Scholar] [CrossRef] [Green Version]
  10. Goodbody, T.R.H.; Coops, N.C.; Hermosilla, T.; Tompalski, P.; Pelletier, G. Vegetation phenology driving error variation in digital aerial photogrammetrically derived Terrain Models. Remote Sens. 2018, 10, 1554. [Google Scholar] [CrossRef] [Green Version]
  11. Hakala, T.; Markelin, L.; Honkavaara, E.; Scott, B.; Theocharous, T.; Nevalainen, O.; Näsi, R.; Suomalainen, J.; Viljanen, N.; Greenwell, C.; et al. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization. Sensors 2018, 18, 1417. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Jayathunga, S.; Owari, T.; Tsuyuki, S. Evaluating the performance of photogrammetric products using fixed-wing UAV imagery over a mixed conifer-broadleaf forest: Comparison with airborne laser scanning. Remote Sens. 2018, 10, 187. [Google Scholar] [CrossRef] [Green Version]
  13. Ni, W.; Sun, G.; Pang, Y.; Zhang, Z.; Liu, J.; Yang, A.; Wang, Y.; Zhang, D. Mapping Three-Dimensional Structures of Forest Canopy Using UAV Stereo Imagery: Evaluating Impacts of Forward Overlaps and Image Resolutions with LiDAR Data as Reference. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 3578–3589. [Google Scholar] [CrossRef]
  14. Ruwaimana, M.; Satyanarayana, B.; Otero, V.; Muslim, A.M.; Muhammad Syafiq, A.; Ibrahim, S.; Raymaekers, D.; Koedam, N.; Dahdouh-Guebas, F. The advantages of using drones over space-borne imagery in the mapping of mangrove forests. PLoS ONE 2018, 13, 1–22. [Google Scholar] [CrossRef] [Green Version]
  15. Oliveira, R.A.; Tommaselli, A.M.G.; Honkavaara, E. Generating a hyperspectral digital surface model using a hyperspectral 2D frame camera. ISPRS J. Photogramm. Remote Sens. 2019, 147, 345–360. [Google Scholar] [CrossRef]
  16. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of drone altitude, image overlap, and optical sensor resolution on multi-view reconstruction of forest images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef] [Green Version]
  17. Tomaštík, J.; Mokroš, M.; Surový, P.; Grznárová, A.; Merganič, J. UAV RTK/PPK method-An optimal solution for mapping inaccessible forested areas? Remote Sens. 2019, 11, 721. [Google Scholar] [CrossRef] [Green Version]
  18. Fraser, B.T.; Congalton, R.G. Issues in Unmanned Aerial Systems (UAS) data collection of complex forest environments. Remote Sens. 2018, 10, 908. [Google Scholar] [CrossRef] [Green Version]
  19. Brach, M.; Chan, J.C.W.; Szymański, P. Accuracy assessment of different photogrammetric software for processing data from low-cost UAV platforms in forest conditions. IForest 2019, 12, 435–441. [Google Scholar] [CrossRef]
  20. Kellner, J.R.; Armston, J.; Birrer, M.; Cushman, K.C.; Duncanson, L.; Eck, C.; Falleger, C.; Imbach, B.; Král, K.; Krůček, M.; et al. New Opportunities for Forest Remote Sensing Through Ultra-High-Density Drone Lidar. Surv. Geophys. 2019, 40, 959–977. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Fletcher, A.; Mather, R. Hypertemporal imaging capability of uas improves photogrammetric tree canopy models. Remote Sens. 2020, 12, 1238. [Google Scholar] [CrossRef] [Green Version]
  22. Graham, A.; Coops, N.C.; Wilcox, M.; Plowright, A. Evaluation of ground surface models derived from unmanned aerial systems with digital aerial photogrammetry in a disturbed conifer forest. Remote Sens. 2019, 11, 84. [Google Scholar] [CrossRef] [Green Version]
  23. Yu, R.; Lyu, M.; Lu, J.; Yang, Y.; Shen, G.; Li, F. Spatial coordinates correction based on multi-sensor low-altitude remote sensing image registration for monitoring forest dynamics. IEEE Access 2020, 8, 18483–18496. [Google Scholar] [CrossRef]
  24. Graham, A.N.V.; Coops, N.C.; Tompalski, P.; Plowright, A.; Wilcox, M. Effect of ground surface interpolation methods on the accuracy of forest attribute modelling using unmanned aerial systems-based digital aerial photogrammetry. Int. J. Remote Sens. 2020, 41, 3287–3306. [Google Scholar] [CrossRef]
  25. Polewski, P.; Yao, W.; Cao, L.; Gao, S. Marker-free coregistration of UAV and backpack LiDAR point clouds in forested areas. ISPRS J. Photogramm. Remote Sens. 2019, 147, 307–318. [Google Scholar] [CrossRef]
  26. Wang, Y.; Zhu, X.; Wu, B. Automatic detection of individual oil palm trees from UAV images using HOG features and an SVM classifier. Int. J. Remote Sens. 2019, 40, 7356–7370. [Google Scholar] [CrossRef]
  27. Picos, J.; Bastos, G.; Míguez, D.; Alonso, L.; Armesto, J. Individual tree detection in a eucalyptus plantation using unmanned aerial vehicle (UAV)-LiDAR. Remote Sens. 2020, 12, 885. [Google Scholar] [CrossRef] [Green Version]
  28. Demir, N. Using UAVs for detection of trees from digital surface models. J. For. Res. 2018, 29, 813–821. [Google Scholar] [CrossRef]
  29. Alexander, C.; Korstjens, A.H.; Hankinson, E.; Usher, G.; Harrison, N.; Nowak, M.G.; Abdullah, A.; Wich, S.A.; Hill, R.A. Locating emergent trees in a tropical rainforest using data from an Unmanned Aerial Vehicle (UAV). Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 86–90. [Google Scholar] [CrossRef]
  30. Gil-Docampo, M.L.; Ortiz-Sanz, J.; Martínez-Rodríguez, S.; Marcos-Robles, J.L.; Arza-García, M.; Sánchez-Sastre, L.F. Plant survival monitoring with UAVs and multispectral data in difficult access afforested areas. Geocarto Int. 2020, 35, 128–140. [Google Scholar] [CrossRef]
  31. Xu, J.; Gu, H.; Meng, Q.; Cheng, J.; Liu, Y.; Jiang, P.; Sheng, J.; Deng, J.; Bai, X. Spatial pattern analysis of Haloxylon ammodendron using UAV imagery-A case study in the Gurbantunggut Desert. Int. J. Appl. Earth Obs. Geoinf. 2019, 83, 101891. [Google Scholar] [CrossRef]
  32. Lin, C.; Chen, S.Y.; Chen, C.C.; Tai, C.H. Detecting newly grown tree leaves from unmanned-aerial-vehicle images using hyperspectral target detection techniques. ISPRS J. Photogramm. Remote Sens. 2018, 142, 174–189. [Google Scholar] [CrossRef]
  33. Chen, S.Y.; Lin, C.; Tai, C.H.; Chuang, S.J. Adaptive window-based constrained energy minimization for detection of newly grown tree leaves. Remote Sens. 2018, 10, 96. [Google Scholar] [CrossRef] [Green Version]
  34. Feduck, C.; McDermid, G.J.; Castilla, G. Detection of coniferous seedlings in UAV imagery. Forests 2018, 9, 432. [Google Scholar] [CrossRef] [Green Version]
  35. Puliti, S.; Talbot, B.; Astrup, R. Tree-stump detection, segmentation, classification, and measurement using Unmanned aerial vehicle (UAV) imagery. Forests 2018, 9, 102. [Google Scholar] [CrossRef] [Green Version]
  36. Panagiotidis, D.; Abdollahnejad, A.; Surový, P.; Kuželka, K. Detection of fallen logs from high-resolution UAV images. N. Z. J. For. Sci. 2019, 49. [Google Scholar] [CrossRef]
  37. Bagaram, M.B.; Giuliarelli, D.; Chirici, G.; Giannetti, F.; Barbati, A. UAV remote sensing for biodiversity monitoring: Are forest canopy gaps good covariates? Remote Sens. 2018, 10, 1397. [Google Scholar] [CrossRef]
  38. dos Santos, A.A.; Marcato Junior, J.; Araújo, M.S.; Di Martini, D.R.; Tetila, E.C.; Siqueira, H.L.; Aoki, C.; Eltner, A.; Matsubara, E.T.; Pistori, H.; et al. Assessment of CNN-based methods for individual tree detection on images captured by RGB cameras attached to UAVS. Sensors 2019, 19, 3595. [Google Scholar] [CrossRef] [Green Version]
  39. Goodbody, T.R.H.; Coops, N.C.; Hermosilla, T.; Tompalski, P.; Crawford, P. Assessing the status of forest regeneration using digital aerial photogrammetry and unmanned aerial systems. Int. J. Remote Sens. 2018, 39, 5246–5264. [Google Scholar] [CrossRef]
  40. Carl, C.; Lehmann, J.R.K.; Landgraf, D.; Pretzsch, H. Robinia pseudoacacia L. in short rotation coppice: Seed and stump shoot reproduction as well as UAS-based spreading analysis. Forests 2019, 10, 235. [Google Scholar] [CrossRef] [Green Version]
  41. Yurtseven, H.; Akgul, M.; Coban, S.; Gulci, S. Determination and accuracy analysis of individual tree crown parameters using UAV based imagery and OBIA techniques. Meas. J. Int. Meas. Confed. 2019, 145, 651–664. [Google Scholar] [CrossRef]
  42. Tian, J.; Dai, T.; Li, H.; Liao, C.; Teng, W.; Hu, Q.; Ma, W.; Xu, Y. A novel tree height extraction approach for individual trees by combining TLS and UAV image-based point cloud integration. Forests 2019, 10, 537. [Google Scholar] [CrossRef] [Green Version]
  43. He, H.; Yan, Y.; Chen, T.; Cheng, P. Tree height estimation of forest plantation in mountainous terrain from bare-earth points using a DoG-coupled radial basis function neural network. Remote Sens. 2019, 11, 1271. [Google Scholar] [CrossRef] [Green Version]
  44. Krause, S.; Sanders, T.G.M.; Mund, J.P.; Greve, K. UAV-based photogrammetric tree height measurement for intensive forest monitoring. Remote Sens. 2019, 11, 758. [Google Scholar] [CrossRef] [Green Version]
  45. Fankhauser, K.E.; Strigul, N.S.; Gatziolis, D. Augmentation of traditional forest inventory and Airborne laser scanning with unmanned aerial systems and photogrammetry for forest monitoring. Remote Sens. 2018, 10, 1562. [Google Scholar] [CrossRef] [Green Version]
  46. Huang, H.; He, S.; Chen, C. Leaf abundance affects tree height estimation derived from UAV images. Forests 2019, 10, 931. [Google Scholar] [CrossRef] [Green Version]
  47. Vanderwel, M.C.; Lopez, E.L.; Sprott, A.H.; Khayyatkhoshnevis, P.; Shovon, T.A. Using aerial canopy data from UAVs to measure the effects of neighbourhood competition on individual tree growth. For. Ecol. Manage. 2020, 461, 117949. [Google Scholar] [CrossRef]
  48. Dalla Corte, A.P.; Rex, F.E.; de Almeida, D.R.A.; Sanquetta, C.R.; Silva, C.A.; Moura, M.M.; Wilkinson, B.; Zambrano, A.M.A.; da Cunha Neto, E.M.; Veras, H.F.P.; et al. Measuring individual tree diameter and height using gatoreye high-density UAV-lidar in an integrated crop-livestock-forest system. Remote Sens. 2020, 12, 863. [Google Scholar] [CrossRef] [Green Version]
  49. Mokroš, M.; Liang, X.; Surový, P.; Valent, P.; Čerňava, J.; Chudý, F.; Tunák, D.; Saloň, Š.; Merganič, J. Evaluation of Close-Range Photogrammetry Image Collection Methods for Estimating Tree Diameters. ISPRS Int. J. Geo-Inf. 2018, 7, 93. [Google Scholar] [CrossRef] [Green Version]
  50. Liang, X.; Wang, Y.; Pyörälä, J.; Lehtomäki, M.; Yu, X.; Kaartinen, H.; Kukko, A.; Honkavaara, E.; Issaoui, A.E.I.; Nevalainen, O.; et al. Forest in situ observations using unmanned aerial vehicle as an alternative of terrestrial measurements. For. Ecosyst. 2019, 6. [Google Scholar] [CrossRef] [Green Version]
  51. Kuželka, K.; Surový, P. Mapping forest structure using uas inside flight capabilities. Sensors 2018, 18, 2245. [Google Scholar] [CrossRef] [Green Version]
  52. Kattenborn, T.; Hernández, J.; Lopatin, J.; Kattenborn, G.; Fassnacht, F.E. Pilot study on the retrieval of DBH and diameter distribution of deciduous forest stands using cast shadows in uav-based orthomosaics. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 4, 93–99. [Google Scholar] [CrossRef] [Green Version]
  53. Krisanski, S.; Taskhiri, M.S.; Turner, P. Enhancing methods for under-canopy unmanned aircraft system based photogrammetry in complex forests for tree diameter measurement. Remote Sens. 2020, 12, 1652. [Google Scholar] [CrossRef]
  54. Brieger, F.; Herzschuh, U.; Pestryakova, L.A.; Bookhagen, B.; Zakharov, E.S.; Kruse, S. Advances in the derivation of Northeast Siberian forest metrics using high-resolution UAV-based photogrammetric point clouds. Remote Sens. 2019, 11, 1447. [Google Scholar] [CrossRef] [Green Version]
  55. Gülci, S. The determination of some stand parameters using SfM-based spatial 3D point cloud in forestry studies: An analysis of data production in pure coniferous young forest stands. Environ. Monit. Assess. 2019, 191. [Google Scholar] [CrossRef]
  56. Yin, D.; Wang, L. Individual mangrove tree measurement using UAV-based LiDAR data: Possibilities and challenges. Remote Sens. Environ. 2019, 223, 34–49. [Google Scholar] [CrossRef]
  57. Li, J.; Yang, B.; Cong, Y.; Cao, L.; Fu, X.; Dong, Z. 3D forest mapping using a low-cost UAV laser scanning system: Investigation and comparison. Remote Sens. 2019, 11, 717. [Google Scholar] [CrossRef] [Green Version]
  58. Zhang, D.; Liu, J.; Ni, W.; Sun, G.; Zhang, Z.; Liu, Q.; Wang, Q. Estimation of Forest Leaf Area Index Using Height and Canopy Cover Information Extracted from Unmanned Aerial Vehicle Stereo Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 471–481. [Google Scholar] [CrossRef]
  59. Lendzioch, T.; Langhammer, J.; Jenicek, M. Estimating snow depth and leaf area index based on UAV digital photogrammetry. Sensors 2019, 19, 1027. [Google Scholar] [CrossRef] [Green Version]
  60. Guo, X.; Wang, L.; Tian, J.; Yin, D.; Shi, C.; Nie, S. Vegetation horizontal occlusion index (VHOI) from TLS and UAV image to better measure mangrove LAI. Remote Sens. 2018, 10, 1739. [Google Scholar] [CrossRef] [Green Version]
  61. Iizuka, K.; Watanabe, K.; Kato, T.; Putri, N.A.; Silsigia, S.; Kameoka, T.; Kozan, O. Visualizing the spatiotemporal trends of thermal characteristics in a peatland plantation forest in Indonesia: Pilot test using unmanned aerial systems (UASs). Remote Sens. 2018, 10, 1345. [Google Scholar] [CrossRef] [Green Version]
  62. Webster, C.; Westoby, M.; Rutter, N.; Jonas, T. Three-dimensional thermal characterization of forest canopies using UAV photogrammetry. Remote Sens. Environ. 2018, 209, 835–847. [Google Scholar] [CrossRef] [Green Version]
  63. Marzahn, P.; Flade, L.; Sanchez-Azofeifa, A. Spatial estimation of the latent heat flux in a tropical dry forest by using unmanned aerial vehicles. Forests 2020, 11, 604. [Google Scholar] [CrossRef]
  64. D’Odorico, P.; Besik, A.; Wong, C.Y.S.; Isabel, N.; Ensminger, I. High-throughput drone-based remote sensing reliably tracks phenology in thousands of conifer seedlings. New Phytol. 2020, 226, 1667–1681. [Google Scholar] [CrossRef] [PubMed]
  65. Park, J.Y.; Muller-Landau, H.C.; Lichstein, J.W.; Rifai, S.W.; Dandois, J.P.; Bohlman, S.A. Quantifying leaf phenology of individual trees and species in a tropical forest using unmanned aerial vehicle (UAV) images. Remote Sens. 2019, 11, 1534. [Google Scholar] [CrossRef] [Green Version]
  66. Santini, F.; Kefauver, S.C.; Resco de Dios, V.; Araus, J.L.; Voltas, J. Using unmanned aerial vehicle-based multispectral, RGB and thermal imagery for phenotyping of forest genetic trials: A case study in Pinus halepensis. Ann. Appl. Biol. 2019, 174, 262–276. [Google Scholar] [CrossRef] [Green Version]
  67. Santini, F.; Serrano, L.; Kefauver, S.C.; Abdullah-Al, M.; Aguilera, M.; Sin, E.; Voltas, J. Morpho-physiological variability of Pinus nigra populations reveals climate-driven local adaptation but weak water use differentiation. Environ. Exp. Bot. 2019, 166, 103828. [Google Scholar] [CrossRef]
  68. Blonder, B.; Graae, B.J.; Greer, B.; Haagsma, M.; Helsen, K.; Kapás, R.E.; Pai, H.; Rieksta, J.; Sapena, D.; Still, C.J.; et al. Remote sensing of ploidy level in quaking aspen (Populus tremuloides Michx.). J. Ecol. 2020, 108, 175–188. [Google Scholar] [CrossRef] [Green Version]
  69. Rissanen, K.; Martin-Guay, M.O.; Riopel-Bouvier, A.S.; Paquette, A. Light interception in experimental forests affected by tree diversity and structural complexity of dominant canopy. Agric. For. Meteorol. 2019, 278, 107655. [Google Scholar] [CrossRef]
  70. Brüllhardt, M.; Rotach, P.; Schleppi, P.; Bugmann, H. Vertical light transmission profiles in structured mixed deciduous forest canopies assessed by UAV-based hemispherical photography and photogrammetric vegetation height models. Agric. For. Meteorol. 2020, 281, 107843. [Google Scholar] [CrossRef]
  71. Zeng, K.; Zheng, G.; Ma, L.; Ju, W.; Pang, Y. Modelling Three-Dimensional Spatiotemporal Distributions of Forest Photosynthetically Active Radiation Using UAV-Based Lidar Data. Remote Sens. 2019, 11, 2806. [Google Scholar] [CrossRef] [Green Version]
  72. Abdollahnejad, A.; Panagiotidis, D.; Surový, P.; Ulbrichová, I. UAV Capability to Detect and Interpret Solar Radiation as a Potential Replacement Method to Hemispherical Photography. Remote Sens. 2018, 10, 423. [Google Scholar] [CrossRef] [Green Version]
  73. Balsi, M.; Esposito, S.; Fallavollita, P.; Nardinocchi, C. Single-tree detection in high-density LiDAR data from UAV-based survey. Eur. J. Remote Sens. 2018, 51, 679–692. [Google Scholar] [CrossRef] [Green Version]
  74. Hentz, Â.M.K.; Silva, C.A.; Dalla Corte, A.P.; Netto, S.P.; Strager, M.P.; Klauberg, C. Estimating forest uniformity in Eucalyptus spp. and Pinus taeda L. stands using field measurements and structure from motion point clouds generated from unmanned aerial vehicle (UAV) data collection. For. Syst. 2018, 27, 1–17. [Google Scholar] [CrossRef]
  75. Iizuka, K.; Yonehara, T.; Itoh, M.; Kosugi, Y. Estimating Tree Height and Diameter at Breast Height (DBH) from Digital surface models and orthophotos obtained with an unmanned aerial system for a Japanese Cypress (Chamaecyparis obtusa) Forest. Remote Sens. 2018, 10, 13. [Google Scholar] [CrossRef] [Green Version]
  76. Mayr, M.J.; Malß, S.; Ofner, E.; Samimi, C. Disturbance feedbacks on the height of woody vegetation in a savannah: A multi-plot assessment using an unmanned aerial vehicle (UAV). Int. J. Remote Sens. 2018, 39, 4761–4785. [Google Scholar] [CrossRef]
  77. Ganz, S.; Käber, Y.; Adler, P. Measuring tree height with remote sensing-a comparison of photogrammetric and LiDAR data with different field measurements. Forests 2019, 10, 694. [Google Scholar] [CrossRef] [Green Version]
  78. Imangholiloo, M.; Saarinen, N.; Markelin, L.; Rosnell, T.; Näsi, R.; Hakala, T.; Honkavaara, E.; Holopainen, M.; Hyyppä, J.; Vastaranta, M. Characterizing seedling stands using leaf-off and leaf-on photogrammetric point clouds and hyperspectral imagery acquired from unmanned aerial vehicle. Forests 2019, 10, 415. [Google Scholar] [CrossRef] [Green Version]
  79. Abdollahnejad, A.; Panagiotidis, D.; Surovỳ, P. Estimation and extrapolation of tree parameters using spectral correlation between UAV and Pléiades data. Forests 2018, 9, 85. [Google Scholar] [CrossRef] [Green Version]
  80. Balková, M.; Bajer, A.; Patočka, Z.; Mikita, T. Visual exposure of rock outcrops in the context of a forest disease outbreak simulation based on a canopy height model and spectral information acquired by an unmanned aerial vehicle. ISPRS Int. J. Geo-Inf. 2020, 9, 325. [Google Scholar] [CrossRef]
  81. Huang, H.; Li, X.; Chen, C. Individual tree crown detection and delineation from very-high-resolution UAV images based on bias field and marker-controlled watershed segmentation algorithms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 2253–2262. [Google Scholar] [CrossRef]
  82. Nuijten, R.J.G.; Coops, N.C.; Goodbody, T.R.H.; Pelletier, G. Examining the multi-seasonal consistency of individual tree segmentation on deciduous stands using Digital Aerial Photogrammetry (DAP) and unmanned aerial systems (UAS). Remote Sens. 2019, 11, 739. [Google Scholar] [CrossRef] [Green Version]
  83. Gu, J.; Grybas, H.; Congalton, R.G. A comparison of forest tree crown delineation from unmanned aerial imagery using canopy height models vs. spectral lightness. Forests 2020, 11, 605. [Google Scholar] [CrossRef]
  84. Chen, S.W.; Nardari, G.V.; Lee, E.S.; Qu, C.; Liu, X.; Romero, R.A.F.; Kumar, V. SLOAM: Semantic lidar odometry and mapping for forest inventory. IEEE Robot. Autom. Lett. 2020, 5, 612–619. [Google Scholar] [CrossRef] [Green Version]
  85. Morales, G.; Kemper, G.; Sevillano, G.; Arteaga, D.; Ortega, I.; Telles, J. Automatic segmentation of Mauritia flexuosa in unmanned aerial vehicle (UAV) imagery using deep learning. Forests 2018, 9, 736. [Google Scholar] [CrossRef] [Green Version]
  86. Hu, X.; Li, D. Research on a Single-Tree Point Cloud Segmentation Method Based on UAV Tilt Photography and Deep Learning Algorithm. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 4111–4120. [Google Scholar] [CrossRef]
  87. Wu, X.; Shen, X.; Cao, L.; Wang, G.; Cao, F. Assessment of individual tree detection and canopy cover estimation using unmanned aerial vehicle based light detection and ranging (UAV-LiDAR) data in planted forests. Remote Sens. 2019, 11, 908. [Google Scholar] [CrossRef] [Green Version]
  88. Hastings, J.H.; Ollinger, S.V.; Ouimette, A.P.; Sanders-DeMott, R.; Palace, M.W.; Ducey, M.J.; Sullivan, F.B.; Basler, D.; Orwig, D.A. Tree species traits determine the success of LiDAR-based crown mapping in a mixed temperate forest. Remote Sens. 2020, 12, 309. [Google Scholar] [CrossRef] [Green Version]
  89. Maturbongs, B.Y.L.; Wing, M.G.; Strîmbu, B.M.; Burnett, J. Forest inventory sensivity to UAS-based image processing algorithms. Ann. For. Res. 2019, 62, 87–108. [Google Scholar] [CrossRef]
  90. Yilmaz, V.; Güngör, O. Estimating crown diameters in urban forests with Unmanned Aerial System-based photogrammetric point clouds. Int. J. Remote Sens. 2019, 40, 468–505. [Google Scholar] [CrossRef]
  91. Thomson, E.R.; Malhi, Y.; Bartholomeus, H.; Oliveras, I.; Gvozdevaite, A.; Peprah, T.; Suomalainen, J.; Quansah, J.; Seidu, J.; Adonteng, C.; et al. Mapping the leaf economic spectrum acrossWest African tropical forests using UAV-Acquired hyperspectral imagery. Remote Sens. 2018, 10, 1532. [Google Scholar] [CrossRef] [Green Version]
  92. Roşca, S.; Suomalainen, J.; Bartholomeus, H.; Herold, M. Comparing terrestrial laser scanning and unmanned aerial vehicle structure from motion to assess top of canopy structure in tropical forests. Interface Focus 2018, 8. [Google Scholar] [CrossRef]
  93. Xu, N.; Tian, J.; Tian, Q.; Xu, K.; Tang, S. Analysis of vegetation red edge with different illuminated/shaded canopy proportions and to construct normalized difference canopy shadow index. Remote Sens. 2019, 11, 1192. [Google Scholar] [CrossRef] [Green Version]
  94. Guerra-Hernández, J.; Cosenza, D.N.; Rodriguez, L.C.E.; Silva, M.; Tomé, M.; Díaz-Varela, R.A.; González-Ferreiro, E. Comparison of ALS- and UAV(SfM)-derived high-density point clouds for individual tree detection in Eucalyptus plantations. Int. J. Remote Sens. 2018, 39, 5211–5235. [Google Scholar] [CrossRef]
  95. Chung, C.H.; Wang, C.H.; Hsieh, H.C.; Huang, C.Y. Comparison of forest canopy height profiles in a mountainous region of Taiwan derived from airborne lidar and unmanned aerial vehicle imagery. GIScience Remote Sens. 2019, 56, 1289–1304. [Google Scholar] [CrossRef]
  96. Jayathunga, S.; Owari, T.; Tsuyuki, S.; Hirata, Y. Potential of UAV photogrammetry for characterization of forest canopy structure in uneven-aged mixed conifer–broadleaf forests. Int. J. Remote Sens. 2020, 41, 53–73. [Google Scholar] [CrossRef]
  97. Puliti, S.; Solberg, S.; Granhus, A. Use of UAV photogrammetric data for estimation of biophysical properties in forest stands under regeneration. Remote Sens. 2019, 11, 233. [Google Scholar] [CrossRef] [Green Version]
  98. Klosterman, S.; Melaas, E.; Wang, J.; Martinez, A.; Frederick, S.; O’Keefe, J.; Orwig, D.A.; Wang, Z.; Sun, Q.; Schaaf, C.; et al. Fine-scale perspectives on landscape phenology from unmanned aerial vehicle (UAV) photography. Agric. For. Meteorol. 2018, 248, 397–407. [Google Scholar] [CrossRef]
  99. St-Onge, B.; Grandin, S. Estimating the height and basal area at individual tree and plot levels in Canadian subarctic lichen woodlands using stereo worldview-3 images. Remote Sens. 2019, 11, 248. [Google Scholar] [CrossRef] [Green Version]
  100. Jin, C.; Oh, C.-Y.; Shin, S.; Njungwi, N.W.; Choi, C. A comparative study to evaluate accuracy on canopy height and density using UAV, ALS, and fieldwork. Forests 2020, 11, 241. [Google Scholar] [CrossRef] [Green Version]
  101. Moe, K.T.; Owari, T.; Furuya, N.; Hiroshima, T. Comparing individual tree height information derived from field surveys, LiDAR and UAV-DAP for high-value timber species in Northern Japan. Forests 2020, 11, 223. [Google Scholar] [CrossRef] [Green Version]
  102. Carr, J.C.; Slyder, J.B. Individual tree segmentation from a leaf-off photogrammetric point cloud. Int. J. Remote Sens. 2018, 39, 5195–5210. [Google Scholar] [CrossRef]
  103. Medauar, C.C.; de Silva, S.A.; Carvalho, L.C.C.; Tibúrcio, R.A.S.; de Lima, J.S.S.; Medauar, P.A.S. Monitoring of eucalyptus sprouts control using digital images obtained by unmanned aerial vehicle. J. Sustain. For. 2018, 37, 739–752. [Google Scholar] [CrossRef]
  104. Qiu, Z.; Feng, Z.K.; Wang, M.; Li, Z.; Lu, C. Application of UAV photogrammetric system for monitoring ancient tree communities in Beijing. Forests 2018, 9, 735. [Google Scholar] [CrossRef] [Green Version]
  105. Shin, P.; Sankey, T.; Moore, M.M.; Thode, A.E. Evaluating unmanned aerial vehicle images for estimating forest canopy fuels in a ponderosa pine stand. Remote Sens. 2018, 10, 1266. [Google Scholar] [CrossRef] [Green Version]
  106. Yan, W.; Guan, H.; Cao, L.; Yu, Y.; Gao, S.; Lu, J.Y. An automated hierarchical approach for three-dimensional segmentation of single trees using UAV LiDAR data. Remote Sens. 2018, 10, 1999. [Google Scholar] [CrossRef] [Green Version]
  107. Chakraborty, K.; Saikom, V.; Borah, S.B.; Kalita, M.; Gupta, C.; Meitei, L.R.; Sarma, K.K.; Raju, P.L.N. Forest biometric parameter extraction using unmanned aerial vehicle to aid in forest inventory data collection. Curr. Sci. 2019, 117, 1194–1199. [Google Scholar] [CrossRef]
  108. Durfee, N.; Ochoa, C.G.; Mata-Gonzalez, R. The use of low-altitude UAV imagery to assess western juniper density and canopy cover in treated and untreated stands. Forests 2019, 10, 296. [Google Scholar] [CrossRef] [Green Version]
  109. Fawcett, D.; Azlan, B.; Hill, T.C.; Kho, L.K.; Bennie, J.; Anderson, K. Unmanned aerial vehicle (UAV) derived structure-from-motion photogrammetry point clouds for oil palm (Elaeis guineensis) canopy segmentation and height estimation. Int. J. Remote Sens. 2019, 40, 7538–7560. [Google Scholar] [CrossRef] [Green Version]
  110. Schneider, F.D.; Kükenbrink, D.; Schaepman, M.E.; Schimel, D.S.; Morsdorf, F. Quantifying 3D structure and occlusion in dense tropical and temperate forests using close-range LiDAR. Agric. For. Meteorol. 2019, 268, 249–257. [Google Scholar] [CrossRef]
  111. Shashkov, M.; Ivanova, N.; Shanin, V.; Grabarnik, P. Ground Surveys Versus UAV Photography: The Comparison of Two Tree Crown Mapping Techniques. In Proceedings of the Information Technologies in the Research of Biodiversity, Irkutsk, Russia, 11–14 September 2018; Bychkov, I., Voronin, V., Eds.; Springer: Cham, Switzerland, 2019; pp. 48–56. [Google Scholar]
  112. Yancho, J.M.M.; Coops, N.C.; Tompalski, P.; Goodbody, T.R.H.; Plowright, A. Fine-Scale Spatial and Spectral Clustering of UAV-Acquired Digital Aerial Photogrammetric (DAP) Point Clouds for Individual Tree Crown Detection and Segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 1–18. [Google Scholar] [CrossRef]
  113. Apostol, B.; Petrila, M.; Lorenţ, A.; Ciceu, A.; Gancz, V.; Badea, O. Species discrimination and individual tree detection for predicting main dendrometric characteristics in mixed temperate forests by use of airborne laser scanning and ultra-high-resolution imagery. Sci. Total Environ. 2020, 698. [Google Scholar] [CrossRef] [PubMed]
  114. Dong, T.; Zhang, X.; Ding, Z.; Fan, J. Multi-layered tree crown extraction from LiDAR data using graph-based segmentation. Comput. Electron. Agric. 2020, 170, 105213. [Google Scholar] [CrossRef]
  115. du Toit, F.; Coops, N.C.; Tompalski, P.; Goodbody, T.R.H.; El-Kassaby, Y.A.; Stoehr, M.; Turner, D.; Lucieer, A. Characterizing variations in growth characteristics between Douglas-fir with different genetic gain levels using airborne laser scanning. Trees Struct. Funct. 2020, 34, 649–664. [Google Scholar] [CrossRef]
  116. Isibue, E.W.; Pingel, T.J. Unmanned aerial vehicle based measurement of urban forests. Urban For. Urban Green. 2020, 48, 126574. [Google Scholar] [CrossRef]
  117. Jurado, J.M.; Ramos, M.I.; Enríquez, C.; Feito, F.R. The impact of canopy reflectance on the 3D structure of individual trees in a Mediterranean Forest. Remote Sens. 2020, 12, 1430. [Google Scholar] [CrossRef]
  118. Kuželka, K.; Slavík, M.; Surový, P. Very high density point clouds from UAV laser scanning for automatic tree stem detection and direct diameter measurement. Remote Sens. 2020, 12, 1236. [Google Scholar] [CrossRef] [Green Version]
  119. Li, L.; Chen, J.; Mu, X.; Li, W.; Yan, G.; Xie, D.; Zhang, W. Quantifying understory and overstory vegetation cover using UAV-based RGB imagery in forest plantation. Remote Sens. 2020, 12, 298. [Google Scholar] [CrossRef] [Green Version]
  120. Yan, W.; Guan, H.; Cao, L.; Yu, Y.; Li, C.; Lu, J.Y. A self-adaptive mean shift tree-segmentation method using UAV LiDAR data. Remote Sens. 2020, 12, 515. [Google Scholar] [CrossRef] [Green Version]
  121. Hyyppä, E.; Hyyppä, J.; Hakala, T.; Kukko, A.; Wulder, M.A.; White, J.C.; Pyörälä, J.; Yu, X.; Wang, Y.; Virtanen, J.P.; et al. Under-canopy UAV laser scanning for accurate forest field measurements. ISPRS J. Photogramm. Remote Sens. 2020, 164, 41–60. [Google Scholar] [CrossRef]
  122. Iizuka, K.; Hayakawa, Y.S.; Ogura, T.; Nakata, Y.; Kosugi, Y.; Yonehara, T. Integration of multi-sensor data to estimate plot-level stem volume using machine learning algorithms-case study of evergreen conifer planted forests in Japan. Remote Sens. 2020, 12, 1649. [Google Scholar] [CrossRef]
  123. Kotivuori, E.; Kukkonen, M.; Mehtätalo, L.; Maltamo, M.; Korhonen, L.; Packalen, P. Forest inventories for small areas using drone imagery without in-situ field measurements. Remote Sens. Environ. 2020, 237, 111404. [Google Scholar] [CrossRef]
  124. Puliti, S.; Dash, J.P.; Watt, M.S.; Breidenbach, J.; Pearse, G.D. A comparison of UAV laser scanning, photogrammetry and airborne laser scanning for precision inventory of small-forest properties. Forestry 2020, 93, 150–162. [Google Scholar] [CrossRef]
  125. Yrttimaa, T.; Saarinen, N.; Kankare, V.; Viljanen, N.; Hynynen, J.; Huuskonen, S.; Holopainen, M.; Hyyppä, J.; Honkavaara, E.; Vastaranta, M. Multisensorial close-range sensing generates benefits for characterization of managed scots pine (Pinus sylvestris L.) stands. ISPRS Int. J. Geo-Inf. 2020, 9, 309. [Google Scholar] [CrossRef]
  126. Li, Z.; Zan, Q.; Yang, Q.; Zhu, D.; Chen, Y.; Yu, S. Remote estimation of mangrove aboveground carbon stock at the species level using a low-cost unmanned aerial vehicle system. Remote Sens. 2019, 11, 1018. [Google Scholar] [CrossRef] [Green Version]
  127. McClelland, M.P.; van Aardt, J.; Hale, D. Manned aircraft versus small unmanned aerial system—forestry remote sensing comparison utilizing lidar and structure-from-motion for forest carbon modeling and disturbance detection. J. Appl. Remote Sens. 2019, 14, 1. [Google Scholar] [CrossRef]
  128. Swinfield, T.; Lindsell, J.A.; Williams, J.V.; Harrison, R.D.; Gemita, E.; Schönlieb, C.B.; Coomes, D.A. Accurate Measurement of Tropical Forest Canopy Heights and Aboveground Carbon Using Structure from Motion. Remote Sens. 2019, 11, 928. [Google Scholar] [CrossRef] [Green Version]
  129. Fernandes, M.R.; Aguiar, F.C.; Martins, M.J.; Rico, N.; Ferreira, M.T.; Correia, A.C. Carbon stock estimations in a mediterranean riparian forest: A case study combining field data and UAV imagery. Forests 2020, 11, 376. [Google Scholar] [CrossRef] [Green Version]
  130. Di Gennaro, S.F.; Nati, C.; Dainelli, R.; Pastonchi, L.; Berton, A.; Toscano, P.; Matese, A. An automatic UAV based segmentation approach for pruning biomass estimation in irregularly spaced chestnut orchards. Forests 2020, 11, 308. [Google Scholar] [CrossRef] [Green Version]
  131. Windrim, L.; Bryson, M.; McLean, M.; Randle, J.; Stone, C. Automated mapping of woody debris over harvested forest plantations using UAVs, high-resolution imagery, and machine learning. Remote Sens. 2019, 11, 733. [Google Scholar] [CrossRef] [Green Version]
  132. Otero, V.; Van De Kerchove, R.; Satyanarayana, B.; Martínez-Espinosa, C.; Fisol, M.A.B.; Ibrahim, M.R.B.; Sulong, I.; Mohd-Lokman, H.; Lucas, R.; Dahdouh-Guebas, F. Managing mangrove forests from the sky: Forest inventory using field data and Unmanned Aerial Vehicle (UAV) imagery in the Matang Mangrove Forest Reserve, peninsular Malaysia. For. Ecol. Manage. 2018, 411, 35–45. [Google Scholar] [CrossRef]
  133. Peña, J.M.; de Castro, A.I.; Torres-Sánchez, J.; Andújar, D.; Martín, C.S.; Dorado, J.; Fernández-Quintanilla, C.; López-Granados, F. Estimating tree height and biomass of a poplar plantation with image-based UAV technology. AIMS Agric. Food 2018, 3, 313–323. [Google Scholar] [CrossRef]
  134. González-Jaramillo, V.; Fries, A.; Bendix, J. AGB estimation in a tropical mountain forest (TMF) by means of RGB and multispectral images using an unmanned aerial vehicle (UAV). Remote Sens. 2019, 11, 1413. [Google Scholar] [CrossRef] [Green Version]
  135. Vaglio Laurin, G.; Ding, J.; Disney, M.; Bartholomeus, H.; Herold, M.; Papale, D.; Valentini, R. Tree height in tropical forest as measured by different ground, proximal, and remote sensing instruments, and impacts on above ground biomass estimates. Int. J. Appl. Earth Obs. Geoinf. 2019, 82, 101899. [Google Scholar] [CrossRef]
  136. Lu, J.; Wang, H.; Qin, S.; Cao, L.; Pu, R.; Li, G.; Sun, J. Estimation of aboveground biomass of Robinia pseudoacacia forest in the Yellow River Delta based on UAV and Backpack LiDAR point clouds. Int. J. Appl. Earth Obs. Geoinf. 2020, 86, 102014. [Google Scholar] [CrossRef]
  137. Navarro, A.; Young, M.; Allan, B.; Carnell, P.; Macreadie, P.; Ierodiaconou, D. The application of Unmanned Aerial Vehicles (UAVs) to estimate above-ground biomass of mangrove ecosystems. Remote Sens. Environ. 2020, 242, 111747. [Google Scholar] [CrossRef]
  138. Puliti, S.; Breidenbach, J.; Astrup, R. Estimation of forest growing stock volume with UAV laser scanning data: Can it be done without field data? Remote Sens. 2020, 12, 1245. [Google Scholar] [CrossRef] [Green Version]
  139. Wang, Y.; Pyörälä, J.; Liang, X.; Lehtomäki, M.; Kukko, A.; Yu, X.; Kaartinen, H.; Hyyppä, J. In situ biomass estimation at tree and plot levels: What did data record and what did algorithms derive from terrestrial and aerial point clouds in boreal forest. Remote Sens. Environ. 2019, 232, 111309. [Google Scholar] [CrossRef]
  140. Giannetti, F.; Chirici, G.; Gobakken, T.; Næsset, E.; Travaglini, D.; Puliti, S. A new approach with DTM-independent metrics for forest growing stock prediction using UAV photogrammetric data. Remote Sens. Environ. 2018, 213, 195–205. [Google Scholar] [CrossRef]
  141. Jayathunga, S.; Owari, T.; Tsuyuki, S. The use of fixed–wing UAV photogrammetry with LiDAR DTM to estimate merchantable volume and carbon stock in living biomass over a mixed conifer–broadleaf forest. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 767–777. [Google Scholar] [CrossRef]
  142. Domingo, D.; Ørka, H.O.; Næsset, E.; Kachamba, D.; Gobakken, T. Effects of UAV image resolution, camera type, and image overlap on accuracy of biomass predictions in a tropical woodland. Remote Sens. 2019, 11, 948. [Google Scholar] [CrossRef] [Green Version]
  143. d’Oliveira, M.V.N.; Broadbent, E.N.; Oliveira, L.C.; Almeida, D.R.A.; Papa, D.A.; Ferreira, M.E.; Zambrano, A.M.A.; Silva, C.A.; Avino, F.S.; Prata, G.A.; et al. Aboveground biomass estimation in Amazonian tropical forests: A comparison of aircraft-and gatoreye UAV-borne LIDAR data in the Chico mendes extractive reserve in Acre, Brazil. Remote Sens. 2020, 12, 1754. [Google Scholar] [CrossRef]
  144. Jones, A.R.; Raja Segaran, R.; Clarke, K.D.; Waycott, M.; Goh, W.S.H.; Gillanders, B.M. Estimating Mangrove Tree Biomass and Carbon Content: A Comparison of Forest Inventory Techniques and Drone Imagery. Front. Mar. Sci. 2020, 6, 1–13. [Google Scholar] [CrossRef] [Green Version]
  145. Zhou, X.; Zhang, X. Individual tree parameters estimation for plantation forests based on UAV oblique photography. IEEE Access 2020, 8, 96184–96198. [Google Scholar] [CrossRef]
  146. Xu, Z.; Li, W.; Li, Y.; Shen, X.; Rua, H. Estimation of secondary forest parameters by integrating image and point cloud-based metrics acquired from unmanned aerial vehicle. J. Appl. Remote Sens. 2019, 14. [Google Scholar] [CrossRef] [Green Version]
  147. Zhu, Y.; Liu, K.; Myint, S.W.; Du, Z.; Li, Y.; Cao, J.; Liu, L.; Wu, Z. Integration of GF2 optical, GF3 SAR, and UAV data for estimating aboveground biomass of China’s largest artificially planted mangroves. Remote Sens. 2020, 12, 2039. [Google Scholar] [CrossRef]
  148. Jayathunga, S.; Owari, T.; Tsuyuki, S. Digital Aerial Photogrammetry for Uneven-Aged Forest Management: Assessing the Potential to Reconstruct Canopy Structure and Estimate Living Biomass. Remote Sens. 2019, 11, 338. [Google Scholar] [CrossRef] [Green Version]
  149. Navarro, J.A.; Algeet, N.; Fernández-Landa, A.; Esteban, J.; Rodríguez-Noriega, P.; Guillén-Climent, M.L. Integration of UAV, Sentinel-1, and Sentinel-2 data for mangrove plantation aboveground biomass monitoring in Senegal. Remote Sens. 2019, 11, 77. [Google Scholar] [CrossRef] [Green Version]
  150. Liu, K.; Shen, X.; Cao, L.; Wang, G.; Cao, F. Estimating forest structural attributes using UAV-LiDAR data in Ginkgo plantations. ISPRS J. Photogramm. Remote Sens. 2018, 146, 465–482. [Google Scholar] [CrossRef]
  151. Alonzo, M.; Andersen, H.E.; Morton, D.C.; Cook, B.D. Quantifying boreal forest structure and composition using UAV structure from motion. Forests 2018, 9, 119. [Google Scholar] [CrossRef] [Green Version]
  152. Guerra-Hernández, J.; Cosenza, D.N.; Cardil, A.; Silva, C.A.; Botequim, B.; Soares, P.; Silva, M.; González-Ferreiro, E.; Díaz-Varela, R.A. Predicting growing stock volume of eucalyptus plantations using 3-D point clouds derived from UAV imagery and ALS data. Forests 2019, 10, 905. [Google Scholar] [CrossRef] [Green Version]
  153. Brede, B.; Calders, K.; Lau, A.; Raumonen, P.; Bartholomeus, H.M.; Herold, M.; Kooistra, L. Non-destructive tree volume estimation through quantitative structure modelling: Comparing UAV laser scanning with terrestrial LIDAR. Remote Sens. Environ. 2019, 233, 111355. [Google Scholar] [CrossRef]
  154. Qiu, P.; Wang, D.; Zou, X.; Yang, X.; Xie, G.; Xu, S.; Zhong, Z. Finer resolution estimation and mapping of mangrove biomass using UAV LiDAR and worldview-2 data. Forests 2019, 10, 871. [Google Scholar] [CrossRef] [Green Version]
  155. Zou, X.; Liang, A.; Wu, B.; Su, J.; Zheng, R.; Li, J. UAV-based high-throughput approach for fast growing Cunninghamia lanceolata (Lamb.) cultivar screening by machine learning. Forests 2019, 10, 815. [Google Scholar] [CrossRef] [Green Version]
  156. Ni, W.; Dong, J.; Sun, G.; Zhang, Z.; Pang, Y.; Tian, X.; Li, Z.; Chen, E. Synthesis of leaf-on and leaf-off unmanned aerial vehicle (UAV) stereo imagery for the inventory of aboveground biomass of deciduous forests. Remote Sens. 2019, 11, 889. [Google Scholar] [CrossRef] [Green Version]
  157. Puliti, S.; Saarela, S.; Gobakken, T.; Ståhl, G.; Næsset, E. Combining UAV and Sentinel-2 auxiliary data for forest growing stock volume estimation through hierarchical model-based inference. Remote Sens. Environ. 2018, 204, 485–497. [Google Scholar] [CrossRef]
  158. Wang, D.; Wan, B.; Liu, J.; Su, Y.; Guo, Q.; Qiu, P.; Wu, X. Estimating aboveground biomass of the mangrove forests on northeast Hainan Island in China using an upscaling method from field plots, UAV-LiDAR data and Sentinel-2 imagery. Int. J. Appl. Earth Obs. Geoinf. 2020, 85, 101986. [Google Scholar] [CrossRef]
  159. Tian, J.; Wang, L.; Li, X.; Yin, D.; Gong, H.; Nie, S.; Shi, C.; Zhong, R.; Liu, X.; Xu, R. Canopy Height Layering Biomass Estimation Model (CHL-BEM) with Full-Waveform LiDAR. Remote Sens. 2019, 11, 1446. [Google Scholar] [CrossRef] [Green Version]
  160. Fujimoto, A.; Haga, C.; Matsui, T.; Machimura, T.; Hayashi, K.; Sugita, S.; Takagi, H. An end to end process development for UAV-SfM based forest monitoring: Individual tree detection, species classification and carbon dynamics simulation. Forests 2019, 10, 680. [Google Scholar] [CrossRef] [Green Version]
  161. Lin, J.; Wang, M.; Ma, M.; Lin, Y. Aboveground tree biomass estimation of sparse subalpine coniferous forest with UAV oblique photography. Remote Sens. 2018, 10, 1849. [Google Scholar] [CrossRef] [Green Version]
  162. Cao, L.; Liu, H.; Fu, X.; Zhang, Z.; Shen, X.; Ruan, H. Comparison of UAV LiDAR and digital aerial photogrammetry point clouds for estimating forest structural attributes in subtropical planted forests. Forests 2019, 10, 145. [Google Scholar] [CrossRef] [Green Version]
  163. Ota, T.; Ahmed, O.S.; Minn, S.T.; Khai, T.C.; Mizoue, N.; Yoshida, S. Estimating selective logging impacts on aboveground biomass in tropical forests using digital aerial photography obtained before and after a logging event from an unmanned aerial vehicle. For. Ecol. Manage. 2019, 433, 162–169. [Google Scholar] [CrossRef]
  164. Shen, X.; Cao, L.; Yang, B.; Xu, Z.; Wang, G. Estimation of forest structural attributes using spectral indices and point clouds from UAS-based multispectral and RGB imageries. Remote Sens. 2019, 11, 800. [Google Scholar] [CrossRef] [Green Version]
  165. Wang, D.; Wan, B.; Qiu, P.; Zuo, Z.; Wang, R.; Wu, X. Mapping height and aboveground biomass of mangrove forests on Hainan Island using UAV-LiDAR sampling. Remote Sens. 2019, 11, 2156. [Google Scholar] [CrossRef] [Green Version]
  166. Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV multispectral imagery can complement satellite data for monitoring forest health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef] [Green Version]
  167. Otsu, K.; Pla, M.; Vayreda, J.; Brotons, L. Calibrating the severity of forest defoliation by pine processionary moth with Lansat and UAV imagery. Sensors 2018, 18, 3278. [Google Scholar] [CrossRef] [Green Version]
  168. Zhang, N.; Zhang, X.; Yang, G.; Zhu, C.; Huo, L.; Feng, H. Assessment of defoliation during the Dendrolimus tabulaeformis Tsai et Liu disaster outbreak using UAV-based hyperspectral images. Remote Sens. Environ. 2018, 217, 323–339. [Google Scholar] [CrossRef]
  169. Cardil, A.; Otsu, K.; Pla, M.; Silva, C.A.; Brotons, L. Quantifying pine processionary moth defoliation in a pine-oak mixed forest using unmanned aerial systems and multispectral imagery. PLoS ONE 2019, 14, 1–19. [Google Scholar] [CrossRef]
  170. Jung, K.Y.; Park, J.K. Analysis of vegetation infection information using unmanned aerial vehicle with optical sensor. Sens. Mater. 2019, 31, 3319–3326. [Google Scholar] [CrossRef]
  171. Lee, K.-W.; Park, J.-K. Economic Evaluation of Unmanned Aerial Vehicle for Forest Pest Monitoring. J. Korea Acad. Coop. Soc. 2019, 20, 440–446. [Google Scholar] [CrossRef]
  172. Smigaj, M.; Gaulton, R.; Suárez, J.C.; Barr, S.L. Canopy temperature from an Unmanned Aerial Vehicle as an indicator of tree stress associated with red band needle blight severity. For. Ecol. Manage. 2019, 433, 699–708. [Google Scholar] [CrossRef]
  173. Brovkina, O.; Cienciala, E.; Surový, P.; Janata, P. Unmanned aerial vehicles (UAV) for assessment of qualitative classification of Norway spruce in temperate forest stands. Geo-Spatial Inf. Sci. 2018, 21, 12–20. [Google Scholar] [CrossRef] [Green Version]
  174. Ganthaler, A.; Losso, A.; Mayr, S. Using image analysis for quantitative assessment of needle bladder rust disease of Norway spruce. Plant Pathol. 2018, 67, 1122–1130. [Google Scholar] [CrossRef] [Green Version]
  175. Näsi, R.; Honkavaara, E.; Blomqvist, M.; Lyytikäinen-Saarenmaa, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Holopainen, M. Remote sensing of bark beetle damage in urban forests at individual tree level using a novel hyperspectral camera from UAV and aircraft. Urban For. Urban Green. 2018, 30, 72–83. [Google Scholar] [CrossRef]
  176. Klouček, T.; Komárek, J.; Surový, P.; Hrach, K.; Janata, P.; Vašíček, B. The use of UAV mounted sensors for precise detection of bark beetle infestation. Remote Sens. 2019, 11, 1561. [Google Scholar] [CrossRef] [Green Version]
  177. Safonova, A.; Tabik, S.; Alcaraz-Segura, D.; Rubtsov, A.; Maglinets, Y.; Herrera, F. Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sens. 2019, 11, 643. [Google Scholar] [CrossRef] [Green Version]
  178. Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial mapping of forests affected by pathogens using UAVs, hyperspectral sensors, and artificial intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef] [Green Version]
  179. Dell, M.; Stone, C.; Osborn, J.; Glen, M.; McCoull, C.; Rimbawanto, A.; Tjahyono, B.; Mohammed, C. Detection of necrotic foliage in a young Eucalyptus pellita plantation using unmanned aerial vehicle RGB photography–a demonstration of concept. Aust. For. 2019, 82, 79–88. [Google Scholar] [CrossRef] [Green Version]
  180. Maes, W.H.; Huete, A.R.; Avino, M.; Boer, M.M.; Dehaan, R.; Pendall, E.; Griebel, A.; Steppe, K. Can UAV-based infrared thermography be used to study plant-parasite interactions between mistletoe and Eucalypt trees? Remote Sens. 2018, 10, 2062. [Google Scholar] [CrossRef] [Green Version]
  181. Pádua, L.; Hruška, J.; Bessa, J.; Adão, T.; Martins, L.M.; Gonçalves, J.A.; Peres, E.; Sousa, A.M.R.; Castro, J.P.; Sousa, J.J. Multi-temporal analysis of forestry and coastal environments using UASs. Remote Sens. 2018, 10, 24. [Google Scholar] [CrossRef] [Green Version]
  182. Barmpoutis, P.; Stathaki, T.; Kamperidou, V. Monitoring of Trees’ Health Condition Using a UAV Equipped with Low-cost Digital Camera. In Proceedings of the ICASSP 2019—2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK, 12–17 May 2019; pp. 8291–8295. [Google Scholar] [CrossRef]
  183. Komárek, J.; Klouček, T.; Prošek, J. The potential of Unmanned Aerial Systems: A tool towards precision classification of hard-to-distinguish vegetation types? Int. J. Appl. Earth Obs. Geoinf. 2018, 71, 9–19. [Google Scholar] [CrossRef]
  184. Tuominen, S.; Näsi, R.; Honkavaara, E.; Balazs, A.; Hakala, T.; Viljanen, N.; Pölönen, I.; Saari, H.; Ojanen, H. Assessment of Classifiers and Remote Sensing Features of Hyperspectral Imagery and Stereo-Photogrammetric Point Clouds for Recognition of Tree Species in a Forest Area of High Species Diversity. Remote Sens. 2018, 10, 714. [Google Scholar] [CrossRef] [Green Version]
  185. Gini, R.; Sona, G.; Ronchetti, G.; Passoni, D.; Pinto, L. Improving tree species classification using UAS multispectral images and texture measures. ISPRS Int. J. Geo-Inf. 2018, 7, 315. [Google Scholar] [CrossRef] [Green Version]
  186. Waite, C.E.; van der Heijden, G.M.F.; Field, R.; Boyd, D.S. A view from above: Unmanned aerial vehicles (UAVs) provide a new tool for assessing liana infestation in tropical forest canopies. J. Appl. Ecol. 2019, 56, 902–912. [Google Scholar] [CrossRef]
  187. Yuan, X.; Laakso, K.; Marzahn, P.; Sanchez-Azofeifa, G.A. Canopy Temperature Differences between Liana-Infested and Non-Liana Infested Areas in a Neotropical Dry Forest. Forests 2019, 10, 890. [Google Scholar] [CrossRef] [Green Version]
  188. Wu, Z.; Ni, M.; Hu, Z.; Wang, J.; Li, Q.; Wu, G. Mapping invasive plant with UAV-derived 3D mesh model in mountain area—A case study in Shenzhen Coast, China. Int. J. Appl. Earth Obs. Geoinf. 2019, 77, 129–139. [Google Scholar] [CrossRef]
  189. Dash, J.P.; Watt, M.S.; Paul, T.S.H.; Morgenroth, J.; Pearse, G.D. Early detection of invasive exotic trees using UAV and manned aircraft multispectral and LiDAR Data. Remote Sens. 2019, 11, 1812. [Google Scholar] [CrossRef] [Green Version]
  190. Kattenborn, T.; Lopatin, J.; Förster, M.; Braun, A.C.; Fassnacht, F.E. UAV data as alternative to field sampling to map woody invasive species based on combined Sentinel-1 and Sentinel-2 data. Remote Sens. Environ. 2019, 227, 61–73. [Google Scholar] [CrossRef]
  191. de Sá, N.C.; Castro, P.; Carvalho, S.; Marchante, E.; López-Núñez, F.A.; Marchante, H. Mapping the flowering of an invasive plant using unmanned aerial vehicles: Is there potential for biocontrol monitoring? Front. Plant Sci. 2018, 9, 1–13. [Google Scholar] [CrossRef] [Green Version]
  192. Kentsch, S.; Caceres, M.L.L.; Serrano, D.; Roure, F.; Diez, Y. Computer vision and deep learning techniques for the analysis of drone-acquired forest images, a transfer learning study. Remote Sens. 2020, 12, 1287. [Google Scholar] [CrossRef] [Green Version]
  193. Franklin, S.E.; Ahmed, O.S. Deciduous tree species classification using object-based analysis and machine learning with unmanned aerial vehicle multispectral data. Int. J. Remote Sens. 2018, 39, 5236–5245. [Google Scholar] [CrossRef]
  194. Mishra, N.B.; Mainali, K.P.; Shrestha, B.B.; Radenz, J.; Karki, D. Species-level vegetation mapping in a Himalayan treeline ecotone using unmanned aerial system (UAS) imagery. ISPRS Int. J. Geo-Inf. 2018, 7, 445. [Google Scholar] [CrossRef] [Green Version]
  195. Rivas-Torres, G.F.; Benítez, F.L.; Rueda, D.; Sevilla, C.; Mena, C.F. A methodology for mapping native and invasive vegetation coverage in archipelagos: An example from the Galápagos Islands. Prog. Phys. Geogr. 2018, 42, 83–111. [Google Scholar] [CrossRef] [Green Version]
  196. Sothe, C.; De Almeida, C.M.; Schimalski, M.B.; La Rosa, L.E.C.; Castro, J.D.B.; Feitosa, R.Q.; Dalponte, M.; Lima, C.L.; Liesenberg, V.; Miyoshi, G.T.; et al. Comparative performance of convolutional neural network, weighted and conventional support vector machine and random forest for classifying tree species using hyperspectral and photogrammetric data. GIScience Remote Sens. 2020, 57, 369–394. [Google Scholar] [CrossRef]
  197. Yaney-Keller, A.; Tomillo, P.S.; Marshall, J.M.; Paladino, F.V. Using unmanned aerial systems (Uas) to assay mangrove estuaries on the pacific coast of Costa Rica. PLoS ONE 2019, 14, 1–20. [Google Scholar] [CrossRef]
  198. Liu, C.; Ai, M.; Chen, Z.; Zhou, Y.; Wu, H. Detection of Firmiana danxiaensis Canopies by a Customized Imaging System Mounted on an UAV Platform. J. Sens. 2018, 2018. [Google Scholar] [CrossRef] [Green Version]
  199. Cao, J.; Leng, W.; Liu, K.; Liu, L.; He, Z.; Zhu, Y. Object-Based mangrove species classification using unmanned aerial vehicle hyperspectral images and digital surface models. Remote Sens. 2018, 10, 89. [Google Scholar] [CrossRef] [Green Version]
  200. Sothe, C.; Dalponte, M.; de Almeida, C.M.; Schimalski, M.B.; Lima, C.L.; Liesenberg, V.; Miyoshi, G.T.; Tommaselli, A.M.G. Tree species classification in a highly diverse subtropical forest integrating UAV-based photogrammetric point cloud and hyperspectral data. Remote Sens. 2019, 11, 1338. [Google Scholar] [CrossRef] [Green Version]
  201. Miyoshi, G.T.; Imai, N.N.; Tommaselli, A.M.G.; de Moraes, M.V.A.; Honkavaara, E. Evaluation of hyperspectral multitemporal information to improve tree species identification in the highly diverse atlantic forest. Remote Sens. 2020, 12, 244. [Google Scholar] [CrossRef] [Green Version]
  202. Nezami, S.; Khoramshahi, E.; Nevalainen, O.; Pölönen, I.; Honkavaara, E. Tree species classification of drone hyperspectral and RGB imagery with deep learning convolutional neural networks. Remote Sens. 2020, 12, 1070. [Google Scholar] [CrossRef] [Green Version]
  203. Kattenborn, T.; Eichel, J.; Wiser, S.; Burrows, L.; Fassnacht, F.E.; Schmidtlein, S. Convolutional Neural Networks accurately predict cover fractions of plant species and communities in Unmanned Aerial Vehicle imagery. Remote Sens. Ecol. Conserv. 2020, 1–15. [Google Scholar] [CrossRef] [Green Version]
  204. Miyoshi, G.T.; dos Arruda, M.S.; Osco, L.P.; Junior, J.M.; Gonçalves, D.N.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Gonçalves, W.N. A novel deep learning method to identify single tree species in UAV-based hyperspectral images. Remote Sens. 2020, 12, 1294. [Google Scholar] [CrossRef] [Green Version]
  205. Saarinen, N.; Vastaranta, M.; Näsi, R.; Rosnell, T.; Hakala, T.; Honkavaara, E.; Wulder, M.A.; Luoma, V.; Tommaselli, A.M.G.; Imai, N.N.; et al. Assessing biodiversity in boreal forests with UAV-based photogrammetric point clouds and hyperspectral imaging. Remote Sens. 2018, 10, 338. [Google Scholar] [CrossRef] [Green Version]
  206. Casapia, X.T.; Falen, L.; Bartholomeus, H.; Cárdenas, R.; Flores, G.; Herold, M.; Coronado, E.N.H.; Baker, T.R. Identifying and quantifying the abundance of economically important palms in tropical moist forest using UAV imagery. Remote Sens. 2020, 12, 9. [Google Scholar] [CrossRef] [Green Version]
  207. Fernández-Guisuraga, J.M.; Sanz-Ablanedo, E.; Suárez-Seoane, S.; Calvo, L. Using unmanned aerial vehicles in postfire vegetation survey campaigns through large and heterogeneous areas: Opportunities and challenges. Sensors 2018, 18, 586. [Google Scholar] [CrossRef] [Green Version]
  208. Rossi, F.C.; Fritz, A.; Becker, G. Combining satellite and UAV imagery to delineate forest cover and basal area after mixed-severity fires. Sustainability 2018, 10, 2227. [Google Scholar] [CrossRef] [Green Version]
  209. Shin, J.I.; Seo, W.W.; Kim, T.; Park, J.; Woo, C.S. Using UAV multispectral images for classification of forest burn severity-A case study of the 2019 Gangneung forest fire. Forests 2019, 10, 1025. [Google Scholar] [CrossRef] [Green Version]
  210. Pádua, L.; Guimarães, N.; Adão, T.; Sousa, A.; Peres, E.; Sousa, J.J. Effectiveness of sentinel-2 in multi-temporal post-fire monitoring when compared with UAV Imagery. ISPRS Int. J. Geo-Inf. 2020, 9, 225. [Google Scholar] [CrossRef] [Green Version]
  211. Rossi, F.; Becker, G. Creating forest management units with Hot Spot Analysis (Getis-Ord Gi*) over a forest affected by mixed-severity fires. Aust. For. 2019, 1–10. [Google Scholar] [CrossRef]
  212. Fernández-álvarez, M.; Armesto, J.; Picos, J. LiDAR-based wildfire prevention in WUI: The automatic detection, measurement and evaluation of forest fuels. Forests 2019, 10, 148. [Google Scholar] [CrossRef] [Green Version]
  213. Frey, J.; Asbeck, T.; Bauhus, J. Predicting tree-related microhabitats by multisensor close-range remote sensing structural parameters for the selection of retention elements. Remote Sens. 2020, 12, 867. [Google Scholar] [CrossRef] [Green Version]
  214. Baena, S.; Boyd, D.S.; Moat, J. UAVs in pursuit of plant conservation-Real world experiences. Ecol. Inform. 2018, 47, 2–9. [Google Scholar] [CrossRef]
  215. Almeida, D.R.A.; Broadbent, E.N.; Zambrano, A.M.A.; Wilkinson, B.E.; Ferreira, M.E.; Chazdon, R.; Meli, P.; Gorgens, E.B.; Silva, C.A.; Stark, S.C.; et al. Monitoring the structure of forest restoration plantations with a drone-lidar system. Int. J. Appl. Earth Obs. Geoinf. 2019, 79, 192–198. [Google Scholar] [CrossRef]
  216. Belmonte, A.; Sankey, T.; Biederman, J.A.; Bradford, J.; Goetz, S.J.; Kolb, T.; Woolley, T. UAV-derived estimates of forest structure to inform ponderosa pine forest restoration. Remote Sens. Ecol. Conserv. 2019, 1–17. [Google Scholar] [CrossRef]
  217. Paolinelli Reis, B.; Martins, S.V.; Fernandes Filho, E.I.; Sarcinelli, T.S.; Gleriani, J.M.; Leite, H.G.; Halassy, M. Forest restoration monitoring through digital processing of high resolution images. Ecol. Eng. 2019, 127, 178–186. [Google Scholar] [CrossRef] [Green Version]
  218. Röder, M.; Latifi, H.; Hill, S.; Wild, J.; Svoboda, M.; Brůna, J.; Macek, M.; Nováková, M.H.; Gülch, E.; Heurich, M. Application of optical unmanned aerial vehicle-based imagery for the inventory of natural regeneration and standing deadwood in post-disturbed spruce forests. Int. J. Remote Sens. 2018, 39, 5288–5309. [Google Scholar] [CrossRef]
  219. Whiteside, T.G.; Bartolo, R.E. A robust object-based woody cover extraction technique for monitoring mine site revegetation at scale in the monsoonal tropics using multispectral RPAS imagery from different sensors. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 300–312. [Google Scholar] [CrossRef]
  220. Fromm, M.; Schubert, M.; Castilla, G.; Linke, J.; McDermid, G. Automated detection of conifer seedlings in drone imagery using convolutional neural networks. Remote Sens. 2019, 11, 2585. [Google Scholar] [CrossRef] [Green Version]
  221. Rupasinghe, P.A.; Simic Milas, A.; Arend, K.; Simonson, M.A.; Mayer, C.; Mackey, S. Classification of shoreline vegetation in the Western Basin of Lake Erie using airborne hyperspectral imager HSI2, Pleiades and UAV data. Int. J. Remote Sens. 2019, 40, 3008–3028. [Google Scholar] [CrossRef]
  222. Berra, E.F.; Gaulton, R.; Barr, S. Assessing spring phenology of a temperate woodland: A multiscale comparison of ground, unmanned aerial vehicle and Landsat satellite observations. Remote Sens. Environ. 2019, 223, 229–242. [Google Scholar] [CrossRef]
  223. Iizuka, K.; Kato, T.; Silsigia, S.; Soufiningrum, A.Y.; Kozan, O. Estimating and examining the sensitivity of different vegetation indices to fractions of vegetation cover at different scaling Grids for Early Stage Acacia Plantation Forests Using a Fixed-Wing UAS. Remote Sens. 2019, 11, 1816. [Google Scholar] [CrossRef] [Green Version]
  224. Khokthong, W.; Zemp, D.C.; Irawan, B.; Sundawati, L.; Kreft, H.; Hölscher, D. Drone-Based Assessment of Canopy Cover for Analyzing Tree Mortality in an Oil Palm Agroforest. Front. For. Glob. Chang. 2019, 2, 1–10. [Google Scholar] [CrossRef] [Green Version]
  225. Nagai, S.; Saitoh, T.M.; Kajiwara, K.; Yoshitake, S.; Honda, Y. Investigation of the potential of drone observations for detection of forest disturbance caused by heavy snow damage in a Japanese cedar (Cryptomeria japonica) forest. J. Agric. Meteorol. 2018, 74, 123–127. [Google Scholar] [CrossRef] [Green Version]
  226. De Luca, G.; Silva, J.M.N.; Cerasoli, S.; Araújo, J.; Campos, J.; Di Fazio, S.; Modica, G. Object-based land cover classification of cork oak woodlands using UAV imagery and Orfeo Toolbox. Remote Sens. 2019, 11, 1238. [Google Scholar] [CrossRef] [Green Version]
  227. Fraser, B.T.; Congalton, R.G. Evaluating the effectiveness of Unmanned Aerial Systems (UAS) for collecting thematic map accuracy assessment reference data in New England forests. Forests 2019, 10, 24. [Google Scholar] [CrossRef] [Green Version]
  228. Sealey, L.L.; Van Rees, K.C.J. Influence of skidder traffic on soil bulk density, aspen regeneration, and vegetation indices following winter harvesting in the Duck Mountain Provincial Park, SK. For. Ecol. Manag. 2019, 437, 59–69. [Google Scholar] [CrossRef]
  229. Yeom, J.; Han, Y.; Kim, T.; Kim, Y. Forest fire damage assessment using UAV images: A case study on goseong-sokcho forest fire in 2019. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2019, 37, 351–357. [Google Scholar] [CrossRef]
  230. Sealey, L.L.; Van Rees, K.C.J. Assessment of residual slash coverage using UAVs and implications for aspen regeneration. J. Unmanned Veh. Syst. 2020, 8, 19–29. [Google Scholar] [CrossRef]
  231. Leberl, F.; Irschara, A.; Pock, T.; Meixner, P.; Gruber, M.; Scholz, S.; Wiechert, A. Point Clouds: Lidar versus 3D Vision. Photogramm. Eng. Remote Sens. 2010, 76, 1123–1134. [Google Scholar] [CrossRef]
  232. Goodbody, T.R.H.; Coops, N.C.; White, J.C. Digital Aerial Photogrammetry for Updating Area-Based Forest Inventories: A Review of Opportunities, Challenges, and Future Directions. Curr. For. Reports 2019, 55–75. [Google Scholar] [CrossRef] [Green Version]
  233. Whitehead, K.; Hugenholtz, C.H. Remote sensing of the environment with small unmanned aircraft systems (UASs), part 1: A review of progress and challenges. J. Unmanned Veh. Syst. 2014, 2, 69–85. [Google Scholar] [CrossRef]
  234. Nesbit, P.; Hugenholtz, C. Enhancing UAV–SfM 3D Model Accuracy in High-Relief Landscapes by Incorporating Oblique Images. Remote Sens. 2019, 11, 239. [Google Scholar] [CrossRef] [Green Version]
  235. Bohlin, J.; Bohlin, I.; Jonzén, J.; Nilsson, M. Mapping forest attributes using data from stereophotogrammetry of aerial images and field data from the national forest inventory. Silva Fenn. 2017, 51. [Google Scholar] [CrossRef] [Green Version]
  236. Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’Connor, J.; Rosette, J. Structure from Motion Photogrammetry in Forestry: A Review. Curr. For. Reports 2019, 5, 155–168. [Google Scholar] [CrossRef] [Green Version]
  237. Dandois, J.; Olano, M.; Ellis, E. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sens. 2015, 7, 13895–13920. [Google Scholar] [CrossRef] [Green Version]
  238. White, J.; Stepper, C.; Tompalski, P.; Coops, N.; Wulder, M. Comparing ALS and Image-Based Point Cloud Metrics and Modelled Forest Inventory Attributes in a Complex Coastal Forest Environment. Forests 2015, 6, 3704–3732. [Google Scholar] [CrossRef]
  239. Surovỳ, P.; Kuželka, K. Acquisition of forest attributes for decision support at the forest enterprise level using remote-sensing techniques-a review. Forests 2019, 10, 273. [Google Scholar] [CrossRef] [Green Version]
  240. Srestasathiern, P.; Siripon, S.; Wasuhiranyrith, R.; Kooha, P.; Moukomla, S. Estimating above ground biomass for eucalyptus plantation using data from unmanned aerial vehicle imagery. In Proceedings of the Remote Sensing for Agriculture, Ecosystems, and Hydrology XX, Berlin, Germany, 10–13 September 2018; Volume 10783. [Google Scholar] [CrossRef]
  241. Paul, K.I.; Radtke, P.J.; Roxburgh, S.H.; Larmour, J.; Waterworth, R.; Butler, D.; Brooksbank, K.; Ximenes, F. Validation of allometric biomass models: How to have confidence in the application of existing models. For. Ecol. Manag. 2018, 412, 70–79. [Google Scholar] [CrossRef]
  242. Gonzalez de Tanago, J.; Lau, A.; Bartholomeus, H.; Herold, M.; Avitabile, V.; Raumonen, P.; Martius, C.; Goodman, R.C.; Disney, M.; Manuri, S.; et al. Estimation of aboveground biomass of large tropical trees with terrestrial LiDAR. Methods Ecol. Evol. 2018, 9, 223–234. [Google Scholar] [CrossRef] [Green Version]
  243. Lausch, A.; Erasmi, S.; King, D.; Magdon, P.; Heurich, M. Understanding Forest Health with Remote Sensing-Part II—A Review of Approaches and Data Models. Remote Sens. 2017, 9, 129. [Google Scholar] [CrossRef] [Green Version]
  244. Carter, G.A. Responses of Leaf Spectral Reflectance to Plant Stress. Am. J. Bot. 1993, 80, 239–243. [Google Scholar] [CrossRef]
  245. Jensen, J.R. Introductory Digital Image Processing: A Remote Sensing Perspective, 4th ed.; Pearson Series in Geographic Information Science; Pearson: London, UK, 2015; ISBN 978-0-13-405816-0. [Google Scholar]
  246. Hill, D.J.; Babbar-Sebens, M. Promise of UAV-Assisted Adaptive Management of Water Resources Systems. J. Water Resour. Plan. Manag. 2019, 145, 02519001. [Google Scholar] [CrossRef]
  247. Dupuis, C.; Lejeune, P.; Michez, A.; Fayolle, A. How can remote sensing help monitor tropical moist forest degradation?-A systematic review. Remote Sens. 2020, 12, 1087. [Google Scholar] [CrossRef] [Green Version]
  248. Quintano, C.; Fernández-Manso, A.; Calvo, L.; Marcos, E.; Valbuena, L. Land surface temperature as potential indicator of burn severity in forest Mediterranean ecosystems. Int. J. Appl. Earth Obs. Geoinf. 2015, 36, 1–12. [Google Scholar] [CrossRef]
  249. Quintano, C.; Fernández-Manso, A.; Fernández-Manso, O.; Shimabukuro, Y.E. Mapping burned areas in Mediterranean countries using spectral mixture analysis from a uni-temporal perspective. Int. J. Remote Sens. 2006, 27, 645–662. [Google Scholar] [CrossRef]
  250. Gómez, C.; Alejandro, P.; Hermosilla, T.; Montes, F.; Pascual, C.; Ruiz, L.Á.; Álvarez-Taboada, F.; Tanase, M.A.; Valbuena, R. Remote sensing for the Spanish forests in the 21stcentury: A review of advances, needs, and opportunities. For. Syst. 2019, 28, 1–33. [Google Scholar] [CrossRef]
  251. Eugenio, F.C.; Schons, C.T.; Mallmann, C.L.; Schuh, M.S.; Fernandes, P.; Badin, T.L. Remotely piloted aircraft systems and forests: A global state of the art and future challenges. Can. J. For. Res. 2020, 50, 705–716. [Google Scholar] [CrossRef]
  252. Ghiyamat, A.; Shafri, H.Z.M. A Review on Hyperspectral Remote Sensing for Homogeneous and Heterogeneous Forest Biodiversity Assessment. Int. J. Remote Sens. 2010, 31, 1837–1856. [Google Scholar] [CrossRef]
  253. Hennessy, A.; Clarke, K.; Lewis, M. Hyperspectral Classification of Plants: A Review of Waveband Selection Generalisability. Remote Sens. 2020, 12, 113. [Google Scholar] [CrossRef] [Green Version]
  254. Upadhyay, V.; Kumar, A. Hyperspectral Remote Sensing of Forests: Technological Advancements, Opportunities and Challenges. Earth Sci. Inform. 2018, 11, 487–524. [Google Scholar] [CrossRef]
  255. Stone, C.; Mohammed, C. Application of Remote Sensing Technologies for Assessing Planted Forests Damaged by Insect Pests and Fungal Pathogens: A Review. Curr. For. Rep. 2017, 3, 75–92. [Google Scholar] [CrossRef]
  256. Schepaschenko, D.; See, L.; Lesiv, M.; Bastin, J.F.; Mollicone, D.; Tsendbazar, N.E.; Bastin, L.; McCallum, I.; Laso Bayas, J.C.; Baklanov, A.; et al. Recent Advances in Forest Observation with Visual Interpretation of Very High-Resolution Imagery. Surv. Geophys. 2019, 40, 839–862. [Google Scholar] [CrossRef] [Green Version]
  257. Lehmann, J.; Nieberding, F.; Prinz, T.; Knoth, C. Analysis of Unmanned Aerial System-Based CIR Images in Forestry—A New Perspective to Monitor Pest Infestation Levels. Forests 2015, 6, 594–612. [Google Scholar] [CrossRef] [Green Version]
  258. Lechner, A.M.; Foody, G.M.; Boyd, D.S. Applications in Remote Sensing to Forest Ecology and Management. One Earth 2020, 2, 405–412. [Google Scholar] [CrossRef]
  259. Coops, N.C. Characterizing forest growth and productivity using remotely sensed data. Curr. For. Rep. 2015, 1, 195–205. [Google Scholar] [CrossRef]
  260. Özcan, A.H.; Hisar, D.; Sayar, Y.; Ünsalan, C. Tree crown detection and delineation in satellite images using probabilistic voting. Remote Sens. Lett. 2017, 8, 761–770. [Google Scholar] [CrossRef]
  261. Hossain, M.D.; Chen, D. Segmentation for Object-Based Image Analysis (OBIA): A Review of Algorithms and Challenges from Remote Sensing Perspective. ISPRS J. Photogramm. Remote Sens. 2019, 150, 115–134. [Google Scholar] [CrossRef]
  262. Duarte, L.; Silva, P.; Teodoro, A. Development of a QGIS Plugin to Obtain Parameters and Elements of Plantation Trees and Vineyards with Aerial Photographs. ISPRS Int. J. Geo-Inf. 2018, 7, 109. [Google Scholar] [CrossRef] [Green Version]
  263. Zimudzi, E.; Sanders, I.; Rollings, N.; Omlin, C.W. Remote sensing of mangroves using unmanned aerial vehicles: Current state and future directions. J. Spat. Sci. 2019, 1–18. [Google Scholar] [CrossRef]
  264. Jain, P.; Coogan, S.C.P.; Subramanian, S.G.; Crowley, M.; Taylor, S.; Flannigan, M.D. A review of machine learning applications in wildfire science and management. Environ. Rev. 2020, 28, 478–505. [Google Scholar] [CrossRef]
  265. Vanden Borre, J.; Paelinckx, D.; Mücher, C.A.; Kooistra, L.; Haest, B.; De Blust, G.; Schmidt, A.M. Integrating remote sensing in Natura 2000 habitat monitoring: Prospects on the way forward. J. Nat. Conserv. 2011, 19, 116–125. [Google Scholar] [CrossRef]
  266. Kennedy, R.E.; Townsend, P.A.; Gross, J.E.; Cohen, W.B.; Bolstad, P.; Wang, Y.Q.; Adams, P. Remote sensing change detection tools for natural resource managers: Understanding concepts and tradeoffs in the design of landscape monitoring projects. Remote Sens. Environ. 2009, 113, 1382–1396. [Google Scholar] [CrossRef]
  267. Huylenbroeck, L.; Laslier, M.; Dufour, S.; Georges, B.; Lejeune, P.; Michez, A. Using remote sensing to characterize riparian vegetation: A review of available tools and perspectives for managers. J. Environ. Manag. 2020, 267, 110652. [Google Scholar] [CrossRef] [PubMed]
  268. Turner, W.; Rondinini, C.; Pettorelli, N.; Mora, B.; Leidner, A.K.; Szantoi, Z.; Buchanan, G.; Dech, S.; Dwyer, J.; Herold, M.; et al. Free and Open-Access Satellite Data Are Key to Biodiversity Conservation. Biol. Conserv. 2015, 182, 173–176. [Google Scholar] [CrossRef] [Green Version]
  269. Sá, C.; Grieco, J. Open Data for Science, Policy, and the Public Good. Rev. Policy Res. 2016, 33, 526–543. [Google Scholar] [CrossRef]
  270. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  271. Chen, S.; McDermid, G.; Castilla, G.; Linke, J. Measuring Vegetation Height in Linear Disturbances in the Boreal Forest with UAV Photogrammetry. Remote Sens. 2017, 9, 1257. [Google Scholar] [CrossRef] [Green Version]
  272. Elliott, S. The potential for automating assisted natural regeneration of tropical forest ecosystems. Biotropica 2016, 48, 825–833. [Google Scholar] [CrossRef]
  273. Novikov, A.I.; Ersson, B.T. Aerial seeding of forests in Russia: A selected literature analysis. IOP Conf. Ser. Earth Environ. Sci. 2019, 226, 012051. [Google Scholar] [CrossRef]
  274. Leroy, B.M.L.; Gossner, M.M.; Lauer, F.P.M.; Petercord, R.; Seibold, S.; Jaworek, J.; Weisser, W.W. Assessing insecticide effects in forests: A tree-level approach using unmanned aerial vehicles. J. Econ. Entomol. 2019, 112, 2686–2694. [Google Scholar] [CrossRef] [PubMed]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part II: Research Applications. Forests 2021, 12, 397. https://0-doi-org.brum.beds.ac.uk/10.3390/f12040397

AMA Style

Dainelli R, Toscano P, Di Gennaro SF, Matese A. Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part II: Research Applications. Forests. 2021; 12(4):397. https://0-doi-org.brum.beds.ac.uk/10.3390/f12040397

Chicago/Turabian Style

Dainelli, Riccardo, Piero Toscano, Salvatore Filippo Di Gennaro, and Alessandro Matese. 2021. "Recent Advances in Unmanned Aerial Vehicles Forest Remote Sensing—A Systematic Review. Part II: Research Applications" Forests 12, no. 4: 397. https://0-doi-org.brum.beds.ac.uk/10.3390/f12040397

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop