Next Article in Journal
How Is the Effect of Phytogenic Feed Supplementation Tested in Heat Stressed Pigs? Methodological and Sampling Considerations
Previous Article in Journal
Evaluation of Tropical Tomato for Growth, Yield, Nutrient, and Water Use Efficiency in Recirculating Hydroponic System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Prospects of Improving Agricultural and Water Productivity through Unmanned Aerial Vehicles

1
Water Research Commission of South Africa, 4 Daventry St, Lynnwood Manor, Pretoria 0081, South Africa
2
Centre for Transformative Agricultural and Food Systems, School of Agricultural, Earth and Environmental Sciences, University of KwaZulu-Natal (UKZN), Scottsville, Pietermaritzburg 3209, South Africa
3
International Water Management Institute (IWMI-SA), 141 Cresswell St, Weavind Park, Silverton, Pretoria 0184, South Africa
4
Geomatics Department, Tshwane University of Technology, Staatsartillerie Road, Pretoria 0001, South Africa
5
Agriculture Research Council Institute for Soil, Climate and Water (ARC-ISCW), Pretoria 0001, South Africa
6
Discipline of Agro-meteorology, School of Agricultural, Earth and Environmental Sciences, University of KwaZulu-Natal (UKZN), Scottsville, Pietermaritzburg 3209, South Africa
7
Department of Geography, Environmental Studies and Tourism, University of the Western Cape (UWC), Robert Sobukwe Road, Bellville, Cape Town 7535, South Africa
8
Discipline of Geography and Environmental Science, School of Agricultural, Earth and Environmental Sciences, University of KwaZulu-Natal (UKZN), Scottsville, Pietermaritzburg 3209, South Africa
*
Authors to whom correspondence should be addressed.
Submission received: 24 May 2020 / Revised: 11 June 2020 / Accepted: 18 June 2020 / Published: 1 July 2020

Abstract

:
Unmanned Aerial Vehicles (UAVs) are an alternative to costly and time-consuming traditional methods to improve agricultural water management and crop productivity through the acquisition, processing, and analyses of high-resolution spatial and temporal crop data at field scale. UAVs mounted with multispectral and thermal cameras facilitate the monitoring of crops throughout the crop growing cycle, allowing for timely detection and intervention in case of any anomalies. The use of UAVs in smallholder agriculture is poised to ensure food security at household level and improve agricultural water management in developing countries. This review synthesises the use of UAVs in smallholder agriculture in the smallholder agriculture sector in developing countries. The review highlights the role of UAV derived normalised difference vegetation index (NDVI) in assessing crop health, evapotranspiration, water stress and disaster risk reduction. The focus is to provide more accurate statistics on irrigated areas, crop water requirements and to improve water productivity and crop yield. UAVs facilitate access to agro-meteorological information at field scale and in near real-time, important information for irrigation scheduling and other on-field decision-making. The technology improves smallholder agriculture by facilitating access to information on crop biophysical parameters in near real-time for improved preparedness and operational decision-making. Coupled with accurate meteorological data, the technology allows for precise estimations of crop water requirements and crop evapotranspiration at high spatial resolution. Timely access to crop health information helps inform operational decisions at the farm level, and thus, enhancing rural livelihoods and wellbeing.

1. Introduction

Although agricultural production has increased substantially in recent years, the demand for agricultural products has also risen significantly, with estimates of aggregate agricultural consumption expected to increase by 69% by 2050 [1]. The increase in demand for food, fibre and feed is mostly fuelled by a surge in global population, which is estimated to reach 9 billion by 2050 [2]. The need to feed a growing population has resulted in the intensification and extensification of agricultural land, consequently resulting in agriculture consuming about 70% of the available freshwater resources worldwide [3]. One of the urgent challenges currently facing sub-Saharan Africa (SSA) is to increase crop yields and at the same time reduce the amount of water used in crop production, that is, ‘more crops per drop’ [4]. This is urgent particularly in smallholder farming, which experiences marginal production due to a range of biophysical and management related factors [5]. In most arid and semi-arid regions water is a scarce resource needing improved crop water productivity [6]. Improved agricultural water management requires innovative and evidence-based solutions applied throughout the whole agriculture value chain [7,8]. The use of Unmanned Aerial Vehicles (UAVs), also known as drones, offers advanced crop image data analytics at high spatial and temporal resolution and crop monitoring in near real-time, important elements in agricultural water management [9].
Unmanned Aerial Vehicles are remotely controlled aircrafts mounted with Global Positioning System (GPS) and specialised thermal and multispectral sensors to collect geo-referenced and high-resolution images without cloud interference [10]. Their use at field or farm scale allows access to real-time agro-meteorological information and crop monitoring at different stages throughout the cropping cycle. The use of UAVs in smallholder agriculture, particularly in regions facing water scarcity, could prove worthwhile as they provide useful information for informing operational decision at the farm level [11], thus, helping to mitigate the risk of crop failure and low yields. Real-time monitoring of crops at field scale, which facilitates timely intervention throughout the growing cycle, results in improved crop and water productivity. Systematic crop health monitoring and the availability of agro-meteorological information in real-time, enables smallholder farmers to make informed, tactical and operational decisions, such as when to plant and irrigate as well as when and where to apply nutrients and chemicals, among other important decisions [12]. The opportunities offered by UAVs, brings climate-smart agriculture (CSA) and precision agriculture to smallholder farmers [13].
Sequential monitoring of crops allows farmers to detect subtle changes that are not easily identified by the human eye [14]. For example, multispectral drone imagery can be used to assess crop health through indices such as NDVI (Normalized Difference Vegetation Index), or NDRE (Normalised Vegetation Red Edge). NDVI, in particular, enables an analysis of the intensity of solar radiation absorption in specific bands, and therefore the health condition of the monitored crops [15]. Traditionally, NDVI has been derived from imageries obtained from space-borne satellite sensors such us Landsat, Pour l’Observation de la Terre (SPOT) and the Moderate Resolution Imaging Spectroradiometer (MODIS), but the temporal and spatial resolutions of the resulting products are generally too low to provide accurate crop monitoring at field level, especially at the scale suitable for informing smallholder farmers. Unmanned Aerial Vehicles offer NDVI analysis of crops at 0.05 to 1 metre resolution, suitable for monitoring the condition of individual plants with optimal accuracy, and at a scale that fits the farm sizes of smallholder farmers. The multispectral camera captures RGB, NIR NDVI, and NDRE indices, offering spectral capabilities that make UAVs to be important and suitable for monitoring crop health and density, water stress and weed detection [16]. When mounted with a thermal camera, UAVs are effective for measuring evapotranspiration (ET) and water stress [17].
Accurate and detailed geospatial drone data is also useful in disaster situations, where crop damage occurs due to extreme weather events, as they precisely estimate the level of crop loss by comparing the pre-disaster and post-disaster imagery [18]. The pre- and post-disaster information is important for insurance companies as they move towards insuring smallholder farmers against extreme weather events [19]. Drones can provide evidence on the state of damage through the development of an index-based crop insurance, thereby reducing risk and vulnerability while contributing towards achieving the 2030 Global Agenda on Sustainable Development [20].
Satellite and UAV imagery serve the same purpose but differ significantly in spatial and temporal resolution. The choice of the two is determined by the scale and accuracy the user requires to achieve [9,21]. The advantage of UAVs over space-borne sensors is that UAVs offer low cost imaging at high spatial resolution and at user determined revisit periods [22]. Thus, UAVs are appropriate for assessing rapid changes in crop phenology, stress assessment and crop health in near real-time [23]. However, the use of UAVs could prove costly and challenging for smallholder farmers due to lack of resources to own and maintain UAVs, and the know-how and capacity to operate, collect, process, and interpret remotely acquired data. However, collective, or communal ownership and the use of extension services could be an effective alternative to reach smallholder farmers with the technology at low cost.
The high spatial resolution offered by UAVs at low altitudes allows plant analyses and detection of anomalies unlike satellite images taken at high altitudes. Although there are also high-resolution satellites like QuickBird, RapidEye, WorldView and IKONOS, as well as hyperspectral and Light Detection and Ranging (LiDAR) remote sensing, these are very expensive and generally time consuming in terms of processing [24]. The high altitudes and low temporal resolution of space-borne sensors make them only suitable over large areas where changes occur slowly [25], significantly limiting their use at field level. Although the high level of spectral detail in hyperspectral images provides better capabilities to detect crop anomalies, they are very complex due to the high number of bands [26]. Although LiDAR imaging could offer similar services as drones, it samples positions without RGB, and thus only creates monochromatic datasets, which could be difficult to interpret [27]. Furthermore, high altitude sensors tend to be susceptible to atmospheric energy attenuation and impurities. Meanwhile, UAVs can acquire images even during cloudy days at short flight preparation times which has not been possible previously [28]. Drones are reducing the cost of remote sensing as new and affordable models are being introduced [29]. This review discusses the importance of UAVs in agricultural water management and crop health as an alternative to improve productivity with particular focus on smallholder agriculture. Within the context of this study, we defined smallholder farmers as farmers cultivating plots that measure about two hectares in area or below, and generally grow crops for household consumption, and thus are important for household food security [5].

2. Methods

Methodological Framework

Figure 1 is a graphical representation of the methodological framework developed to explore the importance of UAVs in enhancing agricultural water management, with a focus on smallholder agriculture. The increasing frequency and intensity in drought recurrence, as well as rising temperatures that results in increased ET are worsening water scarcity challenges in many regions [11,30]. In turn, water scarcity results in increased aridity and shifts in agro-ecological zones and thus, affecting crop yields [31]. The challenge of depleting water resources requires innovative technologies to improve water use efficiency and enhance crop productivity under water-limited conditions without increasing pressure on already strained water resources.
We identified five thematic areas where UAVs can improve agricultural water management and productivity, these include (i) mapping more accurate agricultural fields (ii) assessing crop stress and health, (iii) crop yield modelling, (iv) modelling (ET), and (v) estimating crop water productivity (Figure 1). The aim is to enhance agriculture and water productivity through UAVs in comparison to current remote sensing products and methods. We provide detailed information and procedures of how UAVs can improve agriculture and water productivity and improve food, nutritional and water security in the advent of climate change.
This study offers detailed and practical insights, from multidisciplinary perspectives, on how UAVs enhance agricultural productivity and water management among smallholder farmers and improve rural livelihoods through enhancing adaptation capacity and building resilience. A detailed comparison between space-borne sensors and UAV sensed products is provided, focusing on how UAVs can improve the accuracy of remotely sensed products at field level. The study illustrates how several UAV applications could be deployed to promote precision farming and climate change adaptation among often-poor smallholder farmers. This group of farmers is characterised by low access to climate information and services as well as technologies that enhance productivity and adaptive capacities.

3. Results

3.1. Mapping Agriculture Fields Using UAVs

Accurate geospatial information on agriculture is a critical requirement for planning and decision-making, particularly when intending to increase and improve smallholder irrigated agriculture [32,33]. Most of the freely available space-borne satellite images have been used to produce coarse landuse/cover maps but, the advent of UAVs has improved mapping accuracy because of their high resolution, although at a smaller coverage [34]. The use of UAVs in mapping landuse makes it potentially possible to monitor smallholder farming fields, which are generally too small to be detected by readily available moderate to low resolution satellite images [9]. Smallholder farming plots measure about two hectares in area per farmer [35].
Smallholder farming areas are generally detected as one massive agriculture land (Figure 2b) by low to moderate resolution satellite, yet their mapping accuracy is important as, for example in southern Africa, they occupy about 80% of the cultivated land, contributing about 90% of the agricultural produce [36]. Therefore, mapping accurate and detailed agricultural fields is important for policy and decision making especially for addressing climate resilience of agricultural livelihoods. High resolution satellite images that could offer the same accuracy as UAV are costly, which limits their use [37]. Unlike space-borne satellite, UAVs are not limited by cloud cover because the temporal resolution (acquisition time) is user-defined and can be adjusted to local weather conditions [21]. UAVs can be deployed repeatedly at flexible mission times and altitudes to acquire agricultural data. Images acquired by UAVs offer observation of single plants, patches and ultimately patterns over the fields, something that is not possible with space-borne satellite images [38]. These advantages, coupled with ultra-high spatial resolutions, make UAVs best suited for mapping crops planted in narrow rows at optimal accuracies. The main limiting factor of UAV imagery as compared to satellite imagery is their small coverage per image.
As the resolution of UAVs can be as high as 0.05 metre depending on the flight altitude, they are appropriate for accurately mapping small agricultural fields. Figure 2 compares the accuracy of irrigated area maps derived from UAV image (Figure 2a) and satellite imagery (Figure 2b). The cultivated land is predominantly a centre pivot irrigated area (Figure 2a) (derived UAV imagery), yet Figure 2b resembles the whole scheme area as cultivated. The irrigated area calculated from UAV map is 40,647 ha, which is 17,000 ha less than the area calculated from the satellite map which is 57 803 ha. Thus, satellite images tend to overestimate cultivated areas. Although accuracy may improve with high-resolution space-borne sensors, there is always an overestimation especially at field scale. Overestimating irrigated areas has the disadvantage of misinforming policy and decision-making. Accurate estimates are important for understanding ecological footprint of food production and assessing the potential of irrigation development with limited land and water resources.
The accuracy of UAVs in mapping agricultural fields is supported by previous studies. For instance, a study done in Córdoba and Seville in Spain, illustrated the accuracy and capability of UAVs remotely sensed data in characterising weeds between and within rows of sunflower and cotton crops across the growing season [39]. In a related study, de Castro et al. illustrated the utility of UVAs in discriminating Cynodon dactylon grassweeds in the vineyards of Lleida in Spain [40]. These studies discriminated the weeds from crops with overall classification accuracy ranging between 71–80%.

3.2. Assessing Crop Water Stress and Health Using UAV Derived Indices

Water Stress Indices (WSI) are useful crop parameters for mitigating drought impacts, as well as for irrigation scheduling. Remote sensing provides various products that are used to acquire ecological information from the interpretation and analysis of thermal, multispectral, and hyperspectral image bands. Reflectance of the electromagnetic radiation (EMR) on plants or vegetation differs depending on chlorophyll content, type of plant, sugar content and water content within the plant tissues [41]. Precise interpretation of spectral reflectance through high-resolution UAV images can reveal water and nutrient deficiencies, as well as information on plant health. During the growing process, a plant requires water, carbon dioxide and light for photosynthesis to occur, thereby producing sugar and oxygen [42]. Besides these requirements, plants also need nutrients for plant cell and tissue development [42]. Lack of these components leads to plant stress and the symptoms are mainly observed through the defoliation of older leaves and decrease in biomass. Modern agricultural UAV platforms can be mounted with both multispectral and thermal cameras to simultaneously acquire information on both crop health and water stress at high resolution.
One indicator for a healthy plant is the chlorophyll content in the leaves [43]. Chlorophyll absorbs the visible light and reflects Near Infrared (NIR). Healthy plants with good photosynthetic activities can be analysed by comparing the reflectance of NIR and visible light [42,44]. These plant characteristics are assessed through vegetation indices, which are mathematical transformation of image bands that are used to extract certain spectral properties qualitatively and quantitatively such as vegetation cover, vigour, and growth dynamics [44]. These vegetation indices are tailor made to specific application and each of them has its own advantage. Thus, vegetation indices can be used to enhance the classification and assessment algorithms for plant health [45]. Vegetation indices also provide information on plant growth as healthy plants absorb more visible-light and reflects more near-infrared (NIR) [46]. There are many vegetation indices that use NIR and Red bands and these are: Normalised Difference Vegetation Index (NDVI), Soil Adjusted Vegetation Index (SAVI), Ashburn Vegetation Index (AVI), Enhanced Vegetation Index (EVI), among others [44]. The commonly used vegetation index is the NDVI (Equation (1)), which is directly used to monitor and characterise canopy growth and plant vigour [44]. The NDVI is expressed as [47]:
NDVI = NIR Red NIR + Red
where NIR is the Near Infrared band and Red is the red band.
The NDVI is important for providing information on the variability of the health of crops, as well as in large-scale monitoring of plantations, assess changes in the field, quantifying crop acreage and analyse crop loss [48]. It is also used to estimate plant attributes such as physiological status, yield production, crop distribution and irrigation mapping [9,49,50]. Another modification of the NDVI used to assess changes in plant health is the SAVI (Soil Adjusted Vegetation Index) (Equation (2)) [51]. SAVI takes into consideration the variations in soil properties. SAVI tends to minimize soil brightness and therefore, it introduces the soil calibration factor in the NDVI Equation (1) to account for the first order soil-vegetation optical interactions. SAVI is defined as [52]:
SAVI = ( NIR Red NIR + Red + L ) × ( 1 + L )
where L is a constant that is a surrogate for the leaf area index (LAI) [53].
The crop water stress index (CWSI) is another index for assessing the level of stress in a plant and by using the temperature extracted from the thermal band, as there is a correlation between CWSI and transpiration rate and soil water moisture [54,55]. High resolution UAV derived NDVI can be combined with other indices such as the CWSI and the Canopy-Chlorophyll Content Index (CCCI) to accurately delineate agricultural fields and monitor crop health [56].
An analysis of crop water deficit or water stress monitoring at field level using high-resolution UAVs indices is the basis for an effective irrigation scheduling [57]. The CWSI identifies water stress in crops within 24–48 h prior to stress detection by visual observation [57]. In most cases, the accuracy of space-borne based remote sensing water stress techniques have been hampered by their low spatio-temporal resolution, yet the advent of thermal and multispectral UAVs have transformed and enhanced the accuracy of crop water stress estimates due to their high resolution [58]. For example, a study done in China successfully mapped the water stress status of maize at farm level to inform precision irrigation as a substitute to CWSI modelling [59].

3.3. Estimating Crop Yield through UAVs

The use of high-resolution satellite images to assess crop vigour and yield has generally been limited by high costs, particularly of hyperspectral images [60]. Other limitations of space-borne acquired images in crop yield estimation include cloud noise on the images, and the complex and heterogeneous nature of farming systems in smallholder farming areas, which are difficult to detect with low resolution images. The advent of UAVs has bridged the gap between space-borne satellites and the use of remotely sensed products in smallholder farmlands on one hand, and the high cost, labour intensive, and time-consuming conventional field surveys of crops on the other hand [10]. Thus, the availability of low-cost UAVs has opened new possibilities to remotely sense crop status and yields even on complex smallholder farms with improved accuracy.
High resolution multi-temporal UAV images have transformed the monitoring of crop development, which provides crucial planning information to farmers. High resolution UAV images provide more accurate information on crop biomass, yield, and cropped area with acceptable accuracy for farmers and decision makers to effectively manage and monitor crops for optimum benefits [61,62] to farmers. Thus, UAV sensed images provide more accurate information on crop biomass, yield and cropped area to farmers and decision makers to manage and monitor crops for optimum benefit [61]. Studies have shown that there is a high correlation between vegetation spectral index extracted from satellite images and the green biomass and yield [63,64]. Therefore, combining vegetation spectral indices and the green biomass is important for estimating yield before harvesting. Crop yield is defined as crop production per unit area and it is a product of the complex interaction between soil conditions (physical and chemical conditions), management (cultivar and spacing), and the meteorological conditions (water and thermal) [4]. Crop yield estimation enhances preparedness, as it is part of early warning, providing decision makers with timely information on crop deficit or surplus [11].
High-resolution remotely sensed data has become an alternative source of information needed to estimate crop yield to traditional methods, as it is more accurate, cost and time effective. Traditional methods are generally costly, time consuming and are prone to errors, which often results in poor crop yield assessment [65]. The three remote sensing methods for estimating crop yield include those (i) based on empirical statistical models, (ii) based on water consumption balance models, and (iii) based on biomass estimation models [66,67]. UAVs provide more accurate information on crop height and biomass timeously and at user-defined temporal resolution. Crop height and biomass are important components for assessing growth rate and health of crops [68,69]. Plant height and biomass data are important components for assessing the effect of genetic variation in the crops, crop development and yield potential [70]. The two components are essential for optimising site-specific crop management and yield predictions [61].
The most used model for estimating crop yield is the Monteith equation (Equation (3)), which uses the accumulation of biomass as a proportion of accumulated absorbed photo-synthetically active radiation (APAR) [71]. According to Monteith, crop yield is expressed as [71]:
C Y = ε ( A P A R ) ( t ) ( t ) ( Kgm 2 )
where, CY is the crop yield, which is the accumulated biomass (kg/m2) in period t, ε in g M/J is the light use efficiency and t is the period over which accumulation takes place, and APAR is the absorbed photosynthetically active radiation.
Variability in light use efficiency (ε) in plants is caused by varying nutrient and water levels [72]. Studies show that when crops are not water stressed and temperature is optimal, ε is a relatively constant property of plants [73,74,75]. UAVs mounted with multispectral sensors produce more accurate and reliable biomass and light use efficiency indices for estimating crop yields. UAVs are currently being applied in China to estimate rice yields [76].

3.4. Modelling Crop Evapotranspiration (ET) Using UAVs

Crop ET or crop water-use is the largest water loss from agriculture and its accurate quantification is critical for improved agricultural water management, irrigation scheduling and knowledge of crop water requirements [77]. Estimating ET has become a topical research topic in recent years due to the challenges of water scarcity and climate change. Evapotranspiration has historically been estimated using various field techniques such as the soil water balance, weighing lysimetry, Bowen ratio, eddy covariance and surface renewal. Most of these techniques rely on theoretical derivations and major assumptions [78]. In addition, they represent a point measurement and require footprint estimates to determine the surface area that the measurement represents [78]. Scintillometry has recently overcome the difficulties to some extent by estimating ET over wider, heterogeneous landscapes, but still relies on the other components of the energy balance, which are difficult to measure directly, in heterogeneous landscapes [79].
With the advent of satellite sensors and multispectral imagery of the earth’s surface, remote sensing has evolved as a technique that allows ET to be measured more efficiently and economically on a large spatial scale or field scale using the shortened energy balance equation [80,81]. A number of models have been developed that include the Surface Energy Balance Index (SEBI), the Surface Energy Balance Algorithm for Land (SEBAL), the Mapping Evapotranspiration at high Resolution with Internalized Calibration (METRIC) and Surface Energy Balance System (SEBS). These models vary in their specific methodologies, but generally use the visible, red, and infrared bands of the earth surface reflectance together with terrain and vegetation properties to determine ET, which is extrapolated between consecutive satellite overpasses using representative meteorological data.
SEBI relies on the difference between the dry and wet limits that are used to derive pixel-by-pixel ET from the relative evaporative fraction [82]. SEBAL uses visible, near-infrared, and thermal-infrared reflectance to determine the instantaneous fluxes of the shortened energy balance equation [83]. The SEBS model estimates the atmospheric turbulent fluxes and evaporative fraction by using remote sensing products and observed data [84]. The SEBS model requires the following as input data; land surface albedo, emissivity, temperature, fractional vegetation coverage, leaf area index and the height of the vegetation cover, air pressure, temperature, relative humidity and wind speed at a reference height [84]. It also requires longwave and shortwave downward radiation as input data.
The METRIC algorithm is a successor of the SEBAL model, but with slope and aspect influences included [85,86]. The algorithm uses weather data (air temperature, wind speed, solar radiation, and relative humidity) and satellite radiance data at various bands as inputs for this model. Previous studies have shown that the METRIC algorithm (Equation (4)) produces more accurate results than any other ET model [87,88,89]. The algorithm is calculated as residual energy of the surface energy balance equation expressed as [86]:
L E = R n G H
where, LE is latent energy flux density consumed by ET, Rn is net radiation, G is soil heat flux density and H is sensible heat flux density (energy interaction with heating or cooling of air (all units are in W/m2)).
Initially, the METRIC model was developed using Landsat satellite imageries, but the advent of high-resolution UAV imagery, there exists the potential to further improved accuracy of ET estimation [90]. The METRIC-UAV approach uses the thermal band, which is sensed directly from the UAV’s multispectral camera. The thermal band is used to calculate surface energy balance, and the thermal information is used to estimate sensible heat flux density (H), soil heat flux density (G), and net radiation (Rn). Rn uses surface temperature to estimate longwave thermal emission by the surface. The approach is similar to the METRIC model developed by Allen et al. [86] but using high resolution multispectral UAV imagery.
High-resolution UAVs equipped with both multispectral and thermal sensors are best positioned for estimating more accurate ET as they can capture both images at the same time [17,90]. Many of the problems associated with satellite data used for ET models such as coarse resolution, fixed satellite overpass times and cloud cover, are addressed with the use of UAVs. ET changes rapidly according to microclimatic variables and soil water availability. The benefit of being able to determine ET at variables, or when required, intervals, using UAVs is a significant benefit over satellites and there is potential for UAV’s to become useful tools in monitoring crop water-use as studies have shown [59,90,91].

3.5. Use of UAVs in Estimating Crop Water Productivity

The use of UAVs is showing its worth by providing previously unavailable spatially explicit information required to understand and improve crop and water productivity. Improvements in agricultural water management and, particularly crop water productivity allows the agricultural sector to share water equitably with other competing sectors. Water productivity (WP) (Equation (5)) is a quantitative term which refers to the relationship between the volume of water utilised in crop production and the amount of crop produced expressed kg/m3 [92,93]. Thus, crop WP is a measure of output from a given agricultural system in relation to the water it consumes and is expressed as [94]:
W P = Agriculture   benefit W a t e r   c o n s u m e d
where, the agricultural benefit is the actual harvested yield, the WP is expressed in units of mass like kilograms (kg/m3), or monetary value (income) of that yield expressed in dollars (US$/m3) or nutritional value (kcal/m3).
Water consumed (the denominator of Equation (5)) refers to water that is directly consumed by crops [95]. WP is critical in understanding food and water relations while offering a footstool for the assessment of water use efficiency and water footprint integrated in global food trade [93]. Thus, WP can efficiently be determined through accurate measurements of crop evapotranspiration (ETc). There are other methods used to estimate WP such as the quantity of water supplied to a field, but all water supplied to a field is not consumed by crops in its entirety as part of it will always find its way to the drainage system or is evaporated [8]. Water that is directly consumed by crops is efficiently measured through actual or crop ET (ETc) [8]. As already alluded to, ET is a better measure of water that is consumed by crops and is measured through ETc. Crop ET is the consumption of water through ET, which is incorporated into a product and cannot be readily reused [96].
The high-resolution images from UAVs are vital in mapping accurate agricultural fields (Figure 2). The exclusion of other landuses provides accurate estimates of ETc. UAVs could provide an adequate spatial and temporal detail for estimating ET, ultimately offering a plausible understanding of WP [97]. The Land Surface Heat Flux model, together with UAV measured land surface temperature (LST) at high resolution are used to estimate ET. A UAV mounted with a thermal camera is flown on a field of heat fluxes and hydrology by concatenating thermal images into mosaics of LST, which are an input for the Two-Source Energy Balance (TSEB) modelling scheme, and used to partition the fluxes into soil and canopy (soil evaporation and canopy transpiration) [98].
Thus, UAV images can be used to build high spatial resolution ET maps of up to 1 m. More accurate ETc are derived from ET maps developed using data acquired by UAVs at field scale. The results of UAV derived ET are then compared with measured ET to assess the accuracy of the modelled ET. Existing ET datasets have presented challenges when estimating crop ET, because of their low resolution as compared to small agricultural fields. For example, the ET data derived from the Global Land Evaporation Amsterdam Model (GLEAM) has a spatial resolution of 25 km, while those derived from MODIS 16 and Satellite Application Facility on Land Surface Analysis (LSA-SAF) at a spatial resolution of 1 km are too coarse for characterising ETc and WP in smallholder plots of about two hectares in area [35]. Although there are improvements and promises with the Water Productivity Open-access Portal (WaPOR) dataset, the product currently has uses three resolutions (250 m, 100 m and 30 m) depending with the region [99]. Also, studies have shown that most of these ET datasets tend to either overestimate or underestimate ET [100,101].

3.6. Other Uses of UAVs in Agriculture

One other importance of UAV technology in agriculture is its use in disaster and risk reduction assessments. Twenty-two percent of economic damage caused by natural disasters occur in the agricultural sector, often resulting in yield reductions of about 20–40% every year [102]. Most of the disasters that affect agriculture are climate induced, and these include hailstorms, fires that are caused by heatwaves, cyclonic floods, and winds, as well as droughts. The impact of these disasters can be reduced by systematically applying disaster risk reduction practices [103]. For example, UAVs could play an important role of assessing pre-disaster conditions, immediate impacts after the occurrence of a disaster and post-disaster analysis. Coupled with satellite images, which do not depend on ground infrastructure, UAVs can be used to develop index-based weather insurance that targets smallholder farmers [104].
The Food and Agriculture Organisation (FAO) has used drones in The Philippines as part of the efforts to reduce the impacts of drought and flooding [24]. UAVs mounted with multispectral cameras can relay information related to upland agricultural risks such as landslides and erosion and informs agricultural communities on the risks and reduce the impacts. As already alluded to, remotely sensed data from drones is important in estimating crop yield to provide precise warning on the food situation [105]. Such information provides enough lead-time to decision-makers to prepare.

4. Discussion

4.1. UAV Image Processing and Interpretation

Most agricultural UAVs come with software to process and interpret drone images, besides the existing remote sensing software packages used in image calibration, processing, and interpretation. Existing remote sensing packages can be used to align multispectral UAV images to match bands into a single readable file [28]. In cases where the study area is large and a sequence of images need to be collected in each flight mission to cover the whole study area, image orthorectification and mosaicking is achieved by using the rational polynomial coefficients (RPCs) or image geometric correction tools that come with image processing software [106].

4.2. Cost-Benefit Analyses for Using UAV vs. Traditional Satellite Images

The main difference between space-borne and UAV data is in the spatio-temporal resolution, which determine the accuracy of products derived from them. Space-borne images are generally characterised by a coarse ground sampling distance and a low ground resolved distance which make them less accurate to characterise phenomenon on the landscape when compared to the ultra-fine spatial resolution data from UAV platforms [107]. Although Worldview and GeoEye, images could equally provide high-resolution data, the acquisition of visible and analysable scenes may be less frequent and highly affected by cloud cover, characteristics that limit them to assess specific plant phenological phases due to fixed–time of acquisitions [108]. Meanwhile, UAVs offer user-defined temporal resolution at low operational costs [109]. The hyper spatial resolution (of up to a centimetre) provided by UAVs facilitates a comprehensive characterisation of the spatial heterogeneity of agricultural landscapes at a field-level. The WorldView 3, a space-borne satellite which can offer the closest accuracy has a spatial resolution of 30 cm and is very costly [109]. However, the short flight endurance and limited payload of UAVs remain the most glaring weakness for their application in the context of agriculture [109].
UAVs have been gradually improving in terms of their sensor technologies alongside the improvement of space borne sensors. They can now carry both active and passive multiple sensors across the electromagnetic spectrum to derive data that could be used to calibrate and validate satellite products. They have become useful field instruments, particularly in areas that are inaccessible [107]. Whereas space-borne data sensors are highly susceptible to atmospheric impurities and weather conditions because they are suspended at higher altitudes which tends to affect the quality of the signal to be detected, UAVs are user-defined and can be adjusted to avoid bad weather conditions. Generally, the space borne sensors measure the top of the atmosphere reflectance which requires to be calibrated into surface reflectance whereas the UAVs measure surface reflectance due to their low-altitude flights. For farm level synoptic monitoring capabilities and applications, UAVs are the most suitable platform whereas the space borne platforms will be sufficient and suitable for landscapes applications [10].
The choice to use either space-borne or UAV data depends on objectives, scale, and accuracy the user needs and available financial resources. High resolution satellite images are costly, while the cost of UAV images is increasingly getting low [22]. However, the main disadvantage of UAV images is their low coverage. However, further research is required to explore the advantages of fusing UAV and satellite images, although recent studies indicate that both platforms can complement each other [110].
Based on the value of UAVs in transforming smallholder agriculture, the following points are highlighted:
As individual ownership of UAVs by smallholder farmers as well as the required software for pre-processing the data could be beyond the reach of many because of limited financial resources, communal ownership could be an option particularly for irrigation schemes. The operation of the UAVs can be done through extension officers who could be trained to operate the UAVs, pre-process, analyse the data and pass on the information to the smallholder farmers. Within the context of Africa, the use of drones provides a unique opportunity to involve youth in agriculture as drone pilots and to also process the data and provide a service to the farmers.
In most cases, agricultural UAVs come equipped with the relevant image processing software, which receives support and updates from the manufacturing companies. This is one important advantage of UAVs over spaceborne remote sensing as the cost of image processing software is included at purchase of the hardware. Spaceborne remote sensing requires image processing software which is acquired separately and from different vendors who are not the manufacturers.
Data storage for both spaceborne and UAV data has been made easier in recent years by the advent of high-end computer systems, cloud data storage and improved internet connectivity [111]. Cloud-based platforms facilitate the interaction with the drone data between many users at the same time to be able to manipulate or acquire information at the same time. These cloud-based data storage platforms continue to become more affordable [111].
The availability of thermal and multispectral UAVs images obtained at the same time is enhancing the development of more accurate ET datasets. Existing satellite derived ET datasets are generally coarse resolutions, which makes them unsuitable at field scale.
The use of thermal and multispectral UAVs revolutionising smallholder agriculture by tackling agricultural challenges and other tasks collectively, thereby bringing precision agriculture to previously disadvantaged farming households.
With limited land for agricultural expansion and water resources, UAVs can turn smallholder farms that currently lack technology into smart farmlands by inspecting crops and generating data within a short space of time and at low costs, and surveying fields in near real-time to enable precise application of inputs and irrigation scheduling [112]. Three niche areas for UAVs applications that allow converting farms into small, but effective smart enterprises include: scouting for problems, monitoring to prevent yield losses, and planning crop management operations [113].
The impact of extreme weather events on smallholder agriculture demands urgent insurance mechanisms to enhance the resilience to climate change. The high accuracy of UAV images and user defined temporal resolution suit them for developing precise index-based crop insurance for the benefit of both smallholder farmers and insurers.

5. Conclusions

Unmanned aerial vehicles are fast becoming key components of agricultural research and industry by being an important source of information of previously unavailable agro-meteorological data at field scale. They offer opportunities for mainstreaming climate-smart and precision agriculture into smallholder farming through improved crop health monitoring and agricultural water management as they are a source of high-resolution images acquired at user defined temporal resolution at low altitudes, sufficient to effectively monitor crops in near real-time. To date, their use in smallholder farming has been limited by lack of resources and skills to acquire and operate UAVs; the perception that they are expensive has failed to consider the benefits that would be unlocked through their use in smallholder agriculture. While noting the high costs of ownership, our review recommends communal ownership of UAVs by smallholder farmers to reduce operational costs, as they have the potential to improve agriculture and water productivity. Specifically, we recommend that drones be targeted as an opportunity to increase youth participation in agriculture, which is a priority of most African countries. The application of UAVs in smallholder agriculture would advance the importance of remote sensing in previously disadvantaged smallholder farmers, not only by providing high resolution images at user defined temporal resolution, but also by automating data collection, processing and analysis at low cost. This improves mapping accuracy, stress classification, irrigation scheduling and yield prediction for small croplands. UAV remote sensing can transform smallholder agriculture by improving the mapping of small croplands, assessing crop water nutrient deficiencies, improving crop water productivity and crop yield estimation as well as improving crop evapotranspiration estimation with high accuracy. Applications of UAVs in smallholder agricultural farming could significantly improve input efficiency, environmental sustainability, nutrition of farmers, as well as farm income and livelihoods. Although UAVs have a disadvantage of low coverage, this could turn out to be an important attribute for smallholder farmers as they can monitor the state of individual crops in real-time and act before the damage can spread, which is an effective way to maximise crop yield and quality. Water stress indices are not only important for irrigation scheduling, but they are also important for drought assessment and mitigation. As a result, water stress indices (CWSI and WDI) provide a better option for agricultural water management and climate change adaptation. UAVs can improve farming practices of smallholder farmers and, hence in the long-term, could contribute to climate change adaptation and building resilience.

Author Contributions

L.N., conceptualization, methodology, original draft preparation, review and editing; J.M., methodology, writing original draft, reviewing and editing; A.N., A.D.C., M.S., V.G.P.C., writing sections of the manuscript, reviewing and editing; T.M., conceptualization, resources, supervision and critical review. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Water Research Commission of South Africa (WRC) Project No WRC K5/2971//4 on “Use of drones in monitoring crop health, water stress, crop water requirements and improve on crop water productivity to enhance precision agriculture and irrigation scheduling”. This work is based on the research supported in part by the National Research Foundation of South Africa (Grant Number: 120412).

Acknowledgments

The authors would like to recognize the support of the uMngeni Resilience Project URP), which is funded by the Adaptation Fund. The authors would also like to acknowledge the International Water Management Institute (IWMI) for the use of its drone.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alexandratos, N.; Bruinsma, J. World Agriculture towards 2030/2050: The 2012 Revision; ESA Working paper No. 12-03; FAO: Rome, Italy, 2012. [Google Scholar]
  2. Godfray, H.C.J.; Beddington, J.R.; Crute, I.R.; Haddad, L.; Lawrence, D.; Muir, J.F.; Pretty, J.; Robinson, S.; Thomas, S.M.; Toulmin, C.; et al. Food security: The challenge of feeding 9 billion people. Science 2010, 327, 812–818. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Cosgrove, W.J.; Loucks, D.P. Water management: Current and future challenges and research directions. Water Resour. Res. 2015, 51, 4823–4839. [Google Scholar] [CrossRef] [Green Version]
  4. Fan, M.; Shen, J.; Yuan, L.; Jiang, R.; Chen, X.; Davies, W.J.; Zhang, F. Improving crop productivity and resource use efficiency to ensure food security and environmental quality in China. J. Exp. Bot. 2011, 63, 13–24. [Google Scholar] [CrossRef] [PubMed]
  5. Mungai, L.M.; Snapp, S.; Messina, J.P.; Chikowo, R.; Smith, A.; Anders, E.; Richardson, R.B.; Li, G. Smallholder farms and the potential for sustainable intensification. Front. Plant Sci. 2016, 7, 1720. [Google Scholar] [CrossRef] [Green Version]
  6. Du, T.; Kang, S.; Zhang, J.; Davies, W.J. Deficit irrigation and sustainable water-resource strategies in agriculture for China’s food security. J. Exp. Bot. 2015, 66, 2253–2269. [Google Scholar] [CrossRef]
  7. Levidow, L.; Zaccaria, D.; Maia, R.; Vivas, E.; Todorovic, M.; Scardigno, A. Improving water-efficient irrigation: Prospects and difficulties of innovative practices. Agric. Water Manag. 2014, 146, 84–94. [Google Scholar] [CrossRef] [Green Version]
  8. Nhamo, L.; Mabhaudhi, T.; Magombeyi, M. Improving water sustainability and food security through increased crop water productivity in Malawi. Water 2016, 8, 411. [Google Scholar] [CrossRef]
  9. Nhamo, L.; van Dijk, R.; Magidi, J.; Wiberg, D.; Tshikolomo, K. Improving the accuracy of remotely sensed irrigated areas using post-classification enhancement through UAV capability. Remote Sens. 2018, 10, 712. [Google Scholar] [CrossRef] [Green Version]
  10. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  11. Nhamo, L.; Mabhaudhi, T.; Modi, A. Preparedness or repeated short-term relief aid? Building drought resilience through early warning in southern Africa. Water SA 2019, 45, 20. [Google Scholar] [CrossRef] [Green Version]
  12. Jones, J.W.; Antle, J.M.; Basso, B.; Boote, K.J.; Conant, R.T.; Foster, I.; Godfray, H.C.J.; Herrero, M.; Howitt, R.E.; Janssen, S.; et al. Toward a new generation of agricultural system data, models, and knowledge products: State of agricultural systems science. Agric. Syst. 2017, 155, 269–288. [Google Scholar] [CrossRef] [PubMed]
  13. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  14. Pongnumkul, S.; Chaovalit, P.; Surasvadi, N. Applications of smartphone-based sensors in agriculture: A systematic review of research. J. Sens. 2015. [Google Scholar] [CrossRef] [Green Version]
  15. Ishihara, M.; Inoue, Y.; Ono, K.; Shimizu, M.; Matsuura, S. The impact of sunlight conditions on the consistency of vegetation indices in croplands—Effective usage of vegetation indices from continuous ground-based spectral measurements. Remote Sens. 2015, 7, 14079–14098. [Google Scholar] [CrossRef] [Green Version]
  16. Barbedo, J.G.A. A review on the use of unmanned aerial vehicles and imaging sensors for monitoring and assessing plant stresses. Drones 2019, 3, 40. [Google Scholar] [CrossRef] [Green Version]
  17. Hoffmann, H.; Nieto, H.; Jensen, R.; Guzinski, R.; Zarco-Tejada, P.; Friborg, T. Estimating evaporation with thermal UAV data and two-source energy balance models. Hydrol. Earth Syst. Sci. 2016, 20, 697–713. [Google Scholar] [CrossRef] [Green Version]
  18. Chou, T.-Y.; Yeh, M.-L.; Chen, Y.C.; Chen, Y.H. Disaster monitoring and management by the Unmanned Aerial Vehicle technology. In Proceedings of the ISPRS TC VII Symposium—100 Years ISPRS, Vienna, Austria, 5–7 July 2010. [Google Scholar]
  19. Greatrex, H.; Hansen, J.; Garvin, S.; Diro, R.; Le Guen, M.; Blakeley, S.; Rao, K.; Osgood, D. Scaling Up Index Insurance for Smallholder Farmers: Recent Evidence and Insights; 1904–9005; CGIAR Research Program on Climate Change, Agriculture and Food Security (CCAFS): Copenhagen, Denmark, 2015; p. 32. [Google Scholar]
  20. Carter, M.; de Janvry, A.; Sadoulet, E.; Sarris, A. Index-Based Weather Insurance for Developing Countries: A Review of Evidence and a Set of Propositions for Up-Scaling; Fondation pour les Études et Recherches sur le Développement International: Paris, France, 2014; p. 42. [Google Scholar]
  21. Ruwaimana, M.; Satyanarayana, B.; Otero, V.; Muslim, A.M.; Syafiq, M.; Ibrahim, S.; Raymaekers, D.; Koedam, N.; Dahdouh-Guebas, F. The advantages of using drones over space-borne imagery in the mapping of mangrove forests. PLoS ONE 2018, 13, e0200288. [Google Scholar] [CrossRef] [Green Version]
  22. Manfreda, S.; McCabe, M.; Miller, P.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef] [Green Version]
  23. Ballesteros, R.; Ortega, J.; Hernández, D.; Moreno, M. Applications of georeferenced high-resolution images obtained with unmanned aerial vehicles. Part I: Description of image acquisition and processing. Precis. Agric. 2014, 15, 579–592. [Google Scholar] [CrossRef]
  24. Sylvester, G. E-Agriculture in Action: Drones for Agriculture; Food and Agriculture Organization of the United Nations (FAO): Rome, Italy; International Telecommunication Union (ITU): Bangkok, Thailand, 2018; p. 126. [Google Scholar]
  25. Huang, C.; Chen, Y.; Zhang, S.; Wu, J. Detecting, extracting, and monitoring surface water from space using optical sensors: A review. Rev. Geophys. 2018, 56, 333–360. [Google Scholar] [CrossRef]
  26. Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
  27. Sevara, C.; Wieser, M.; Doneus, M.; Pfeifer, N. Relative radiometric calibration of airborne LiDAR data for archaeological applications. Remote Sens. 2019, 11, 945. [Google Scholar] [CrossRef] [Green Version]
  28. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  29. Salamí, E.; Barrado, C.; Pastor, E. UAV flight experiments applied to the remote sensing of vegetated areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef] [Green Version]
  30. Naumann, G.; Alfieri, L.; Wyser, K.; Mentaschi, L.; Betts, R.; Carrao, H.; Spinoni, J.; Vogt, J.; Feyen, L. Global changes in drought conditions under different levels of warming. Geophys. Res. Lett. 2018, 45, 3285–3296. [Google Scholar] [CrossRef]
  31. Solh, M.; van Ginkel, M. Drought preparedness and drought mitigation in the developing world׳ s drylands. Weather Clim. Extrem. 2014, 3, 62–66. [Google Scholar] [CrossRef] [Green Version]
  32. Cai, X.; Magidi, J.; Nhamo, L.; van Koppen, B. Mapping Irrigated Areas in the Limpopo Province, South Africa; IWMI Working Paper 172; International Water Management Institute (IWMI): Colombo, Sri Lanka, 2017; p. 37. [Google Scholar]
  33. Scott, G.; Rajabifard, A. Sustainable development and geospatial information: A strategic framework for integrating a global policy agenda into national geospatial capabilities. Geo-Spat. Inf. Sci. 2017, 20, 59–76. [Google Scholar] [CrossRef] [Green Version]
  34. Sandbrook, C. The social implications of using drones for biodiversity conservation. Ambio 2015, 44, 636–647. [Google Scholar] [CrossRef] [Green Version]
  35. Graeub, B.E.; Chappell, M.J.; Wittman, H.; Ledermann, S.; Kerr, R.B.; Gemmill-Herren, B. The state of family farms in the world. World Dev. 2016, 87, 1–15. [Google Scholar] [CrossRef] [Green Version]
  36. Livingston, G.; Schonberger, S.; Delaney, S. Saharan Africa: The State of Smallholders in Agriculture; International Fund for Agricultural Development (IFAD): Rome, Italy, 2011; p. 36. [Google Scholar]
  37. Boyle, S.A.; Kennedy, C.M.; Torres, J.; Colman, K.; Perez-Estigarribia, P.E.; Noé, U. High-resolution satellite imagery is an important yet underutilized resource in conservation biology. PLoS ONE 2014, 9, e86908. [Google Scholar] [CrossRef]
  38. Xiang, H.; Tian, L. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosyst. Eng. 2011, 108, 174–190. [Google Scholar] [CrossRef]
  39. De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An automatic random forest-obia algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef] [Green Version]
  40. De Castro, A.; Peña, J.; Torres-Sánchez, J.; Jiménez-Brenes, F.; López-Granados, F. Mapping Cynodon dactylon in vineyards using UAV images for site-specific weed control. Adv. Anim. Biosci. 2017, 8, 267–271. [Google Scholar] [CrossRef]
  41. Fang, Y.; Ramasamy, R. Current and prospective methods for plant disease detection. Biosensors 2015, 5, 537–561. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  42. Mee, C.Y.; Balasundram, S.K.; Hanif, A.H.M. Detecting and monitoring plant nutrient stress using remote sensing approaches: A review. Asian J. Plant Sci. 2017, 16, 1–8. [Google Scholar]
  43. Pavlovic, D.; Nikolic, B.; Djurovic, S.; Waisi, H.; Andjelkovic, A.; Marisavljevic, D. Chlorophyll as a measure of plant health: Agroecological aspects. Pestic. Phytomed. 2015, 29, 14. [Google Scholar] [CrossRef]
  44. Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017. [Google Scholar] [CrossRef] [Green Version]
  45. She, X.; Zhang, L.; Cen, Y.; Wu, T.; Huang, C.; Baig, M.H.A. Comparison of the continuity of vegetation indices derived from Landsat 8 OLI and Landsat 7 ETM+ data among different vegetation types. Remote Sens. 2015, 7, 13485–13506. [Google Scholar] [CrossRef] [Green Version]
  46. Hatfield, J.L.; Prueger, J.H. Value of using different vegetative indices to quantify agricultural crop characteristics at different growth stages under varying management practices. Remote Sens. 2010, 2, 562–578. [Google Scholar] [CrossRef] [Green Version]
  47. Townshend, J.R.; Justice, C. Analysis of the dynamics of African vegetation using the normalized difference vegetation index. Int. J. Remote Sens. 1986, 7, 1435–1445. [Google Scholar] [CrossRef]
  48. Boken, V.K.; Shaykewich, C.F. Improving an operational wheat yield model using phenological phase-based Normalized Difference Vegetation Index. Int. J. Remote Sens. 2002, 23, 4155–4168. [Google Scholar] [CrossRef]
  49. Filella, I.; Serrano, L.; Serra, J.; Penuelas, J. Evaluating wheat nitrogen status with canopy reflectance indices and discriminant analysis. Crop Sci. 1995, 35, 1400–1405. [Google Scholar] [CrossRef]
  50. Aparicio, N.; Villegas, D.; Casadesus, J.; Araus, J.L.; Royo, C. Spectral vegetation indices as nondestructive tools for determining durum wheat yield. Agron. J. 2000, 92, 83–91. [Google Scholar] [CrossRef]
  51. Costa, J.D.O.; Coelho, R.D.; Wolff, W.; José, J.V.; Folegatti, M.V.; Ferraz, S.F.D.B. Spatial variability of coffee plant water consumption based on the SEBAL algorithm. Sci. Agric. 2019, 76, 93–101. [Google Scholar] [CrossRef]
  52. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  53. Xu, M.; Lacey, C.; Armstrong, S. The feasibility of satellite remote sensing and spatial interpolation to estimate cover crop biomass and nitrogen uptake in a small watershed. J. Soil Water Conserv. 2018, 73, 682–692. [Google Scholar] [CrossRef]
  54. Fuentes, S.; De Bei, R.; Pech, J.; Tyerman, S. Computational water stress indices obtained from thermal image analysis of grapevine canopies. Irrig. Sci. 2012, 30, 523–536. [Google Scholar] [CrossRef]
  55. Dalezios, N.R.; Dercas, N.; Eslamian, S.S. Water scarcity management: Part 2: Satellite-based composite drought analysis. Int. J. Glob. Environ. Issues 2018, 17, 262–295. [Google Scholar] [CrossRef]
  56. Cammarano, D.; Fitzgerald, G.J.; Casa, R.; Basso, B. Assessing the robustness of vegetation indices to estimate wheat N in Mediterranean environments. Remote Sens. 2014, 6, 2827–2844. [Google Scholar] [CrossRef] [Green Version]
  57. Ihuoma, S.O.; Madramootoo, C.A. Recent advances in crop water stress detection. Comput. Electron. Agric. 2017, 141, 267–275. [Google Scholar] [CrossRef]
  58. Gago, J.; Douthe, C.; Coopman, R.; Gallego, P.; Ribas-Carbo, M.; Flexas, J.; Escalona, J.; Medrano, H. UAVs challenge to assess water stress for sustainable agriculture. Agric. Water Manag. 2015, 153, 9–19. [Google Scholar] [CrossRef]
  59. Zhang, L.; Zhang, H.; Niu, Y.; Han, W. Mapping maize water stress based on UAV multispectral remote sensing. Remote Sens. 2019, 11, 605. [Google Scholar] [CrossRef] [Green Version]
  60. Wahab, I.; Hall, O.; Jirström, M. Remote sensing of yields: Application of UAV imagery-derived NDVI for estimating maize vigor and yields in complex farming systems in Sub-Saharan Africa. Drones 2018, 2, 28. [Google Scholar] [CrossRef] [Green Version]
  61. Hunt, E.R., Jr.; Daughtry, C.S. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef] [Green Version]
  62. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.; Neely, H.L.; et al. Unmanned aerial vehicles for high-throughput phenotyping and agronomic research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [Green Version]
  63. Zhou, X.; Zheng, H.; Xu, X.; He, J.; Ge, X.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.; Tian, Y.; et al. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  64. Jin, X.; Liu, S.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef] [Green Version]
  65. Tilly, N.; Hoffmeister, D.; Schiedung, H.; Hütt, C.; Brands, J.; Bareth, G. Terrestrial laser scanning for plant height measurement and biomass estimation of maize. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014. [Google Scholar] [CrossRef] [Green Version]
  66. Meroni, M.; Marinho, E.; Sghaier, N.; Verstrate, M.; Leo, O. Remote sensing-based yield estimation in a stochastic framework—Case study of durum wheat in Tunisia. Remote Sens. 2013, 5, 539–557. [Google Scholar] [CrossRef] [Green Version]
  67. Basso, B.; Cammarano, D.; Carfagna, E. Review of crop yield forecasting methods and early warning systems. In Proceedings of the First Meeting of the Scientific Advisory Committee of the Global Strategy to Improve Agricultural and Rural Statistics, Rome, Italy, 18–19 July 2013; p. 56. [Google Scholar]
  68. Tumlisan, G.Y. Monitoring Growth Development and Yield Estimation of Maize Using very High-Resolution UAV-Images in Gronau, Germany; University of Twente: Enschede, The Netherlands, 2017. [Google Scholar]
  69. Vriet, C.; Russinova, E.; Reuzeau, C. Boosting crop yields with plant steroids. Plant Cell 2012, 24, 842–857. [Google Scholar] [CrossRef] [Green Version]
  70. Govindaraj, M.; Vetriventhan, M.; Srinivasan, M. Importance of genetic diversity assessment in crop plants and its recent advances: An overview of its analytical perspectives. Genet. Res. Int. 2015, 2015. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  71. Monteith, J. Solar radiation and productivity in tropical ecosystems. J. Appl. Ecol. 1972, 9, 747–766. [Google Scholar] [CrossRef] [Green Version]
  72. Craine, J.M.; Dybzinski, R. Mechanisms of plant competition for nutrients, water and light. Funct. Ecol. 2013, 27, 833–840. [Google Scholar] [CrossRef]
  73. Onoda, Y.; Saluñga, J.B.; Akutsu, K.; Aiba, S.I.; Yahara, T.; Anten, N.P. Trade-off between light interception efficiency and light use efficiency: Implications for species coexistence in one-sided light competition. J. Ecol. 2014, 102, 167–175. [Google Scholar] [CrossRef]
  74. Slattery, R.A.; Walker, B.J.; Weber, A.P.; Ort, D.R. The impacts of fluctuating light on crop performance. Plant Physiol. 2018, 176, 990–1003. [Google Scholar] [CrossRef] [Green Version]
  75. Haxeltine, A.; Prentice, I. A general model for the light-use efficiency of primary production. Funct. Ecol. 1996, 10, 551–561. [Google Scholar] [CrossRef]
  76. Duan, B.; Fang, S.; Zhu, R.; Wu, X.; Wang, S.; Gong, Y.; Peng, Y. Remote estimation of rice yield with unmanned aerial vehicle (UAV) data and spectral mixture analysis. Front. Plant Sci. 2019, 10. [Google Scholar] [CrossRef] [Green Version]
  77. Fereres, E.; Soriano, M.A. Deficit irrigation for reducing agricultural water use. J. Exp. Bot. 2006, 58, 147–159. [Google Scholar] [CrossRef] [Green Version]
  78. Burba, G. Eddy Covariance Method for Scientific, Industrial, Agricultural and Regulatory Applications: A Field Book on Measuring Ecosystem Gas Exchange and Areal Emission Rates; LI-Cor Biosciences: Lincoln, NE, USA, 2013. [Google Scholar]
  79. Perez-Priego, O.; El-Madany, T.S.; Migliavacca, M.; Kowalski, A.S.; Jung, M.; Carrara, A.; Kolle, O.; Martín, M.P.; Pacheco-Labrador, J.; Moreno, G.; et al. Evaluation of eddy covariance latent heat fluxes with independent lysimeter and sapflow estimates in a Mediterranean savannah ecosystem. Agric. For. Meteorol. 2017, 236, 87–99. [Google Scholar] [CrossRef]
  80. Nouri, H.; Beecham, S.; Kazemi, F.; Hassanli, A.M. A review of ET measurement techniques for estimating the water requirements of urban landscape vegetation. Urban Water J. 2013, 10, 247–259. [Google Scholar] [CrossRef]
  81. Allen, R.; Irmak, A.; Trezza, R.; Hendrickx, J.M.; Bastiaanssen, W.; Kjaersgaard, J. Satellite-based ET estimation in agriculture using SEBAL and METRIC. Hydrol. Process. 2011, 25, 4011–4027. [Google Scholar] [CrossRef]
  82. Li, Z.-L.; Tang, R.; Wan, Z.; Bi, Y.; Zhou, C.; Tang, B.; Yan, G.; Zhang, X. A review of current methodologies for regional evapotranspiration estimation from remotely sensed data. Sensors 2009, 9, 3801–3853. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  83. Jovanovic, N.; Israel, S. Critical review of methods for the estimation of actual evapotranspiration in hydrological models. In Evapotranspiration-Remote Sensing and Modeling; InTech: London, UK, 2012; p. 24. [Google Scholar]
  84. Su, Z. The Surface Energy Balance System (SEBS) for estimation of turbulent heat fluxes. Hydrol. Earth Syst. Sci. 2002, 6, 85–100. [Google Scholar] [CrossRef]
  85. Gokool, S.; Chetty, K.; Jewitt, G.; Heeralal, A. Estimating total evaporation at the field scale using the SEBS model and data infilling procedures. Water SA 2016, 42, 673–683. [Google Scholar] [CrossRef] [Green Version]
  86. Allen, R.G.; Tasumi, M.; Trezza, R. Satellite-based energy balance for mapping evapotranspiration with internalized calibration (METRIC)—Model. J. Irrig. Drain. Eng. 2007, 133, 380–394. [Google Scholar] [CrossRef]
  87. Mokhtari, M.; Ahmad, B.; Hoveidi, H.; Busu, I. Sensitivity analysis of METRIC–based evapotranspiration algorithm. Int. J. Environ. Res. 2013, 7, 407–422. [Google Scholar]
  88. Madugundu, R.; Al-Gaadi, K.A.; Tola, E.; Hassaballa, A.A.; Patil, V.C. Performance of the METRIC model in estimating evapotranspiration fluxes over an irrigated field in Saudi Arabia using Landsat-8 images. Hydrol. Earth Syst. Sci. 2017, 21, 6135–6151. [Google Scholar] [CrossRef] [Green Version]
  89. Gibson, L.; Jarmain, C.; Su, Z.; Eckardt, F. Estimating evapotranspiration using remote sensing and the Surface Energy Balance System—A South African perspective. Water SA 2013, 39, 477–484. [Google Scholar] [CrossRef] [Green Version]
  90. Thorp, K.; Thompson, A.; Harders, S.; French, A.; Ward, R. High-throughput phenotyping of crop water use efficiency via multispectral drone imagery and a daily soil water balance model. Remote Sens. 2018, 10, 1682. [Google Scholar] [CrossRef] [Green Version]
  91. Elarab, M.; Torres-Rua, A.F.; Kustas, W.; Nieto, H.; Song, L.; Alfieri, J.G.; Prueger, J.H.; McKee, L.; Anderson, M.; Sanchez, L.; et al. Use of Aggieair UAS remote sensing data to estimate crop ET at high spatial resolution. In Proceedings of the Synergy in Science: Partnering for Solutions 2015 Annual Meeting, Minneapolis, MN, USA, 15–18 November 2015. [Google Scholar]
  92. Igbadun, H.E.; Mahoo, H.F.; Tarimo, A.K.; Salim, B.A. Crop water productivity of an irrigated maize crop in Mkoji sub-catchment of the Great Ruaha River Basin, Tanzania. Agric. Water Manag. 2006, 85, 141–150. [Google Scholar] [CrossRef]
  93. Liu, J.; Williams, J.R.; Zehnder, A.J.; Yang, H. GEPIC—Modelling wheat yield and crop water productivity with high resolution on a global scale. Agric. Syst. 2007, 94, 478–493. [Google Scholar] [CrossRef]
  94. Kijne, J.W.; Barker, R.; Molden, D.J. Water Productivity in Agriculture: Limits and Opportunities for Improvement; Comprehensive Assessment of Water Management in Agriculture Series 1; CABI International: Wallingford, UK, 2003. [Google Scholar]
  95. Mabhaudhi, T.; Chibarabada, T.; Modi, A. Water-food-nutrition-health nexus: Linking water to improving food, nutrition and health in Sub-Saharan Africa. Int. J. Environ. Res. Public Health 2016, 13, 107. [Google Scholar] [CrossRef] [Green Version]
  96. Molden, D.; Oweis, T.; Steduto, P.; Bindraban, P.; Hanjra, M.A.; Kijne, J. Improving agricultural water productivity: Between optimism and caution. Agric. Water Manag. 2010, 97, 528–535. [Google Scholar] [CrossRef]
  97. DeBell, L.; Anderson, K.; Brazier, R.E.; King, N.; Jones, L. Water resource management at catchment scales using lightweight UAVs: Current capabilities and future perspectives. J. Unmanned Veh. Syst. 2015, 4, 7–30. [Google Scholar] [CrossRef]
  98. Park, S.; Ryu, D.; Fuentes, S.; Chung, H.; Hernández-Montes, E.; O’Connell, M. Adaptive estimation of crop water stress in nectarine and peach orchards using high-resolution imagery from an unmanned aerial vehicle (UAV). Remote Sens. 2017, 9, 828. [Google Scholar] [CrossRef] [Green Version]
  99. FAO; IHE-Delft. WaPOR Quality Assessment. Technical Report on the Data Quality of the WaPOR FAO Database Version 1.0; Food and Agriculture Organisation of the United Nations (FAO): Rome, Italy; IHE Delft Institute for Water Education (IHE DELFT): Delft, The Netherlands, 2019; p. 134. [Google Scholar]
  100. Ramoelo, A.; Majozi, N.; Mathieu, R.; Jovanovic, N.; Nickless, A.; Dzikiti, S. Validation of global evapotranspiration product (MOD16) using flux tower data in the African savanna, South Africa. Remote Sens. 2014, 6, 7406–7423. [Google Scholar] [CrossRef] [Green Version]
  101. Kim, H.W.; Hwang, K.; Mu, Q.; Lee, S.O.; Choi, M. Validation of MODIS 16 global terrestrial evapotranspiration products in various climates and land cover types in Asia. KSCE J. Civ. Eng. 2012, 16, 229–238. [Google Scholar] [CrossRef]
  102. FAO. The impact of natural hazards and disasters on agriculture and food security and nutrition: A call for action to build resilient livelihoods. In Proceedings of the World Conference on Disaster Risk Reduction, Sendai, Japan, 17 March 2015; p. 16. [Google Scholar]
  103. Jayanthi, H.; Husak, G.J.; Funk, C.; Magadzire, T.; Chavula, A.; Verdin, J.P. Modeling rain-fed maize vulnerability to droughts using the standardized precipitation index from satellite estimated rainfall—Southern Malawi case study. Int. J. Disaster Risk Reduct. 2013, 4, 71–81. [Google Scholar] [CrossRef] [Green Version]
  104. Castillo, M.J.; Boucher, S.; Carter, M. Index insurance: Using public data to benefit small-scale agriculture. Int. Food Agribus. Manag. Rev. 2016, 19, 93. [Google Scholar]
  105. Singha, C.; Swain, K.C. Land suitability evaluation criteria for agricultural crop selection: A review. Agric. Rev. 2016, 37, 125–132. [Google Scholar] [CrossRef] [Green Version]
  106. Stratoulias, D.; Tolpekin, V.; De By, R.A.; Zurita-Milla, R.; Retsios, V.; Bijker, W.; Hasan, M.A.; Vermote, E. A workflow for automated satellite image processing: From raw VHSR data to object-based spectral information for smallholder agriculture. Remote Sens. 2017, 9, 1048. [Google Scholar] [CrossRef] [Green Version]
  107. McCabe, M.F.; Rodell, M.; Alsdorf, D.E.; Miralles, D.G.; Uijlenhoet, R.; Wagner, W.; Lucieer, A.; Houborg, R.; Verhoest, N.E.; Franz, T.E.; et al. The future of Earth observation in hydrology. Hydrol. Earth Syst. Sci. 2017, 21, 3879. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  108. Müllerová, J.; Brůna, J.; Bartaloš, T.; Dvořák, P.; Vítková, M.; Pyšek, P. Timing is important: Unmanned aircraft vs. satellite imagery in plant invasion monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef] [Green Version]
  109. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef] [Green Version]
  110. Zhao, L.; Shi, Y.; Liu, B.; Hovis, C.; Duan, Y.; Shi, Z. Finer classification of crops by fusing UAV images and sentinel-2A data. Remote Sens. 2019, 11, 3012. [Google Scholar] [CrossRef] [Green Version]
  111. Wolfert, S.; Ge, L.; Verdouw, C.; Bogaardt, M.-J. Big data in smart farming—A review. Agric. Syst. 2017, 153, 69–80. [Google Scholar] [CrossRef]
  112. Wang, L.; Lan, Y.; Zhang, Y.; Zhang, H.; Tahir, M.N.; Ou, S.; Liu, X.; Chen, P. Applications and prospects of agricultural unmanned aerial vehicle obstacle avoidance technology in China. Sensors 2019, 19, 642. [Google Scholar] [CrossRef] [Green Version]
  113. Hunt, E.R., Jr.; Daughtry, C.S. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2017, 1–32. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Methodological framework to explore the importance of UAVs in agricultural water management.
Figure 1. Methodological framework to explore the importance of UAVs in agricultural water management.
Agriculture 10 00256 g001
Figure 2. Comparison in the accuracy of UAV image derived map (a) and satellite image derived map (b). (a) shows the actual shape of agricultural fields and yet (b) resembles the whole scheme area as cultivated, overestimating the cultivated area. (a) was developed by the authors using a drone and (b) was derived from the irrigated area map developed by the International Water Management Institute (IWMI) (www.iwmi.cgiar.org/2018/06/irrigated-area-mapping-asia-and-africa/).
Figure 2. Comparison in the accuracy of UAV image derived map (a) and satellite image derived map (b). (a) shows the actual shape of agricultural fields and yet (b) resembles the whole scheme area as cultivated, overestimating the cultivated area. (a) was developed by the authors using a drone and (b) was derived from the irrigated area map developed by the International Water Management Institute (IWMI) (www.iwmi.cgiar.org/2018/06/irrigated-area-mapping-asia-and-africa/).
Agriculture 10 00256 g002

Share and Cite

MDPI and ACS Style

Nhamo, L.; Magidi, J.; Nyamugama, A.; Clulow, A.D.; Sibanda, M.; Chimonyo, V.G.P.; Mabhaudhi, T. Prospects of Improving Agricultural and Water Productivity through Unmanned Aerial Vehicles. Agriculture 2020, 10, 256. https://0-doi-org.brum.beds.ac.uk/10.3390/agriculture10070256

AMA Style

Nhamo L, Magidi J, Nyamugama A, Clulow AD, Sibanda M, Chimonyo VGP, Mabhaudhi T. Prospects of Improving Agricultural and Water Productivity through Unmanned Aerial Vehicles. Agriculture. 2020; 10(7):256. https://0-doi-org.brum.beds.ac.uk/10.3390/agriculture10070256

Chicago/Turabian Style

Nhamo, Luxon, James Magidi, Adolph Nyamugama, Alistair D. Clulow, Mbulisi Sibanda, Vimbayi G. P. Chimonyo, and Tafadzwanashe Mabhaudhi. 2020. "Prospects of Improving Agricultural and Water Productivity through Unmanned Aerial Vehicles" Agriculture 10, no. 7: 256. https://0-doi-org.brum.beds.ac.uk/10.3390/agriculture10070256

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop