Next Article in Journal
Spaceborne GNSS Reflectometry
Previous Article in Journal
Modulation Mode Recognition Method of Non-Cooperative Underwater Acoustic Communication Signal Based on Spectral Peak Feature Extraction and Random Forest
Previous Article in Special Issue
Estimation of Paddy Rice Nitrogen Content and Accumulation Both at Leaf and Plant Levels from UAV Hyperspectral Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Bibliometric Review of the Use of Unmanned Aerial Vehicles in Precision Agriculture and Precision Viticulture for Sensing Applications

Department of Engineering, University of Sannio, 82100 Benevento, Italy
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 11 February 2022 / Revised: 21 March 2022 / Accepted: 22 March 2022 / Published: 27 March 2022
(This article belongs to the Special Issue Drones for Precision Agriculture: Remote Sensing Applications)

Abstract

:
This review focuses on the use of unmanned aerial vehicles (UAVs) in precision agriculture, and specifically, in precision viticulture (PV), and is intended to present a bibliometric analysis of their developments in the field. To this aim, a bibliometric analysis of research papers published in the last 15 years is presented based on the Scopus database. The analysis shows that the researchers from the United States, China, Italy and Spain lead the precision agriculture through UAV applications. In terms of employing UAVs in PV, researchers from Italy are fast extending their work followed by Spain and finally the United States. Additionally, the paper provides a comprehensive study on popular journals for academicians to submit their work, accessible funding organizations, popular nations, institutions, and authors conducting research on utilizing UAVs for precision agriculture. Finally, this study emphasizes the necessity of using UAVs in PV as well as future possibilities.

1. Introduction

Precision agriculture (PA) is becoming highly significant in today’s technologically advanced world and has been considered as the farm for the future [1]. This is a modern farming management concept using digital techniques to monitor and optimize agricultural production processes by using technological advancements [2]. PA uses modern technology and principles to manage the spatial and temporal variability in all aspects of agricultural production for the goal of improving crop performance [3,4]. Spatial and temporal variability are terms used to describe variability that have significant impacts on agricultural production. Examples of spatial and temporal variability include yield variability [5], field variability [6], soil variability [7], crop variability [8], and management variability [9]. The goal is to reduce economical costs, decrease the impact on the environment (e.g., by using less water and fertilizers), and increase food production quality at the same time [10]. Typically, unmanned aerial vehicles (UAVs), sensor technologies, satellite navigation and positioning technologies, and the internet of things (IoT) are used to achieve these goals. PA is increasingly aiding farmers with their job as it makes its way into fields across Europe [11]. Larger yields need greater financial investment because a large amount of fertilizers, pesticides, water, and other resources is required. However, by their proper management, growers can achieve considerable saving on the expected expenses [12,13,14]. Further, in addition to increasing the yield by proper monitoring, the plants’ health and productivity can also increase at the same time, which will allow growers to meet the demand. Figure 1 depicts the last ten years of documents released by scientists, researchers, and growers, demonstrating that the practice of PA is expanding day by day. PA has evolved on digital-based farming management approach that monitors and optimizes agricultural production operations [15]. PA has been now practiced frequently in cultivation, monitoring and harvesting of: rice [16], wheat [17], maize [18], barley [19], soybean [20], potato [21], orange [22], olive [23], and many other crops.
Precision viticulture (PV) is a subset of PA, where the same technologies are applied only on grapevine care and development. In viticulture, the cultivation and study of grapes are of concern, and it refers to a set of activities in the vineyard [24]. Typically, viticulturists are mainly interested in monitoring and managing the vineyard [25], fertilizing, and watering [26], canopy management [27], monitoring fruit growth and characteristics [28], choosing when to harvest [29], and trimming during the specific months [30]. Viticulturists and winemakers are commonly linked because vineyard management and grape characteristics provide the basis for wine-making [31]. A vast range of varietals is presently cultivated in the European Union as actual grapes for wine growing and viticulture [32].
Viticulture is one of the major factors which plays an important role in the economic developments of majorly the European countries [33]. PV also involves the practice of grape farmers and winemakers employing a range of information technologies to better sense and comprehend variability in their production systems. Then, this information is used to better match production inputs to desired or expected outputs [34]. Crop sensors and yield monitors, remote sensors, geographic information system (GIS), and global navigation satellite systems (GNSS) are some of the components of the PV’s technological advancements. Therefore, they are a rising trend in the wine industry [35]. Furthermore, PV is growing substantially because of the improved and cost-effective sensors, methodologies, and equipment for data acquisition from drones [36].
This bibliometric study presents the analysis on the usage of UAVs in PA and more specifically in PV, and presents the analyzed results. Readers can find other bibliometric reviews dedicated to PA and PV, such as research evolution on PA [37], advances in precision coffee growing research [38], wireless sensor networks in agriculture [39], IoT in PA [40], digital agriculture [41,42], an investigation of PV production, impact, and interdisciplinary collaboration involved in viticulture [43], a bibliometric analysis of PV in Italy [44], Mexican academicians’ contribution to viticulture [45], to gain further information on PA and PV. Although numerous reviews have been published, to our knowledge, no bibliometric study has explored on the use of UAVs in PV. The current article fills this gap in the literature by identifying the important concerns and potential associated with the use of UAVs in PV. The remaining sections of this manuscript are as follows: Section 2 discusses the materials and methods used for the bibliometric analysis, Section 3 discusses briefly why the UAV is needed in PA, Section 4 discusses why the UAV is needed in PV, Section 5 presents the analyzed results, and Section 6 summarizes the findings and future scope.

2. Materials and Methods

The usage of UAVs in PA (UAV-PA) and usage of UAVs in PV (UAV-PV) are undeniably transforming traditional agricultural operations. The number of contributions is constantly rising throughout the years. In 2020 alone, the number of publications using the terms UAV-PA and UAV-PV reached 224 and 40, respectively. It is difficult to maintain track of key developments in the area at such a rapid rate of publishing. By statistically evaluating published data, bibliometric analytic tools can assist speed up the review process in this context [41]. Authors working in the use of UAVs in PA research field use a variety of terminologies to refer to the same ideas. In such a fragmented subject, bibliometric analysis becomes very important. So, to find the most important publications, the most influential sources, institutions, and nations, we have used bibliometrics tools.

2.1. Materials

In this bibliometric review, the articles from the Scopus database have been considered. Scopus is an online database that contains nearly all significant research publications and has built-in analytical capabilities for generating representative data. Additionally, the search results from Scopus can be exported to other software, for additional post processing. For instance, the targeted study used VOSviewer [46] that provides algorithms for bibliometrics-based analysis of publishing datasets. Co-citation, network analysis, collaboration, coupling, and co-word information fetching are also possible. A publication is indeed the dataset’s base element. Authors, document type, document title, keywords, cited references, source, year, author address, times cited, and others are connected with each publication. The targeted study also used the science mapping analysis software tool (SciMAT). SciMAT has been chosen because it facilitates all stages of scientific mapping, from data preparation to the development of field maps [42,47], and also has a comprehensive preparation module that verifies the quality of the data [42].
The quality and thoroughness of the outcomes in this work may still be hampered by constraints in the analysis tools and data filtering procedure. To begin with, the retrieved data could have left out certain publications. Some publications that did not contain the sought keywords in the title, abstract, author keywords may have been excluded from the records because we used a search strategy presented in Table 1. Some publications may be incorrectly eliminated as a result of manual selection. Finally, we want to express our gratitude to the VOSviewer tool and SciMAT for offering the data visualization functionalities utilized in this study.

2.2. Methods

Research towards the application of UAV-PA and UAV-PV is trending, and further knowledge should be obtained to progress in these fields, which is why bibliometric analysis is vital. As a result, this review is conducted at the appropriate moment to give a comprehensive knowledge of the utilization of UAV-PA and UAV-PV, as well as future research possibilities.

2.2.1. Design of the Study

We defined and explored the following questions in order to correctly gain significant knowledge and conduct a bibliometric study on UAV-PA and UAV-PV:
  • Why are UAVs required in PA?Section 3 of this review study explored the solution to this question. The purpose of this research question is to examine the developments in the use of UAV-PV.
  • What technologies are employed in UAV-PA and UAV-PV? The answer to this question is explored in further detail in Section 4 and Section 5 of this review study.
  • What nations are pioneering research on the use of UAV-PA and UAV-PV?Section 4.1 and Section 5.1 show the solution with their respective publications.
  • Which journals are chosen by researchers for publication, which funding agencies are accessible, which prominent researchers are active in the field, and which universities/institutions are active in the field of UAV-PA and UAV-PV? The answers to these concerns can be found in subsections of Section 4 and Section 5, respectively. Section 6 contains information about major universities/institutions active in the area of UAV-PA and UAV-PV. In addition, the future scope towards the PV’s technological development is also reported in this section.

2.2.2. Data Collection

To address the questions of Section 2.2.1, we examined the publications between 1 January 2006 and 15 November 2021. The most widely utilized publishing databases are Web of Science (WoS) and Scopus and there were no other options available in the past [48] about what databases to choose for the bibliometric analysis. However, WoS and Scopus continue to have the highest data quality and completeness across several categories [49], and as a result, they are the most commonly employed for bibliometric analysis. Scopus demonstrated that it has a wider coverage than WoS, which justifies our choice to use only the first.
The data are accessed on 15 November 2021 with keywords reported in Table 1. The table also reports the type of database used, how many documents have been considered, time-span of the data collection, the criteria for inclusion and the software used to analyze the results.

2.2.3. Data Preparation

The data fetched have not been used directly because of missing values in relevant fields. Thus, a manual checking has been carried out in order to remove the articles with incomplete information. Then, the filtered data have been uploaded to SciMAT and VOSviewer for generating results.
The strategic diagram, thematic network structure and thematic evolution structure for the most significant themes have been generated via SciMAT. The evolutionary map was constructed using the equivalence index, with solid lines indicating that clusters have a common theme (Figure 2). In comparison, dashed lines indicate a connection between non-major terminologies, whereas in some cases, the absence of lines indicates that the topic gradually ceased to exist from one period to the other [42].
In total, 1084 documents were exported from Scopus related to use of UAV-PA with duplicates being deleted from the list. The phrases that indicated the same thing were then grouped together, and the data were preprocessed before being entered into software to create a strategic picture of the topic of research. To achieve the evolutionary map, data were divided into three sub-periods (2006–2013, 2014–2018, and 2019–2021). Using SciMAT, it is not possible to create super-periods automatically; thus, it required a manual entry for the sub-periods. It can be seen that, in the third sub-period (2019–2021), a large number of articles were published, providing a good indication of what the study field would look like in the future (as pictorially shown in Figure 1). To determine which topics were most essential and which clusters they were in, we used the keyword co-occurrence matrix. Data were normalized and clustered into groups based on the simple centers algorithm using the equivalence index (EI) to create a network of links between the themes. We compress the data to a frequency resolution of 2 to create the diagrams. We established a maximum and minimum network size of 14 and 3, respectively, when we examined the field of study. In the strategic diagram, the horizontal axis (centrality) depicts the importance of each subject and the number of links it has with other themes, while the vertical axis (density) depicts the number of connections between each cluster. This graph depicts the relationship between two or more topics throughout time, for more information about the quadrants, the reader is referred to [42].

2.2.4. Data Analysis

In this bibliometric review, the strategies for data analysis is divided into the following categories:
  • Worldwide published documents based on year-wise and country-wise on adopting UAV-PA and UAV-PV: Under this analysis, we analyzed the documents published during the considered time-span that are fetched from the database, and arranged them according to the year of publication and presented in graphical form. We applied the same methods for country-specific publications.
  • Influenced authors world wide on adopting UAV-PA and UAV-PV: The world recognizes excellent research, and hence, we chose three different categories to identify leading authors on adopting UAV-PA and UAV-PV. These are: (i) on the basis of author’s citation count, (ii) on the basis of most cited documents, and (iii) on the basis of the number of documents produced by an author, and accordingly, we presented the results in this manuscript.
  • Most preferred journals and popular funding sponsors: To choose most preferred journals, it is obvious to go for the number of articles published in the specific journals on adopting UAV-PA and UAV-PV, and the same we applied to the popular funding sponsors.
  • Leading institutions based on the citation counts on adopting UAV-PA and UAV-PV: To identify leading institutions, we again focused on the citation count of the articles published by an institute on adopting UAV-PA and UAV-PV.

2.2.5. Evaluation of the Terminologies towards the Use of UAV-PA

The evaluation of PA terminologies since 2006 is shown in Figure 2 based on their h-index. These terms can be found by looking at the strategic diagram (Figure 3) with the help of SciMAT, a software that proved to be the most efficient in determining the trending keywords (themes). A few of the most frequently used terms (h-index more than 10) and relevant to this bibliometric study have been extracted from the strategic diagram, and their thematic network structures are shown in Figure 4, and also, they are briefly introduced below.
  • Vegetation Index (VI): VI is a common term for a class of indices that are used in agriculture to derive a plant’s status via the observation of their reflected spectrum in multiple bands [50]. Plants take in these lights and reflect near infrared (NIR), which a human eye cannot see. Stressed or dead leaves will show more red light than healthy leaves. Another term associated with VI is the Normalized Difference VI (NDVI), where the health status is derived by considering a plant’s reflected spectrum in the near-infrared and that in the visible range (red wavelengths). It is good to use the NDVI index to figure out how healthy the plants are and how much biomass they have. When the field is covered in healthy leaves, the NDVI index goes up. If an area is there with a lot of vegetation, then NDVI may not be able to see very small changes in the plants. Other approaches based on spectral indices are available and are quite commonly employed, for more details on indices, one may consult [51]. In addition to monitoring plant health, VI is very important in determining canopy height, chlorophyll content, when to start fertilizer, and when to start irrigation.
  • UAVs: UAVs are now becoming more popular when it comes to monitoring, not only when applied to agriculture but also the other important aspects, such as power-line inspection, pipeline monitoring, building and structure monitoring etc. UAVs can fly autonomously over the area and take images of the various regions. The information regarding where the vulnerabilities are present in the field is extracted from these snapshots using standard tools. A decision support system uses this information to determine how much fertilizer, water, or other resources are essential and in what quantities. UAVs are becoming increasingly popular for monitoring, not just in agriculture, but also in other critical areas, such as power-line inspection, pipeline monitoring, construction and structural monitoring, and so on.
  • Unmanned Aerial Systems (UAS): The term UAS refers to the ensemble drones, ground control systems, the cameras, GNSS, the software, maintenance tools that are required to operate and enable UAVs to fly autonomously or remotely. UAS gives freedom to the growers to make the decision online and, as an example of UAS system, the article [52] can be followed to see how the UAV and the associated sensory devices work in decision making.
    It is critical to note that the term remotely piloted aircraft (RPA) systems has been used multiple times to refer to UAS systems. A review of RPA applications in PA is provided in [53], where the authors use the term RPA to refer to UAVs or drones. Additionally, the review article [54] discussed drones and RPAs as well as those with the same name as RPAs that have been used in agriculture. In [55], a technique for developing and constructing a prototype of a low-cost quadcopter-type RPA for precision agriculture applications is described.
  • Sensors: Sensors are becoming less expensive and more advanced as technology advances. The sensors are the backbone of the PA, providing vital information about variability in farm areas. Sensors are also utilized to determine the viability of a given crop being grown on farm area. Wireless sensors have been widely used to collect data from farm fields and interact with UASs for further processing and decision making. It is a good to follow [56] for additional information on sensors and sensory devices.
  • Detection Methods: There are various kinds of detection methods from the farm available in the literature that talk about, for example, disease detection [57], crop row detection [58], fruit detection [59], tree detection [60], weed detection [61], etc., using UAV. These detection methods further help in the decision making process in the farm.
  • Soils: The first and utmost importance is given to soil management for PA [7]. Soil management is a way of bifurcating the field into different categories depending on soil content. The soil samples can be collected from different points/locations from the field. The soil quality can be measured in the laboratories using the collected samples and, depending on the categorization, it can be implemented [62]. The color variations in the images of the soil acquired by drones after plowing the fields play an important role in segregating the fields [7]. However, soil management is usually very expensive and time-consuming since, in order to be effective, it has to be run continuously. Similar outcomes but with very less effort can be obtained using UAVs equipped with RGB cameras: through the acquisition of several RGB images from the field, it is possible to infer whether the soil is sunny-wet, sunny-dry, shadow-wet, shadow-dry and also other decisions via an off-line image processing [63].
  • Neural Networks (NNs): PA practices rely on accurate mapping of farmlands. A neural network is a system for managing and mapping UAV remote sensing for the best outcomes. When applied to UAVs for PA, NNs proved to be the best in remote sensing in several situations. A multispectral camera along with the NNs has shown that a semantic segmentation of citrus orchards is highly achievable with deep neural networks [64]. Based on an NN technique, a methodology given in [65] proposes an automatic strategy for the large-scale mapping of date palm trees from very-high-spatial-resolution UAV images. NNs also played an important role in spraying UAVs. For example, Khan et al. in [66] proposed an accurate real-time recognition method based on NNs which is critical for UAV-based sprayers.

3. Why UAV in PA?

It is convincing that as the world’s population grows, there is a need to enhance what is done on the farm, and UAVs are a reasonable extension of improving productivity and quality of the crops that will be cultivated [67]. The UAVs should hover over the field and detect areas that have a significant infestation that cannot be seen from a distant location, allowing a range of pesticides to be applied exclusively to smaller regions where the infections are present [68]. The objective of these efforts is to achieve productivity gains of 70 % by 2050 to satisfy the growing requirements of the Earth’s population while decreasing the area under agriculture [69]. UAVs fall under the category of aerial robotics, which is the deployment of devices to perform beneficial tasks, including agriculture which is the focus of this review study. The designed UAV system should have the required payload capacity [70] and can fly to survey the area being monitored [71]. In addition (but not limited to), the system may include near-infrared, visible cameras aboard, the potential to use thermal cameras [72], and multispectral and hyperspectral cameras.

Technologies of UAV in PA

UAVs have recently been employed in agriculture for large-scale inspections as well as irrigation [73] and fertilization [74]. A drone’s payload is made up of all the sensors and actuators attached to it, i.e., (i) multispectral and hyperspectral cameras, (ii) infrared cameras and RGB cameras, (iii) light detection and ranging (LiDAR) systems [75], and global navigation satellite system (GNSS). More research into the integration of multi-sensors on a UAV platform, such as RGB cameras, LiDAR, thermal cameras, and multi/hyperspectral cameras, is needed to improve PA estimate accuracy [76]. The necessity for a thermal camera aboard a UAV can be reduced by using the technique outlined in [77]. Due to the restricted area available for picture collection, UAV photos seldom catch well-watered plants or arid areas. The thermal camera is not required in the approach such as the one described in [77], which focuses on certain mapping using just multispectral pictures, might considerably decrease operating and investment expenses. A good quality image can be achieved utilizing the suggested methods and control approaches of [78]. This research also shows that the suggested methods can handle various image datasets, such as those acquired by frame cameras with variable sensor-to-object distances over some crop fields.
When evapotranspiration is involved, it is sometimes apparent to utilize thermal cameras. The current conventional technique for aligning thermal imaging employs GPS logger data for the initial imagery spot; however, it does not account for changes in meteorological conditions during the flight, leading to unsatisfactory outcomes. To improve this scenario, three alignment and calibration techniques based on RGB image alignment were developed in [79]. According to the findings of [80], the suggested thermal calibration technique based on temperature-controlled standards can provide appropriate precision and accuracy for UAV-based canopy temperature estimates applied to PA.

4. Why UAV in PV?

The key to success for any PV strongly depends on the collection of the maximum amount of geo-referenced information of the whole vineyard and the technologies used to monitor. Since the late twentieth century, UAVs have advanced quickly. UAV remote sensing has been swiftly put into reality as agricultural remote sensing has improved. Vegetation coverage monitoring, growth tracking, and yield estimates are among the most common types of field growth data collected using UAV platforms [81]. In addition, UAVs have partially solved terrestrial and high-altitude remote sensing shortcomings, offering substantial support for PA crop information monitoring technologies [82].

4.1. Technologies of UAV in PV

The initial phase in the wine-making process is viticulture. The circumstances of the vineyard and human decisions in the vineyard determine the quality of wine [83]. The human decision for PV is based on the data received from the UAV sensors and attached tools [84]. The location of the vineyard influences the flavor of the grapes grown. The majority of PV research is focused on vegetation index information [85]. The vegetation index data may be gathered using multispectral cameras placed on the UAVs [86]. Changes in items that affect our environment, such as water quality and plant cover, are measured using these cameras. It is now feasible to build maps of vegetation coverage for the whole region under examination using these cameras on UAVs. Using remotely sensed data, the NDVI has been utilized to reveal discrepancies in grapevine performance. The NDVI is calculated using the formula
NDVI = ρ n i r ρ r e d ρ n i r + ρ r e d ,
where ρ n i r and ρ r e d are the reflectance levels in the near-infrared and in the red spectrum, respectively [87]. In numerous investigations, NDVI readings in vineyards have been related to leaf area index (LAI). The relationship between NDVI and vineyard LAI is well known because NDVI is strongly linked to the gross quantity of chlorophyll. Increasing leaf area leads to a higher gross quantity of chlorophyll per unit area of the vineyard. LAI is a major physiological component for characterizing crop growth models and vegetation indices for expressing crop growth status. The NDVI does not have a linear connection with LAI. Using UAV platforms, many studies on spectral data monitoring growth indicators such as LAI [88,89] have been conducted. Authors in [90] describe three models that use a quad-rotor UAV platform with a digital camera to examine the link between LAI and canopy coverage. A UAV fitted with hyperspectral cameras was deployed to evaluate different cultivars to illustrate the feasibility of LAI monitoring in the context of PA [91]. Various methods of calculating vegetation indices have been used throughout history; however, the most often researched vegetation indices are given in [92].
Multispectral cameras collect data from the electromagnetic spectrum across different bands, or frequency intervals [93]. In particular, they are used for the NIR spectrum, specifically in the range 800–850 nm, since this band is important for determining the health of plants [94]. In the NIR spectrum, plants emit up to 60 % of their total electromagnetic radiated energy [95]. For measuring vegetation on the ground, differences in reflected light in the NIR part of the spectrum are critical. Multispectral remote sensing datasets are used to detect light energy reflected from objects on the earth’s surface and estimate various physical and chemical characteristics of things that are not visible to the naked eye. Following that, the measurements provide us with information about what is on the ground. For example, vegetation is typically indicated by pixels with a spectrum containing much NIR light energy.
The growers starts with the grape variety, and once that is achieved, understanding the soil for them becomes important. Figure 5 shows a block diagram of the decision making process for the categorization of the soil (mainly the field of plots on which the grapevine is cultivated). The acquisition of aerial photographs takes place in the first block. The photographs are used in the second block to create a mosaicked image of the site under consideration [96]. This mosaicked image is also used to analyze the relationship between various surface soil properties, such as organic matter, moisture, clay, silt, sand, and other soil content, and then, the soil is categorized accordingly. When the vegetation indices value were utilized as input data in trained techniques, the best performance in the categorization of vineyard soil RGB pictures was obtained, with overall accuracy values around 0.98 and high sensitivity values for the soil [97]. To monitor farmland soil parameters and crop growth, the UAV’s remote sensing have been equipped with high-resolution hyperspectral sensors [98].
The use of UAVs equipped with RGB cameras has some limitations; indeed, during the first tillage process when the fields are usually covered in vegetation and/or crop leftovers, soil images cannot be shot. In addition, sometimes it is challenging to take photographs in uneven terrain that affects the grape production. Elevation, latitude, slope, and aspect are among the geographical elements that influence grape production [99]. For instance, in many of the world’s best wine areas, nearby water and mountains have a strong impact [100] as also temperature, sunshine, and wind [36]. Degree days are used to quantify the amount of heat that accumulates over the course of the growing season [101]. The amount of heat necessary for grapes to reach maturity varies depending on the grape type.
Photosynthesis and taste development require sunlight [102]. However, too much exposure to sunlight might result in sunburn and shriveling of grapes. So, when planning a vineyard, row orientation and sunshine are critical considerations. It is required to make sure the afternoon sun is shining on the non-exposed section of the fruit [103]. In [104], a technique for evaluating heat and radiative stress impacts in terms of temperature at the cluster and canopy level is suggested. A high-resolution thermal monitoring method is described, which uses a UAV and a wireless sensor network (WSN) to integrate remote and proximal sensing.
Irrigation is required frequently in the summer due to dry weather or a lack of water-holding capacity while, on the other hand, it is a common practice to give a vine as little water as possible once it has reached full maturity [105]. In this regards, the amount of water stress is crucial in order to decide when the irrigation should start, as well as its duration. Furthermore, in this case, UAV-endowed image acquisition equipment can be fruitfully exploited for the inspection of the targeted area, as sketched in Figure 6. For example, in [106], a model utilizing UAVs is developed to evaluate on a plant-by-plant basis stress sectors within the vineyard for optimal irrigation management and to detect geographic variability within the vineyards.
The quantity of water accessible to the vine and the nutrients it requires are determined by the soil type [107]. The macronutrients that we require are mostly nitrogen, phosphorus, and potassium [108]. From vineyard architecture to clonal and rootstock selection, viticultural decisions are made to suit the specific characteristics of each location [109]. For example, because grapevines are sensitive to phylloxera, a soil parasite, the resistant rootstock is frequently utilized to protect the vine [110].
Growers must consider vine density, row spacing, and direction while creating a new vineyard so that distant sensing would be simple [111]. Canopy management is one of the important measures to take by the growers. It requires continuous inspection throughout the year. However, these operations are time-consuming and difficult for the entire vineyard. The use of photogrammetric methods has shown to be effective to this aim [27]. Increased airflow and sunshine in the fruiting zone and lower disease pressure may be achieved by canopy management [112]. By maintaining the vineyard floor, farmers may impact soil fertility and water availability [113]. Cover crops are mowed to limit competition or used to reduce surplus soil moisture. Plants that affect the growth of the vine are removed by tilling nitrogen-rich cover crops into the soil [114]. In [30], a novel approach for assessing vineyard trimming is suggested, wherein UAV technology is used to produce photogrammetric point clouds, which are then analyzed using object-based image analysis algorithms.
The biggest obstacle for viticulturists is the weather, also because they have no control on it. In a particular year, hail, spring frost, drought, extreme heat, and rain can lower yields or degrade fruit quality. Pests and diseases also pose a danger to the vineyard’s long-term viability. Powdery mildew is the most prevalent illness in most cases. In [115], the authors propose a spatial-spectral segmentation technique for analyzing hyperspectral imaging data obtained from UAVs and applying it to predicting powdery mildew infection levels in undamaged grape bunches before veraison. Beginning with bud break, farmers must be proactive in planning for and responding to this situation. Grapes are filled with sugar as they get closer to ripeness, and when the fruit ripens, the sugar will leak through the skin, providing a valuable food supply for the naturally existing fungus in the vines. When this happens, it gets a disease called Botrytis bunch rot [116]. Grapevine viruses such as leafroll, and bacterial infections like Pierce’s are spread by insects and require special care [117]. It is required to employ integrated pest management to identify the appropriate treatment approach. Control measures include anything from cultural techniques to canopy management, vineyard floor operations, and perhaps, pesticides.
One of the most severe obstacles to present grapevine farming practices is a lack of personnel. As technology develops and labor prices rise, we may expect increased mechanization in the vineyard. As a result, growers must now address the problem of sustainability and consider using an alternate strategy that incorporates UAVs to save time.

5. Results

This section provides results of the biblometric study on the popular trends globally on adopting UAV-PA and UAV-PV, popular journals for academicians to submit their work, available funding organizations, popular nations, institutions, and authors conducting research on utilizing UAV-PA and UAV-PV.

5.1. Global Trends in Adopting the UAV-PA

The global trend in adopting UAV-PA is more inclined towards the USA, where many researchers are working, followed by China, and then, Italy, as also Figure 7 depicts.

5.1.1. Worldwide Published Documents by Countries on Adopting UAV-PA

More than 80 nations are working on adoption of the UAV-PA, with the top ten countries shown in Figure 8. This research was conducted using the Scopus database. As of database, a total of 1084 relevant papers had been published globally, with ten nations’ publications depicted in Figure 8.
The USA has contributed a total of 238 papers. It is the leading country in releasing documents on the use of UAV-PA, followed by China, which has published 202 documents, and Italy, which has published 120 documents. Spain is in fourth place to release high-quality publications, with 91 documents contributed, followed by Brazil in fifth place with 74 documents.

5.1.2. Worldwide Published Documents by Authors on Adopting UAV-PA

Around the world, numerous researchers have published research articles based on the use of UAV-PA. The year-wise publication is shown in Figure 9. It can be observed from the figure that the trend in the research towards the use of UAV-PA is increasing as the year changes.
We classified leading authors on adoption of the UAV-PA into three categories, (i) on the basis of the author’s citation count, (ii) on the basis of the number of documents produced by an author, and, (iii) on the basis of the author’s most cited documents, and we presented the results accordingly.
The top 15 most popular authors are represented in Figure 10 based on their citations in the PA. According to Scopus, Dr. Francisca López Granados is the leading researcher in this subject, with a total of 1525 citations in this field, and also she produced maximum number of the documents compared to other researchers in the same field, see Figure 11. She is a scientific researcher and member of the Imaging Group: Remote sensing applied to Precision Agriculture and Malherbology at Institute for Sustainable Agriculture (IAS) of the Spanish Council for Scientific Research (CSIC). If we refer to Table 2, the highest cited article on adopting UAV in PA is titled on “Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging” with a total citation count of 371 and is authored by Juliane Bendig, Andreas Bolten, Simon Bennertz, Janis Broscheit, Silas Eichfuss and Georg Bareth, all associated with Institute of Geography (at the time of publication of this article), GIS & RS, University of Cologne, Germany.

5.1.3. The Top Ten Journals with the Most Publications and Top Funding Sponsors on Adopting UAV-PA

Researchers’ top chosen journals for publishing their work on UAV adoption in PA are shown in Table 3. The ranking provided in the table is based on the number of documents published in adopting UAV-PA.
Researchers choose to publish their work on using UAV-PA for Remote Sensing, which has a total of 111+ papers. With over 33 papers, Computers and Electronics in Agriculture is the second most popular journal in this subject.
The top ten funding agencies in the world that have provided grants for the use of UAV-PA are listed in Figure 12. China’s National Science Foundation leads the way in financing research on using UAV-PA, with 51 articles published, followed by the European Commission with 37 documents. According to papers released and available in the Scopus database, most of the financing came from Europe, the United States, Brazil, and China in UAV-PA.

5.2. Global Trends in Adopting the UAV-PV

PV is fast acquiring international recognition. The worldwide trend in using UAV-PV is leaning towards Italy, where many academics are working, followed by Spain, and finally, the United States in the top three spots. The global trend in UAV-PV is shown in Figure 13.

5.2.1. Worldwide Published Documents by Countries on Adopting UAV-PV

Over 22 countries are working on UAV adoption in PV, with the top 10 countries indicated in Figure 14. Referring to the Scopus database, a total of 182 articles related to the use of UAV-PV had been published internationally as of 15 November 2021, with 10 countries’ publications shown in Figure 14.
Italy has contributed a total of 63 papers. It is the leading country in terms of document release on UAV-PV, followed by Spain (45 documents) and the United States (31 documents). With 15 papers, Australia is the fourth and most popular country for issuing high-quality publications, followed by France and Portugal with 13 and 11 documents, respectively.

5.2.2. Worldwide Published Documents by Authors on Adopting UAV-PV

Over hundreds of academicians from around the world have published articles on the use of UAV-PV. Figure 15 shows the number of publications by year. The graph depicts the research trend toward the usage of UAV-PV, increasing as the year passes. According to the database, the trend towards using UAV-PV began in 2011.
According to the articles published, the top 10 highly cited researchers in PV are listed in Figure 16. Dr. Alessandro Matese of the National Research Council’s Institute of BioEconomy (CNR-IBE) in Florence, Italy, is the most cited researcher in the topic of UAV adoption in PV, with a total of 777 citations to his name. He also has the most articles published in this field (Figure 17), with 19, making him the most prolific author in this field. Referring to Table 4, the highest cited article on adopting UAV-PV is titled on “Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture” with a total citation count of 312 and is authored by Alessandro Matese, Piero Toscano, Salvatore Filippo Di Gennaro, Lorenzo Genesio, Francesco Primo Vaccari, Jacopo Primicerio, Alessandro Zaldei, Beniamino Gioli who are associated with IBIMET CNR—Istituto di Biometeorologia, Consiglio Nazionale delle Ricerche, Firenze, Italy (at the time of publication of this article), and Claudio Belli, Roberto Bianconi who are associated with Terrasystem s.r.l., Italy (at the time of publication of this article).

5.2.3. The Top Five Journals with the Most Publications and Funding Sponsors on Adopting UAV-PV

Table 5 shows the top journals chosen by researchers for publishing their work on UAV adoption in PV. The ranking provided here in the table is based on the number of documents published in adopting UAV-PV.
Researchers choose to publish their study on utilizing UAV-PV in the journal Remote Sensing, containing 29+ papers. On the other hand, Acta Horticulturae and Computers and Electronics in Agriculture are the second and third most popular journals, with 9 and 7 articles, respectively.
Figure 18 depicts the top five funding agencies on the globe that have given grants for the use of UAV-PV. With 16 papers, the European Commission leads funding research on the use of UAV-PV, followed by the European Regional Development Fund with 10 articles. According to articles published and available in the database, the majority of the funding came from Europe.

6. Findings and Discussion

This bibliometric analysis is intended to provide scholars working in this field with an update on UAV-PA and UAV-PV. This survey also recommended the popular journal favored by researchers to publish their work, the popular funding agencies accessible, popular nations, and popular authors based on their citations leading the research in using UAVs for PA. This survey revealed the use of UAV-PA and UAV-PV plays an essential role in increasing crop quality and yield by remote sensing the farm’s health and status. This approach greatly reduces labor costs, manpower, and time for growers and vineyard managers. Growers increase their production potential by using this approach since it keeps the plants healthier and longer, allowing them to supply to the world, which is the primary goal of using UAVs in PA. Viticulture is a key economic element in several European nations. PV refers to the practice of grape growers and winemakers of using various information technology tools. These tools required to better sense and grasp variability in their production systems and then using that knowledge to better match production inputs to desired or expected outputs. Crop sensors and yield monitors, remote sensors, GIS, and GNSS systems rely on PV technical advances, which is a growing trend in the wine business. The enhanced and cost-effective sensors, methods, and equipment for data gathering from UAVs are driving significant growth in PV. In a sense, the UAVs are proving to be the backbone of PA. Furthermore, there are many other options of sensors for vineyards available. Especially, the proximal sensors that have proved to be one of the key elements for the management of treatments and interventions within the vineyards. For example, tractors and also unmanned ground vehicles have been designed to be able to perform site-specific tasks without human intervention, credit to sensors that can keep an eye on the plant’s health on-the-go [130]; an experiment was carried out without human intervention during the night using an all-terrain vehicle developed in [131] to which various sensors were attached to obtain an image of the vine in order to identify the berries; in [132], a technique based on optical sensors is proposed for different irrigation treatments to vineyards, allowing a non-invasive evaluation of plant water stress dynamics in a timely manner; in [133], an experiment is conducted on 21 individual branches under various canopy treatments, utilizing the proximal sensor to improve the accuracy of vineyard yield estimation.
In the field of PV, there have been numerous developments in the use of UAV-PV. A few of these are mentioned here, although there are more throughout the paper.
  • Soil categorization: When the vegetation indices values were utilized as input data in trained techniques, the best performance in the categorization of vineyard soil RGB pictures was obtained, with overall accuracy values around 0.98 and high sensitivity values for the soil [97]. To monitor farmland soil parameters and crop growth, the UAV’s remote sensing have been equipped with high-resolution hyperspectral sensors [98].
  • Weed detection and control: In vineyards, bermudagrass is a major issue. The spectral closeness of grapevines and bermudagrass makes it tough to distinguish the two species using just spectral information from a multi-band image sensor. Using ultra-high spatial resolution UAV pictures and object-based image analysis, this problem has been solved and the accuracy of this approach to distinguishing between grapevines and bermudagrass (Cynodon dactylon) is better than 97.7 % [134]. Additionally, an algorithm is proposed in [135] for detecting and mapping the presence of bermudagrass based on spatial information, as well as for accurately mapping the presence of vines, cover crops, Cynodon dactylon, and bare soil in order to apply site-specific treatment to the vegetation. Furthermore, this research claims to be effective in controlling bermudagrass in a short amount of time. As a result, the combination of UAV imagery and the algorithm would enable farmers to continue cover crop-based management schemes in their vineyards while also controlling bermudagrass.
  • Disease detection: Disease detection is essential in preventing the disease from spreading further in the vineyard. If the disease spreads in vineyards, it has severe economic effects for the growers, and detecting the disease in the vineyard is one of the most difficult tasks for viticulturists. A deep learning technique was reported in [136] to identify areas of infection in the grapevine using the UAV by taking images in the visible domain and then processing them with convolution neural networks to detect the symptoms. This paper also claims that the technique used is more than 95.8 % accurate in detecting the infection. Flavescence dorée, a form of grape vine disease, that can be identified using UAV multispectral data as reported in [57]. This study also examines the potential for 20 variables, i.e., 11 related to vegetation indices, 5 depend on spectral bands, and 4 associated with biophysical parameters, to be computed from UAV multispectral imagery in order to remotely identify symptomatic from asymptomatic areas in a vineyard.
  • Monitoring the vegetation and irrigation control: Due to the direct relation between radiation interception and evaporative surface, the canopy cover maps are used for irrigation management primarily in order to calculate the basic evapotranspire coefficient. Crop size and temporal development rely on the water supply, and crop canopy maps are accordingly measured to identify spatial irrigation system consistency. The results of [137] showed that the green-red vegetation index (GRVI) is appropriate for assessing vegetation cover. When it came to recognizing phenological crop changes and detecting variety in field irrigation, the GRVI outperformed the NDVI. Motohka et al. [138] suggested the usage of GRVI, which may be calculated using the formula
    GRVI = ρ g r e e n ρ r e d ρ g r e e n + ρ r e d ,
    where ρ g r e e n is green reflectance, and ρ r e d is reflectance of visible red. This GRVI is used to determine (i) Green vegetation: ρ g r e e n is higher than ρ r e d , (ii) Soils: ρ g r e e n is lower than ρ r e d , and (iii) Water or snow: ρ g r e e n and ρ r e d are almost the same.
  • Grapevine maturity: It was discovered in [139] that by using spectral information gathered from a UAV, it is possible to distinguish between vines of various vigor in a Guyot-trained, mature vineyard of ‘Sangiovese’ located in Tuscany. A system for determining the ripeness of grape clusters has been developed by the researchers in Spain [140]. When a grape begins to become bluish, it is presumed to be ripe, and using simple image processing and filtering, it is possible to identify mature grape clusters in a short amount of time.
  • Yield estimation: Forecasting yields is critical for harvest management and scheduling wine-making activities. Traditional yield prediction approaches are time-consuming and depend on manual sampling, making it challenging to account for vineyards’ inherent geographical variability. In [140], an unsupervised and automated method for detecting grape clusters in red grapevine types is established using UAV photogrammetric technique and color indices, with R 2 values greater than 0.82 . This precision gained in grape detection opens the door to red grape vineyard production prediction. Every farmer aspires to forecast their vineyard’s yield estimation in advance, and hence yield prediction is an important issue in vineyard management in order to achieve the required grape production and quality. In [141], an automated system is being developed that can predict yield estimation (5 weeks before harvest) using high-resolution RGB photos and a UAV platform throughout the vineyard. A technique has also been developed in [142] for capturing multispectral imagery through UAV, which is then processed together with artificial neural networks to create a relationship between the vegetation index, vegetated fraction cover, and yield. This technique demonstrates that when machine learning is used, the outcomes are significantly more accurate. Although promising results were obtained earlier in the development process, more exact yield forecasts were achieved when images were captured nearer to the harvest date.
During the study and utilizing the Scopus database, it was discovered that more than 80 countries are working on UAV adoption in PA as of 15 November 2021, with a large number of articles relevant to the use of UAVs for PA having been published internationally. The United States have provided the most papers and is the country that has released the most documentation on the use of UAVs for PA, followed by China, Italy, Spain, and Brazil. The top three publications favored by researchers working on UAV adoption for PA are (i) Remote Sensing, (ii) Computers and Electronics in Agriculture, and (iii) Nongye Gongcheng Xuebao Transactions of the Chinese Society of Agricultural Engineering. There are three major funding agencies in this field: China’s National Science Foundation, the European Commission, and China’s Ministry for Science and Technology.
PV is becoming more popular internationally as well. Globally, the use of UAV-PV is trending towards Italy, where many researchers and academicians are working, followed by Spain, and finally, the United States. Researchers preferred journals on using UAV-PV are (i) Remote Sensing, (ii) Acta Horticulturae, and (iii) Computers and Electronics in Agriculture. Two primary financing agencies for this area are (i) the European Commission and (ii) the European Regional Development Fund.
The major institutions working in this field are also worth mentioning (based on their citations), which are included in Table 6.

6.1. Some Lights on Economic Analysis

This section reports some costs that small growers can incur in case they want to use/implement UAV-PA technologies, such that a rough idea of a possible investment can be inferred. In particular, we briefly discuss the economic analysis in [36], where typical costs of service providers are given, and we also add further information regarding the costs in case an ad hoc custom solution is going to be implemented.
Farmers using UAVs to perform PA are aiming for high-quality images with high precision, as well as the ability to take measurements from the field at any time they choose. Due to high spatial resolution of imagery, high operational flexibility, and low operational costs, UAVs have demonstrated that they can compete with traditional acquisition platforms (such as satellites and aircraft). It has been observed that for small landholders with an area of around 5 ha, UAVs appear to be the most cost-effective solution due to their low cost in data acquisition and use of advanced sensor technologies [36]. In [36], three broad categories of cost analysis were considered with 100 images of the crop captured by UAVs, and this service was purchased from a third party, namely data acquisition, georeferencing and orthorectification, and image processing. The cost of data acquisition is €1500, which includes the cost of organizing and carrying out the acquisition campaign to obtain the raw images; the cost of georeferencing and orthorectification is €500; and the cost of image processing is €200. Depending on where you live, services from UAV providers can cost anywhere from $ 12 per hectare for raw image files to $ 70 per hectare for an orthomosaic image in the United States [143]. If one does not wish to use a service provider, another viable option is the AgriQ [144], which is a $ 1400 UAV that is developed for PA. AgriQ consists of three subsystems: the drone, the multispectral imaging system, and the open-source software that computes useful information for farmers [145]. The authors approached the problem from four angles: (1) the drone’s construction; (2) the vision algorithms; (3) the autonomous trajectory taking into account all parameters in order to properly recover all of the crops’ visual information; and (4) the development of a low-cost multispectral imaging system with multiple bands.

6.2. Future Possibilities

Acquiring and analyzing spectral data requires considerable additional cost and skill sets that are frequently unavailable; therefore, the identification of geometric indices that enable the monitoring of spatial variability using low-cost instruments based on photogrammetry techniques and high-resolution RGB cameras will be critical in future. There is a need for the rapid evolution of a yield forecast methodology based on UAV data and machine learning techniques in order to avoid routine human engagement. Various techniques for irrigation monitoring in a vineyard using UAVs have been developed, but research is still lagging, and future research directions could include automated irrigation based on crop water needs and monitoring of vines based on their physiology, more precisely, control of grapevine irrigation based on plant-based models. It is now necessary to develop a web-based one-stop service for UAV adoption in PV (and PA), which will assist in providing information related to supporting field operations in the vineyard, health monitoring of the vines and grapes, canopy management, and other key areas, as well as providing online digital content that is accessible from any location. Finally, but certainly not least, a UAV system with the bare minimal payload required for maximum surveillance of the vineyards should be designed so as to make it cost effective and that can be frequently used by the growers.

Author Contributions

Conceptualization, A.P.S. and L.G.; methodology, A.P.S. and A.Y.; formal analysis, A.P.S.; writing—original draft preparation, A.P.S.; writing—review and editing, V.M., A.Y., L.I. and L.G.; supervision, L.G.; funding acquisition, L.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work received funding from the Italian Ministry of University, Research Programma Operativo Nazionale 2014–2020 (PON “R&I” 2014–2020), project VERITAS, CUP—B64I20000820005.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data is fetched from the Scopus for the period 1 January 2006 to 15 November 2021 from the link https://0-www-scopus-com.brum.beds.ac.uk//.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
GISGeographic Information Systems
GPSGlobal Positioning System
GRVIGreen-Red Vegetation Index
GNSSGlobal Navigation Satellite System
IoTInternet of Things
LAILeaf Area Index
NIRNear Infrared
NDVINormalised Difference Vegetation Index
PAPrecision Agriculture
PVPrecision Viticulture
RGBRed, Green, and Blue
RPARemotely Piloted Aircraft
UAVUnmanned Aerial Vehicle
UAV-PAUnmanned Aerial Vehicle in Precision Agriculture
UAV-PVUnmanned Aerial Vehicle in Precision Viticulture
WSNWireless Sensor Network

References

  1. Langemeier, M.; Boehlje, M. What Will Be the Capabilities and Skills Needed to Manage the Farm of the Future? Farmdoc Dly. 2021, 11, 1–4. [Google Scholar]
  2. Zangina, U.; Buyamin, S.; Aman, M.N.; Abidin, M.S.Z.; Mahmud, M.S.A. A greedy approach to improve pesticide application for precision agriculture using model predictive control. Comput. Electron. Agric. 2021, 182, 105984. [Google Scholar] [CrossRef]
  3. Hedley, C. The role of precision agriculture for improved nutrient management on farms. J. Sci. Food Agric. 2015, 95, 12–19. [Google Scholar] [CrossRef]
  4. Beyer, M.; Wallner, M.; Bahlmann, L.; Thiemig, V.; Dietrich, J.; Billib, M. Rainfall characteristics and their implications for rain-fed agriculture: A case study in the Upper Zambezi River Basin. Hydrol. Sci. J. 2016, 61, 321–343. [Google Scholar] [CrossRef]
  5. Yang, C.; Everitt, J.H.; Du, Q.; Luo, B.; Chanussot, J. Using high-resolution airborne and satellite imagery to assess crop growth and yield variability for precision agriculture. Proc. IEEE 2012, 101, 582–592. [Google Scholar] [CrossRef]
  6. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  7. Huuskonen, J.; Oksanen, T. Soil sampling with drones and augmented reality in precision agriculture. Comput. Electron. Agric. 2018, 154, 25–35. [Google Scholar] [CrossRef]
  8. Khanal, S.; Fulton, J.; Shearer, S. An overview of current and potential applications of thermal remote sensing in precision agriculture. Comput. Electron. Agric. 2017, 139, 22–32. [Google Scholar] [CrossRef]
  9. Martínez, J.; Egea, G.; Agüera, J.; Pérez-Ruiz, M. A cost-effective canopy temperature measurement system for precision agriculture: A case study on sugar beet. Precis. Agric. 2017, 18, 95–110. [Google Scholar] [CrossRef]
  10. Onibonoje, M.O.; Nwulu, N. Synergistic Technologies for Precision Agriculture. In Artificial Intelligence and IoT-Based Technologies for Sustainable Farming and Smart Agriculture; IGI Global: Hershey, PA, USA, 2021; pp. 123–139. [Google Scholar]
  11. Lieve, W.V. Precision Agriculture and the Future of Farming in Europe; Scientific Foresight Unit (STOA): Brussels, Belgium, 2016. [Google Scholar]
  12. Xu, X.; Fan, L.; Li, Z.; Meng, Y.; Feng, H.; Yang, H.; Xu, B. Estimating Leaf Nitrogen Content in Corn Based on Information Fusion of Multiple-Sensor Imagery from UAV. Remote Sens. 2021, 13, 340. [Google Scholar] [CrossRef]
  13. Zhou, X.; He, J.; Chen, D.; Li, J.; Jiang, C.; Ji, M.; He, M. Human-robot skills transfer interface for UAV-based precision pesticide in dynamic environments. Assem. Autom. 2021, 41, 345–357. [Google Scholar] [CrossRef]
  14. Zhou, Z.; Majeed, Y.; Naranjo, G.D.; Gambacorta, E.M. Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications. Comput. Electron. Agric. 2021, 182, 106019. [Google Scholar] [CrossRef]
  15. Basso, B.; Antle, J. Digital agriculture to design sustainable agricultural systems. Nat. Sustain. 2020, 3, 254–256. [Google Scholar] [CrossRef]
  16. Xiao, K.; Xiao, D.; Luo, X. Smart water-saving irrigation system in precision agriculture based on wireless sensor network. Trans. Chin. Soc. Agric. Eng. 2010, 26, 170–175. [Google Scholar]
  17. Gómez-Candón, D.; De Castro, A.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef] [Green Version]
  18. Bonfante, A.; Monaco, E.; Manna, P.; De Mascellis, R.; Basile, A.; Buonanno, M.; Cantilena, G.; Esposito, A.; Tedeschi, A.; De Michele, C.; et al. LCIS DSS—An irrigation supporting system for water use efficiency improvement in precision agriculture: A maize case study. Agric. Syst. 2019, 176, 102646. [Google Scholar] [CrossRef]
  19. Burkart, A.; Hecht, V.; Kraska, T.; Rascher, U. Phenological analysis of unmanned aerial vehicle based time series of barley imagery with high temporal resolution. Precis. Agric. 2018, 19, 134–146. [Google Scholar] [CrossRef]
  20. Brown, R.M.; Dillon, C.R.; Schieffer, J.; Shockley, J.M. The carbon footprint and economic impact of precision agriculture technology on a corn and soybean farm. J. Environ. Econ. Policy 2016, 5, 335–348. [Google Scholar] [CrossRef]
  21. Al-Gaadi, K.A.; Hassaballa, A.A.; Tola, E.; Kayad, A.G.; Madugundu, R.; Alblewi, B.; Assiri, F. Prediction of potato crop yield using precision agriculture techniques. PLoS ONE 2016, 11, e0162219. [Google Scholar] [CrossRef]
  22. Colaço, A.F.; Molin, J.P.; Rosell-Polo, J.R. Spatial variability in commercial orange groves. Part 1: Canopy volume and height. Precis. Agric. 2019, 20, 788–804. [Google Scholar] [CrossRef] [Green Version]
  23. Álamo, S.; Ramos, M.; Feito, F.; Cañas, A. Precision techniques for improving the management of the olive groves of southern Spain. Span. J. Agric. Res. 2012, 10, 583–595. [Google Scholar] [CrossRef] [Green Version]
  24. Winkler, A.J. General Viticulture; University of California Press: Berkeley, CA, USA, 1974. [Google Scholar]
  25. Sassu, A.; Gambella, F.; Ghiani, L.; Mercenaro, L.; Caria, M.; Pazzona, A.L. Advances in Unmanned Aerial System Remote Sensing for Precision Viticulture. Sensors 2021, 21, 956. [Google Scholar] [CrossRef]
  26. Spachos, P.; Gregori, S. Integration of wireless sensor networks and smart uavs for precision viticulture. IEEE Internet Comput. 2019, 23, 8–16. [Google Scholar] [CrossRef]
  27. López-Granados, F.; Torres-Sánchez, J.; Jiménez-Brenes, F.M.; Oneka, O.; Marín, D.; Loidi, M.; de Castro, A.I.; Santesteban, L.G. Monitoring vineyard canopy management operations using UAV-acquired photogrammetric point clouds. Remote Sens. 2020, 12, 2331. [Google Scholar] [CrossRef]
  28. De Castro, A.I.; Jiménez-Brenes, F.M.; Torres-Sánchez, J.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. 3-D characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture applications. Remote Sens. 2018, 10, 584. [Google Scholar] [CrossRef] [Green Version]
  29. Di Gennaro, S.F.; Dainelli, R.; Palliotti, A.; Toscano, P.; Matese, A. Sentinel-2 validation for spatial variability assessment in overhead trellis system viticulture versus UAV and agronomic data. Remote Sens. 2019, 11, 2573. [Google Scholar] [CrossRef] [Green Version]
  30. Torres-Sánchez, J.; Marín, D.; De Castro, A.; Oria, I.; Jiménez-Brenes, F.; Miranda, C.; Santesteban, L.; López-Granados, F. Assessment of vineyard trimming and leaf removal using UAV photogrammetry. In Precision Agriculture’19; Wageningen Academic Publishers: Wageningen, The Netherlands, 2019; p. e0130479. [Google Scholar]
  31. Romboli, Y.; Di Gennaro, S.; Mangani, S.; Buscioni, G.; Matese, A.; Genesio, L.; Vincenzini, M. Vine vigour modulates bunch microclimate and affects the composition of grape and wine flavonoids: An unmanned aerial vehicle approach in a Sangiovese vineyard in Tuscany. Aust. J. Grape Wine Res. 2017, 23, 368–377. [Google Scholar] [CrossRef]
  32. Mondello, V.; Larignon, P.; Armengol Fortí, J.; Kortekamp, A.; Váczy, K.; Prezman, F.; Serrano, E.; Rego, C.; Mugnai, L.; Fontaine, F. Management of grapevine trunk diseases: Knowledge transfer, current strategies and innovative strategies adopted in Europe. Phytopathol. Mediterr. 2018, 57, 369–383. [Google Scholar]
  33. Santos, J.A.; Fraga, H.; Malheiro, A.C.; Moutinho-Pereira, J.; Dinis, L.T.; Correia, C.; Moriondo, M.; Leolini, L.; Dibari, C.; Costafreda-Aumedes, S.; et al. A review of the potential climate change impacts and adaptation options for European viticulture. Appl. Sci. 2020, 10, 3092. [Google Scholar] [CrossRef]
  34. Bramley, R.; Pearse, B.; Chamberlain, P. Being profitable precisely-a case study of precision viticulture from Margaret River. Aust. N. Z. Grapegrow. Winemak. 2003, 84–87. [Google Scholar]
  35. Arnó Satorra, J.; Martínez Casasnovas, J.A.; Ribes Dasi, M.; Rosell Polo, J.R. Precision viticulture. Research topics, challenges and opportunities in site-specific vineyard management. Span. J. Agric. Res. 2009, 7, 779–790. [Google Scholar] [CrossRef] [Green Version]
  36. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef] [Green Version]
  37. Pallottino, F.; Biocca, M.; Nardi, P.; Figorilli, S.; Menesatti, P.; Costa, C. Science mapping approach to analyze the research evolution on precision agriculture: World, EU and Italian situation. Precis. Agric. 2018, 19, 1011–1026. [Google Scholar] [CrossRef]
  38. Santana, L.S.; Teodoro, A.J.d.S.; Santana, M.S.; Rossi, G.; Palchetti, E. Advances in Precision Coffee Growing Research: A Bibliometric Review. Agronomy 2021, 11, 1557. [Google Scholar] [CrossRef]
  39. Abdollahi, A.; Rejeb, K.; Rejeb, A.; Mostafa, M.M.; Zailani, S. Wireless Sensor Networks in Agriculture: Insights from Bibliometric Analysis. Sustainability 2021, 13, 12011. [Google Scholar] [CrossRef]
  40. Lara, M.d.J.D.; Bernabe, J.G.; Benítez, R.Á.G.; Toxqui, J.M.; Huerta, M.K. Bibliometric Analysis of the Use of the Internet of Things in Precision Agriculture. In Proceedings of the 2021 IEEE International Conference on Engineering Veracruz (ICEV), Boca del Rio, Mexico, 25–28 October 2021; pp. 1–5. [Google Scholar]
  41. Bertoglio, R.; Corbo, C.; Renga, F.M.; Matteucci, M. The Digital Agricultural Revolution: A Bibliometric Analysis Literature Review. IEEE Access 2021, 9, 134762–134782. [Google Scholar] [CrossRef]
  42. Sott, M.K.; Nascimento, L.d.S.; Foguesatto, C.R.; Furstenau, L.B.; Faccin, K.; Zawislak, P.A.; Mellado, B.; Kong, J.D.; Bragazzi, N.L. A Bibliometric Network Analysis of Recent Publications on Digital Agriculture to Depict Strategic Themes and Evolution Structure. Sensors 2021, 21, 7889. [Google Scholar] [CrossRef]
  43. Aleixandre-Benavent, R.; Aleixandre-Tudo, J.L.; Ferrer-Sapena, A.; Aleixandre, J.L.; Alcaide, G.G.; Du Toit, W. Bibliometric analysis of publications by South African viticulture and oenology research centres. S. Afr. J. Sci. 2012, 108, 1–11. [Google Scholar] [CrossRef] [Green Version]
  44. Costa, C.; Biocca, M.; Pallottino, F.; Nardi, P.; Figorilli, S. Structure of the precision agriculture research in Italy from 2000 to 2016: A term mapping approach. Chem. Eng. Trans. 2017, 58, 643–648. [Google Scholar]
  45. Sånchez, G.C.; Castro-López, L.; Méndez, S. Contribution of Mexican scholars to viticultural and oenological research: Where do we stand? Oeno One 2018, 52, 273–278. [Google Scholar]
  46. Van Eck, N.J.; Waltman, L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics 2010, 84, 523–538. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Cobo, M.J.; López-Herrera, A.G.; Herrera-Viedma, E.; Herrera, F. SciMAT: A new science mapping analysis software tool. J. Am. Soc. Inf. Sci. Technol. 2012, 63, 1609–1630. [Google Scholar] [CrossRef]
  48. Waltman, L.; Larivière, V. Special issue on bibliographic data sources. Quant. Sci. Stud. 2020, 1, 360–362. [Google Scholar] [CrossRef]
  49. Visser, M.; van Eck, N.J.; Waltman, L. Large-scale comparison of bibliographic data sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic. Quant. Sci. Stud. 2021, 2, 20–41. [Google Scholar] [CrossRef]
  50. Pop, S.; Cristea, L.; Luculescu, M.C.; Zamfira, S.C.; Boer, A.L. Vegetation Index Estimation in Precision Farming Using Custom Multispectral Camera Mounted on Unmanned Aerial Vehicle. In International Conference on Remote Engineering and Virtual Instrumentation; Springer: Cham, Switzerland, 2019; pp. 674–685. [Google Scholar]
  51. Shafi, U.; Mumtaz, R.; García-Nieto, J.; Hassan, S.A.; Zaidi, S.A.R.; Iqbal, N. Precision agriculture techniques and practices: From considerations to applications. Sensors 2019, 19, 3796. [Google Scholar] [CrossRef] [Green Version]
  52. Ezzy, H.; Charter, M.; Bonfante, A.; Brook, A. How the Small Object Detection via Machine Learning and UAS-Based Remote-Sensing Imagery Can Support the Achievement of SDG2: A Case Study of Vole Burrows. Remote Sens. 2021, 13, 3191. [Google Scholar] [CrossRef]
  53. Santos, L.M.d.; Barbosa, B.D.S.; Andrade, A.D. Use of remotely piloted aircraft in precision agriculture: A review. Dyna 2019, 86, 284–291. [Google Scholar]
  54. Ahmad, A.; Ordonez, J.; Cartujo, P.; Martos, V. Remotely piloted aircraft (RPA) in agriculture: A pursuit of sustainability. Agronomy 2020, 11, 7. [Google Scholar] [CrossRef]
  55. Morerira, M.; Ferraz, G.; Barbosa, B.; Iwasaki, E.; Ferraz, P.; Damasceno, F.; Rossi, G. Design and construction of a low-cost remotely piloted aircraft for precision agriculture applications. Agron. Res. 2019, 17, 1984–1992. [Google Scholar]
  56. Morais, R.; Mendes, J.; Silva, R.; Silva, N.; Sousa, J.J.; Peres, E. A versatile, low-power and low-cost IoT device for field data gathering in precision agriculture practices. Agriculture 2021, 11, 619. [Google Scholar] [CrossRef]
  57. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.B.; Dedieu, G. Detection of Flavescence dorée grapevine disease using unmanned aerial vehicle (UAV) multispectral imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef] [Green Version]
  58. Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep network for crop row detection in UAV images. IEEE Access 2019, 8, 5189–5200. [Google Scholar] [CrossRef]
  59. Apolo-Apolo, O.; Martínez-Guanter, J.; Egea, G.; Raja, P.; Pérez-Ruiz, M. Deep learning techniques for estimation of the yield and size of citrus fruits using a UAV. Eur. J. Agron. 2020, 115, 126030. [Google Scholar] [CrossRef]
  60. Mohan, M.; Silva, C.A.; Klauberg, C.; Jat, P.; Catts, G.; Cardil, A.; Hudak, A.T.; Dia, M. Individual tree detection from unmanned aerial vehicle (UAV) derived canopy height model in an open canopy mixed conifer forest. Forests 2017, 8, 340. [Google Scholar] [CrossRef] [Green Version]
  61. Bah, M.D.; Hafiane, A.; Canals, R. Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef] [Green Version]
  62. Mouazen, A.M.; De Baerdemaeker, J.; Ramon, H. Towards development of on-line soil moisture content sensor using a fibre-type NIR spectrophotometer. Soil Tillage Res. 2005, 80, 171–183. [Google Scholar] [CrossRef]
  63. Al-Naji, A.; Fakhri, A.B.; Gharghan, S.K.; Chahl, J. Soil color analysis based on a RGB camera and an artificial neural network towards smart irrigation: A pilot study. Heliyon 2021, 7, e06078. [Google Scholar] [CrossRef]
  64. Osco, L.P.; Nogueira, K.; Ramos, A.P.M.; Pinheiro, M.M.F.; Furuya, D.E.G.; Gonçalves, W.N.; de Castro Jorge, L.A.; Junior, J.M.; dos Santos, J.A. Semantic segmentation of citrus-orchard using deep neural networks and multispectral UAV-based imagery. Precis. Agric. 2021, 22, 1171–1188. [Google Scholar] [CrossRef]
  65. Gibril, M.B.A.; Shafri, H.Z.M.; Shanableh, A.; Al-Ruzouq, R.; Wayayok, A.; Hashim, S.J. Deep convolutional neural network for large-scale date palm tree mapping from UAV-based images. Remote Sens. 2021, 13, 2787. [Google Scholar] [CrossRef]
  66. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Iqbal, J.; Wasim, A. Real-time recognition of spraying area for UAV sprayers using a deep learning approach. PLoS ONE 2021, 16, e0249436. [Google Scholar] [CrossRef]
  67. Das, S.; Chapman, S.; Christopher, J.; Choudhury, M.R.; Menzies, N.W.; Apan, A.; Dang, Y.P. UAV-thermal imaging: A technological breakthrough for monitoring and quantifying crop abiotic stress to help sustain productivity on sodic soils—A case review on wheat. Remote Sens. Appl. Soc. Environ. 2021, 23, 100583. [Google Scholar] [CrossRef]
  68. de Castro, A.I.; Shi, Y.; Maja, J.M.; Peña, J.M. UAVs for Vegetation Monitoring: Overview and Recent Scientific Contributions. Remote Sens. 2021, 13, 2139. [Google Scholar] [CrossRef]
  69. FAO. Declaration of the World Summit on Food Security; FAO: Rome, Italy, 2009. [Google Scholar]
  70. Madroñal, D.; Palumbo, F.; Capotondi, A.; Marongiu, A. Unmanned Vehicles in Smart Farming: A Survey and a Glance at Future Horizons. In Proceedings of the 2021 Drone Systems Engineering and Rapid Simulation and Performance Evaluation: Methods and Tools Proceedings, 2021 ACM, Budapest, Hungary, 18–20 January 2021; pp. 1–8. [Google Scholar]
  71. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A Review of Unmanned Aerial Vehicle Low-Altitude Remote Sensing (UAV-LARS) Use in Agricultural Monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  72. Awais, M.; Li, W.; Cheema, M.M.; Hussain, S.; Shaheen, A.; Aslam, B.; Liu, C.; Ali, A. Assessment of optimal flying height and timing using high-resolution unmanned aerial vehicle images in precision agriculture. Int. J. Environ. Sci. Technol. 2021, 19, 1–18. [Google Scholar] [CrossRef]
  73. Gago, J.; Douthe, C.; Coopman, R.E.; Gallego, P.P.; Ribas-Carbo, M.; Flexas, J.; Escalona, J.; Medrano, H. UAVs challenge to assess water stress for sustainable agriculture. Agric. Water Manag. 2015, 153, 9–19. [Google Scholar] [CrossRef]
  74. Bacco, M.; Ferro, E.; Gotta, A. UAVs in WSNs for agricultural applications: An analysis of the two-ray radio propagation model. In Proceedings of the SENSORS, 2014 IEEE, Valencia, Spain, 2–5 November 2014; pp. 130–133. [Google Scholar]
  75. Daponte, P.; De Vito, L.; Glielmo, L.; Iannelli, L.; Liuzza, D.; Picariello, F.; Silano, G. A review on the use of drones for precision agriculture. IOP Conf. Ser. Earth Environ. Sci. 2019, 275, 012022. [Google Scholar] [CrossRef]
  76. Shao, G.; Han, W.; Zhang, H.; Liu, S.; Wang, Y.; Zhang, L.; Cui, X. Mapping maize crop coefficient Kc using random forest algorithm based on leaf area index and UAV-based multispectral vegetation indices. Agric. Water Manag. 2021, 252, 106906. [Google Scholar] [CrossRef]
  77. Mokhtari, A.; Ahmadi, A.; Daccache, A.; Drechsler, K. Actual Evapotranspiration from UAV Images: A Multi-Sensor Data Fusion Approach. Remote Sens. 2021, 13, 2315. [Google Scholar] [CrossRef]
  78. Lin, Y.C.; Zhou, T.; Wang, T.; Crawford, M.; Habib, A. New orthophoto generation strategies from UAV and ground remote sensing platforms for high-throughput phenotyping. Remote Sens. 2021, 13, 860. [Google Scholar] [CrossRef]
  79. Maes, W.H.; Huete, A.R.; Steppe, K. Optimizing the processing of UAV-based thermal imagery. Remote Sens. 2017, 9, 476. [Google Scholar] [CrossRef] [Green Version]
  80. Han, X.; Thomasson, J.A.; Swaminathan, V.; Wang, T.; Siegfried, J.; Raman, R.; Rajan, N.; Neely, H. Field-Based Calibration of Unmanned Aerial Vehicle Thermal Infrared Imagery with Temperature-Controlled References. Sensors 2020, 20, 7098. [Google Scholar] [CrossRef] [PubMed]
  81. Wang, L.; Liu, J.; Yang, L.; Chen, Z.; Wang, X.; Ouyang, B. Applications of unmanned aerial vehicle images on agricultural remote sensing monitoring. Trans. Chin. Soc. Agric. Eng. 2013, 29, 136–145. [Google Scholar]
  82. Li, D.; Li, M. Research advance and application prospect of unmanned aerial vehicle remote sensing system. Geomat. Inf. Sci. Wuhan Univ. 2014, 39, 505–513. [Google Scholar]
  83. Matese, A.; Di Gennaro, S.F. Practical applications of a multisensor UAV platform based on multispectral, thermal and RGB high resolution images in precision viticulture. Agriculture 2018, 8, 116. [Google Scholar] [CrossRef] [Green Version]
  84. Karakizi, C.; Oikonomou, M.; Karantzalos, K. Spectral discrimination and reflectance properties of various vine varieties from satellite, UAV and proximate sensors. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 31. [Google Scholar] [CrossRef] [Green Version]
  85. Giovos, R.; Tassopoulos, D.; Kalivas, D.; Lougkos, N.; Priovolou, A. Remote Sensing Vegetation Indices in Viticulture: A Critical Review. Agriculture 2021, 11, 457. [Google Scholar] [CrossRef]
  86. Borgogno-Mondino, E.; Lessio, A.; Tarricone, L.; Novello, V.; de Palma, L. A comparison between multispectral aerial and satellite imagery in precision viticulture. Precis. Agric. 2018, 19, 195–217. [Google Scholar] [CrossRef]
  87. Matese, A.; Di Gennaro, S.F. Beyond the traditional NDVI index as a key factor to mainstream the use of UAV in precision viticulture. Sci. Rep. 2021, 11, 1–13. [Google Scholar] [CrossRef]
  88. Ata-Ul-Karim, S.T.; Zhu, Y.; Yao, X.; Cao, W. Determination of critical nitrogen dilution curve based on leaf area index in rice. Field Crop. Res. 2014, 167, 76–85. [Google Scholar] [CrossRef]
  89. Turner, D.; Lucieer, A.; Watson, C. Development of an Unmanned Aerial Vehicle (UAV) for hyper resolution vineyard mapping based on visible, multispectral, and thermal imagery. In Proceedings of the 34th International Symposium on Remote Sensing Of Environment, Sydney, Australia, 10–15 April 2011; p. 4. [Google Scholar]
  90. Córcoles, J.I.; Ortega, J.F.; Hernández, D.; Moreno, M.A. Estimation of leaf area index in onion (Allium cepa L.) using an unmanned aerial vehicle. Biosyst. Eng. 2013, 115, 31–42. [Google Scholar] [CrossRef]
  91. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  92. Zhu, G.; Ju, W.; Chen, J.; Liu, Y. A Novel Moisture Adjusted Vegetation Index (MAVI) to reduce background reflectance and topographical effects on LAI retrieval. PLoS ONE 2014, 9, e102560. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  93. Samiappan, S.; Turnage, G.; Hathcock, L.; Yao, H.; Kincaid, R.; Moorhead, R.; Ashby, S. Classifying common wetland plants using hyperspectral data to identify optimal spectral bands for species mapping using a small unmanned aerial systems—A case study. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 5924–5927. [Google Scholar]
  94. Blekos, K.; Tsakas, A.; Xouris, C.; Evdokidis, I.; Alexandropoulos, D.; Alexakos, C.; Katakis, S.; Makedonas, A.; Theoharatos, C.; Lalos, A. Analysis, Modeling and Multi-Spectral Sensing for the Predictive Management of Verticillium Wilt in Olive Groves. J. Sens. Actuator Netw. 2021, 10, 15. [Google Scholar] [CrossRef]
  95. Vanegas, F.; Bratanov, D.; Powell, K.; Weiss, J.; Gonzalez, F. A novel methodology for improving plant pest surveillance in vineyards and crops using UAV-based hyperspectral and spatial data. Sensors 2018, 18, 260. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  96. Iizuka, K.; Itoh, M.; Shiodera, S.; Matsubara, T.; Dohar, M.; Watanabe, K. Advantages of unmanned aerial vehicle (UAV) photogrammetry for landscape analysis compared with satellite data: A case study of postmining sites in Indonesia. Cogent Geosci. 2018, 4, 1498180. [Google Scholar] [CrossRef]
  97. Poblete-Echeverría, C.; Olmedo, G.F.; Ingram, B.; Bardeen, M. Detection and segmentation of vine canopy in ultra-high spatial resolution RGB imagery obtained from unmanned aerial vehicle (UAV): A case study in a commercial vineyard. Remote Sens. 2017, 9, 268. [Google Scholar] [CrossRef] [Green Version]
  98. Wang, J.; Wang, S.; Zou, D.; Chen, H.; Zhong, R.; Li, H.; Zhou, W.; Yan, K. Social Network and Bibliometric Analysis of Unmanned Aerial Vehicle Remote Sensing Applications from 2010 to 2021. Remote Sens. 2021, 13, 2912. [Google Scholar] [CrossRef]
  99. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Bessa, J.; Sousa, A.; Peres, E.; Morais, R.; Sousa, J.J. Vineyard properties extraction combining UAS-based RGB imagery with elevation data. Int. J. Remote Sens. 2018, 39, 5377–5401. [Google Scholar] [CrossRef]
  100. Zoto, J.; Musci, M.A.; Khaliq, A.; Chiaberge, M.; Aicardi, I. Automatic path planning for unmanned ground vehicle using uav imagery. In International Conference on Robotics in Alpe-Adria Danube Region; Springer: Cham, Switzerland, 2019; pp. 223–230. [Google Scholar]
  101. Cogato, A.; Pagay, V.; Marinello, F.; Meggio, F.; Grace, P.; De Antoni Migliorati, M. Assessing the feasibility of using sentinel-2 imagery to quantify the impact of heatwaves on irrigated vineyards. Remote Sens. 2019, 11, 2869. [Google Scholar] [CrossRef] [Green Version]
  102. Kopačková-Strnadová, V.; Koucká, L.; Jelének, J.; Lhotáková, Z.; Oulehle, F. Canopy top, height and photosynthetic pigment estimation using Parrot Sequoia multispectral imagery and the Unmanned Aerial Vehicle (UAV). Remote Sens. 2021, 13, 705. [Google Scholar] [CrossRef]
  103. Costa, J.; Egipto, R.; Sánchez-Virosta, A.; Lopes, C.; Chaves, M. Canopy and soil thermal patterns to support water and heat stress management in vineyards. Agric. Water Manag. 2019, 216, 484–496. [Google Scholar] [CrossRef]
  104. Di Gennaro, S.F.; Matese, A.; Gioli, B.; Toscano, P.; Zaldei, A.; Palliotti, A.; Genesio, L. Multisensor approach to assess vineyard thermal dynamics combining high-resolution unmanned aerial vehicle (UAV) remote sensing and wireless sensor network (WSN) proximal sensing. Sci. Hortic. 2017, 221, 83–87. [Google Scholar] [CrossRef]
  105. Soubry, I.; Patias, P.; Tsioukas, V. Monitoring Vineyards with UAV and Multi-sensors for the assessment of Water Stress and Grape Maturity. J. Unmanned Veh. Syst. 2017, 5, 37–50. [Google Scholar] [CrossRef] [Green Version]
  106. Romero, M.; Luo, Y.; Su, B.; Fuentes, S. Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management. Comput. Electron. Agric. 2018, 147, 109–117. [Google Scholar] [CrossRef]
  107. Poblete, T.; Ortega-Farías, S.; Moreno, M.A.; Bardeen, M. Artificial neural network to predict vine water status spatial variability using multispectral information obtained from an unmanned aerial vehicle (UAV). Sensors 2017, 17, 2488. [Google Scholar] [CrossRef] [Green Version]
  108. Salam, A. Internet of things in agricultural innovation and security. In Internet of Things for Sustainable Community Development; Springer: Cham, Switzerland, 2020; pp. 71–112. [Google Scholar]
  109. Puig Sirera, À.; Antichi, D.; Warren Raffa, D.; Rallo, G. Application of Remote Sensing Techniques to Discriminate the Effect of Different Soil Management Treatments over Rainfed Vineyards in Chianti Terroir. Remote Sens. 2021, 13, 716. [Google Scholar] [CrossRef]
  110. Reynolds, A. The grapevine, viticulture, and winemaking: A brief introduction. In Grapevine Viruses: Molecular Biology, Diagnostics and Management; Springer: Cham, Switzerland, 2017; pp. 3–29. [Google Scholar]
  111. Weiss, M.; Baret, F. Using 3D point clouds derived from UAV RGB imagery to describe vineyard 3D macro-structure. Remote Sens. 2017, 9, 111. [Google Scholar] [CrossRef] [Green Version]
  112. Volschenk, C.; Hunter, J. Effect of seasonal canopy management on the performance of Chenin blanc/99 Richter grapevines. S. Afr. J. Enol. Vitic. 2001, 22, 36–40. [Google Scholar] [CrossRef] [Green Version]
  113. Vance, A.J.; Reeve, A.L.; Skinkis, P. The role of canopy management in vine balance. In Corvallis, or Extension Service; Oregon State University: Corvallis, OR, USA, 2013. [Google Scholar]
  114. Cruz, A.; Botelho, M.; Silvestre, J.; Castro, R. Soil management: Introduction of tillage in a vineyard with a long-term natural cover. Ciênc. Téc. Vitiviníc. 2012, 27, 27–38. [Google Scholar]
  115. Abdulridha, J.; Ampatzidis, Y.; Roberts, P.; Kakarla, S.C. Detecting powdery mildew disease in squash at different stages using UAV-based hyperspectral imaging and artificial intelligence. Biosyst. Eng. 2020, 197, 135–148. [Google Scholar] [CrossRef]
  116. Mirás-Avalos, J.M.; Araujo, E.S. Optimization of Vineyard Water Management: Challenges, Strategies, and Perspectives. Water 2021, 13, 746. [Google Scholar] [CrossRef]
  117. Páscoa, R.N. In situ visible and near-infrared spectroscopy applied to vineyards as a tool for precision viticulture. In Comprehensive Analytical Chemistry; Elsevier: Amsterdam, The Netherlands, 2018; Volume 80, pp. 253–279. [Google Scholar]
  118. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  119. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef] [Green Version]
  120. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef] [Green Version]
  121. Lelong, C.C.; Burger, P.; Jubelin, G.; Roux, B.; Labbé, S.; Baret, F. Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots. Sensors 2008, 8, 3557–3585. [Google Scholar] [CrossRef]
  122. Torres-Sánchez, J.; Pena, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  123. Baluja, J.; Diago, M.P.; Balda, P.; Zorer, R.; Meggio, F.; Morales, F.; Tardaguila, J. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 2012, 30, 511–522. [Google Scholar] [CrossRef]
  124. Zarco-Tejada, P.J.; Guillén-Climent, M.L.; Hernández-Clemente, R.; Catalina, A.; González, M.; Martín, P. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV). Agric. For. Meteorol. 2013, 171, 281–294. [Google Scholar] [CrossRef] [Green Version]
  125. Tokekar, P.; Vander Hook, J.; Mulla, D.; Isler, V. Sensor planning for a symbiotic UAV and UGV system for precision agriculture. IEEE Trans. Robot. 2016, 32, 1498–1511. [Google Scholar] [CrossRef]
  126. Zarco-Tejada, P.J.; González-Dugo, V.; Williams, L.; Suarez, L.; Berni, J.A.; Goldhamer, D.; Fereres, E. A PRI-based water stress index combining structural and chlorophyll effects: Assessment using diurnal narrow-band airborne imagery and the CWSI thermal index. Remote Sens. Environ. 2013, 138, 38–50. [Google Scholar] [CrossRef]
  127. Mathews, A.J.; Jensen, J.L. Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud. Remote Sens. 2013, 5, 2164–2183. [Google Scholar] [CrossRef] [Green Version]
  128. Santesteban, L.; Di Gennaro, S.; Herrero-Langreo, A.; Miranda, C.; Royo, J.; Matese, A. High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard. Agric. Water Manag. 2017, 183, 49–59. [Google Scholar] [CrossRef]
  129. Zarco-Tejada, P.J.; Catalina, A.; González, M.; Martín, P. Relationships between net photosynthesis and steady-state chlorophyll fluorescence retrieved from airborne hyperspectral imagery. Remote Sens. Environ. 2013, 136, 247–258. [Google Scholar] [CrossRef] [Green Version]
  130. Matese, A.; Di Gennaro, S.F. Technology in precision viticulture: A state of the art review. Int. J. Wine Res. 2015, 7, 69–81. [Google Scholar] [CrossRef] [Green Version]
  131. Aquino, A.; Millan, B.; Diago, M.P.; Tardaguila, J. Automated early yield prediction in vineyards from on-the-go image acquisition. Comput. Electron. Agric. 2018, 144, 26–36. [Google Scholar] [CrossRef]
  132. Matese, A.; Baraldi, R.; Berton, A.; Cesaraccio, C.; Di Gennaro, S.F.; Duce, P.; Facini, O.; Mameli, M.G.; Piga, A.; Zaldei, A. Estimation of water stress in grapevines using proximal and remote sensing methods. Remote Sens. 2018, 10, 114. [Google Scholar] [CrossRef] [Green Version]
  133. Hacking, C.; Poona, N.; Manzan, N.; Poblete-Echeverría, C. Investigating 2-D and 3-D proximal remote sensing techniques for vineyard yield estimation. Sensors 2019, 19, 3652. [Google Scholar] [CrossRef] [Green Version]
  134. Jiménez-Brenes, F.M.; López-Granados, F.; Torres-Sánchez, J.; Peña, J.M.; Ramírez, P.; Castillejo-González, I.L.; de Castro, A.I. Automatic UAV-based detection of Cynodon dactylon for site-specific vineyard management. PLoS ONE 2019, 14, e0218132. [Google Scholar] [CrossRef]
  135. de Castro, A.I.; Peña, J.M.; Torres-Sánchez, J.; Jiménez-Brenes, F.M.; Valencia-Gredilla, F.; Recasens, J.; López-Granados, F. Mapping cynodon dactylon infesting cover crops with an automatic decision tree-OBIA procedure and UAV imagery for precision viticulture. Remote Sens. 2020, 12, 56. [Google Scholar] [CrossRef] [Green Version]
  136. Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
  137. Chen, A.; Orlov-Levin, V.; Meron, M. Applying high-resolution visible-channel aerial scan of crop canopy to precision irrigation management. Multidiscip. Digit. Publ. Inst. Proc. 2018, 2, 335. [Google Scholar]
  138. Motohka, T.; Nasahara, K.N.; Oguma, H.; Tsuchida, S. Applicability of green-red vegetation index for remote sensing of vegetation phenology. Remote Sens. 2010, 2, 2369–2387. [Google Scholar] [CrossRef] [Green Version]
  139. Caruso, G.; Tozzini, L.; Rallo, G.; Primicerio, J.; Moriondo, M.; Palai, G.; Gucci, R. Estimating biophysical and geometrical parameters of grapevine canopies (‘Sangiovese’) by an unmanned aerial vehicle (UAV) and VIS-NIR cameras. Vitis 2017, 56, 63–70. [Google Scholar]
  140. Torres-Sánchez, J.; Mesas-Carrascosa, F.J.; Santesteban, L.G.; Jiménez-Brenes, F.M.; Oneka, O.; Villa-Llop, A.; Loidi, M.; López-Granados, F. Grape Cluster Detection Using UAV Photogrammetric Point Clouds as a Low-Cost Tool for Yield Forecasting in Vineyards. Sensors 2021, 21, 3083. [Google Scholar] [CrossRef]
  141. Di Gennaro, S.; Toscano, P.; Cinat, P.; Berton, A.; Matese, A. A precision viticulture UAV-based approach for early yield prediction in vineyard. In Precision Agriculture’19; Wageningen Academic Publishers: Wageningen, The Netherlands, 2019; pp. 370–378. [Google Scholar]
  142. Ballesteros, R.; Intrigliolo, D.S.; Ortega, J.F.; Ramírez-Cuesta, J.M.; Buesa, I.; Moreno, M.A. Vineyard yield estimation by combining remote sensing, computer vision and artificial neural network techniques. Precis. Agric. 2020, 21, 1242–1262. [Google Scholar] [CrossRef]
  143. Hunt Jr, E.R.; Daughtry, C.S. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef] [Green Version]
  144. de Oca, A.M.; Flores, G. The AgriQ: A low-cost unmanned aerial system for precision agriculture. Expert Syst. Appl. 2021, 182, 115163. [Google Scholar] [CrossRef]
  145. Ardupilot. Planning a Mission with Waypoints and Events. 2020. Available online: http://ardupilot.org/copter/docs/common-planning-a-mission-with-waypoints-and-events.html (accessed on 12 March 2022).
Figure 1. Documents published in precision agriculture in the last 10 years (* prior to 15 November 2021).
Figure 1. Documents published in precision agriculture in the last 10 years (* prior to 15 November 2021).
Remotesensing 14 01604 g001
Figure 2. Thematic evolution structure. This demonstrates that the evaluation and association of keywords increased from 5 to 17 between 2006 and 2021.
Figure 2. Thematic evolution structure. This demonstrates that the evaluation and association of keywords increased from 5 to 17 between 2006 and 2021.
Remotesensing 14 01604 g002
Figure 3. Strategic diagram shows keywords of (a) 2006–2013, (b) 2013–2018 and (c) 2018–2021. The horizontal axis (centrality) depicts the importance of each subject and the number of links it has with other themes, while the vertical axis (density) depicts the number of connections between each cluster.
Figure 3. Strategic diagram shows keywords of (a) 2006–2013, (b) 2013–2018 and (c) 2018–2021. The horizontal axis (centrality) depicts the importance of each subject and the number of links it has with other themes, while the vertical axis (density) depicts the number of connections between each cluster.
Remotesensing 14 01604 g003
Figure 4. Thematic network structures. The size of the clusters is proportional to the h-index of the associated documents and the significance of the theme in the research field, while the thickness of the lines indicates the strength of the relationship between the clusters. (a) Vegetation index; (b) drones; (c) UAS; (d) sensors; (e) detection method; (f) soil; (g) neural networks.
Figure 4. Thematic network structures. The size of the clusters is proportional to the h-index of the associated documents and the significance of the theme in the research field, while the thickness of the lines indicates the strength of the relationship between the clusters. (a) Vegetation index; (b) drones; (c) UAS; (d) sensors; (e) detection method; (f) soil; (g) neural networks.
Remotesensing 14 01604 g004
Figure 5. Decision on soil categorization.
Figure 5. Decision on soil categorization.
Remotesensing 14 01604 g005
Figure 6. A schematic of inspection from the drone.
Figure 6. A schematic of inspection from the drone.
Remotesensing 14 01604 g006
Figure 7. Network showing the co-occurrence of the keywords PA and UAVs in the world (created using VOSviewer https://www.vosviewer.com// accessed on 15 November 2021).
Figure 7. Network showing the co-occurrence of the keywords PA and UAVs in the world (created using VOSviewer https://www.vosviewer.com// accessed on 15 November 2021).
Remotesensing 14 01604 g007
Figure 8. Published documents worldwide on adopting UAV-PA.
Figure 8. Published documents worldwide on adopting UAV-PA.
Remotesensing 14 01604 g008
Figure 9. Year-wise published documents worldwide on adopting UAV-PA (until 15 November 2021).
Figure 9. Year-wise published documents worldwide on adopting UAV-PA (until 15 November 2021).
Remotesensing 14 01604 g009
Figure 10. Most cited authors on adopting UAV-PA.
Figure 10. Most cited authors on adopting UAV-PA.
Remotesensing 14 01604 g010
Figure 11. Most documents published by an author on adopting UAV-PA.
Figure 11. Most documents published by an author on adopting UAV-PA.
Remotesensing 14 01604 g011
Figure 12. Popular funding sponsors with the most publications on adopting UAV-PA.
Figure 12. Popular funding sponsors with the most publications on adopting UAV-PA.
Remotesensing 14 01604 g012
Figure 13. Network showing the co-occurrence of the keywords PV and UAVs in the world (created using VOSviewer https://www.vosviewer.com// accessed on 15 November 2021).
Figure 13. Network showing the co-occurrence of the keywords PV and UAVs in the world (created using VOSviewer https://www.vosviewer.com// accessed on 15 November 2021).
Remotesensing 14 01604 g013
Figure 14. Published documents worldwide on adopting UAV-PV.
Figure 14. Published documents worldwide on adopting UAV-PV.
Remotesensing 14 01604 g014
Figure 15. Year-wise published documents worldwide on adopting UAVs for PV (until 15 November 2021).
Figure 15. Year-wise published documents worldwide on adopting UAVs for PV (until 15 November 2021).
Remotesensing 14 01604 g015
Figure 16. Most cited authors on adopting UAV-PV.
Figure 16. Most cited authors on adopting UAV-PV.
Remotesensing 14 01604 g016
Figure 17. Most documents published by an author on adopting UAV-PV.
Figure 17. Most documents published by an author on adopting UAV-PV.
Remotesensing 14 01604 g017
Figure 18. Popular funding sponsors with the most publications on adopting UAV-PV.
Figure 18. Popular funding sponsors with the most publications on adopting UAV-PV.
Remotesensing 14 01604 g018
Table 1. Search strategy adopted to explore and retrieve relevant publications for the bibliometric analysis from Scopus database.
Table 1. Search strategy adopted to explore and retrieve relevant publications for the bibliometric analysis from Scopus database.
DatabaseScopus
Topic for PA‘Precision Agriculture’ and ‘UAV’, ‘Precision Agriculture’ and ‘UAS’
Topic for PV‘Precision Viticulture’ and ‘UAV’, ‘Precision Viticulture’ and ‘UAS’, ‘Vineyard’ and ‘UAV’, ‘Vineyard’ and ‘UAS’
Number of relevant documents considered in PA1084
Number of relevant documents considered in PV182
Time-span1 January 2006–15 November 2021
Criteria for inclusionTitle, abstract, and keywords should contain search terms. Only English documents.
Bibliometric softwareSciMAT and VOSviewer
Table 2. Most cited documents on adopting UAV-PA.
Table 2. Most cited documents on adopting UAV-PA.
Article TitleAuthor DetailsJournal NameYearCitation Count
Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging [118]Bendig J., Bolten A., Bennertz S., Broscheit J., Eichfuss S., Bareth G.Remote Sensing2014371
Evaluating multispectral images and vegetation indices for precision farming applications from UAV images [119]Candiago S., Remondino F., De Giglio M., Dubbini M., Gattelli M.Remote Sensing2015335
Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture [120]Honkavaara E., Saari H., Kaivosoja J., Pölönen I., Hakala T., Litkey P., Mäkynen J., Pesonen L.Remote Sensing2013334
Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture [36]Matese A., Toscano P., Di Gennaro S.F., Genesio L., Vaccari F.P., Primicerio J., Belli C., Zaldei A., Bianconi R., Gioli B.Remote Sensing2015312
UAVs challenge to assess water stress for sustainable agriculture [73]Gago J., Douthe C., Coopman R.E., Gallego P.P., Ribas-Carbo M., Flexas J., Escalona J., Medrano H.Agricultural Water Management2015281
Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots [121]Lelong C.C.D., Burger P., Jubelin G., Roux B., Labbé S., Baret F.Sensors2008278
Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance [91]Aasen H., Burkart A., Bolten A., Bareth G.ISPRS Journal of Photogrammetry and Remote Sensing2015273
Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV [122]Torres-Sánchez J., Peña J.M., de Castro A.I., López-Granados F.Computers and Electronics in Agriculture2014267
Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV) [123]Baluja J., Diago M.P., Balda P., Zorer R., Meggio F., Morales F., Tardaguila J.Irrigation Science2012250
Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV) [124]Zarco-Tejada P.J., Guillén-Climent M.L., Hernández-Clemente R., Catalina A., González M.R., Martín P.Agricultural and Forest Meteorology2013198
Sensor Planning for a Symbiotic UAV and UGV System for Precision Agriculture [125]Tokekar P., Hook J.V., Mulla D., Isler V.IEEE Transactions on Robotics2016197
Table 3. Most preferred journals on adopting UAV-PA.
Table 3. Most preferred journals on adopting UAV-PA.
RankJournal NameDocumentsh-Index
1Remote Sensing111124
2Computers and Electronics in Agriculture33115
3Nongye Gongcheng Xuebao Transactions of the Chinese Society of Agricultural Engineering3051
4Sensors27172
5Precision Agriculture2363
6Nongye Jixie Xuebo Transactions of the Chinese Society for Agricultural Machinery1542
7ISPRS Journal of Photogrammetry and Remote Sensing13138
8IEEE Access13127
9Agronomy1330
10Drones1018
Table 4. Most cited documents on adopting UAV-PV.
Table 4. Most cited documents on adopting UAV-PV.
Article TitleAuthor DetailsJournal NameYearCitation Count
Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture [36]Matese A., Toscano P., Di Gennaro S.F., Genesio L., Vaccari F.P., Primicerio J., Belli C., Zaldei A., Bianconi R., Gioli B.Remote Sensing2015312
Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV) [123]Baluja J., Diago M.P., Balda P., Zorer R., Meggio F., Morales F., Tardaguila J.Irrigation Science2012250
Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV) [124]Zarco-Tejada P.J., Guillén-Climent M.L., Hernández-Clemente R., Catalina A., González M.R., Martín P.Agricultural and Forest Meteorology2013198
A PRI-based water stress index combining structural and chlorophyll effects: Assessment using diurnal narrow-band airborne imagery and the CWSI thermal index [126]Zarco-Tejada P.J., González-Dugo V., Williams L.E., Suárez L., Berni J.A.J., Goldhamer D., Fereres E.Remote Sensing of Environment2013173
Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) collected high density structure from motion point cloud [127]Mathews A.J., Jensen J.L.R.Remote Sensing2013160
High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard [128]Santesteban L.G., Di Gennaro S.F., Herrero-Langreo A., Miranda C., Royo J.B., Matese A.Agricultural Water Management2017124
Relationships between net photosynthesis and steady-state chlorophyll fluorescence retrieved from airborne hyperspectral imagery [129]Zarco-Tejada P.J., Catalina A., González M.R., Martín P.Remote Sensing of Environment2013105
Detection of Flavescence dorée grapevine disease using Unmanned Aerial Vehicle (UAV) multispectral imagery [57]Albetis J., Duthoit S., Guttler F., Jacquin A., Goulard M., Poilvé H., Féret J.-B., Dedieu G.Remote Sensing201784
A novel methodology for improving plant pest surveillance in vineyards and crops using UAV-based hyperspectral and spatial data [95]Vanegas F., Bratanov D., Powell K., Weiss J., Gonzalez F.Sensors201881
Table 5. Most preferred journals on adopting UAV-PV.
Table 5. Most preferred journals on adopting UAV-PV.
RankJournal NameDocumentsh-index
1Remote Sensing29124
2Acta Horticulturae958
3Computers and Electronics in Agriculture7115
4Precision Agriculture463
5Sensors4172
Table 6. Leading institutions working on adopting UAV-PA and UAV-PV.
Table 6. Leading institutions working on adopting UAV-PA and UAV-PV.
Institutions Working on Adopting UAV-PACitationsInstitutions Working on Adopting UAV-PVCitations
Instituto de Agricultura Sostenible—CSIC, Spain2227Consiglio Nazionale delle Ricerche, Italy942
Universidad de Córdoba, Spain765Instituto de Agricultura Sostenible—CSIC, Spain916
Consiglio Nazionale delle Ricerche, Italy631Istituto Di Biometeorologia, Florence, Italy803
China Agricultural University, China565Università degli Studi di Torino, Italy685
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Singh, A.P.; Yerudkar, A.; Mariani, V.; Iannelli, L.; Glielmo, L. A Bibliometric Review of the Use of Unmanned Aerial Vehicles in Precision Agriculture and Precision Viticulture for Sensing Applications. Remote Sens. 2022, 14, 1604. https://0-doi-org.brum.beds.ac.uk/10.3390/rs14071604

AMA Style

Singh AP, Yerudkar A, Mariani V, Iannelli L, Glielmo L. A Bibliometric Review of the Use of Unmanned Aerial Vehicles in Precision Agriculture and Precision Viticulture for Sensing Applications. Remote Sensing. 2022; 14(7):1604. https://0-doi-org.brum.beds.ac.uk/10.3390/rs14071604

Chicago/Turabian Style

Singh, Abhaya Pal, Amol Yerudkar, Valerio Mariani, Luigi Iannelli, and Luigi Glielmo. 2022. "A Bibliometric Review of the Use of Unmanned Aerial Vehicles in Precision Agriculture and Precision Viticulture for Sensing Applications" Remote Sensing 14, no. 7: 1604. https://0-doi-org.brum.beds.ac.uk/10.3390/rs14071604

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop