Next Article in Journal
Understanding the Effect of Traffic Congestion on Accidents Using Big Data
Next Article in Special Issue
The Future of Agricultural Jobs in View of Robotization
Previous Article in Journal
Change in Learning Motivation Observed through the Introduction of Design Thinking in a Mobile Application Programming Course
Previous Article in Special Issue
Feeding Models to Optimize Dairy Feed Rations in View of Feed Availability, Feed Prices and Milk Production Scenarios
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Farm Management Information System for Semi-Supervised Path Planning and Autonomous Vehicle Control

1
Beijing Research Center of Intelligent Equipment for Agriculture, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
2
Research Faculty of Agriculture, Hokkaido University, Sapporo 065-8589, Japan
3
Beijing Research Center for Information Technology in Agriculture, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(13), 7497; https://0-doi-org.brum.beds.ac.uk/10.3390/su13137497
Submission received: 7 June 2021 / Revised: 29 June 2021 / Accepted: 30 June 2021 / Published: 5 July 2021
(This article belongs to the Special Issue Farming 4.0: Towards Sustainable Agriculture)

Abstract

:
This paper presents a farm management information system targeting improvements in the ease of use and sustainability of robot farming systems. The system integrates the functionalities of field survey, path planning, monitoring, and controlling agricultural vehicles in real time. Firstly, a Grabcut-based semi-supervised field registration method is proposed for arable field detection from the orthoimage taken by the drone with an RGB camera. It partitions a complex field into simple geometric entities with simple user interaction. The average Mean Intersection over Union is about 0.95 when the field size ranges from 2.74 ha to 5.06 ha. In addition, a desktop software and a web application are developed as the entity of an FMIS. Compared to existing FMISs, this system provides more advanced features in robot farming, while providing simpler user interaction and better results. It allows clients to invoke web services and receive responses independent of programming language and platforms. Moreover, the system is compatible with other services, users, and devices following the open-source access protocol. We have evaluated the system by controlling 5 robot tractors with a 2 Hz communication frequency. The communication protocols will be publicly available to protentional users.

1. Introduction

Farming activities are related to a wide range of skills, methods, and processes which can be effectively supported by automated systems [1]. The precision agriculture (PA) system and its successor, the smart agriculture (SA) system, are considered to play an important role in boosting agricultural productivity, sustainability, and quality. Developing strategies of SA practice, variegating sustainable agricultural systems require the implementation of elaborate management rules. During the past years, sophisticated farm management systems (FMSs) have been developed to plan, monitor, and control agricultural processes over the Internet and wireless sensor network [2,3]. Specifically, FMSs with advanced Information and Communication Technologies (ICT) that collect and process the data from different sources (such as temperature, humidity, soil moisture, etc.) are called farm management information systems (FMISs) [4,5]. The advent of ICT and artificial intelligence (AI) technologies, as well as the unmanned ground vehicle (UGV) and unmanned aerial vehicle (UAV) at reasonable costs, have enabled farmers to access high-quality and large amounts of georeferenced data. Therefore, it is necessary to develop a dedicated FMIS that can cope with big data, which is not exclusively related to big volume, but also the variety and velocity of the collected data [1].
In recent years, the targets of academic and commercial FMIS has evolved from recording keeping, machinery management, and documentation for farms to budget, finance, quality assurance, sales, and decision support system (DSS) for agribusiness management [6]. Fonseca et. al. [7] developed agroecosystem management, termed “Agro 4.0”. It provides tools for the collection, storage, analysis, visualization, and scenario simulations of sustainability-related information of rural properties. Rupnik et al. developed a cloud-based management system, implementing several data analysis methods, which helps farmers better understand their data [8]. Singh et al. developed an FMIS, termed “Agri-Info”, which gathers information from various users through pre-configured devices and provides the required information to users automatically [9]. Academic FMISs have been made great progress in field operational planning, which involves field partitioning and coverage path planning on 2D/3D terrain [10,11]. Optimizing field operations leads to a reduction of total non-working travel distance and soil compaction for increasing work efficiency and sustainable land management. Based on various optimization algorithms, the FMIS with field operational planning function determines the optimal predominant working direction, divides a field into working and non-working subareas, and generates complete or partial coverage path planning for each area [12,13]. Some of these algorithms have been integrated into commercial FMISs to optimize the input resources such as seeds, fuel, and time. With the successful implementation of autosteering systems in agricultural vehicles, agricultural machinery makers, such as John Deere, Fendt, New Holland and CLASS, all fit their FMISs with telematics. However, these are all closed systems with limited interoperability. One systematic review identifies 81 different features and 53 obstacles of FMISs developed during the last decade [14]. It reveals that there is still no standardized system to enable cohesive interoperability among functionalities related to autonomous farming. Moreover, understandability and insufficient farmer skills are frequent obstacles to current FMISs. Some systems are difficult to understand and use for farmers, due to complex user interfaces and operational processes. Therefore, beyond the automated data acquisition and various connectivity, new requirements that autonomous farming introduces to FMISs are those of ease of use (EoU) and sustainability.
First of all, accurate and informative field information is a prerequisite of farm management. However, human experience and an extensive labor force are still necessary for the field mapping and registration process. For example, manually driving a tractor with an accurate real-time kinematic (RTK) GPS around the field is a common and reliable way to obtain the field boundary. On the contrary, a UAV is more efficient than a UGV in field information acquisition. Moreover, a UAV equipped with an RGB camera has been exploited in various applications related to crop monitoring and spraying [15]. However, UAV-based field detection and field partitioning is an open area of research and few methods have been developed. This research aims to simplify data acquisition using UAV and analyze the data by integrating human experience with artificial intelligence (AI). AI methods, such as Convolutional Neural Network (CNN), have had great success in disease detection, species classification, and plant phenotyping applications [16,17,18]. However, it needs a large amount of data for training the CNN models. Moreover, a well-tuned CNN model cannot work well in all scenarios. Once the model fails to detect the arable area, it is difficult for a user to tune the model or correct the result. Therefore, human interaction is important for the field registration process of an FMIS. In this research, a semi-supervised image segmentation method, Grabcut, is adopted to the field registration process by extracting the field with simple user interaction [19]. Grabcut and its extension methods have been successfully applied to many applications, like ship detection from the sea [20] and specified road region extraction [21]. Applying Grabcut to the agricultural environment for field detection has not been researched yet.
In addition, the sustainability of a robot farming system is always an important criterion while conducting agricultural practices. Optimizing field operations can improve the working efficiency of agricultural machinery and reduce time and energy consumption. The operational planning of the FMIS is focused on generating complete coverage or partial coverage navigation map, both in terms of minimizing the non-working travel distance and improving yields [22,23]. Planning methods take into account vehicle capacity and kinematics constraints, calculate the optimal way to generate paths for in-field navigation as well as headland turns [24,25]. However, field complexity and environment uncertainties are still hurdles for existing DSS and FMISs. Since agriculture is in a semi-constructed environment, a deterministic model cannot fully describe the nature inherent in agricultural production systems [6]. Instead of designing a complex path tracking controller like other researchers [26,27,28], we propose a self-tuning mechanism to operational planning, by which the FMIS, as well as the automatic vehicles, can cope with the uncertainty of farming operations.
The aim of this research is not to construct a comprehensive FMIS, but rather to collaborate agricultural robotics with advanced FMISs. The objectives of this research are to improve both the EoU and sustainability of an FMIS from two aspects: field registration and operational planning. First, a semi-supervised field registration method is proposed to extract accurate field information from the UAV sensing data. Since operations at the headland are more complex than tracking parallel straight paths, a self-tuning mechanism is adopted in the operational planning process to cope with the uncertainties during the headland turns. Lastly, an FMIS is delivered as a platform with web application and desktop software. The entity of an FMIS is developed to evaluate proposed algorithms and to demonstrate the workflow of the developed management system.

2. Methodology

The overall workflow of the FMIS is depicted in Figure 1. The FMIS consists of UAV for the field survey, a PC software with full accessibility to the server, a web application for users with portable devices, and a radio for nearby monitoring and emergency stop. The PC software is modified from the “Geomation”, originally developed by Hitachi Solutions, Ltd. It executes in a Windows laptop with an Intel Core i5-7200U 2.50 GHz CPU and 8 GB RAM. The software platform was built using PostgreSQL database communicating through the REST framework. The web application development is based on HTML5 and JavaScript under the Node.js environment.
The field information acquisition is performed by a hexarotor UAV (EnRoute Co., Ltd., Fujimino, Japan) with a RedEdge camera (MicaSense Inc., Seattle, USA), shown in Figure 2. We use the RGB channels of the image for field registration. The device specifications and settings are similar to the methods presented by Meinen [29] and Kattenborn [30], but with an average height of 50.0 m above ground for better ground sampling distance. PC software at the server-side manages the sensing data required by the UAV and provides a user interface for field registration and operational planning. It is possible to export the working plan, such that it can be imported into agricultural robots, autonomous tractors in this research. The user can monitor and control robot tractors through both the PC software and the web application. Commands, working status, and working plan are formatted into Extensible Markup Language (XML) information set and are transmitted relying on HyperText Transfer Protocol (HTTP). Since the HTTP protocol is installed and runs on all operating systems, the farm management system at the server-side allows clients to invoke web services and to receive responses independent of programming language and platforms, such as a tablet, smartphone, and so on. In addition, users nearby the farm can monitor and control the robots through a radio with a 150 MHz width waveband. Furthermore, the functionality are same as the PC software and web application in robot control, while the radio enables the user to stop the engine in an emergency. Some other functionalities, such as account management and machinery and equipment management, are not listed here.
The robot tractor uploads working status and requests commands from the server following the simple object access protocol (SOAP) [10]. The dataflow between the server and each robot tractor is shown in Figure 3. It then downloads the working plan from the FMIS after initialization. The working plan includes the navigation map, path order, vehicle capacity, and agricultural implement size. The system enables users to modify machinery statuses, such as the velocity, engine speed, the status of power take-off (PTO), and the position of the hitch. In addition, other machinery statuses (e.g., steer angle, position, and attitude) and working status of the implement should be uploaded to the server and the monitor simultaneously. The communication frequency in this research is set to 2 Hz for real-time control and monitoring.

2.1. Field Registration with Simple User Interaction

Based on the accurate sensing information, field registration procedure based on Grabcut and path planning for headland turns will be introduced in the following section. The field image I is formed by a set of pixels with precise georeferencing I = v x y ,   P X Y .   x , y denotes the position of a pixel in the image domain Ω R 2 , and X , Y is the position in the world coordinate, represented by latitude and longitude. The elevation information is not used in this research. Let L = l x y 0 , 1 : x , y Ω be a set of labels for all the pixels in the image. The field registration algorithm modified from Grabcut is treated as a pixel labeling problem by assigning the pixel at position x , y to be ‘0′ for non-working area or ‘1′ for working area [14]. Color pixel is modeled by two full-covariance Gaussian Mixture Models (GMMs), one for the working area and one for the non-working area. The GMM with K components is defined as
D v , l = i = 1 K π i g i v ; μ i , i ,   s . t .   i = 1 K π i = 1   and   0 π i 1 g v ; μ , = 1 2 π 3 e 1 2 v μ T 1 v μ ,
  • where   D · is a full-covariance Gaussian mixture with K components;
  • v is the pixel value in RGB color space;
  • l 0 , 1 is the label of the pixel;
  • π i is the mixture weighting coefficient;
  • g · is the Gaussian probability distribution of the pixel in a component i ;
  • μ ,   are the mean and covariance of the Gaussian component.
The task for working and non-working area classification is to minimize the following Gibbs energy function:
E L = E r L + γ E b L ,
  • where L is a set of labels for all the pixels in the image;
  • γ is an experimental based tradeoff factor.
The energy function consists of a regional term E r and a boundary term E b balanced by a positive tradeoff factor γ . The regional term E r , taking account of the GMM models, evaluates the fit of segmentation to the pixel as
E r L = x y Ω log D v x y , l x y ,
  • where x y Ω denotes the position of a pixel in the image domain.
The boundary term penalizes color discontinuity between neighboring pixels as follows:
E b L = x y , i j   C l x y l i j e β | | v x y v i j | | 2 ,
  • where C is the set of pairs of 8-neighboring pixels;
  • | | v x y v i j | | denotes the Euclidean distance of two pixels;
Eliminating the effects of extremely high or low contrast, the parameter β is chosen to be:
β = 2 | | v x y v i j | | 2 1 ,
where · denotes the expectation over an image. The hard segmentation based on the defined Gibbs energy function is solved by the minimum cut algorithm, exactly as the Grabcut and Graph cut [19,31].

2.2. Headland Turning Methods with Self-Tuning Mechanism

Headland turning methods can be roughly classified as non-stop turning and turning with backward movement [32]. As a typical non-stop turning method, U-turn is suitable for heavy load vehicles. It is an optimal option for vehicles with a wide implement or skip-path working scenarios. Circle-back turning is suitable for small vehicles working on the limited headland. The humanoid turning maneuvers of Circle-back turning provide high after-turn accuracy without the sacrifice of speed. We implement U-turn and Circle-back headland turning methods to minimize non-working path lengths at headlands. Two examples of headland turn using U-turn and Circle-back turning are shown in Figure 4. Reference paths for headland turn are composed of elementary primitives, such as continuous-curve and a line segment. Applying a U-turn, the agricultural vehicle steers to full left, follows a short straight path, and then steers to the full left till entering the next path. The reference path using the default value of the minimum turning radius is depicted as a dashed line in Figure 4a. The robot vehicle deviates from the reference path due to some uncertainties, such as the lock-to-lock time delay, changing vehicle speed, and unknown tire-soil friction. Instead of designing a complex controller, our solution is to correct the reference path based on the measured minimum turning radius and recalculate the forward movement distance d following
d = w 2 R f + ε     s . t .   w 2 R f ,
  • where   w is the distance between two adjacent paths;
  • R f is the turning radius with forward movement;
  • ε is a relaxation factor to deal with the unknown disturbance.
In this research, the relaxation factor is set to after-turn lateral deviation. The updated reference path of the U-turn is shown as a solid line in Figure 4a. When the path space is less than two times a minimum turning radius, Circle-back turning is used for generating a reference path. The central angle of forward turning, θ , is determined by
θ = cos 1 w R f + R b R f + R b + ε     s . t .   w < 2 R f ,
  • where   R b is the turning radius with backward movement.

2.3. Pilot Study

Grabcut-based image segmentation method is implemented in the field registration process. User interaction and an iterative scheme are adopted to further improve segmentation accuracy. We select three field orthoimages as examples and illustrate the field registration process in Figure 5. The first image of each panel is the orthoimage processed from UAV sensing data. The second image of each panel is the orthoimage with user interactions. The field registration includes two-step user interaction. The first is to initialize two GMMs by roughly marking some pixels either to the working or non-working area. As shown in the second image of each panel in Figure 5, white strokes label the working area. Black strokes label the non-working area. The second step is to repeat the segmentation process until a satisfactory result is obtained. GMM parameters update in each iteration, during which new pixels are assigned to GMM components. The third image of each panel is the result of field segmentation after several iterations. The field edges in Figure 5a,b are uninterrupted and smooth. However, the segmentation result degrades in Figure 5c because of the interference from the grass on the road. The resolution of sensing data also affects the segmentation accuracy. In this case, further user-labeling is needed to refine ambiguous segmentation. In general, the entire segmentation process can be conducted automatically using the initially rough labels in most cases. In these examples, we can register two or three fields at the same time. No additional labor force is necessary for irregular multi-fields labeling.
Following the field registration, the operational planning algorithms automatically design the reference paths for the working area and feasible turns subject to the field shape and vehicle constraints. To express the length of the trajectory clearly, the origin of the coordinate is shifted to (527180, 4769333) in the Universal Transverse Mercator (UTM) coordinate system. The velocity of the robot is 1.4 m/s for working, which is the speed limit for the rotary tillage seeding machine. For safety and soil protection concerns, the turning speed at headland is set to 1.0 m/s. We analyzed each turn by calculating the trajectory distance, time consumption, headland occupation, and after-turn lateral deviation.

3. Experimental Results

As the requirements of an FMIS for autonomous operations are identified, we develop a PC software and a web application for users to access the server and manage the system through a web service. The graphical user interface (UI) of the web application is shown in Figure 6. The robot accesses the server through the preassigned ID and the password. The icon’s color indicates the working status of the tractor. Blue indicates the robot moving in the non-working area (outside the farmland). Green indicates the robot working in the field. The yellow icon in Figure 6 indicates the robot is manually paused or on standby. A demonstration of applying “Geomation” for vehicle control is available in Supplementary Materials: Video S1. The video shows that we control 5 robot tractors at the same time through this system. The communication frequency is set to 2 Hz. The web application, “Weilai Farm”, is accessible at http://39.104.144.154:10000/robot.ai (accessed on 24 June 2021).
The data analysis for field registration is listed in Table 1. The pilot area index is consistent with panel a–c in Figure 5. The FMIS can cope with the orthoimage of a field as large as 5.06 ha without reliance on an external accelerator, such as a GPU. Mean Intersection Over Union (mIoU) is employed as the accuracy metric [33]. It can be seen that the extracted field area is consistent with the manually labeled ground truth. Therefore, detected field boundaries can be with centimeter-level accuracy, which is also determined by the resolution of the field orthoimage. Compared to other methods, such as the manually drawing boundaries and GPS recording boundaries, the proposed field registration method overwhelms these methods in the aspects of labor force consumption and boundary accuracy. Its advantages become significant in the case of large farmland and irregular shape of the field.
Figure 7a shows the implementation of the Circle-back turning method on a ten-path navigation map. The distance between the two adjacent paths is 2.64 m. The robot tractor starts at the upper headland. Turns with order {1,3,5,7,9} are at the lower headland, and the ones with order {2,4,6,8} are at the upper headland. Pink lines are the working trajectories of the robot tractor, and the blue lines are the turning paths. The second headland turn is enlarged in Figure 7b. The blue lines are generated reference paths for headland turns. Red circles show the trajectory of forward turning, black dots are the backward turning trajectory, and blue blocks illustrate the forward movement entering into the next path. It can be seen that the robot follows the reference path precisely at the headland.
The results of 9-times headland turns are listed in Table 2. The after-turn deviations of the first two headland turns following the initial reference paths are as large as 7.1 cm and −6.6 cm, respectively. With the supervision of the self-tuning mechanism, deviations converge to 1.0–3.6 cm in following headland turns (from order 3 to order 8). The deviation after the 9th headland turn might be caused by a disturbance. The tolerance to disturbance can be adjusted through tuning the relaxation factor in Equation (7). In this way, the self-tuning mechanism enables operational planning to be robust to disturbance and uncertainties.

4. Discussion

We adopt the semi-supervised learning based field registration method and headland path planning method in the proposed FMIS to reduce the production costs and to improve working quality. The proposed field registration method can provide reliable field partitioning results by integrating the human experience and machine learning method. For robot operation planning, it is unnecessary, and also impossible, to design a universal path planning method for all the working conditions. Proposed headland turning methods generate reference transfer paths while turning and make adjustments based on measured turning radius. In this way, operational planning is adaptable to the uncertainties of a semi-constructed agricultural environment.
Comparing with other primary studies, the main features of an FMIS, as well as the configured features in this study, are presented in Table 3. Listed feature categories come from multiple domains, including arable farming, greenhouse, livestock, orchard, multipurpose and general FMIS that cannot be traced back to one domain [14]. The proposed FMIS has features such as data acquisition, operation plan generation, equipment management, data processing, and data management, which occur frequently over multiple domains. In addition, features related to the arable farming domain, for example, field monitoring, and field management, are also configured in this study. On the contrary, one of the most frequent features, financial management is not presented in this system. We furthermore present a combination of desktop software and web application. Therefore, the FMIS also has Internet accessibility and Multi-platform supports. It can be seen that the proposed FMIS is specialized for fully autonomous farming according to the configured features. Moreover, the system is compatible with other services, users and devices following the open-source access protocol.
One main difference between our study and the related work is that a semi-supervised field registration method is proposed to partition the complex field into simple geometric entities with simple user interaction. Using the semi-supervised learning method provides the following advantages for field registration:
  • It simplifies the field registration procedure. The field segmentation executes iteratively with few samples of working and non-working areas.
  • It is a purely color-based segmentation method without the limitation of field geometry and size.
The other main difference of this study with related path planning work is that the self-tuning mechanism is adopted. The turning methods with the self-tuning mechanism are simple and flexible enough to be adapted to changing environmental conditions. After-turn deviations of a robot tractor following the default reference turning paths are as large as 7.1 cm. Experiments reveal that the after-turn deviation converges to within 4 cm by adjusting the reference path autonomously.

5. Conclusions

The sustainability of a robot farming system is an important criterion while conducting agricultural practices. We proposed a semi-supervised field registration method to extract field information from the UAV sensing data. UAV with an RGB camera enables farmers to access high-quality field information in an efficient and non-destructive way. It is also an economical method since the RGB image can be easily captured by a common camera. Grabcut-based field segmentation method can be implemented without reliance on GPU, which is necessary for other deep learning methods [34]. At the path planning process, proposed FMIS integrates two humanoid headland turning methods for headland turning between different path spaces. Fewer steering operations are also beneficial to soil protection at the headland as well as the sustainability of the autonomous system.
Compared to existing FMISs, the system presented in this research provides more advanced features in robot farming, while providing simpler user interaction and better results. Users without a high level of education or sufficient farming skills can obtain the full potential of the proposed management system. We have evaluated the system by controlling 5 robot tractors with 2 Hz communication frequency. Its capacity should be evaluated in further tests. According to the above analysis, we can conclude that:
(1)
An FMIS specialized for robot farming is developed. It allows clients to invoke web services and receive responses independent of programming language and platforms.
(2)
Ease of use and sustainability are proposed as criteria for developing an FMIS.
(3)
Grabcut is adopted to process field orthoimage and to simplify field registration.
(4)
Headland turn with self-tuning mechanism is adapted to environmental uncertainties.
Active use of a fully autonomous or intelligent assisting system to support farming work will increase in the future FMIS. User interaction is still necessary at the field registration. In addition, it is challenging for Grabcut to extract an arable field from the background with similar colors, such as the example in Figure 5c. Moreover, access points that mark where the field can be entered should be manually assigned currently [11]. Deep learning methods, such as CNN, have been applied to harness the UAV-based sensing data to map vegetation patterns [30]. Their potential for identifying arable farmland as well as assess points from high spatial resolution data should be further evaluated. In addition, the field’s Digital Elevation Model (DEM) derived from the UAV photogrammetry shares the same spatial resolution of orthoimage [35]. However, the inclusion of DEM did not significantly improve the CNN models in plant species identification [18]. The merit of 3D information for field registration should be evaluated in future research. Future FMISs in smart agriculture should support automatic sensing, planning, controlling, and decision-making mechanisms [5]. In this regard, the uncertainty assessment and management capability of an FMIS can be improved by implementing these machine learning models.

Supplementary Materials

The following are available online at https://www.youtube.com/watch?v=vjGoxsCVxFU, Video S1: A Demonstration of FMIS.

Author Contributions

Conceptualization, H.W. and Z.M.; methodology, H.W. and Y.R.; software, H.W.; validation, H.W.; formal analysis, H.W. and Y.R.; investigation, H.W.; resources, H.W.; data curation, H.W.; writing—original draft preparation, H.W.; writing—review and editing, H.W. and Y.R.; visualization, H.W.; supervision, H.W.; project administration, H.W.; funding acquisition, H.W. and Z.M. All authors have read and agreed to the published version of the manuscript.

Funding

The research was partially funded by the NATIONAL KEY R&D PROGRAM OF CHINA, grant number 2019YFB1312305 and 2017YFD0700604.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy restrictions.

Acknowledgments

The authors thank Osamu Nishiguchi for his help in “Geomation” development.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lytos, A.; Lagkas, T.; Sarigiannidis, P.; Zervakis, M.; Livanos, G. Towards smart farming: Systems, frameworks and exploitation of multiple sources. Comput. Netw. 2020, 172, 107147. [Google Scholar] [CrossRef]
  2. Othman, M.F.; Shazali, K. Wireless Sensor Network Applications: A Study in Environment Monitoring System. Procedia Eng. 2012, 41, 1204–1210. [Google Scholar] [CrossRef] [Green Version]
  3. Kaloxylos, A.; Eigenmann, R.; Teye, F.; Politopoulou, Z.; Wolfert, S.; Shrank, C.; Dillinger, M.; Lampropoulou, I.; Antoniou, E.; Pesonen, L.; et al. Farm management systems and the Future Internet era. Comput. Electron. Agric. 2012, 89, 130–144. [Google Scholar] [CrossRef]
  4. Kaloxylos, A.; Groumas, A.; Sarris, V.; Katsikas, L.; Magdalinos, P.; Antoniou, E.; Politopoulou, Z.; Wolfert, S.; Brewster, C.; Eigenmann, R.; et al. A cloud-based Farm Management System: Architecture and implementation. Comput. Electron. Agric. 2014, 100, 168–179. [Google Scholar] [CrossRef]
  5. Saiz-Rubio, V.; Rovira-Más, F. From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy 2020, 10, 207. [Google Scholar] [CrossRef] [Green Version]
  6. Fountas, S.; Carli, G.; Sørensen, C.; Tsiropoulos, Z.; Cavalaris, C.; Vatsanidou, A.; Liakos, B.; Canavari, M.; Wiebensohn, J.; Tisserye, B. Farm management information systems: Current situation and future perspectives. Comput. Electron. Agric. 2015, 115, 40–50. [Google Scholar] [CrossRef] [Green Version]
  7. Da Fonseca, E.P.R.; Caldeira, E.; Filho, H.S.R.; e Oliveira, L.B.; Pereira, A.C.M.; Vilela, P.S. Agro 4.0: A data science-based information system for sustainable agroecosystem management. Simul. Model. Pract. Theory 2020, 102, 102068. [Google Scholar] [CrossRef]
  8. Rupnik, R.; Kukar, M.; Vračar, P.; Košir, D.; Pevec, D.; Bosnić, Z. AgroDSS: A decision support system for agriculture and farming. Comput. Electron. Agric. 2019, 161, 260–271. [Google Scholar] [CrossRef]
  9. Singh, S.; Chana, I.; Buyya, R. Agri-Info: Cloud Based Autonomic System for Delivering Agriculture as a Service. Internet Things 2020, 9, 100131. [Google Scholar] [CrossRef]
  10. Oksanen, T.; Visala, A. Coverage path planning algorithms for agricultural field machines. J. Field Robot. 2009, 26, 651–668. [Google Scholar] [CrossRef]
  11. Jin, J.; Tang, L. Coverage path planning on three-dimensional terrain for arable farming. J. Field Robot. 2011, 28, 424–440. [Google Scholar] [CrossRef]
  12. Kroulik, M.; Hula, J.; Brant, V. Field trajectories proposals as a tool for increasing work efficiency and sustainable land management. Agron. Res. 2018, 16, 1752–1761. [Google Scholar] [CrossRef]
  13. Řezník, T.; Herman, L.; Klocová, M.; Leitner, F.; Pavelka, T.; Šimon, T.; Trojanová, K.; Štampach, R.; Moshou, D.; Mouazen, A.; et al. Towards the Development and Verification of a 3D-Based Advanced Optimized Farm Machinery Trajectory Algorithm. Sensors 2021, 21, 2980. [Google Scholar] [CrossRef]
  14. Tummers, J.; Kassahun, A.; Tekinerdogan, B. Obstacles and features of Farm Management Information Systems: A systematic literature review. Comput. Electron. Agric. 2019, 157, 189–204. [Google Scholar] [CrossRef]
  15. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  16. Yang, W.; Feng, H.; Zhang, X.; Zhang, J.; Doonan, J.H.; Batchelor, W.D.; Xiong, L.; Yan, J. Crop Phenomics and High-Throughput Phenotyping: Past Decades, Current Challenges, and Future Perspectives. Mol. Plant 2020, 13, 187–214. [Google Scholar] [CrossRef] [Green Version]
  17. Kim, W.-S.; Lee, D.-H.; Kim, Y.-J. Machine vision-based automatic disease symptom detection of onion downy mildew. Comput. Electron. Agric. 2020, 168, 105099. [Google Scholar] [CrossRef]
  18. Kattenborn, T.; Eichel, J.; Wiser, S.; Burrows, L.; Fassnacht, F.E.; Schmidtlein, S. Convolutional Neural Networks accurately predict cover fractions of plant species and communities in Unmanned Aerial Vehicle imagery. Remote Sens. Ecol. Conserv. 2020, 6, 472–486. [Google Scholar] [CrossRef] [Green Version]
  19. Rother, C.; Kolmogorov, V.; Blake, A. “GrabCut”—Interactive foreground extraction using iterated graph cuts. In Proceedings of the ACM Transactions on Graphics, SIGGRAPH 2004, Los Angeles, CA, USA, 8–12 August 2004; pp. 309–314. [Google Scholar]
  20. Xu, C.; Zhang, D.; Zhang, Z.; Feng, Z. BgCut: Automatic Ship Detection from UAV Images. Sci. World J. 2014, 2014, 1–11. [Google Scholar] [CrossRef]
  21. Zhou, H.; Kong, H.; Wei, L.; Creighton, D.; Nahavandi, S. Efficient Road Detection and Tracking for Unmanned Aerial Vehicle. IEEE Trans. Intell. Transp. Syst. 2015, 16, 297–309. [Google Scholar] [CrossRef]
  22. Contente, O.; Lau, N.; Morgado, F.; Morais, R. A Path Planning Application for a Mountain Vineyard Autonomous Robot. In Advances in Intelligent Systems and Computing, Proceedings of the Robot 2015: Second Iberian Robotics Conference, Lisbon, Portugal, 19–21 November 2015; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2016; Volume 417, pp. 347–358. [Google Scholar]
  23. Hameed, I.; la Cour-Harbo, A.; Osen, O.L. Side-to-side 3D coverage path planning approach for agricultural robots to minimize skip/overlap areas between swaths. Robot. Auton. Syst. 2016, 76, 36–45. [Google Scholar] [CrossRef]
  24. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  25. Wang, B.; Ren, J.; Cai, M. Car-Like Mobile Robot Path Planning in Rough Terrain with Danger Sources. In Proceedings of the 2019 Chinese Control Conference (CCC), Guangzhou, China, 27–30 July 2019; pp. 4467–4472. [Google Scholar]
  26. Jin, X.; Yin, G.; Chen, N. Advanced Estimation Techniques for Vehicle System Dynamic State: A Survey. Sensors 2019, 19, 4289. [Google Scholar] [CrossRef] [Green Version]
  27. Kraus, T.; Ferreau, H.; Kayacan, E.; Ramon, H.; De Baerdemaeker, J.; Diehl, M.; Saeys, W. Moving horizon estimation and nonlinear model predictive control for autonomous agricultural vehicles. Comput. Electron. Agric. 2013, 98, 25–33. [Google Scholar] [CrossRef]
  28. Tu, X.; Gai, J.; Tang, L. Robust navigation control of a 4WD/4WS agricultural robotic vehicle. Comput. Electron. Agric. 2019, 164, 104892. [Google Scholar] [CrossRef]
  29. Meinen, B.U.; Robinson, D.T. Mapping erosion and deposition in an agricultural landscape: Optimization of UAV image acquisition schemes for SfM-MVS. Remote Sens. Environ. 2020, 239, 111666. [Google Scholar] [CrossRef]
  30. Kattenborn, T.; Eichel, J.; Fassnacht, F. Convolutional Neural Networks enable efficient, accurate and fine-grained segmentation of plant species and communities from high-resolution UAV imagery. Sci. Rep. 2019, 9, 17656. [Google Scholar] [CrossRef]
  31. Boykov, Y.; Veksler, O.; Zabih, R. Fast approximate energy minimization via graph cuts. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1222–1239. [Google Scholar] [CrossRef] [Green Version]
  32. Wang, H.; Noguchi, N. Autonomous maneuvers of a robotic tractor for farming. In Proceedings of the 2016 IEEE/SICE International Symposium on System Integration (SII), Sapporo, Japan, 13–15 December 2017; pp. 592–597. [Google Scholar] [CrossRef]
  33. Lin, Z.; Zhang, Z.; Chen, L.-Z.; Cheng, M.-M.; Lu, S.-P. Interactive Image Segmentation with First Click Attention. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; Institute of Electrical and Electronics Engineers (IEEE): Piscataway, NJ, USA, 2020; pp. 13336–13345. [Google Scholar]
  34. Yuheng, S.; Hao, Y. Image segmentation algorithms overview. arXiv 2017, arXiv:1707.02051. [Google Scholar]
  35. Langhammer, J.; Vacková, T. Detection and Mapping of the Geomorphic Effects of Flooding Using UAV Photogrammetry. Pure Appl. Geophys. PAGEOPH 2018, 175, 3223–3245. [Google Scholar] [CrossRef]
Figure 1. The workflow of an FMIS for robot farming.
Figure 1. The workflow of an FMIS for robot farming.
Sustainability 13 07497 g001
Figure 2. UAV-imagery system used in this study.
Figure 2. UAV-imagery system used in this study.
Sustainability 13 07497 g002
Figure 3. Parameters for remote monitor and control.
Figure 3. Parameters for remote monitor and control.
Sustainability 13 07497 g003
Figure 4. Initial reference paths and self-tuned reference paths of headland turn. (a) Depicts the U-turn and (b) depicts the Circle-back turning. Dashed lines are reference paths generated by default parameters, and solid lines are updated reference paths by self-tuning mechanism.
Figure 4. Initial reference paths and self-tuned reference paths of headland turn. (a) Depicts the U-turn and (b) depicts the Circle-back turning. Dashed lines are reference paths generated by default parameters, and solid lines are updated reference paths by self-tuning mechanism.
Sustainability 13 07497 g004
Figure 5. Grabcut-based field registration process.
Figure 5. Grabcut-based field registration process.
Sustainability 13 07497 g005
Figure 6. The user interface of the web application, termed “Weilai Farm”. The color of the robot’s icon indicates the working status of the tractor.
Figure 6. The user interface of the web application, termed “Weilai Farm”. The color of the robot’s icon indicates the working status of the tractor.
Sustainability 13 07497 g006
Figure 7. The navigation map with in-field paths and headland turns. The robot tractor starts at the upper headland following a blue curved line at (a). Turns with order {1,3,5,7,9} are at the lower headland, and the ones with order {2,4,6,8} are at the upper headland. The second headland turn is enlarged at (b). The blue lines are generated reference path for headland turns. Red circles show the trajectory of forward turning, black dots are the backward turning trajectory, and blue blocks illustrate the forward movement entering into the next path.
Figure 7. The navigation map with in-field paths and headland turns. The robot tractor starts at the upper headland following a blue curved line at (a). Turns with order {1,3,5,7,9} are at the lower headland, and the ones with order {2,4,6,8} are at the upper headland. The second headland turn is enlarged at (b). The blue lines are generated reference path for headland turns. Red circles show the trajectory of forward turning, black dots are the backward turning trajectory, and blue blocks illustrate the forward movement entering into the next path.
Sustainability 13 07497 g007
Table 1. Sensing data and field registration accuracy.
Table 1. Sensing data and field registration accuracy.
Pilot Area IndexResolution
[cm/pixel]
Average Field Size
[m2]
Coverage
[ha]
mIoU
013.424,8205.060.9761
023.313,0242.740.9558
035.317,6003.630.9276
Table 2. Analysis of headland turns with the Circle-back turning method.
Table 2. Analysis of headland turns with the Circle-back turning method.
Order123456789
Trajectory [m]20.220.320.020.620.720.220.720.620.6
Time [s]24.824.824.625.025.224.625.424.825.2
Headland [m]6.56.66.46.76.76.66.76.76.6
Deviation [cm]7.1−6.62.32.41.0−3.43.6−2.86.4
Table 3. Main features of an FMIS listed in primary studies [5,14].
Table 3. Main features of an FMIS listed in primary studies [5,14].
Feature CategoriesConfigured
F1Financial management
F2ReportingUnder development
F3Data acquisition
F4Operation plan generation
F5Crop management
F6Resource management
F7Equipment management
F8Field monitoring
F9Data processing
F10Fertilization management
F11Human resource management
F12Weather service
F13Data management
F14Field management
F15Accounting
F16Inventory management
F17Internet accessibility
F18Compatibility
F19Multi-platforms support
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, H.; Ren, Y.; Meng, Z. A Farm Management Information System for Semi-Supervised Path Planning and Autonomous Vehicle Control. Sustainability 2021, 13, 7497. https://0-doi-org.brum.beds.ac.uk/10.3390/su13137497

AMA Style

Wang H, Ren Y, Meng Z. A Farm Management Information System for Semi-Supervised Path Planning and Autonomous Vehicle Control. Sustainability. 2021; 13(13):7497. https://0-doi-org.brum.beds.ac.uk/10.3390/su13137497

Chicago/Turabian Style

Wang, Hao, Yaxin Ren, and Zhijun Meng. 2021. "A Farm Management Information System for Semi-Supervised Path Planning and Autonomous Vehicle Control" Sustainability 13, no. 13: 7497. https://0-doi-org.brum.beds.ac.uk/10.3390/su13137497

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop