Next Article in Journal
Differences in Physiological Signals Due to Age and Exercise Habits of Subjects during Cycling Exercise
Previous Article in Journal
A Study of Biofeedback Gait Training in Cerebral Stroke Patients in the Early Recovery Phase with Stance Phase as Target Parameter
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Range-Based Algorithm for Autonomous Navigation of an Aerial Drone to Approach and Follow a Herd of Cattle

by
Manaram Gnanasekera
,
Jay Katupitiya
,
Andrey V. Savkin
* and
A.H.T. Eranga De Silva
School of Mechanical and Manufacturing Engineering, The University of New South Wales, Sydney, NSW 2052, Australia
*
Author to whom correspondence should be addressed.
Submission received: 17 September 2021 / Revised: 19 October 2021 / Accepted: 25 October 2021 / Published: 29 October 2021
(This article belongs to the Section Sensor Networks)

Abstract

:
This paper proposes an algorithm that will allow an autonomous aerial drone to approach and follow a steady or moving herd of cattle using only range measurements. The algorithm is also insensitive to the complexity of the herd’s movement and the measurement noise. Once arrived at the herd of cattle, the aerial drone can follow it to a desired destination. The primary motivation for the development of this algorithm is to use simple, inexpensive and robust sensing hence range sensors. The algorithm does not depend on the accuracy of the range measurements, rather the rate of change of range measurements. The proposed method is based on sliding mode control which provides robustness. A mathematical analysis, simulations and experimental results with a real aerial drone are presented to demonstrate the effectiveness of the proposed method.

1. Introduction

Animal husbandry has been a social practice ever since the Neolithic Revolution. However, the techniques and technologies used in animal farming have evolved with time. As a result, modern-day farmers are progressively taking up technology in many ways to succeed in their farming endeavours. Automated feeding systems [1,2,3], smart farming techniques [4,5,6], robots used in farming [7,8,9], and intelligent farming technologies [10,11,12] can be highlighted as examples out of the vast range available in the literature.
Keeping track of farm animals is an essential task for a farmer, especially for the safety of animals and others. Sheepdogs and shepherds used in olden days have been replaced by technology in today’s farming. An extensive amount of literature can be found on animal tracking such as the work of [13,14,15,16] based on GPS. Specifically, most modern-day cattle tracking is done by attaching a GPS collar to the neck of the animal, as presented in [17,18]. Apart from the use of GPS collars, some other studies have used leg sensors [19] and ear sensors [18] for behaviour monitoring and location monitoring. Some researchers used different approaches, such as Wi-Fi [19], solar-powered sensors [20], and VHF collars [21] for livestock localization. Other than technological monitoring, the study conducted by [22] presents a mobile robot used for flock controlling. Method proposed in [23] uses a distributed game theoretic approach [24] to guarantee the safety of the herd from a potential predation.
The study carried out by [25] presents a method for bird herding using an aerial drone, also known as an Unmanned Aerial Vehicle (UAV), based on a waypoint algorithm. Similarly, some studies present how drone-based bird herding could be applied around airports [26,27] using a n-wavefront algorithm.
We draw our attention toward manoeuvring an aerial drone to an animal herd some distance away and to move along with the herd while maintaining a predefined altitude. Similar to military [28,29], surveying [30,31], cinematography [32,33], transportation [34,35] etc. sectors, agriculture [36,37] is one area where UAVs are used in large scales. Precision, easy deploy-ability, security, cost and flexibility are the key features behind the increased UAV usage [38].
This problem could be related to navigating an autonomous vehicle towards a static or dynamic target [39]. An extensive amount of literature could be found on ground vehicles moving towards targets that are developed based on various control laws. The study carried out by the authors of [40] have used a biologically inspired controller for vehicle guidance. The work presented in [41] proposed a control law which only uses angular information as inputs. As a result, the algorithm has become robust for sensing uncertainty. There were also guidance laws made to navigate wheeled robots towards static or moving targets based on range information [42,43]. It should be noted that sliding mode controllers based navigation laws are the backbone of most of these approaches, as can be found in [42,43,44,45] due to various advantages as described in [46]. In the case of aerial drones following ground targets, the work carried out is comparatively moderate. The work conducted by [47] uses sequential decision processing to track a mobile target. A Partially Observable Markov Decision Process (POMDP)-based method has been used by [48] to track ground targets. Some publications developed complex computer vision [49,50] and deep learning [51] algorithms for object tracking. The use of smarter, simpler, and practically implementable tracking methods-based control strategies are not widely available in the literature.
There are numerous methods presented in the literature to herd animals. The GPS collar methods stated above are not economically viable as GPS receivers with reasonable positioning accuracy are expensive. Moreover, GPS readings are generally inaccurate in remote areas unless some means of providing GPS corrections are made available. Secondly, the solutions which use LiDAR and vision cameras will suffer from limited range, limited field of view, poor visibility and occlusion. This type of sensors also have a very poor signal to noise ratio over long distances. Furthermore, they need to be pointed in the right direction for their readings to be useful. The directional information is not available at the start; hence this type of sensors cannot be used.
There are various methods used to navigate UAVs, including vision-based systems [52], GNSS-based systems [53] and on board short range sensors [54]. However, it is impossible to use vision and short range sensing for long range herding applications. Even though the satellite-based positioning systems such as GNSS could be used in long range applications, the positional inaccuracy at times could be hazardous and dangerous for UAV applications.
The proposed method uses simple range sensing, and the rate of change of range measurements is used to fly the drone towards the herd. The proposed algorithm is insensitive to the accuracy of range measurements. The algorithm only requires the rate of change of range. In practice, several transmitter/receiver modules should be used. One of the range sensing modules is placed on the aerial drone, and the other modules are attached to some of the cattle in the herd. The average range sensed by the drone can be taken as representative of the distance to the herd. In comparison to other sensing methods, range sensing is economical and robust.
Please note that range data does not give the directional information and hence the range alone does not indicate the location of the herd. Instead of a unique destination, a range sensor provides a large subset of possible destinations. Hence a search type algorithm must be developed in contrast to an algorithm that is deterministic such as an algorithm that pursues a known destination. Moreover, given that range data are less accurate, an algorithm that depends on the rate of change of range data is more suitable than an algorithm that depends on accurate range sensing. Sliding mode controllers are ideal candidates to operate in the presence of uncertainties. Hence the algorithm developed in this paper is based on sliding mode control methodology.
In this paper, we formulate the problem of drone navigation for following a ground-based herd, with an aerial receiving range-only measurements. The drone navigation law is based on sliding mode control. The distance from the drone to the herd will be taken as the range measurement. No bearing information is needed. The results shown in this work confirm the validity of the algorithm and its suitability to practical applications.
The following list of contributions are made to the existing knowledge base.
  • The paper presents a control algorithm for a UAV to follow and intercept the location of a ground herd. To the best of the authors’ knowledge, the proposed work remains the only approach to address such an application.
  • The control algorithm is based on range data. More importantly, the algorithm is robust against noisy data. The performance of the proposed method and a state of the art method was tested with noisy data in simulation. Our method has shown better performance than the state of the art method when compared.
  • The UAV has the ability to intersect the herd’s location (finding the herd and moving along with it) and navigate along with it irrespective of the herd’s path complexity. We conducted multiple simulation and experiment tests in various complexity levels to confirm this point.
The rest of the paper content is organized as follows. Section 2 provides the problem statement and solution development, the main results and their mathematical analysis. Section 3 presents the descretized algorithm and the dynamic model. Computer simulation and experimental results are presented in Section 4 and Section 5 receptively. Section 6 concludes the paper.

2. Problem Statement and Solution Development

The problem to be solved is formulated as follows. An aerial drone is initially at a certain location (the farm station) of the farm, far removed from the grazing area of a herd of cattle. The grazing area is a substantially large area of land and the cattle as a herd may roam freely anywhere within the area. The resting place or the shed to which the cattle are herded for the night is located within this grazing area. Please note that this paper does not address the herding technologies that may be applied. The emphasis in this paper is locating the herd and then tracking the herd as it moves toward the grazing land as well as moving the herd back to the shed. Once the drone has intercepted the herd, the GPS coordinates of the drone may be used to log the movement of the herd and to verify the arrival of the herd at the shed.
The drone navigates by being at a constant height h > 0 at all times. The location of the drone is given as ( x ( t ) , y ( t ) ) . The drone navigates in a ϕ ( t ) heading direction, which is measured anti-clock wise from the x axis. Where π < ϕ π . Let the drone travel in a constant speed u > 0 and an angular velocity ϕ ˙ ( t ) . Then, the translational motion could be introduced as (1).
x ˙ ( t ) = u cos ( ϕ ( t ) ) ; y ˙ ( t ) = u sin ( ϕ ( t ) ) .
The limits of ϕ ˙ ( t ) will always be between ϕ ˙ m a x ϕ ˙ ( t ) ϕ ˙ m a x . Where ϕ ˙ m a x > 0 .
The heard could be stationary or moving on the ground and the ( x H ( t ) , y H ( t ) ) becomes the coordinates of the herd. Let d ( t ) be the sole range information the drone has about the herd at any time t. Since, h height is perpendicular to the ground, we introduce l ( t ) as in (2). Let, e ( t ) be a minor error component that occurs due to signal noise. If E is a given constant, | e ( t ) | < E .
l ( t ) = ( d 2 ( t ) h 2 ) + e ( t ) .
Definition 1.
The drone arrives at the herd’s location when l ( T ) L , where T > 0 and L R .
Let g ( t ) be a given function, g ( t ) at t could be introduced as in (3).
g ( t ) : = lim ξ 0 , ξ > 0 g ( t ξ ) .
We introduce the sliding mode control law as (4).
ϕ ˙ ( 0 ) = ϕ ˙ m a x , ϕ ˙ ( t ) = ϕ ˙ ( t ) sgn ( l ¨ ( t ) ) .
We introduce Assumption 1.
We assume that initially the drone is not too close to the herd’s location as in Assumption 1.
Assumption 1.
The statement l ( 0 ) > 3 u ϕ ˙ m a x holds all times.
We introduce Lemma 1.
Lemma 1.
Let the angle between the drone’s initial direction and the herd be < π 2 and with the aid of Assumption 1, it could be said that the control law (4) guides the drone towards the location of a herd which is stationary. Where ( x H ( t ) , y H ( t ) ) ( X H , Y H ) .
Proof of Lemma 1.
Let λ ( t ) be the angle between the drone’s heading ϕ ( t ) and the direction from the drone towards the herd. Then the following equations hold:
l ˙ ( t ) = u cos ( λ ( t ) ) ;
λ ˙ ( t ) = ϕ ˙ + u sin ( λ ( t ) ) d ( t ) ,
see, e.g., [42]. Furthermore, there exist two circles of radius
R : = u ϕ ˙ m a x
that is tangent to the drone’s initial heading ϕ ( 0 ) . One of these circles has the property that it intersects the straight line segment connecting the drone’s initial position and the herd. It follows from (4) that the drone first moves along this circle until its heading ϕ ( t ) coincides with the direction towards the steady herd. Indeed, it follows from Assumption 1 that the herd is outside of this circle and the distance between the herd and any point of this circle is greater than u ϕ ˙ m a x . This and (6) imply that λ ( t ) is decreasing when the drone moves along this circle, hence, it follows from (5) that l ˙ ( t ) is decreasing, therefore, l ¨ ( t ) < 0 . Furthermore, it is obvious that if the heading ϕ ( t ) for some t coincides with the direction towards the steady herd, then the sliding mode control law (4) will keep the heading at the line connecting the drone with the steady herd which corresponds to the sliding surface l ¨ ( t ) = 0 . Hence, the drone will move towards the herd along this straight line and intercept it. This completes the proof of Lemma 1.   □
We introduce Assumptions 2 and 3 in order to consider the problem of moving herds v H ( t ) : = x ˙ H ( t ) 2 + y ˙ H ( t ) 2 .
Assumption 2.
Let v m a x , k R such that v H ( t ) v m a x < u and | u ˙ ( t ) | k .
Assumption 3.
Let the condition v H u 2 + 2 k u ϕ ˙ m a x 2 < 1 hold.
Assumptions 2 and 3 guarantee that the speed and the acceleration of the herd is not too large compared to the drone’s speed and angular velocity. It is obvious that if the herd is moving too fast, interception is impossible. In practice, the drone’s speed and angular velocity are known a priori from drone’s specifications. The speed and the acceleration of the herd can be obtained from preliminary studies of herds of cattle using, for example, aerial video surveillance of herds.
Lemma 2.
If all the above assumptions are sustained, the control law makes the drone navigate towards the moving herd and intercept if D > u + v m a x 2 ϕ ˙ m a x .
Proof of Lemma 2.
Let λ ( t ) be the angle between the drone’s heading ϕ ( t ) and the direction from the drone towards the herd, β ( t ) be the angle between the herd’s heading and the direction from the herd towards the drone. Then the following equations hold:
l ˙ ( t ) = u cos ( λ ( t ) ) + v H ( t ) cos ( β ( t ) ) ;
λ ˙ ( t ) = ϕ ˙ + u sin ( λ ( t ) ) l ( t ) v H ( t ) sin ( β ( t ) ) l ( t ) ,
see, e.g., [42]. Furthermore, as in the poof of Lemma 1, it follows from (4) that the drone first moves along the circle of radius (7) that is tangent to the initial drone’s heading ϕ ( 0 ) and intersects the straight line segment connecting the drone’s initial position and the herd’s initial position ( x H ( 0 ) , y H ( 0 ) ) . Indeed, it follows from Assumption 1 that the herd is outside of this circle at time 0. Moreover, it follows from (4) and (9) that when the drone moves along this circle and l ( t ) > u + v m a x 2 ϕ ˙ m a x then λ ˙ ( t ) < ϕ ˙ m a x 2 . This and (9) imply that λ ( t ) is decreasing when the drone moves along this circle. Furthermore, it follows from (8) and Assumption 2 that l ¨ ( t ) < 0 if λ ˙ ( t ) < ϕ ˙ m a x 2 . This, (8) and Assumption 3 imply that l ˙ ( t ) < ξ < 0 if l ( t ) > u + v m a x 2 ϕ ˙ m a x . Therefore, since D satisfy D > u + v m a x 2 ϕ ˙ m a x , the proposed navigation law (4) is D intercepting. This completes the proof of Lemma 2.   □
In the forthcoming sections, we tested the performance of the control law (4).

3. The Proposed Algorithm and the Dynamic Model

We introduce the following function where l ˙ ( t ) denotes the derivative of l ( t ) .
f ( t ) = ( l ˙ ( t ) u ) 2
We introduce the following discretized version of the control law (4) in Algorithm 1. The sample time is given by δ T and ϕ 0 is a constant heading angle ( ϕ 0 R ).
Algorithm 1 Calculating ϕ ( δ T )
Input: f ( δ T ) , f ( ( δ 1 ) T )
Output: ϕ ( δ T )
1:
if f ( δ T ) f ( ( δ 1 ) T ) <0then
2:
ϕ ( δ T )     ϕ ( ( δ 1 ) T ) + ϕ 0
3:
else
4:
ϕ ( δ T )     ϕ ( ( δ 1 ) T ) ϕ 0
5:
end if
We introduce the following PD controller in (11) to find the τ needed to control the heading angle. K d , K p R . ω ( t ) is the angular velocity, I is the inertia of the UAV and β is the Coulomb resistance.
τ = K d ( ϕ ˙ ( δ T ) ϕ ˙ ( ( δ 1 ) T ) ) + K p ( ϕ ( δ T ) ϕ ( ( δ 1 ) T ) )
The motion model of the UAV can be introduced as (12).
x ˙ ( t ) = u cos ( ϕ ( t ) ) y ˙ ( t ) = u sin ( ϕ ( t ) ) ϕ ˙ ( t ) = ω ( t ) ω ˙ ( t ) = ( β / I ) ω ( t ) + τ / I

4. Simulation Results

We conducted Matlab-based simulations to test the performance of the proposed algorithm with both stationary and moving herds. The results of both cases are presented in this section. In all navigation (herd tracking) plots x and y coordinates are expressed in meters. It is important to note that u = 5 ms 1 in all simulations. In Section 4.1, the control algorithm has been tested with static and dynamic herds. There were five tests conducted with dynamic herds (Figure 2–6) and the complexities of the travel paths were varied to test the performance. Thereafter in Section 4.2, two tests have been conducted with noisy data. A state of the art method [55] was made to perform the same two tests and the results of the two methods are compared(the proposed algorithm and the state of the art method [55]).

4.1. Target Location Interception

In Figure 1, the UAV’s task is to travel towards a herd stationed at the co-ordinates (−250, −250). Since the herd’s direction is different from the UAV’s initial heading, an initial heading change around 180 could be observed. This heading change was made by the control law (4) of the sliding mode controller. Whenever the control law decides to use ϕ ( t ) < 0 the control law is in sliding on the sliding manifold and when off the sliding surface ϕ ( t ) > 0 is applied. In this simulation result, the transition from sliding surface to off the sliding surface and then back on to the sliding surface occurs around (−230, −198). This makes the controller change the heading of the drone to reach the herd’s location. The UAV has stopped at the herd’s location because this location corresponds to the origin of the sliding surface of the error driven sliding mode controller.
In Figure 2, the UAV reaches the herd’s location and tracks the herd as it moves towards the final destination. In this particular case, the control law (4) of the UAV maintains the control effort until the origin of the sliding surface is reached at co-ordinates (102,75), which is the herd’s location. From this point onwards, the UAV has moved along with the herd till the final destination. Please note that some circular movements of the UAV could be observed along the path taken by the herd (a location is zoomed). The main reason for this behaviour is that the UAV maintains a constant speed which is slightly greater than the speed at which the herd is moving. This causes the error states to shift off the siding mode, however, the sliding mode controller wrests control of the drone and brings back to the herd’s location.
Similar to Figure 2, the UAV in Figure 3 maintained the control effort by making three heading changes before reaching the sliding surface origin around (149,100) and then manoeuvred hand in hand with the herd until the herd reached their final destination. The same UAV circulation could be identified due to the speed difference in this scenario as well. However, the only difference between Figure 2 and Figure 3 is the herd’s travel paths. In Figure 2, the path is made out of three linear segments and in Figure 3 the path is complex and consists some curved segments. In all three figures, the UAV found the herd’s initial location and has not missed the herd at any point while travelling towards the destination. These simulation results show the validity of the proposed algorithm.
To further investigate the robustness, the algorithm was applied to a herd which moves in complex paths (Figure 4, Figure 5 and Figure 6). Equation (4) has made the UAV in Figure 4 make six heading changes before approaching the herd. According to Figure 4 the UAV reached the target around co-ordinates (0,160). Similar to all other figures, the UAV travelled along with the target here onwards by being at the origin of the error space. In Figure 5, the UAV made five heading changes to reach the target around (10,160) which corresponds to the origin of the error space. The path towards the destination in this scenario is complex. However, the UAV successfully tracked the herd to the final destination despite the path complexity. Similar to Figure 5, Figure 6 shows the results of another complex simulation. The control algorithm of the UAV has made three heading changes before reaching the origin of the error space, at (125,75). Thereafter, similar to Figure 5 the UAV successfully tracked the herd towards the destination.

4.2. Performance under Noise

To test the performance of the algorithm under noise, we introduced a white Gaussian noise with a 0.001 SNR. The noise signal is presented in Figure 7. We tested the performance of the UAV with a noisy range signal and made the UAV to navigate towards the target. The target travelled in a piecewise linear trajectory. To compare the result, we have also made the navigation algorithm introduced in [55] to follow the same target under noise. The obtained results are shown in Figure 8a. It is more than evident that the performance of the proposed method has not been disturbed by the noise according to Figure 8a. However, the method proposed in [55] has not been able to find the target location properly due to the effect of the noise (Figure 8b).
To make a proper conclusion, we also made both the algorithms travel towards a target travelling in a complex path (Figure 9). It is apparent that the proposed method performed well with the noisy range signal according to Figure 9a. On the other hand, the method proposed in [55] has not been successful similar to the previous example.

5. Experimental Results

The proposed algorithm was experimentally tested using a DJI Matrice 600 Pro drone (Figure 10c). The drone was serially connected to an Intel NUC Mini PC (Figure 10a). We automated the drone with the aid of a DJI Onboard Software Development Kit Version 3.6 which was installed in the Mini Pc. The code was written in C++ for the experiments. A Deacawave Trek 1000 range sensor which could be deployed for measuring the range distances is shown in Figure 10b. Two of these range sensors are needed for the distance measurements (one should be fixed to the drone and the other should be connected to one of the cow’s in the herd) and the data transmission between the two sensors happens through Ultra Wide Band radio technology. Figure 11 shows the accuracy of three different range measurements over 720 s. The accuracy is within ± 2 cm.
In the experiment, the drone’s speed was set to 0.21 ms 1 . In all navigation (herd tracking) plots x and y coordinates are expressed in meters. Unlike in the simulation scenario, the drone was subjected to disturbances such as wind effects during the experiment. From the starting co-ordinate till around (2.4, 0), the drone has had the initial heading. Thereafter, based on the range signal, the control law in Equation (4) has changed the heading direction towards the herd’s location and travelled directly towards the herd. However, it is obvious in Figure 12, the trajectory of the drone after the heading change has not been a straight line or a combination of straight lines. The leading cause of the drone’s wavered behaviour has been the resistance produced by the wind. The herd’s location corresponds to the origin of the error space. Furthermore, it also could be observed that the drone has circulated at the herd’s location. The reason for the circular movement of the drone is because the speed of the herd is 0, and the drone has got a speed which is higher than 0. The speed difference between the herd and the drone would give rise to the circulation. Apart from the circular movement at the herd’s location, Figure 12 justifies that the control law performs well when the herd is static.
Then the drone followed a herd from a particular location to a destination, Figure 13a and Figure 14a show the results obtained. The speed of the drone has made close to that of the herd in order to address the drone’s circulation issue, which was discussed in the simulations. The experiments with a dynamic target (moving herd) were conducted in an identical manner to the simulations. In Figure 13a the control Equation (4) has made the drone to gradually reduce the heading and navigate in the direction of the herd. However, similar to the static scenario in Figure 12, the wind has been a significant factor and has caused a minor drift from the herd’s path. The drone in Figure 14a has gradually increased the heading and moved towards the herd. Around (9, −6), the drone has reached the target (herd), or in other words, the controller has reached the origin of the error space through the sliding surface. The main cause for the errors in range between targets and drones in Figure 13a and Figure 14a are the disturbances from the wind. The wind resistance has the ability to make a significant impact on the speed of the drone. However, the results presented in Figure 13a and Figure 14a prove that the algorithm performs sufficiently and robustly in the presence of disturbances.

6. Conclusions

This paper presented a novel methodology that can be implemented on an aerial drone for following a herd of cattle. The proposed method uses range data, i.e., the distance between the herd of cattle and the drone, delivered by simple inexpensive sensors without requiring any directional information. The navigation algorithm guarantees that an autonomous aerial drone approaches and follows a herd of cattle using only range data. The robustness of the controller ensures guaranteed interception of the herd of cattle regardless of the system dynamics involved. Generally, range measurements are robust and resilient against signal noise. Moreover, the algorithm does not need the range data to be accurate, rather it only relies on the rate of change of range data, which substantially improves robustness. The theory behind the proposed navigation algorithm was presented in detail, and the simulations and experimental results demonstrated the effectiveness of the proposed method. Furthermore, the proposed algorithm has outperformed other work when tested under noise. An interesting direction for future research is to extend the proposed approach to a network of aerial drones, see, e.g., [56], combining the autonomous navigation approach of the current paper with various methods of control of networked systems, see, e.g., [57,58,59]. Another interesting direction for future research is combining the developed navigation algorithm with advanced robust control and estimation techniques (see, e.g., [60,61]) to handle large uncertainties and non-linearities that are typical for dynamic models of drone’s and herd’s motion.

Author Contributions

Conceptualization, M.G. and A.V.S.; methodology, M.G. and A.V.S.; software, M.G.; validation, M.G. and J.K.; formal analysis, M.G. and J.K.; investigation, M.G. and J.K.; resources, A.H.T.E.D.S. and J.K.; data curation, M.G.; writing—original draft preparation, M.G.; writing—review and editing, M.G., A.V.S. and J.K.; visualization, M.G.; supervision, J.K.; project administration, J.K.; funding acquisition, A.V.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Australian Research Council. Furthermore, this work received funding from the Australian Government, via grant AUSMURIB000001 associated with ONR MURI grant N00014-19-1-2571.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Papandroulakis, N.; Dimitris, P.; Pascal, D. An automated feeding system for intensive hatcheries. Aquac. Eng. 2002, 26, 13–26. [Google Scholar] [CrossRef]
  2. Jorgensen, M.; Adams-Progar, A.; De Passille, A.; Rushen, J.; Godden, S.; Chester-Jones, H.; Endres, M. Factors associated with dairy calf health in automated feeding systems in the Upper Midwest United States. J. Dairy Sci. 2017, 100, 5675–5686. [Google Scholar] [CrossRef]
  3. Reis, J.; Weldon, A.; Ito, P.; Stites, W.; Rhodes, M.; Davis, D.A. Automated feeding systems for shrimp: Effects of feeding schedules and passive feedback feeding systems. Aquaculture 2021, 541, 736800. [Google Scholar] [CrossRef]
  4. Pivoto, D.; Waquil, P.D.; Talamini, E.; Finocchio, C.P.S.; Dalla Corte, V.F.; de Vargas Mores, G. Scientific development of smart farming technologies and their application in Brazil. Inf. Process. Agric. 2018, 5, 21–32. [Google Scholar] [CrossRef]
  5. Jayaraman, P.P.; Yavari, A.; Georgakopoulos, D.; Morshed, A.; Zaslavsky, A. Internet of things platform for smart farming: Experiences and lessons learnt. Sensors 2016, 16, 1884. [Google Scholar] [CrossRef] [PubMed]
  6. Gupta, M.; Abdelsalam, M.; Khorsandroo, S.; Mittal, S. Security and privacy in smart farming: Challenges and opportunities. IEEE Access 2020, 8, 34564–34584. [Google Scholar] [CrossRef]
  7. Pitla, S.; Bajwa, S.; Bhusal, S.; Brumm, T.; Brown-Brandl, T.M.; Buckmaster, D.R.; Condotta, I.; Fulton, J.; Janzen, T.J.; Karkee, M.; et al. Ground and Aerial Robots for Agricultural Production: Opportunities and Challenges; CAST: Ames, IA, USA, 2020. [Google Scholar]
  8. Driessen, C.; Heutinck, L.F. Cows desiring to be milked? Milking robots and the co-evolution of ethics and technology on Dutch dairy farms. Agric. Hum. Values 2015, 32, 3–20. [Google Scholar] [CrossRef]
  9. Boogaard, B.K.; Bock, B.B.; Oosting, S.J.; Kroch, E. Visiting a farm: An exploratory study of the social construction of animal farming in Norway and the Netherlands based on sensory perception. Int. J. Sociol. Agric. Food 2010, 17, 24–50. [Google Scholar]
  10. Putjaika, N.; Phusae, S.; Chen-Im, A.; Phunchongharn, P.; Akkarajitsakul, K. A control system in an intelligent farming by using arduino technology. In Proceedings of the 2016 Fifth ICT International Student Project Conference (ICT-ISPC), Nakhonpathom, Thailand, 27–28 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 53–56. [Google Scholar]
  11. Srivastava, A.; Vijay, S.; Negi, A.; Shrivastava, P.; Singh, A. DTMF based intelligent farming robotic vehicle: An ease to farmers. In Proceedings of the 2014 International Conference on Embedded Systems (ICES), Coimbatore, India, 3–5 July 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 206–210. [Google Scholar]
  12. Shandilya, U.; Khanduja, V. Intelligent Farming System With Weather Forecast Support and Crop Prediction. In Proceedings of the 2020 5th International Conference on Computing, Communication and Security (ICCCS), Patna, India, 14–16 October 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
  13. Turner, L.; Udal, M.; Larson, B.; Shearer, S. Monitoring cattle behavior and pasture use with GPS and GIS. Can. J. Anim. Sci. 2000, 80, 405–413. [Google Scholar] [CrossRef]
  14. Maroto-Molina, F.; Navarro-García, J.; Príncipe-Aguirre, K.; Gómez-Maqueda, I.; Guerrero-Ginel, J.E.; Garrido-Varo, A.; Pérez-Marín, D.C. A Low-Cost IoT-Based System to Monitor the Location of a Whole Herd. Sensors 2019, 19, 2298. [Google Scholar] [CrossRef] [Green Version]
  15. McGranahan, D.A.; Geaumont, B.; Spiess, J.W. Assessment of a livestock GPS collar based on an open-source datalogger informs best practices for logging intensity. Ecol. Evol. 2018, 8, 5649–5660. [Google Scholar] [CrossRef] [PubMed]
  16. Bailey, D.; Trotter, M.; Knight, C.; Thomas, M. Use of GPS tracking collars and accelerometers for rangeland livestock production research. J. Anim. Sci. 2017, 95, 360. [Google Scholar] [CrossRef] [Green Version]
  17. Polojärvi, K.; Colpaert, A.; Matengu, K.; Kumpula, J. GPS collars in studies of cattle movement: Cases of northeast Namibia and north Finland. In Engineering Earth; Springer: Berlin/Heidelberg, Germany, 2011; pp. 173–187. [Google Scholar]
  18. Bebe, F.N.; Hutchens, T.; Andries, K.M.; Bates, K.J.; Gipson, T.; Evans, M. Meat Goats in Hillside Pastures: Control of Undesirable Plant Species and GPS Collar Determination of Activity Patterns. J. Ky. Acad. Sci. 2015, 75, 69–80. [Google Scholar] [CrossRef] [Green Version]
  19. Ungar, E.; Nevo, Y.; Baram, H.; Arieli, A. Evaluation of the IceTag leg sensor and its derivative models to predict behaviour, using beef cattle on rangeland. J. Neurosci. Methods 2018, 300, 127–137. [Google Scholar] [CrossRef] [PubMed]
  20. Panckhurst, B.; Brown, P.; Payne, K.; Molteno, T.C. Solar-powered sensor for continuous monitoring of livestock position. In Proceedings of the 2015 IEEE Sensors Applications Symposium (SAS), Zadar, Croatia, 13–15 April 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1–6. [Google Scholar]
  21. Collins, G.H.; Petersen, S.L.; Carr, C.A.; Pielstick, L. Testing VHF/GPS collar design and safety in the study of free-roaming horses. PloS ONE 2014, 9, e103189. [Google Scholar] [CrossRef]
  22. Vaughan, R.; Sumpter, N.; Henderson, J.; Frost, A.; Cameron, S. Experiments in automatic flock control. Robot. Auton. Syst. 2000, 31, 109–117. [Google Scholar] [CrossRef]
  23. Nardi, S.; Mazzitelli, F.; Pallottino, L. A game theoretic robotic team coordination protocol for intruder herding. IEEE Robot. Autom. Lett. 2018, 3, 4124–4131. [Google Scholar] [CrossRef]
  24. Harmati, I.; Skrzypczyk, K. Robot team coordination for target tracking using fuzzy logic controller in game theoretic framework. Robot. Auton. Syst. 2009, 57, 75–86. [Google Scholar] [CrossRef]
  25. Paranjape, A.A.; Chung, S.J.; Kim, K.; Shim, D.H. Robotic herding of a flock of birds using an unmanned aerial vehicle. IEEE Trans. Robot. 2018, 34, 901–915. [Google Scholar] [CrossRef] [Green Version]
  26. Gade, S.; Paranjape, A.A.; Chung, S.J. Herding a flock of birds approaching an airport using an unmanned aerial vehicle. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Kissimmee, FL, USA, 5–9 January 2015; p. 1540. [Google Scholar]
  27. Gade, S.; Paranjape, A.A.; Chung, S.J. Robotic herding using wavefront algorithm: Performance and stability. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, San Diego, CA, USA, 4–8 January 2016; p. 1378. [Google Scholar]
  28. Obermeyer, K. Path planning for a UAV performing reconnaissance of static ground targets in terrain. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Chicago, IL, USA, 10–13 August 2009; p. 5888. [Google Scholar]
  29. Cevik, P.; Kocaman, I.; Akgul, A.S.; Akca, B. The small and silent force multiplier: A swarm UAV—Electronic attack. J. Intell. Robot. Syst. 2013, 70, 595–608. [Google Scholar] [CrossRef]
  30. Christiansen, M.P.; Laursen, M.S.; Jørgensen, R.N.; Skovsen, S.; Gislum, R. Designing and testing a UAV mapping system for agricultural field surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [Green Version]
  31. Siebert, S.; Teizer, J. Mobile 3D mapping for surveying earthwork projects using an Unmanned Aerial Vehicle (UAV) system. Autom. Constr. 2014, 41, 1–14. [Google Scholar] [CrossRef]
  32. Lagkas, T.; Argyriou, V.; Bibi, S.; Sarigiannidis, P. UAV IoT framework views and challenges: Towards protecting drones as “Things”. Sensors 2018, 18, 4015. [Google Scholar] [CrossRef] [Green Version]
  33. Mademlis, I.; Mygdalis, V.; Nikolaidis, N.; Montagnuolo, M.; Negro, F.; Messina, A.; Pitas, I. High-level multiple-UAV cinematography tools for covering outdoor events. IEEE Trans. Broadcast. 2019, 65, 627–635. [Google Scholar] [CrossRef]
  34. Mozaffari, M.; Saad, W.; Bennis, M.; Debbah, M. Optimal transport theory for cell association in UAV-enabled cellular networks. IEEE Commun. Lett. 2017, 21, 2053–2056. [Google Scholar] [CrossRef]
  35. Haidari, L.A.; Brown, S.T.; Ferguson, M.; Bancroft, E.; Spiker, M.; Wilcox, A.; Ambikapathi, R.; Sampath, V.; Connor, D.L.; Lee, B.Y. The economic and operational value of using drones to transport vaccines. Vaccine 2016, 34, 4062–4067. [Google Scholar] [CrossRef] [Green Version]
  36. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  37. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  38. Jurak, T.; Bajer, J.; Jilek, A.; Bares, M.; Silinger, K.; Sedlacek, T. Pros and Cons Analysis of a Flying-wing and a Canard Conceptions for a Special Purpose UAV in High Altitude. In Proceedings of the 2019 4th Technology Innovation Management and Engineering Science International Conference (TIMES-iCON), Bangkok, Thailand, 11–13 December 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–5. [Google Scholar]
  39. Delmerico, J.; Mueggler, E.; Nitsch, J.; Scaramuzza, D. Active autonomous aerial exploration for ground robot path planning. IEEE Robot. Autom. Lett. 2017, 2, 664–671. [Google Scholar] [CrossRef] [Green Version]
  40. Loizou, S.G.; Kumar, V. Biologically inspired bearing-only navigation and tracking. In Proceedings of the 2007 46th IEEE Conference on Decision and Control, New Orleans, LA, USA, 12–14 December 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 1386–1391. [Google Scholar]
  41. Bekris, K.E.; Argyros, A.A.; Kavraki, L.E. Angle-based methods for mobile robot navigation: Reaching the entire plane. In Proceedings of the IEEE International Conference on Robotics and Automation, 2004, Proceedings, ICRA’04, New Orleans, LA, USA, 26 April–1 May 2004; IEEE: Piscataway, NJ, USA, 2004; Volume 3, pp. 2373–2378. [Google Scholar]
  42. Teimoori, H.; Savkin, A.V. Equiangular navigation and guidance of a wheeled mobile robot based on range-only measurements. Robot. Auton. Syst. 2010, 58, 203–215. [Google Scholar] [CrossRef]
  43. Matveev, A.S.; Teimoori, H.; Savkin, A.V. Range-only measurements based target following for wheeled mobile robots. Automatica 2011, 47, 177–184. [Google Scholar] [CrossRef]
  44. Matveev, A.S.; Teimoori, H.; Savkin, A.V. A method for guidance and control of an autonomous vehicle in problems of border patrolling and obstacle avoidance. Automatica 2011, 47, 515–524. [Google Scholar] [CrossRef]
  45. Matveev, A.S.; Wang, C.; Savkin, A.V. Real-time navigation of mobile robots in problems of border patrolling and avoiding collisions with moving and deforming obstacles. Robot. Auton. Syst. 2012, 60, 769–788. [Google Scholar] [CrossRef]
  46. Utkin, V.I. Sliding Modes in Control and Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  47. Baek, S.S.; Kwon, H.; Yoder, J.A.; Pack, D. Optimal path planning of a target-following fixed-wing UAV using sequential decision processes. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 2955–2962. [Google Scholar]
  48. Vanegas, F.; Campbell, D.; Roy, N.; Gaston, K.J.; Gonzalez, F. UAV tracking and following a ground target under motion and localisation uncertainty. In Proceedings of the 2017 IEEE Aerospace Conference, Big Sky, MT, USA, 4–11 March 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–10. [Google Scholar]
  49. Li, Y.; Doucette, E.A.; Curtis, J.W.; Gans, N. Ground target tracking and trajectory prediction by UAV using a single camera and 3D road geometry recovery. In Proceedings of the 2017 American Control Conference (ACC), Seattle, WA, USA, 24–26 May 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1238–1243. [Google Scholar]
  50. Minaeian, S.; Liu, J.; Son, Y.J. Effective and efficient detection of moving targets from a UAV’s camera. IEEE Trans. Intell. Transp. Syst. 2018, 19, 497–506. [Google Scholar] [CrossRef]
  51. Price, E.; Lawless, G.; Ludwig, R.; Martinovic, I.; Bülthoff, H.H.; Black, M.J.; Ahmad, A. Deep neural network-based cooperative visual tracking through multiple micro aerial vehicles. IEEE Robot. Autom. Lett. 2018, 3, 3193–3200. [Google Scholar] [CrossRef] [Green Version]
  52. Hermand, E.; Nguyen, T.W.; Hosseinzadeh, M.; Garone, E. Constrained control of UAVs in geofencing applications. In Proceedings of the 2018 26th Mediterranean Conference on Control and Automation (MED), Zadar, Croatia, 19–22 June 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 217–222. [Google Scholar]
  53. Štroner, M.; Urban, R.; Reindl, T.; Seidl, J.; Brouček, J. Evaluation of the georeferencing accuracy of a photogrammetric model using a quadrocopter with onboard GNSS RTK. Sensors 2020, 20, 2318. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  54. Vanegas, F.; Gonzalez, F. Enabling UAV navigation with sensor and environmental uncertainty in cluttered and GPS-denied environments. Sensors 2016, 16, 666. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  55. Koh, K.C.; Cho, H.S. A smooth path tracking algorithm for wheeled mobile robots with dynamic constraints. J. Intell. Robot. Syst. 1999, 24, 367–385. [Google Scholar] [CrossRef]
  56. Li, X.; Savkin, A.V. Networked Unmanned Aerial Vehicles for Surveillance and Monitoring: A Survey. Future Internet 2021, 13, 174. [Google Scholar] [CrossRef]
  57. Matveev, A.S.; Savkin, A.V. Estimation and Control over Communication Networks; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  58. Bemporad, A.; Heemels, M.; Johansson, M. Networked Control Systems; Springer: Berlin/Heidelberg, Germany, 2010; Volume 406. [Google Scholar]
  59. Xia, Y.; Fu, M.; Liu, G.P. Analysis and Synthesis of Networked Control Systems; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2011; Volume 409. [Google Scholar]
  60. Petersen, I.R.; Ugrinovskii, V.A.; Savkin, A.V. Robust Control Design Using H Methods; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2000. [Google Scholar]
  61. Petersen, I.R.; Savkin, A.V. Robust Kalman Filtering for Signals and Systems with Large Uncertainties; Springer Science & Business Media: Berlin/Heidelberg, Germany, 1999. [Google Scholar]
Figure 1. UAV travelling to the location of the stationary herd.
Figure 1. UAV travelling to the location of the stationary herd.
Sensors 21 07218 g001
Figure 2. Herding on a piecewise linear path.
Figure 2. Herding on a piecewise linear path.
Sensors 21 07218 g002
Figure 3. Herding on a curved path.
Figure 3. Herding on a curved path.
Sensors 21 07218 g003
Figure 4. Herding on a complex path.
Figure 4. Herding on a complex path.
Sensors 21 07218 g004
Figure 5. Herding on a second complex path.
Figure 5. Herding on a second complex path.
Sensors 21 07218 g005
Figure 6. Herding on a third complex path.
Figure 6. Herding on a third complex path.
Sensors 21 07218 g006
Figure 7. The White Gaussian Noise.
Figure 7. The White Gaussian Noise.
Sensors 21 07218 g007
Figure 8. Comparison with a herd travelling in a linear trajectory: (a) The proposed method following the herd with noisy range signal data. (b) The performance of the method proposed in [55] when made to travel towards the same herd under the same circumstances.
Figure 8. Comparison with a herd travelling in a linear trajectory: (a) The proposed method following the herd with noisy range signal data. (b) The performance of the method proposed in [55] when made to travel towards the same herd under the same circumstances.
Sensors 21 07218 g008
Figure 9. Comparison with a herd travelling in a complex trajectory: (a) The proposed method following the herd with noisy signal data. (b) The performance of the method proposed in [55] when made to travel towards the same herd under the same circumstances.
Figure 9. Comparison with a herd travelling in a complex trajectory: (a) The proposed method following the herd with noisy signal data. (b) The performance of the method proposed in [55] when made to travel towards the same herd under the same circumstances.
Sensors 21 07218 g009
Figure 10. (a) Intel NUC Mini PC (b) Decawave Trek 1000 range sensor (c) Matrice 600 PRO drone used for the practical experiments.
Figure 10. (a) Intel NUC Mini PC (b) Decawave Trek 1000 range sensor (c) Matrice 600 PRO drone used for the practical experiments.
Sensors 21 07218 g010
Figure 11. The plot shows the results of three different range measurements over a period of 12 min.
Figure 11. The plot shows the results of three different range measurements over a period of 12 min.
Sensors 21 07218 g011
Figure 12. The drone navigating towards a static herd.
Figure 12. The drone navigating towards a static herd.
Sensors 21 07218 g012
Figure 13. (a) Drone navigating towards a herd moving in a linear path. (b) The distance between the drone and the herd expressed as an error over time.
Figure 13. (a) Drone navigating towards a herd moving in a linear path. (b) The distance between the drone and the herd expressed as an error over time.
Sensors 21 07218 g013
Figure 14. (a) Drone navigating towards a herd moving in a curved path. (b) Error/distance between the herd and the drone over time.
Figure 14. (a) Drone navigating towards a herd moving in a curved path. (b) Error/distance between the herd and the drone over time.
Sensors 21 07218 g014
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gnanasekera, M.; Katupitiya, J.; Savkin, A.V.; De Silva, A.H.T.E. A Range-Based Algorithm for Autonomous Navigation of an Aerial Drone to Approach and Follow a Herd of Cattle. Sensors 2021, 21, 7218. https://0-doi-org.brum.beds.ac.uk/10.3390/s21217218

AMA Style

Gnanasekera M, Katupitiya J, Savkin AV, De Silva AHTE. A Range-Based Algorithm for Autonomous Navigation of an Aerial Drone to Approach and Follow a Herd of Cattle. Sensors. 2021; 21(21):7218. https://0-doi-org.brum.beds.ac.uk/10.3390/s21217218

Chicago/Turabian Style

Gnanasekera, Manaram, Jay Katupitiya, Andrey V. Savkin, and A.H.T. Eranga De Silva. 2021. "A Range-Based Algorithm for Autonomous Navigation of an Aerial Drone to Approach and Follow a Herd of Cattle" Sensors 21, no. 21: 7218. https://0-doi-org.brum.beds.ac.uk/10.3390/s21217218

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop