Next Article in Journal
Process Design and Optimization towards Digital Twins for HIV-Gag VLP Production in HEK293 Cells, including Purification
Previous Article in Journal
Genistein, a Potential Phytochemical against Breast Cancer Treatment-Insight into the Molecular Mechanisms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Predictive Control Algorithm of Vehicle Turning Path Based on Monocular Vision

1
College of Mechanical and Electrical Engineering, Shihezi University, Shihezi 832000, China
2
School of Precision Instrument and Opto-Electronics Engineering, Tianjin University, Tianjin 300072, China
*
Author to whom correspondence should be addressed.
Submission received: 10 January 2022 / Revised: 15 February 2022 / Accepted: 17 February 2022 / Published: 21 February 2022

Abstract

:
To solve the issue that the monocular vision vehicle navigation system is limited by the field of vision acquired by the charge-coupled device camera and cannot acquire navigation turning path information throughout the turning process, decreasing the vehicle turning control accuracy, this paper proposed a turning control algorithm based on monocular vision vehicle turning path prediction. Firstly, the camera’s distortion was adjusted. Secondly, the camera imaging model was built, and the turning path’s position information was determined using the imaging position relationship. The vehicle motion model was built in accordance with the vehicle steering mode. Lastly, the cornering trajectory of a vehicle was estimated using the vehicle’s front axle length and front-wheel adjustment data, determining the vehicle turning point and turn operations on the basis of the projected relationship between the vehicle turning track and the turning path position. The experimental results showed that the proposed algorithm can effectively measure the position parameters of the cornering path and complete vehicle cornering control. The maximum absolute error of intercept and slope in turn path position parameters were 0.2525 m and 0.014 m, respectively. The cornering control accuracy was 0.093 m and 0.085 m, which met the vehicle navigation cornering control requirements. At the same time, the research can provide theoretical reference for research on precise navigation control of other cornering vehicles and other path guidance modes.

1. Introduction

The development of artificial intelligence and digitization has laid the foundation for intelligent animal husbandry. To free the farmer from heavy labor, improve the quality of operation, and reduce the risk of zoonosis, intelligent animal husbandry has become an important trend of animal husbandry development and the only way to realize the scale and industrialization of animal husbandry [1,2]. Intelligent machines for patrol inspection and feeding of livestock have been designed for moving and independently completing a series of inspections and feedings during the production process of animal husbandry [3,4,5]. As an important part of intelligent animal husbandry, autonomous navigation and driving system provides important technical support. Because modern animal husbandry requires many functions, in order to make effective use of resources, the cultivation areas are divided into small areas for management, which is convenient for animal husbandry. The lanes in each cultivation area are very compact [2]. It is difficult to find a suitable turning point for each turn, and therefore the study of turn control algorithm is needed.
Machine vision technology has been widely used in automatic navigation and in the driving of vehicles operating in confined spaces because of its advantages in information collection and strong environmental adaptability [6]. Automatic turning control relies heavily on visual positioning algorithms and steering control algorithms. There are three types of visual sensors used for visual positioning: monocular vision positioning systems, binocular vision positioning systems, and multi-ocular vision positioning systems [7]. In the literature [8,9,10,11,12,13,14,15,16,17,18,19,20], monocular vision technology is used to achieve spatial position acquisition, and the transformation relationship between the image coordinate system and vehicle coordinate system is obtained through geometric relations, resulting in position information acquisition. Various studies [21,22,23,24,25,26,27,28,29,30,31,32,33] have used binocular and multi-ocular stereo vision technology to acquire spatial location. When two or more cameras of different orientations are used, the image is matched, parallax is calculated, the target distance is obtained, and the target location is realized. The above research mainly focuses on the position measurement of points on the plane but cannot solve the position parameters of the line.
To regulate vehicle path tracking, researchers [34,35,36,37,38,39,40,41,42,43,44,45] have mostly used PID control and PID-based control algorithms such as PD, fuzzy PID, and nested PID. The control algorithm for paths, including the path of beeline and small bending curvature, is limited by the collection scope of the visual navigation system based on machine vision sensor; in the vehicle turning driving path, picture data loss occurs, it is unable to provide path location information, and then it turns to control.
GPS refers to the global positioning system. Its principle is to use navigation satellites to detect and locate the distance of objects on the ground. The application of this technology in agricultural production greatly improves the efficiency of agricultural production. However, the loss of GNSS signals due to extreme weather or obstruction limits its application in complex agricultural environments [46,47]. In order for high-precision positioning and autonomous navigation ability of agricultural machinery to be achieved, as well as for the operation efficiency of agricultural machinery to be improved, navigation is often carried out by the fusion of GPS and laser radar, gyroscope, depth camera, and other sensors [48,49,50,51,52]. However, as the accuracy increases, the cost will also increase.
In relation to the above problem, this paper proposes a vehicle based on the monocular vision path prediction control algorithm of turning, turning to the camera calibration at first, then using holes in the imaging model of image alignment parameters to solve the calculation of ground actual position parameters that are obtained through building the vehicle motion model to forecast the vehicle movement track of turning, as well as the turning point, implementing corresponding steering actions to realize off-line turning control of vehicles. The main contributions are as follows:
  • The turning path predictive control algorithm of animal husbandry machinery can provide theoretical reference for accurate navigation control under other path guidance modes.
  • The imaging model was built, and the turn path location parameters were calculated according to the camera imaging position relationship to determine the turn path location information.
  • On the basis of the vehicle motion model, structure size, and front wheel adjustment parameters, we predicted the vehicle turning trajectory, and the relative position of the vehicle trajectory was measured while the turning point was determined on the basis of the CCD camera.
The remainder of this paper is divided as follows: Section 2 contains the details of the turning path prediction algorithm. Section 3 details the simulation and experimental test results. Finally, Section 4 and Section 5 state the discussion and conclusions of this study.

2. Materials and Methods

2.1. Camera Distortion Correction

The camera used was a THE TXY-616_720P industrial camera. The lens distortion of the camera was formed as a result of the camera’s poor lens set and the curvature inaccuracy of the surface, severely limiting the positioning accuracy of THE CCD camera. Lens distortion primarily involves radial and tangential distortion, with the mathematical models of tangential and radial distortion being as follows:
{ Δ x r = x ( k 1 ( x 2 + y 2 ) + k 2 ( x 2 + y 2 ) 2 ) Δ y r = y ( k 1 ( x 2 + y 2 ) + k 2 ( x 2 + y 2 ) 2 )
{ Δ x d = 2 p 1 x y + p 2 ( 3 x 2 + y 2 ) Δ y d = 2 p 1 ( 3 y 2 + x 2 ) + 2 p 2 x y
where Δxr and Δxd are transverse tangential and radial distortion variables, respectively; Δyr and Δydare the tangential and radial distortions along the longitudinal direction, respectively; p1 and p2 are the tangential distortion coefficients; k1 and k2 are the radial distortion coefficients; x and y are the coordinate values of the pixel points in the pixel coordinate system with the optical center point as the center, the horizontal direction as the horizontal axis, and the vertical direction as the longitudinal axis. Let the undistorted coordinates be (Xcorrect, Ycorrect), and the actual coordinate value can be found through Formula (3):
{ X c o r r e c t = x + Δ x r + Δ x d Y c o r r e c t = y + Δ y r + Δ y d
This work primarily used a camera calibrator and OpenCV to calculate and correct camera distortion parameters. First, we used the current CCD camera to collect and calibrate the black-and-white checkerboard image from different angles (the checkerboard had a side length of 30 mm), and then we imported the collected image into the camera calibrator to calculate the camera distortion parameters, as shown in Table 1. To adjust the camera, we entered the parameters into the OpenCV camera distortion repair application. The effect is shown in Figure 1.

2.2. Camera Imaging Model Construction and Turning Path Location

2.2.1. Camera Imaging Model Construction

The vision autonomous navigation system is primarily based on a CCD camera to gather, identify, and fit path image information, then direct and control the vehicle on the basis of the relative location of the path and the vehicle. As shown in Figure 2, the CCD imaging concept can be simplified into an imaging model. In the figure, h denotes the camera height, and θ is the camera acquisition angle. Point Z is the position of the camera optical center (camera imaging), point O is the intersection of the optical axis and the ground, OZ is the camera optical axis, plane ABCD is the image acquisition range of the CCD camera at the current position, and plane A’B’C’D’ is the CCD imaging equivalent image plane; the plane is perpendicular to the optical axis OZ, and points A’, B’, C’, D’ are points A, B, C, D corresponding to the imaging position of the image plane, respectively, taking point O as the coordinate origin to establish the equivalent image plane coordinate system U-O-V and ground coordinate system X-O-Y.

2.2.2. Position Measurements of Turning Path

The position relationship between the image plane and the ground can be obtained from the spatial position relationship between the image plane coordinate system and the ground coordinate system in the camera imaging model, as shown in Figure 3a,b. Point Q in Figure 3a corresponds to any point on the plane ABCD, its plane coordinate was (Qx, Qy); Q’ is a point on the image plane corresponding to point Q, and its coordinate was (Qu, Qv). The corresponding relationship between pavement coordinate and image points can be established using the collinear principle of actual point Q, image point Q’, and camera optical center point Z, as indicated in Formula (4).
{ Q x = Q u ( Q y + h tan θ ) Q v + h tan θ Q y = Q v h ( cos θ + sin θ tan θ ) h Q v sin θ
As shown in Figure 4, the known image resolution was H × W. The position parameter slope and intercept of the turning path in the pixel coordinate system I-O-J were αp and βp, respectively. Then, using the relative position relationship between the pixel coordinate system and the image coordinate αt and intercept βt, we were able to calculate the slope of the position parameter of the turning path in the equivalent picture plane as follows:
{ α t = α p β t = ( H α p W 2 β p ) × L M N / W
In this formula, LMN represents the actual distance between the effective image plane and the actual ground intersection MN, as shown in Figure 3b. Formula (4) shows the slope of the actual position parameter of the turning path α and intercept β slope of the position parameter in equivalent image plane αt and intercept βt conversion formula, respectively, which is as follows:
{ α = α t h ( cos θ + sin θ tan θ ) h β t sin θ β = β t h ( cos θ + sin θ tan θ ) h β t sin θ

2.3. Vehicle Movement Model and Turn Control Algorithm Design

2.3.1. Vehicle Motion Model Establishment

Vehicle steering is made up of three components: articulated steering, differential steering, and Ackerman steering. This paper’s cornering control object is an agricultural vehicle. This vehicle’s steering mode is Ackerman steering. Ackerman steering has been designed to solve the problem of different corners of left and right steering wheels caused by different steering radii of left and right steering wheels when the vehicle is steering. According to Ackermann steering geometry, the steering mechanism is designed so that when the vehicle is turning along a bend, the steering angle of the inner wheel is about 2–4 degrees greater than that of the outer wheel, and thus the circles of the four wheel paths meet approximately at the instantaneous steering center on the extension line of the rear axle, allowing the vehicle to turn smoothly. The schematic of its steering principle is shown in Figure 5a. Or represents the steering center of the vehicle, φ1 represents the deflection angle of the outer wheel, φ2 represents the deflection angle of the inner wheel, and F represents the center point of the front axle. R represents the midpoint of the rear axle, and L represents the wheelbase.
The vehicle motion model can be simplified into a two-wheel model, as shown in Figure 5b, where X-O-Y is the ground acquisition coordinate system for the camera and Or is the front-wheel angle of the vehicle. For δ instantaneous rotational axis, set the starting coordinates of point F and point R to (rx,ry), (fx,fy) at front-wheel angle when Δt period is δ. If the speed of the vehicle is V, then position coordinates of point F and point R after Δt period are (rxt), ryt)), (fxt), fyt)). The expression is shown in Formula (7), in which ω is the vehicle circumferential angular velocity, and Orx and Ory are the position coordinates of the vehicle rotation center Or, respectively. Their calculation formulas are shown in Formulas (8) and (9).
[ r x ( Δ t ) r y ( Δ t ) f x ( Δ t ) f y ( Δ t ) ] = [ cos ω Δ t sin ω Δ t 0 0 sin ω Δ t cos ω Δ t 0 0 0 0 cos ω Δ t sin ω Δ t 0 0 sin ω Δ t cos ω Δ t ] × [ r x O r x r y O r y f x O r x f y O r x ] + [ O r x O r y O r x O r x ]
ω = V × tan δ L
[ O r x O r y ] = [ tan ( tan 1 f y r y f x r x π 2 ) 1 tan ( tan 1 f y r y f x r x δ π 2 ) 1 ] 1 × [ tan ( tan 1 f y r y f x r x π 2 ) × f x f y tan ( tan 1 f y r y f x r x δ π 2 ) × r x r y ]

2.3.2. Design of Turning Control Algorithm

A turning control method based on vehicle turning path prediction was devised on the basis of the path acquisition features of a vehicle navigation system based on machine vision. First, the turning trajectory of the vehicle is anticipated on the basis of the vehicle structure size, front-wheel adjustment parameters, and driving speed. Then, the relative position of the vehicle trajectory in the waiting turning path is determined on the basis of the CCD camera and the steering point. When the vehicle reaches the steering point, the steering wheel angle is adjusted to control the vehicle in order to drive to the predetermined waiting turning path. The angle sensor [53] used to measure the front-wheel rotation angles corresponding to different steering wheel angles is shown in Table 2, and the relationship between steering wheel rotation and front-wheel deflection is fit using the MATLAB data fitting tool. The relationship is shown in Formula (10), and the diagram is shown in Figure 5.
y = a x 4 + b x 3 + c x 2 + d x + e
where a is equal to 0.000004473; b is equal to 0.00001372; c is equal to 0.001587; d is equal to 0.04146; e is equal to 0.001829; y is the front-wheel angle, road; and x is the steering wheel angle, road.
The test object’s steering actuator is a stepping motor gear packing actuator. The structure is shown in Figure 6. It mainly uses a stepping motor as a power source to drive the steering wheel through a gear transmission to realize the front-wheel steering control. The working speed of the stepper motor was 5.3 rad/s, and the gear ratio was 1:4, and therefore the adjustment speed of the steering wheel was 1.33 rad/s. Simultaneously, the front and rear wheelbase l of the vehicle was measured to be 3.25 m, and the maximum deflection angles of the left and right front wheels were 35° and 21°, respectively.
On the basis of the constructed vehicle motion model, the steering actuator is shown in Figure 7, and the angular position relationship between the steering wheel and front wheel and vehicle structural parameters, taking a right turning as an example, we found that the front-wheel adjustment angle was 0°–21°, and the vehicle driving speed was 1.8 km/h; (Δt = 0.01) calculates the turning path of the vehicle. The results are shown in Figure 8.
As shown in Figure 8, the front-wheel adjustment state of the vehicle during cornering is mainly divided into two stages, namely, the angle adjustment stage and the angle fixing stage. RR1 and FF1 stages are steering wheel adjustment stage, and R1R2 and F1F2 stages are steering wheel fixing stage, in which the vehicle makes a circular motion.
The best turning point of the vehicle is when the turning path is tangent to the driving track of the center point of the vehicle’s front axle. Since the angle between the turning path and the current vehicle’s driving track is about 90°, the tangent point between the turning path and vehicle’s driving track is in the F1F2 stage, i.e., steering wheel fixing stage.
We set the turning path expression to y = αx + β, according to the positional relationship when the vehicle trajectory is tangent to the transverse road to be turned when the distance s between the vehicle rotation circle center Or and the turning path in the F1F2 stage are equal to the current vehicle radius r, being the best turning point of the vehicle. The calculation formulas of S and R are as follows:
S = | α O r x O r y + β | α 2 + 1
R = L sin δ
where Orx and Ory are the coordinates of the circle center Or, and L is the vehicle wheelbase, δ front-wheel angle.

3. Results

To verify the feasibility of the vehicle turning control algorithm proposed in this paper, on the basis of the path detection algorithm developed in the early stage, we integrated the steering control algorithm designed in this paper. The industrial computer used was a Lenovo laptop, CPU was i7 6700HQ, memory was 8 G, and the graphics card was GTX960M; VS2013 MFC software development program was used to write the vehicle turning control program. The program interface during the experiment is shown in Figure 9, showing that the camera interface was corrected for distortion, and the red guide line was the path navigation line. The program was mainly composed of three parts: vehicle status, serial port setting, and image source selection. The vehicle status includes four parameters: vehicle deflection angle, front wheel deviation, front wheel deflection angle, and adjustment angle, which were calculated by the data transmitted by the program and sensor. We build a test platform with agricultural vehicles as the test object. Sensors, cameras, laptops, and other equipment were placed and installed as shown in Figure 10a,b, and the industrial computer was placed in the passenger seat. In September 2019, the navigation test site of the North Campus factory of Shihezi University in Xinjiang was used for obtaining the turning path position and vehicle turning test.
As shown in Figure 11a, the CCD camera was installed in the test vehicle’s front, h was 1.2 m above the ground, and the angle between the optical axis and the ground plane θ was 30°. At the same time, the guidance line of the navigation test path, as shown in Figure 11b, was drawn in the test area, wherein the color of the guidance line was yellow, the width of the line was 2.5 cm, and the length of the longitudinal path in the early stage was 20 m. The length of the turning path was 10 m.
During the test, the vehicle was first guided to the early longitudinal path on the basis of the straight-line path tracking algorithm developed in the early stage. Setting Cartesian coordinates at the test site, we recorded the turning track when the vehicle was turning. We found the best turning point according to Formulas (8) and (9). The tangent of the best turning point fell on the established coordinate axis, and the actual measured value was obtained. Then, the path blocking method was adopted to conduct tests on turning paths A, B, and C according to the length of the front bridge. We recorded the vehicle steering motion at the beginning of the path location of turning parameters of CCD camera measurement (slope alpha and intercept beta) and compared it with the actual value; the vehicle turning control accuracy took place at the same time at the front axle length of 1.72 m, with the results being shown in Table 3. At the same time, we obtained the turning path measuring position and actual position parameters of a function, which are shown in Figure 12 and Figure 13. When the front axle length was 1.43 m, the experimental results were as they are shown in Table 4. At the same time, the one-time function of measured position parameters and actual position parameters of the turning path was obtained, as shown in Figure 14 and Figure 15.
According to the relationship between the measured position and the actual position of path A for α and β, the errors were −0.012 and 0.0985, respectively. Similarly, for path B α and β, the errors were −0.008 and 0.1193, respectively. For path C α and β, the errors were −0.004 and 0.0686, respectively. It is known that the greater the angle between the transverse path and the longitudinal path, the smaller the slope error, and the algorithm can meet the requirements of vehicle turning.
According to the relationship between the measured position and the actual position of path A α and β, the errors were −0.014 and 0.0265, respectively. Similarly, for path B α and β, the errors were −0.01 and 0.2525 respectively. For path C α and β, the errors were −0.008 and 0.0435, respectively. It is known that the greater the angle between the transverse path and the longitudinal path, the smaller the slope error, and the algorithm can meet the requirements of vehicle turning.
Conclusions can be drawn from the above experimental results and data analysis when the length of the front axle was 1.72 m. We found from the tangent function expression of the turning path, as shown in Figure 12 and Figure 13, that the turning path predictive control algorithm had good stability when the turning path was 90° and 105°. When the turning path is greater than or equal to 120°, the turning path predictive control algorithm will have obvious errors. Through the experiments, we found that if the turning path is greater than 120°, there will be errors. However, this will not affect the actual effect of turning, and therefore the error is within a reasonable range that the algorithm can control. When the front axle size was 1.43 m, we see from Figure 14 and Figure 15 that when the turning path was greater than 120°, the error decreased, the turning control accuracy of the control algorithm was improved, and the stability of the control algorithm was better. As shown in Table 3 and Table 4, for different front axle sizes, the turning control algorithm proposed in this paper was able to effectively measure the position parameters of the path to be turned and to turn the vehicle in order to identify the turning point. The maximum measurement errors of the slope of the position parameters of the path to be turned were 0.012 and 0.014, respectively, and the maximum measurement errors of the intercept were 0.1193 and 0.2525, respectively. The maximum accuracy of the turning control was 0.093 m and 0.085 m.
Because the testing ground is not an absolute plane, there will be a certain deviation in the position state of the CCD camera relative to the ground during vehicle driving, resulting in a certain error in the measurement of position parameters of the path to be turned. Due to impact of the stability of vehicle speed, there is a certain deviation between the predicted path and the actual paths. The above two errors lead to a certain deviation between the determined turning point and the ideal turning point, and therefore there is a certain error in the vehicle turning control, in which the path position measurement error and steering control error are within the effective range. Hence, it meets the turning control requirements for in-vehicle navigation operation.

4. Discussion

This paper studied the turning control algorithm based on path prediction and mainly discusses path position measurement, vehicle path prediction, and turning point determination based on monocular vision.
(1)
Aiming at the navigation path position measurement problem, firstly, we corrected the camera’s distortion. The imaging model was constructed, and the path position acquisition function was established according to the camera imaging position relationship in order to realize the measurement of path position parameters of working pavement.
(2)
Aiming at the steering point prediction, firstly, we established the vehicle motion model according to the vehicle steering mode, predicted the vehicle turning trajectory according to the vehicle structure size and front-wheel adjustment parameters, and distinguished the steering point according to the path position parameters.

5. Conclusions

The test results show that the vehicle steering control algorithm based on monocular vision proposed in this paper can accurately identify steering points, optimize the field of vision block problem during monocular vision steering, and improve the accuracy of vehicle steering control. The steering control algorithm is mainly based on the steering of the front axle. According to the length of the front axle, the maximum measuring error of the position slope and position parameter of the turning path was found to be 0.014, and the maximum measuring error of the intercept was 0.2525 m. The maximum accuracy of steering control was 0.093 m and 0.085 m, meeting the steering control requirements for in-vehicle navigation operation.
In the future, a comparison with some other experimental studies should be performed, and the following further research should also be conducted. The turning predictive control algorithm, based on machine vision, needs to be improved. On the basis of our test, we found that once there is strong light or deep shadows, the detection of navigation guides becomes unstable and the camera lens deflection is too large to cause bias. The integration of GPS and multi-sensor and deep learning is worth studying and applying at this step. In order to make the cornering prediction algorithm more adaptable, more factors should be considered, such as the influence of speed changes at a certain angle, tire pressure, etc., on vehicle cornering. In future work, these factors can be further incorporated into the scope of algorithm optimization in order to improve the accuracy of cornering predictive control.

Author Contributions

Conceptualization, Y.L. and J.L.; methodology, Y.L. and Q.Y.; software, Y.L.; validation, Y.L., W.Z.; formal analysis, Y.L.; resources, Y.L.; data curation, Y.L.; writing—original draft preparation, Y.L.; writing—review and editing, Y.L. and J.N.; project administration, J.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Innovation and Development Special Projects of Shihezi University (No. CXFZ202103) and the Achievement transformation and Technology Extension projects of Shihezi University (No. CGZH202103).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

Thanks goes to Li Jingbin and Nie Jing from Shihezi University for their guidance on paper writing; Yao Qingwang, Zhou Wenhao, Zhang Chongman, and other students for their help in this experiment; and the lab students for their support and help in this research.

Conflicts of Interest

The authors assert no conflict of interest. There are no inappropriate statements about personal circumstances or interests. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. He, D.; Liu, D.; Zhao, K. Advances in Intelligent Perception and Behavior Detection of Animal Information in Precision Animal Husbandry. Trans. CSAM 2016, 47, 231–244. [Google Scholar] [CrossRef]
  2. Dai, D.-L.; Liu, Z.-H.; Zhao, C.; Qin, Q.; Zhang, C.-Y. Progress of Artificial Intelligence Technology in Animal Husbandry. Anim. Husb. Feed. Sci. 2021, 42, 112–119. [Google Scholar] [CrossRef]
  3. Yang, S.; Liang, S.; Zheng, Y.; Tan, Y.; Xiao, Z.; Li, B.; Liu, X. Integrated Navigation Models of a Mobile Fodder-Pushing Robot Based on a Standardized Cow Husbandry Environment. Trans. ASABE 2020, 63, 221–230. [Google Scholar] [CrossRef]
  4. Ting, Z.; Lai, W. Aircraft Pastoral Operational Navigation Monitoring System Based on GNSS. Electron. World 2016, 15, 114–116. [Google Scholar] [CrossRef]
  5. Yunyong, D. Research on Mapping and Navigation of Environment Inspection Robot in Animal Husbandry Farm. Master’s Thesis, Jiangxi University of Science and Technology, Jiangxi, China, 2020. [Google Scholar]
  6. Nitsche, M.A.; Cristóforis, P.D. Real-Time On-Board Image Processing Using an Embedded GPU for Monocular Vision-Based Navigation; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
  7. Chen, M.; Tang, Y.; Zou, X.; Huang, K.; Huang, Z.; Zhou, H.; Wang, C.; Lian, G. Three-Dimensional Perception of Orchard Banana Central Stock Enhanced by Adaptive Multi-Vision Technology. Comput. Electron. Agric. 2020, 174, 105508. [Google Scholar] [CrossRef]
  8. Gehrig, D.; Rüegg, M.; Gehrig, M.; Hidalgo-Carrió, J.; Scaramuzza, D. Combining Events and Frames Using Recurrent Asynchronous Multimodal Networks for Monocular Depth Prediction. IEEE Robot. Autom. Lett. 2021, 6, 2822–2829. [Google Scholar] [CrossRef]
  9. Zhang, Z.; Cao, Y.; Ding, M.; Zhuang, L.; Tao, J. Monocular Vision Based Obstacle Avoidance Trajectory Planning for Unmanned Aerial Vehicle. Aerosp. Sci. Technol. 2020, 106, 106199. [Google Scholar] [CrossRef]
  10. Xu, C.; Liu, Z.; Li, Z. Robust Visual-Inertial Navigation System for Low Precision Sensors Under Indoor and Outdoor Environments. Remote Sens. 2021, 13, 772. [Google Scholar] [CrossRef]
  11. Huang, L.; Wu, G.; Tang, W.; Wu, Y. Obstacle Distance Measurement Under Varying Illumination Conditions Based on Monocular Vision Using a Cable Inspection Robot. IEEE Access 2021, 99, 55955–55973. [Google Scholar] [CrossRef]
  12. Venkateswaran, N.; Hans, W.J.; Padmapriya, N. Deep Learning Based Robust Forward Collision Warning System with Range Prediction. Multimed. Tool Appl. 2021, 8, 1–19. [Google Scholar] [CrossRef]
  13. Xu, D.; Han, L.; Tan, M.; Li, Y.F. Ceiling-Based Visual Positioning for an Indoor Mobile Robot with Monocular Vision. IEEE Trans. Ind. Electron. 2009, 56, 1617–1628. [Google Scholar] [CrossRef]
  14. Guo, Y.; Xiao, Z.; Chen, H.; Huang, L. Planar-Based Visual Positioning for a Mobile Robot with Monocular Vision. In Proceedings of the International Conference on Cloud Computing and Security, Nanjing, China, 16–18 June 2017. [Google Scholar]
  15. Han, Y.-X.; Zhang, Z.-S.; Dai, M. Monocular Vision System for Distance Measurement Based on Feature Points. Opt. Precis. Eng. 2011, 19, 1082–1087. [Google Scholar] [CrossRef]
  16. Nie, J.; Wang, N.; Li, J.; Wang, K.; Wang, H. Meta-learning prediction of physical and chemical properties of magnetized water and fertilizer based on LSTM. Plant Methods 2021, 17, 1–13. [Google Scholar] [CrossRef]
  17. Nie, J.; Li, Y.; She, S.; Chao, X. Magnetic shielding analysis for arrayed Eddy current testing. J. Magn. 2019, 24, 328–332. [Google Scholar] [CrossRef]
  18. Li, Y.; Yang, J. Few-shot cotton pest recognition and terminal realization. Comput. Electron. Agric. 2020, 169, 105240. [Google Scholar] [CrossRef]
  19. Wang, P.; Xiao, X.; Zhang, Z.; Sun, C. Study on the Position and Orientation Measurement Method with Monocular Vision System. Chin. Opt. Lett. 2010, 8, 55–58. [Google Scholar] [CrossRef]
  20. Royer, E.; Lhuillier, M.; Dhome, M.; Lavest, J. Monocular Vision for Mobile Robot Localization and Autonomous Navigation. Int. J. Comput. Vis. 2007, 74, 237–260. [Google Scholar] [CrossRef] [Green Version]
  21. Wang, M.D.; Han, B.L.; Luo, Q.S. Binocular Visual Navigation and Obstacle Avoidance of Mobile Robots Based on Speeded-Up Robust Features. Comput. Aid. Draft. Des. Manuf. 2013, 23, 22–28. [Google Scholar]
  22. Zhang, Y.; Wang, X.; Astronautics, S.O. Landmark Fixed High-Precision Binocular Visual Navigation Method. J. Beijing Univ. Aeronaut. Astronaut. 2014, 40, 1305–1311. [Google Scholar]
  23. Wang, L.; Zhen, L.; Zhang, Z. An On-Line Calibration Algorithm for External Parameters of Visual System Based on Binocular Stereo Cameras. Proc. SPIE Int. Soc. Opt. Eng. 2014, 9301. [Google Scholar] [CrossRef]
  24. Chen, Q.; Miao, X.; Jiang, H.; Wang, L.; Chen, J. Measurement of Tree Barriers in Transmission Line Corridors Based on Binocular Stereo Vision. In Proceedings of the 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Shenzhen, China, 13–15 December 2018; Volume 2018. [Google Scholar]
  25. Zhao, L.; Peng, T.; Zhao, Y.; Xia, P.; Xie, Y. Features Extraction of Flotation Froth Based on Equivalent Binocular Stereo Vision. IFAC PapersOnLine 2016, 49, 90–95. [Google Scholar] [CrossRef]
  26. Shen, T.; Liu, W.; Wang, J. Distance Measurement System Based on Binocular Stereo Vision. In Proceedings of the First IITA International Joint Conference on Artificial Intelligence, Hainan Island, China, 25–26 April 2009. [Google Scholar]
  27. Zhang, C. Binocular Vision Navigation Method of Marine Garbage Cleaning Robot in Unknown Dynamic Scene. J. Coast. Res. 2020, 103, 864. [Google Scholar] [CrossRef]
  28. Cheng, L.; Dai, Y.; Peng, R.; Nong, X. Positioning and Navigation of Mobile Robot with Asynchronous Fusion of Binocular Vision System and Inertial Navigation System. Int. J. Adv. Robot. Syst. 2017, 14, 172988141774560. [Google Scholar] [CrossRef] [Green Version]
  29. Li, Y.; Nie, J.; Chao, X. Do we really need deep CNN for plant diseases identification? Comput. Electron. Agric. 2020, 178, 105803. [Google Scholar] [CrossRef]
  30. Li, Y.; Chao, X. ANN-Based Continual Classification in Agriculture. Agriculture 2020, 10, 178. [Google Scholar] [CrossRef]
  31. Li, Y.; Yang, J. Meta-learning baselines and database for few-shot classification in agriculture. Comput. Electron. Agric. 2021, 182, 106055. [Google Scholar] [CrossRef]
  32. Wu, J.; Snáel, V.; Abraham, A. A Vision-Based Navigation System of Mobile Tracking Robot. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Taipei, Taiwan, 18–21 August 2010; Volume 2010. [Google Scholar]
  33. Zhang, Z.; Gong, M.; Peng, T. A New Non-Parallel Binocular Stereo Vision Ranging System Using Combinations of Linear and Nonlinear Methods. In Proceedings of the Sixth International Conference on Optical and Photonic Engineering, Shanghai, China, 8–11 May 2018. [Google Scholar]
  34. Nishat, A. Design of PID Controller for High-Order Process via IMC Scheme in Frequency Domain. In International Conference on Computational and Characterization Techniques in Engineering and Sciences; Department of Electrical Engineering—National Institute of Technology: Telengana, India, 2018. [Google Scholar]
  35. Zhou, H.; Chen, R.; Zhou, S.; Liu, Z. Design and Analysis of a Drive System for a Series Manipulator Based on Orthogonal-Fuzzy PID Control. Electronics 2019, 8, 1051. [Google Scholar] [CrossRef] [Green Version]
  36. Urrea, C.; Muñoz, J. Path Tracking of Mobile Robot in Crops. J. Intell. Robot. Syst. 2015, 80, 193–205. [Google Scholar] [CrossRef]
  37. Netto, S.M. Nested PID Steering Control for Lane Keeping in Autonomous Vehicles. Control Eng. Pract. 2011, 19, 1459–1467. [Google Scholar]
  38. Dai, A.; Zhou, X.; Liu, X. Design and Simulation of a Genetically Optimized Fuzzy Immune PID Controller for a Novel Grain Dryer. IEEE Access 2017, 5, 14981–14990. [Google Scholar] [CrossRef]
  39. Li, Y.; Chao, X. Semi-supervised few-shot learning approach for plant diseases recognition. Plant Methods 2021, 17, 1–10. [Google Scholar] [CrossRef] [PubMed]
  40. Li, Y.; Yang, J.; Wen, J. Entropy-based redundancy analysis and information screening. Digit. Commun. Netw. 2021. In press. [Google Scholar] [CrossRef]
  41. Hui, L.; Xingqiao, L.; Jing, L. The Research of Fuzzy Immune Linear Active Disturbance Rejection Control Strategy for Three-Motor Synchronous System. CEAI 2015, 17, 50–58. [Google Scholar]
  42. Li, S.; Xu, H.; Ji, Y.; Cao, R.; Zhang, M.; Li, H. Development of a Following Agricultural Machinery Automatic Navigation System. Comput. Electron. Agric. 2019, 158, 335–344. [Google Scholar] [CrossRef]
  43. Li, Y.; Chao, X. Toward Sustainability: Trade-Off Between Data Quality and Quantity in Crop Pest Recognition. Front. Plant Sci. 2021, 12, 811241. [Google Scholar] [CrossRef]
  44. Li, Y.; Chao, X. Distance-Entropy: An effective indicator for selecting informative data. Front. Plant Sci. 2022, 12, 3167. [Google Scholar] [CrossRef]
  45. Xuan, H.V.; Ivanov, V.E.; Dinh, T.N. Using signals from GLONASS/GPS navigation systems to correct the readings of a digital magnetic compass. ITM Web Conf. 2019, 30, 03006. [Google Scholar] [CrossRef]
  46. Zhang, M.; Ji, Y.; Li, S.; Cao, R.; Xu, H.; Zhang, Z. Research progress of agricultural machinery navigation technology. Trans. CSAM 2020, 51, 1–18. [Google Scholar] [CrossRef]
  47. Xiuri, P. Application of RTK GPS system in intelligent agricultural machinery and equipment. Nan Fang Nong Ji 2021, 52, 84–86. [Google Scholar] [CrossRef]
  48. Yang, Y.; Tiange, W.; Miao, W. Design of unmanned vehicle navigation system based on GPS and four wire lidar. J. Shenyang Univ. Technol. 2020, 39, 13–17, 34. [Google Scholar] [CrossRef]
  49. Yang, R.; Wang, G.; Gao, W.; Sun, Q.; Zhang, Y. An anti-interference MIMU/GPS vehicle integrated navigation algorithm based on IDNN-EKF. In Proceedings of the Position, Location and Navigation Symposium IEEE, Savannah, GA, USA, 11–14 April 2016. [Google Scholar]
  50. Wu, D.; Wang, L. Research on autonomous navigation and positioning agricultural machinery equipment based on GPS and machine vision. J. Agric. Mech. Res. 2018, 40, 221–225. [Google Scholar] [CrossRef]
  51. Lingyu, M.; Lei, M.; Tao, Z. Full integrated navigation based on INS/GPS/magnetometer. Sens. Microsyst. 2019, 38, 33–35, 39. [Google Scholar] [CrossRef]
  52. Zhang, T.; Li, H.; Chen, D.; Huang, P.; Zhuang, X. Agricultural vehicle path tracking and navigation system based on multi-source sensor information fusion. Trans. CSAM 2015, 46, 37–42. [Google Scholar] [CrossRef]
  53. Beijing Torch Sensor Tech. Co., Ltd. Available online: http://www.tcsensor.com/product/?12-1.html (accessed on 7 January 2022).
Figure 1. Camera calibration diagram.
Figure 1. Camera calibration diagram.
Processes 10 00417 g001
Figure 2. Camera imaging model.
Figure 2. Camera imaging model.
Processes 10 00417 g002
Figure 3. Position relationship diagram. (a) Y-V relationship diagram. (b) X-U relationship diagram.
Figure 3. Position relationship diagram. (a) Y-V relationship diagram. (b) X-U relationship diagram.
Processes 10 00417 g003
Figure 4. Coordinate system position relationship diagram.
Figure 4. Coordinate system position relationship diagram.
Processes 10 00417 g004
Figure 5. Vehicle steering diagram. (a). Ackerman steering principal diagram. (b). Vehicle two-wheel steering diagram.
Figure 5. Vehicle steering diagram. (a). Ackerman steering principal diagram. (b). Vehicle two-wheel steering diagram.
Processes 10 00417 g005
Figure 6. Fitting diagram of the relationship between the steering wheel and front-wheel angle.
Figure 6. Fitting diagram of the relationship between the steering wheel and front-wheel angle.
Processes 10 00417 g006
Figure 7. Steering actuator.
Figure 7. Steering actuator.
Processes 10 00417 g007
Figure 8. Vehicle turning trajectory prediction.
Figure 8. Vehicle turning trajectory prediction.
Processes 10 00417 g008
Figure 9. Program interface.
Figure 9. Program interface.
Processes 10 00417 g009
Figure 10. Equipment layout. (a) Fixed position of equipment. (b) Position of angle sensor.
Figure 10. Equipment layout. (a) Fixed position of equipment. (b) Position of angle sensor.
Processes 10 00417 g010
Figure 11. Turning test chart. (a) Camera installation drawing. (b) Schematic of the test path.
Figure 11. Turning test chart. (a) Camera installation drawing. (b) Schematic of the test path.
Processes 10 00417 g011
Figure 12. Measurement position of turning path (length of the front axle is 1.72 m).
Figure 12. Measurement position of turning path (length of the front axle is 1.72 m).
Processes 10 00417 g012
Figure 13. Actual position of turning path (length of the front axle is 1.72 m).
Figure 13. Actual position of turning path (length of the front axle is 1.72 m).
Processes 10 00417 g013
Figure 14. Measurement position of turning path (length of the front axle is 1.43 m).
Figure 14. Measurement position of turning path (length of the front axle is 1.43 m).
Processes 10 00417 g014
Figure 15. Actual position of turning path (length of the front axle is 1.43 m).
Figure 15. Actual position of turning path (length of the front axle is 1.43 m).
Processes 10 00417 g015
Table 1. Camera Calibration Parameters.
Table 1. Camera Calibration Parameters.
Camera Calibration Parameters
k1−0.345128826247987
k20.191077749729993
p12.450580059939514 × 10−4
p2−1.939065402701762 × 10−4
Table 2. Angle sensor parameters.
Table 2. Angle sensor parameters.
Range360° Measure, Sinusoidal OutputRepeatability±0.05%
Linear range±45° ±30° ±20°Sensitivity≈40 mV/1° (Vin = 5 V)
Working voltageDC 5 V (DC 3.8~8 V)Working current<20 mA
Output1~4 VRotation modeContinuous rotation
Temperature range0~95% RHService temperature−25° C~+75° C
Table 3. Test results.
Table 3. Test results.
Lateral to Turn the PathMeasuring PositionActual LocationPath Measurement ErrorSteering Accuracy
αβαβαβ
A1.5469−6.6411.5589−6.7395−0.0120.09850.093
B1.8290−8.1301.8298−8.2493−0.0080.11930.075
C2.356−8.74322.3600−8.8118−0.0040.06860.061
The average1.9106−7.83811.9162−7.9335−0.0060.09550.084
The maximum2.356−8.74322.36−8.8118−0.0040.11930.093
Table 4. Test results.
Table 4. Test results.
Lateral to Turn the PathMeasuring PositionActual LocationPath Measurement ErrorSteering Accuracy
αβαβαβ
A1.5165−6.3531.5179−6.3795−0.0140.02650.085
B1.7982−7.9801.8102−8.2325−0.0120.25250.072
C2.314−8.1352.320−8.1785−0.0060.04350.083
The average1.8648−7.48931.8737−7.7175−0.0110.10750.08
The maximum2.285−8.1352.293−8.356−0.0140.25250.085
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, Y.; Li, J.; Yao, Q.; Zhou, W.; Nie, J. Research on Predictive Control Algorithm of Vehicle Turning Path Based on Monocular Vision. Processes 2022, 10, 417. https://0-doi-org.brum.beds.ac.uk/10.3390/pr10020417

AMA Style

Li Y, Li J, Yao Q, Zhou W, Nie J. Research on Predictive Control Algorithm of Vehicle Turning Path Based on Monocular Vision. Processes. 2022; 10(2):417. https://0-doi-org.brum.beds.ac.uk/10.3390/pr10020417

Chicago/Turabian Style

Li, Yufeng, Jingbin Li, Qingwang Yao, Wenhao Zhou, and Jing Nie. 2022. "Research on Predictive Control Algorithm of Vehicle Turning Path Based on Monocular Vision" Processes 10, no. 2: 417. https://0-doi-org.brum.beds.ac.uk/10.3390/pr10020417

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop