Next Article in Journal
Partial Discharges and Noise Discrimination Using Magnetic Antennas, the Cross Wavelet Transform and Support Vector Machines
Next Article in Special Issue
A “Global–Local” Visual Servo System for Picking Manipulators
Previous Article in Journal
Design of a Miniaturized Rectangular Multiturn Loop Antenna for Shielding Effectiveness Measurement
Previous Article in Special Issue
Hyperspectral Classification of Cyperus esculentus Clones and Morphologically Similar Weeds
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Machine Vision-Based Method for Monitoring Broiler Chicken Floor Distribution

1
Department of Poultry Science, College of Agricultural & Environmental Sciences, University of Georgia, Athens, GA 30602, USA
2
College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling, Shaanxi 712100, China
3
U.S. National Poultry Research Center, USDA-ARS, Athens, GA 30605, USA
*
Author to whom correspondence should be addressed.
Submission received: 8 May 2020 / Revised: 26 May 2020 / Accepted: 1 June 2020 / Published: 3 June 2020
(This article belongs to the Collection Sensors in Agriculture and Forestry)

Abstract

:
The proper spatial distribution of chickens is an indication of a healthy flock. Routine inspections of broiler chicken floor distribution are done manually in commercial grow-out houses every day, which is labor intensive and time consuming. This task requires an efficient and automatic system that can monitor the chicken’s floor distributions. In the current study, a machine vision-based method was developed and tested in an experimental broiler house. For the new method to recognize bird distribution in the images, the pen floor was virtually defined/divided into drinking, feeding, and rest/exercise zones. As broiler chickens grew, the images collected each day were analyzed separately to avoid biases caused by changes of body weight/size over time. About 7000 chicken areas/profiles were extracted from images collected from 18 to 35 days of age to build a BP neural network model for floor distribution analysis, and another 200 images were used to validate the model. The results showed that the identification accuracies of bird distribution in the drinking and feeding zones were 0.9419 and 0.9544, respectively. The correlation coefficient (R), mean square error (MSE), and mean absolute error (MAE) of the BP model were 0.996, 0.038, and 0.178, respectively, in our analysis of broiler distribution. Missed detections were mainly caused by interference with the equipment (e.g., the feeder hanging chain and water line); studies are ongoing to address these issues. This study provides the basis for devising a real-time evaluation tool to detect broiler chicken floor distribution and behavior in commercial facilities.

1. Introduction

In commercial poultry houses, animal floor uniformity and distribution in drinking, feeding, and resting zones are critical for evaluating flock production, animal health, and wellbeing. The proper distribution of chickens is an indication of a healthy flock. Currently, daily routine inspections of broiler flock distributions are done manually in commercial grow-out houses, which is labor intensive and time consuming. This task requires an efficient system that can monitor chicken floor distribution and behavior automatically, to provide information for the early detection of potential problems [1,2,3].
Noncontact and nondestructive monitoring methods such as the machine vision-based technology (MVT) have been suggested and tested to monitor poultry and livestock behavior and for individual identification [4,5,6,7,8,9]. MVT has also been used to evaluate welfare status (e.g., lameness, estrus, pecking, etc.) [10,11,12,13], and for body size or weight assessments [14,15]. For poultry housing, different versions of MVT have been tested for the identification of specific behaviors under given scenarios (e.g., feeding and drinking as affected by environmental factors or enrichments) and general group behavior (e.g., activity index and locomotion) with or without assistance from other sensors (e.g., Radio-frequency identification and accelerometers) [16,17,18,19,20,21]. However, most existing procedures have limitations or high levels of uncertainty in monitoring group chicken behavior and distribution in the different feeding, drinking, and resting zones due to higher animal density (>10,000 broiler chickens in a commercial facility) compared to other animal (e.g., cattle and swine) facilities [7,8,22]. The “optical flow” method for measuring broiler welfare based on optical flow statistics of flock movements recorded on video [19,22,23], and the “eYeNamic” system for gait score monitoring in broiler houses [17,24], are the most common vision-based methods. These systems represent the first proof of concept for broiler welfare evaluation via computer or machine vision-based methods. However, there is no method, to our knowledge, with a high level of efficiency and accuracy for monitoring group bird behavior or for tracking individual birds in a commercial broiler house setting.
The back propagation (BP) neural network algorithm is a multilayer feedforward network trained according to the error back propagation algorithm [25]. This algorithm is one of the most widely applied neural network models. BP networks can be used to learn and store a great deal of mapping relation data via an input-output model; there is no need to disclose in advance the mathematical equation that describes these mapping relations. In recent years, a BP neural network algorithm was tested and showed a high level of accuracy in monitoring the production and behavior of large animals, e.g., pigs’ pen floor behavior recognition, rating, and body weight prediction [26,27,28,29]. Applying this method to monitor poultry requires modifications to the algorithm and training with a large number of animal images reflecting changes in body size.
The objectives of this study were to: 1) develop a machine vision-based method for monitoring broiler chicken floor distribution (i.e., real-time number of birds in the drinking, feeding, and resting zones); 2) train the BP neural network model with broiler chicken images collected at different ages; and 3) test the new machine vision-based method in terms of its ability to identify the distribution of broiler chickens in the feeding and drinking zones of a research poultry facility.

2. Materials and Methods

2.1. Experimental Setup and Data Collection

This study was conducted in an experimental facility (Figure 1) at the Poultry Research Center at the University of Georgia, Athens, GA, USA. Six identical pens measuring 1.84 L × 1.16 W m were used to raise Cobb 500 broiler chickens (21 broilers per pen) from d1 to d49 (from November 26, 2019 to January 14, 2020). Each pen was monitored with a high definition (HD) camera (PRO-1080MSFB, Swann Communications, Santa Fe Springs, CA) mounted on the ceiling (2.5 m above floor) to capture video (15 frame/s with the resolution of 1440 × 1080 pixels). Videos were saved as avi files in a video recorder (DVR-4580, Swann Communications, Santa Fe Springs, CA) (Figure 1). The collected videos were transferred to a data station in an office every three day. The files were later converted to images using MATLAB-R2019b.
For the machine-vision based method to recognize the bird distributions in the images, the pen floor was divided virtually into drinking, feeding, and rest/exercise zones (Figure 2). Broilers were raised antibiotic-free on reused litter made of pine shavings. Husbandry and management (e.g., feeding, drinking, lighting, bedding, and house air temperature) followed the US industry standard protocols and approval was obtained from the Institutional Animal Care and Use Committee (IACUC) at the University of Georgia.

2.2. Method for Target (Chicken) Detection

The target detection method in the current study was developed based on the color space classification in the MATLAB tool box (MATLAB-R2019b, The MathWorks, Inc., Natick, MA). The MATLAB tool box provides a number of different color space classification choices such as L* a* b* (LAB) and RGB (Red, Green, Blue) for RG, RB, and GB. We compared the visualization effect of different color space classification methods (Figure 3). The GB method was used in this study, as it (Figure 3d) had higher classification and visualization efficiencies (e.g., processing time and target extraction) than other strategies.
Two-dimensional Otsu processing was applied to convert the original images into binary images [30]. Figure 2 (an original image) was used as an example to show image processing steps of removing the nontarget area (Figure 4a), the generation of binary image (Figure 4b), and a further process with morphological corrosion to remove the background (Figure 4c). It can be observed that the nipple drinker pipe and the hanging chain of the tube feeder were blocking the top view images of the chickens on the floor.
In addition, the current method (i.e., the integration of GB color space and two-dimensional Otsu processing) was compared, in terms of its image processing speed and visualization efficiency, to K-means [31] and Fuzzy C-Means (FCM) [32], two widely used classical clustering algorithms to identify static or mobile targets. The K-means algorithm takes k as the parameter to divide n objects into k clusters, such that the cluster has a high degree of similarity, while the similarities between clusters are low [31]. FCM is a clustering method based on fuzzy sets which determines the subordination of each data point to a center by membership degree [32].

2.3. Method for Counting Broiler Chickens

The top view area-based animal recognition method was used to determine the number of broilers in different zones on the floor [33]. For a machine vision-based method, recognizing chickens and their numbers in an image is based on chicken profile and specific area size, because they tend to congregate together. The images collected each day were analyzed separately to avoid biases in each of the top view images caused by changes in the body weight/size over time. Different chicken profiles were randomly selected from images collected each day. The area of each chicken in the image was quantified first, then the average was used as the reference area of the day to estimate the number of chickens on the house floor, as expressed in Equation (1).
s i ¯ = s i s   ,   1 i n
where s i ¯ is the area value of the ith area after normalization, s i is the area value of the ith area in the image, s is the reference value of a single broiler area, and n is the number of areas detected in the image.
To determine the chicken number in each zone, the specific BP neural network algorithm was developed and applied by referring existing methods [29,34]. A BP network is a typical supervised neural network classifier which performs the function of linear or nonlinear mapping from input to output to automatically extract reasonable solving rules through learning; it has certain generalization abilities [25]. In the current study, the newly modified BP neural network model comprised an input layer (area value of a chicken), a hidden layer, an output layer (the number of chicken), and a node connection between the layers. On the MATLAB-R2019b software platform (MathWorks, Natick, MA), the Feedforwardnet function was used to build a BP neural network.
We manually selected high quality images collected during d18–d35 (1926 images in total) to train the model to identify broiler chickens in the drinking and feeding zones. Then, an additional 196 randomly selected images between d18 and d35 were used for verification. The quantification method is expressed in Equations (2)–(6). According to different zones defined in Figure 2, chickens were considered to be in the drinking or feeding zone when more than 50% of their body was quantified in the zone.
D n u m = i = 1 n r o u n d ( A d i A s c × N i )
F n u m = j = 1 m r o u n d ( A f j A s c × N j )
R a c c u r a c y = T n u m T t r u e n u m
R m i s s = T m i s s T t r u e n u m
R f a l s e = T f a l s e T t r u e n u m
where Dnum is the number of broiler chickens detected in the drinking zone, n is the number of extracted areas containing chicken/chickens in the drinking zone, Adi is the size of the ith extracted area containing chicken/chickens in the drinking zone, Acs is the standardized area size of a chicken in the image, Ni is the number of broiler chicken associated area detected in the Adi, Fnum is the number of broiler chickens detected in the feeding zone, m is the number of extracted areas containing chicken/chickens in the feeding zone, Afj is the size of the j-th extracted area containing chicken/chickens in the feeding zone, Nj is the number of broiler chicken associated area detected in the Adj, Raccuracy is the accuracy rate of the number of broilers in the drinking area or feeding area, Tnum is the number of broiler chickens detected automatically within the drinking or feeding zone, Ttruenum is the true number of broiler chickens in the drinking or feeding zone, Rmiss is the missed detection rate, Tmiss is the number of missed detections, Rfalse is the false detection rate, and Tfalse is the number of false detections.

2.4. Evaluation Criteria and Statistical Analysis

In order to measure the calculated deviation (i.e., difference between true and predicted results) and test the new BP model in chicken number determination, three test criteria, i.e., correlation coefficient R, mean square error (MSE), and mean absolute error (MAE), were applied [29].
R = 1 i = 1 n ( y i y i ^ ) 2 ( y i y i ¯ ) 2
MSE   = i = 1 n ( y i y i ^ ) 2 n
MAE   = 1 n i = 1 n | y i y i ^ | y i
where, y i is the actual number of broilers, y i ^ is the number obtained by model fitting, y i ¯ is the average of the actual number, and n is the number of samples.
A one-way ANOVA (MATLAB-R2019b) was used to test if there were significant differences in the detection speed of broiler chickens on the same image between the method developed in the current study (i.e., integration of GB Color Space and two-dimensional Otsu processing), K-means, and FCM. The effect was considered to be significant when the p-value was less than 0.05.

3. Results and Discussions

3.1. Individual Chicken Identification

Images collected on d18, d24, and d30 were randomly selected (30 images) to test the GB color space classification for chicken identification. Figure 5 shows the identification results (i.e., profile extraction of chicken/chickens) of the current method and the comparison with two other existing methods, i.e., K-means and FCM. The current method extracted individual animals from original images with a visualization efficiency (e.g., clearness and completeness of chicken areas) similar to FCM method, but a better visualization effect than K-means method (p < 0.001). However, the current method required less time (clustering speed) than FCM (p < 0.001) (Table 1). According to Figure 5B, some target areas (i.e., chicken profiles) were lost during imaging processing with the K-means method.

3.2. Broiler Chicken BP Model Building/Training Results

From the images collected from d18 to d35 of chicken age, the video segments (6–8 min) in each hour were used to train the BP neural network model. Finally, about 1926 images were used to obtain 19,988 target chicken areas. Among these areas, about 6896 target areas qualified as good (i.e., area without overcrowding or occlusion) for building/training the broiler chicken BP network model. The ratios of vectors for model training, validation, and testing were 0.70, 0.15, and 0.15, respectively. The newly trained BP model had an MSE of 0.038, MAE of 0.178, and R of 0.996 (Figure 6).

3.3. Chicken Distribution Identification with BP Model

3.3.1. Total Chicken Numbers Identification

Figure 7 shows the number of broiler chickens automatically counted by the BP model. Originally, there were 21 chickens, but two were sampled from each pen for health evaluation, leaving 19 in each pen. The BP model could identify the chicken distribution at different ages correctly by quantifying the number of chickens in each zone (Figure 7a,b). However, broiler chickens could be blocked by equipment including the feeder hanging chains and the water lines from the top view images (Figure 7c,d).
The interference could be fixed after image background processing only if a part of a body was blocked (e.g., a chicken blocked by feeder chain was still recognizable, Figure 7a). Sometimes, the issue cannot be fixed if the chicken is mostly obscured (>50% area) (e.g., one chicken was missed in Figure 7b,d, respectively). The interference issues caused by physical barriers can be solved by using multiple cameras or a mobile camera. In addition, broiler chickens were observed spreading their wings and crowded together during data collection, which affected the accuracy of chicken counting (Figure 7c). For instance, there were only two broiler chickens in the bottom left area of Figure 7c, with one chicken spreading its wings, but this was misidentified as five chickens due to the expansion of the target area. Therefore, there is a need to improve the chicken identification efficiency of this model before counting can be applied in a commercial setting, where thousands of birds are usually crowded together.

3.3.2. Chicken Distribution Identification in Drinking and Feeding Zones

Figure 8 shows the distribution of broiler chickens in feeding and drinking zones identified by the BP model. For the same image, the model analyzed the total number of chickens in the pen first (Figure 8a), and then quantified their distribution in each zone (Figure 8b). From Figure 8b, we can ascertain that there were two birds in the drinking zone and 11 in the feeding zone. The rest (19 − 2 − 11 = 6) were considered to be in rest zone.
Table 2 shows details of the distribution analysis of 196 images randomly selected from images collected between d18-d35 to verify the BP method. The identification accuracy rates for broiler chicken distribution in the drinking and feeding zones were 0.9419 and 0.9544, respectively. The missed detection rates were 0.0626 and 0.0498, respectively; this was primarily caused by chicken crowding behavior and occlusion problems from the feeder hanging chains and water system (Figure 8). The false detection rates were 0.0045 and 0.0037, respectively. The current method was developed using healthy chickens. However, in the future, the model will be applied to chickens raised under several conditions (e.g., diseases, heat stress, etc.) in order to develop real-time welfare-status evaluation ability.
It is a technical challenge to apply only machine vision-based methods to judge individual eating or drinking behavior in commercial broilers houses with thousands of chickens. In the current study, we focused on the floor distribution patterns (i.e., real-time counts of bird numbers in the feeding and drinking zones), as this is technically quantifiable and the information is correlated to animal health and welfare, as birds with underlying conditions such as lameness or high gait score tend to be less active and stay closer to feeders/drinkers due to their limited mobility [10,13]. Our method is different from most existing methods. The “eYeNamic” system, for instance, evaluates a gait score based on a group activity index quantified according to pixel changes in continuously recorded photos/videos. The broiler house is a physically challenging environment for vision-based monitoring due to equipment interference and dust levels [35,36], and usually leads to poor image quality and results. In the long-term, we plan to develop a mobile imaging system equipped with a global positioning system (GPS) to track individual birds for their welfare/health based on their distribution patterns.

4. Conclusions

A machine vision-based method was developed and tested to identify broiler chicken floor distributions, including the total number of chickens on the house floor and their distribution in drinking and feeding zones. The following observations and conclusions were made from the current study:
(1)
Advanced image processing techniques of GB color space and two-dimensional Otsu processing were integrated for image processing, which has a faster clustering speed than most existing methods (e.g., K-means and FCM) (p < 0.001);
(2)
The BP neutral network model developed to count the total number of birds on the floor and their distribution in feeding and drinking zones had a correlation coefficient (R), mean square error (MSE), and mean absolute error (MAE) of 0.996, 0.038, and 0.178, respectively;
(3)
The machine vison-based method was tested with an accuracy rate of 0.9419 and 0.9544, respectively. The missed detections were primarily caused by facility interferences such as feeder hanging chains and water lines in the chicken images. These issues can be solved by using multiple cameras or a mobile imaging operation.
Work is ongoing to identify chicken head posture and distance from feeder, drinker and litter floor in order to evaluate real-time feeding, drinking, and foraging behaviors. The current findings provide the basis for the development of an automatic system to monitor poultry floor distributions and behavior in a commercial production system.

Author Contributions

Y.G., L.C., S.E.A., and A.O. came up the research thoughts; Y.G., L.C., S.E.A., and A.O. figured out the research methodology; S.E.A., and A.O. designed poultry biological experiment; Y.G., L.C, S.E.A., A.O., and J.J. managed research resources; Y.G. developed the algorithm for data analysis; S.E.A. and J.J. managed chickens; Y.G., L.C., S.E.A. and A.O, J.J., and G.Z. collected the original data; Y.G. and L.C analyzed the data; Y.G. and L.C verified the results; Y.G., L.C, S.E.A., and A.O. wrote the manuscript. All authors have reviewed and agreed to the submission of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

USDA-ARS cooperative grants for the University of Georgia (58-6040-6-030 and 58-6040-8-034). USDA Agricultural Research Service project number: 6040-32000-010-00-D.

Acknowledgments

This study was partially supported by a cooperative grant 58-6040-6-030 (Lilong Chai) and 58-6040-8-034 (S. E. Aggrey) from the United State Department of Agriculture-Agriculture Research Service and the International Collaboration Research Grant (Office of Global Programs, College of Agricultural and Environmental Sciences, University of Georgia) awarded to Lilong Chai. Yangyang Guo thanks to the scholarship from China Scholarship Council to study at the University of Georgia, USA. The funders have no role in the study design and implementation, data collection and analysis, decision to publish or preparation of this manuscript. We thank Marie Milfort for her technical assistance.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  1. Ben Sassi, N.; Averós, X.; Estevez, I. Technology and poultry welfare. Animals 2016, 6, 62. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Wang, K.Y.; Zhao, X.Y.; He, Y. Review on noninvasive monitoring technology of poultry behavior and physiological information. Trans. Chin. Soc. Agr. Eng. 2017, 33, 197–209. [Google Scholar]
  3. Li, N.; Ren, Z.; Li, D.; Zeng, L. Automated techniques for monitoring the behaviour and welfare of broilers and laying hens: Towards the goal of precision livestock farming. Animal 2020, 14, 617–625. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Aydin, A.; Cangar, O.; Ozcan, S.E.; Bahr, C.; Berckmans, D. Application of a fully automatic analysis tool to assess the activity of broiler chickens with different gait scores. Comput. Electron. Agr. 2010, 73, 194–199. [Google Scholar] [CrossRef]
  5. Porto, S.M.; Arcidiacono, C.; Anguzza, U.; Cascone, G. The automatic detection of dairy cow feeding and standing behaviours in free-stall barns by a computer vision-based system. Biosyst. Eng. 2015, 133, 46–55. [Google Scholar] [CrossRef]
  6. Lao, F.D.; Teng, G.H.; Li, J.; Yu, L.; Li, Z. Behavior recognition method for individual laying hen based on computer vision. Trans. Chin. Soc. Agr. Eng. 2012, 28, 157–163. [Google Scholar]
  7. Lao, F.D.; Du, X.D.; Teng, G.H. Automatic Recognition Method of Laying Hen Behaviors Based on Depth Image Processing. Trans. Chin. Soc. Agr. Eng. 2017, 48, 155–162. [Google Scholar]
  8. Guo, Y.; He, D.; Chai, L. A Machine Vision-Based Method for Monitoring Scene-Interactive Behaviors of Dairy Calf. Animals 2020, 10, 190. [Google Scholar] [CrossRef] [Green Version]
  9. Pereira, D.F.; Miyamoto, B.C.; Maia, G.D.; Sales, G.T.; Magalhães, M.M.; Gates, R.S. Machine vision to identify broiler breeder behavior. Comput. Electron. Agr. 2013, 99, 194–199. [Google Scholar] [CrossRef]
  10. Aydin, A. Development of an early detection system for lameness of broilers using computer vision. Comput. Electron. Agr. 2017, 136, 140–146. [Google Scholar] [CrossRef]
  11. Guo, Y.; Zhang, Z.; He, D.; Niu, J.; Tan, Y. Detection of cow mounting behavior using region geometry and optical flow characteristics. Comput. Electron. Agr. 2019, 163, 104828. [Google Scholar] [CrossRef]
  12. Viazzi, S.; Ismayilova, G.; Oczak, M.; Sonoda, L.T.; Fels, M.; Guarino, M.; Vranken, E.; Hartung, J.; Bahr, C.; Berckmans, D. Image feature extraction for classification of aggressive interactions among pigs. Comput. Electron. Agr. 2014, 104, 57–62. [Google Scholar] [CrossRef]
  13. Aydin, A. Using 3D vision camera system to automatically assess the level of inactivity in broiler chickens. Comput. Electron. Agr. 2017, 135, 4–10. [Google Scholar] [CrossRef]
  14. Lu, M.; Norton, T.; Youssef, A.; Radojkovic, N.; Fernández, A.P.; Berckmans, D. Extracting body surface dimensions from top-view images of pigs. Int. J. Agr. Biol. Eng. 2018, 11, 182–191. [Google Scholar] [CrossRef] [Green Version]
  15. Mortensen, A.K.; Lisouski, P.; Ahrendt, P. Weight prediction of broiler chickens using 3D computer vision. Comput. Electron. Agr. 2016, 123, 319–326. [Google Scholar] [CrossRef]
  16. Leroy, T.; Vranken, E.; Van Brecht, A.; Struelens, E.; Sonck, B.; Berckmans, D. A computer vision method for on-line behavioral quantification of individually caged poultry. Trans. ASABE 2006, 49, 795–802. [Google Scholar] [CrossRef]
  17. Fernandez, A.P.; Norton, T.; Tullo, E.; van Hertem, T.; Youssef, A.; Exadaktylos, V.; Vranken, E.; Guarino, M.; Berckmans, D. Real-time monitoring of broiler flock’s welfare status using camera-based technology. Biosyst. Eng. 2018, 173, 103–114. [Google Scholar] [CrossRef]
  18. Pu, H.; Lian, J.; Fan, M. Automatic recognition of flock behavior of chickens with convolutional neural network and kinect sensor. Int. J. Pattern Recogn. 2018, 32, 1850023. [Google Scholar] [CrossRef]
  19. Dawkins, M.S.; Cain, R.; Merelie, K.; Roberts, S.J. In search of the behavioural correlates of optical flow patterns in the automated assessment of broiler chicken welfare. Appl. Anim. Behave. Sci. 2013, 145, 44–50. [Google Scholar] [CrossRef]
  20. Li, L.; Zhao, Y.; Oliveira, J.; Verhoijsen, W.; Liu, K.; Xin, H. A UHF RFID system for studying individual feeding and nesting behaviors of group-housed laying hens. Trans. ASABE 2017, 60, 1337–1347. [Google Scholar] [CrossRef] [Green Version]
  21. Nakarmi, A.D.; Tang, L.; Xin, H. Automated tracking and behavior quantification of laying hens using 3D computer vision and radio frequency identification technologies. Trans. ASABE 2014, 57, 1455–1472. [Google Scholar]
  22. Dawkins, M.S.; Lee, H.J.; Waitt, C.D.; Roberts, S.J. Optical flow patterns in broiler chicken flocks as automated measures of behaviour and gait. Appl. Anim. Behav. Sci. 2009, 119, 203–209. [Google Scholar] [CrossRef]
  23. Dawkins, M.S.; Cain, R.; Roberts, S.J. Optical flow, flock behaviour and chicken welfare. Anim. Behav. 2012, 84, 219–223. [Google Scholar] [CrossRef]
  24. Kashiha, M.; Pluk, A.; Bahr, C.; Vranken, E.; Berckmans, D. Development of an early warning system for a broiler house using computer vision. Biosyst. Eng. 2013, 116, 36–45. [Google Scholar] [CrossRef]
  25. Li, J.; Cheng, J.H.; Shi, J.Y.; Huang, F. Brief introduction of back propagation (BP) neural network algorithm and its improvement. In Proceedings of the Advances in Computer Science and Information Engineering, Zhengzhou, China, 19–20 May 2012. [Google Scholar]
  26. Wang, C.Z.; Wang, D.; Zhang, H.H.; Zhang, Y. Research on pig’s behavior recognition based on attitude angle. J. Yangzhou Univ. 2016, 37, 43–48. [Google Scholar]
  27. Wang, H.W.; Yin, Y.H.; Liu, Y.G. Apply BP neural network on synthesis evaluation of living pig. Microelectron. Comput. 2006, 12, 33. [Google Scholar]
  28. Zheng, L.M.; Tian, L.j.; Zhu, H.; Wang, W.; Yu, B.; Liu, Y.G.; Lin, Z.; Tang, Y. Study on pork grade evaluation of BP neural network based on MATLAB. Appl. Res. Comput. 2008, 25, 1642–1644. [Google Scholar]
  29. Wang, L.; Sun, C.H.; Li, W.Y.; Ji, Z.T.; Zhang, X.; Wang, Y.Z.; Lei, P.; Yang, X.T. Establishment of broiler quality estimation model based on depth image and BP neural network. Trans. Chin. Soc. Agr. Eng. 2017, 33, 199–205. [Google Scholar]
  30. Fan, J.L.; Zhao, F. Two-dimensional Otsu’s curve thresholding segmentation method for gray-level images. Acta. Electron. Sin. 2007, 35, 751. [Google Scholar]
  31. Ray, S.; Turi, R.H. Determination of number of clusters in k-means clustering and application in colour image segmentation. In Proceedings of the 4th International Conference on Advances in Pattern Recognition and Digital Techniques, New Delhi, India, 27–29 December 1999; pp. 137–143. [Google Scholar]
  32. Chuang, K.S.; Tzeng, H.L.; Chen, S.; Wu, J.; Chen, T.J. Fuzzy c-means clustering with spatial information for image segmentation. Comput. Med. Imag. Grap. 2006, 30, 9–15. [Google Scholar] [CrossRef] [Green Version]
  33. Li, G.; Zhao, Y.; Chesser, G.D.; Lowe, J.W.; Purswell, J.L. Image processing for analyzing broiler feeding and drinking behaviors. In Proceedings of the 2019 ASABE Annual International Meeting (p. 1), Boston, MA, USA, 7–10 July 2019. [Google Scholar]
  34. Xu, H.F.; Chen, H.Y.; Yuan, K. A BP Neural Network-Based Automatic Windshield Wiper Controller. Adv. Mater. Res. 2012, 482, 31–34. [Google Scholar] [CrossRef]
  35. Chai, L.; Xin, H.; Oliveira, J.; Wang, Y.; Wang, K.; Zhao, Y. Dust suppression and heat stress relief in cage-free hen housing. In Proceedings of the 10th International Livestock Environment Symposium (ILES X) (Paper No. ILES18-012), Omaha, NE, USA, 25–27 September 2018. [Google Scholar]
  36. Chai, L.; Xin, H.; Wang, Y.; Oliveira, J.; Wang, K.; Zhao, Y. Mitigating particulate matter generations of a commercial cage-free henhouse. Trans. ASABE 2019, 62, 877–886. [Google Scholar] [CrossRef]
Figure 1. Experimental setup for broiler chicken image data collection.
Figure 1. Experimental setup for broiler chicken image data collection.
Sensors 20 03179 g001
Figure 2. A top view of a pen and zone definition. The red box (1) represents the drinking area: the center of the nipple drinker is the center of the drinking area and its width is defined as one body length of a three-week-old broiler chicken; the yellow circle (2) represents the feeding area: the center of the tube feeder is the feeding area center; the radius of feeding zone is the tube feeder radius plus the body length of a three-week-old broiler chicken; and the cyan box (3) represents the overall detection area of the pen, so any area not included in drinking and feeding zones will be considered as the rest/exercise zone.
Figure 2. A top view of a pen and zone definition. The red box (1) represents the drinking area: the center of the nipple drinker is the center of the drinking area and its width is defined as one body length of a three-week-old broiler chicken; the yellow circle (2) represents the feeding area: the center of the tube feeder is the feeding area center; the radius of feeding zone is the tube feeder radius plus the body length of a three-week-old broiler chicken; and the cyan box (3) represents the overall detection area of the pen, so any area not included in drinking and feeding zones will be considered as the rest/exercise zone.
Sensors 20 03179 g002
Figure 3. Comparison between different color spaces ((a) LAB, (b) RG, (c) RB, and (d) GB) in the classification process.
Figure 3. Comparison between different color spaces ((a) LAB, (b) RG, (c) RB, and (d) GB) in the classification process.
Sensors 20 03179 g003
Figure 4. The image processing steps. (a) Nontarget area removal; (b) Generation of binary image; (c) Morphological corrosion operation and background removal.
Figure 4. The image processing steps. (a) Nontarget area removal; (b) Generation of binary image; (c) Morphological corrosion operation and background removal.
Sensors 20 03179 g004
Figure 5. The visualization efficiency of three different methods (a, b, and c represent images taken on d18, d24 and d30, respectively).
Figure 5. The visualization efficiency of three different methods (a, b, and c represent images taken on d18, d24 and d30, respectively).
Sensors 20 03179 g005
Figure 6. Input–output correlation in the newly developed BP neural network model for identifying broiler chicken floor distribution (a,b,c, and d correspond to the training set, validation set, test set and overall results, respectively. The horizontal axis Target is the actual number of chickens; and the vertical axis Output is the number of BP model output; “O” represents the input data of the model; “Fit” is the fitting relationship between input and output; “Y = T” means the training output value equal to the target value).
Figure 6. Input–output correlation in the newly developed BP neural network model for identifying broiler chicken floor distribution (a,b,c, and d correspond to the training set, validation set, test set and overall results, respectively. The horizontal axis Target is the actual number of chickens; and the vertical axis Output is the number of BP model output; “O” represents the input data of the model; “Fit” is the fitting relationship between input and output; “Y = T” means the training output value equal to the target value).
Sensors 20 03179 g006
Figure 7. Number of chickens identified and their distribution determined with the newly developed BP model (chickens were three weeks of old in (a), (b) and 4 weeks old in (c), (d); cyan rectangles represent target extraction zone by BP method, and back rectangles represent missed target area (chicken); yellow numbers indicate true chicken number and behind indicate BP model recognized chicken).
Figure 7. Number of chickens identified and their distribution determined with the newly developed BP model (chickens were three weeks of old in (a), (b) and 4 weeks old in (c), (d); cyan rectangles represent target extraction zone by BP method, and back rectangles represent missed target area (chicken); yellow numbers indicate true chicken number and behind indicate BP model recognized chicken).
Sensors 20 03179 g007
Figure 8. Broiler chicken distribution in feeding and drinking zones as identified by the BP model. (a) total chicken tracked and identified; (b) chicken distribution in drinking and feeding zones. The big red rectangle is the drinking zone, and the small red rectangles in this zone indicate the detected broiler chickens. The yellow circle is the feeding area, and the yellow rectangles in the zone indicate detected broiler chickens.
Figure 8. Broiler chicken distribution in feeding and drinking zones as identified by the BP model. (a) total chicken tracked and identified; (b) chicken distribution in drinking and feeding zones. The big red rectangle is the drinking zone, and the small red rectangles in this zone indicate the detected broiler chickens. The yellow circle is the feeding area, and the yellow rectangles in the zone indicate detected broiler chickens.
Sensors 20 03179 g008
Table 1. Comparison between different methods in processing (clustering) speed for 30 images on different days.
Table 1. Comparison between different methods in processing (clustering) speed for 30 images on different days.
Method Images Clustering Running Time (s, Mean ± SD, n = 30)
d18d24d30
K-means5.17 ± 0.695.04 ± 0.665.31 ± 0.87
FCM 1 16.79 ± 1.2614.93 ± 0.8612.99 ± 0.81
This study 20.24 ± 0.040.26 ± 0.030.24 ± 0.04
1 FCM-Fuzzy C-Means. 2 The integration of GB color space and two-dimensional Otsu processing.
Table 2. Test of the BP model on 196 images on identification of chicken distribution in the feeding/drinking zones.
Table 2. Test of the BP model on 196 images on identification of chicken distribution in the feeding/drinking zones.
ZoneTrue Chickens 1Detected Chickens 2Missed Detections [3]False DetectionsRacRmissRfalse
CrowdingOcclusionOthers
Drinking671632832230.94190.06260.0045
Feeding823785826730.95440.04980.0037
1 True chicken numbers were obtained manually by looking at 196 images; 2 Detected chicken numbers were obtained by applying the newly developed BP model to analyze 196 images automatically; 3 Missed detections were caused by multiple factors such as the crowding of chickens and occlusion by equipment such as feeder hanging chains and water lines; Rac-identification accuracy rate of the number of broilers in the drinking/feeding zone, Rmiss -missed detection rate, and Rfalse-false detection rate.

Share and Cite

MDPI and ACS Style

Guo, Y.; Chai, L.; Aggrey, S.E.; Oladeinde, A.; Johnson, J.; Zock, G. A Machine Vision-Based Method for Monitoring Broiler Chicken Floor Distribution. Sensors 2020, 20, 3179. https://0-doi-org.brum.beds.ac.uk/10.3390/s20113179

AMA Style

Guo Y, Chai L, Aggrey SE, Oladeinde A, Johnson J, Zock G. A Machine Vision-Based Method for Monitoring Broiler Chicken Floor Distribution. Sensors. 2020; 20(11):3179. https://0-doi-org.brum.beds.ac.uk/10.3390/s20113179

Chicago/Turabian Style

Guo, Yangyang, Lilong Chai, Samuel E. Aggrey, Adelumola Oladeinde, Jasmine Johnson, and Gregory Zock. 2020. "A Machine Vision-Based Method for Monitoring Broiler Chicken Floor Distribution" Sensors 20, no. 11: 3179. https://0-doi-org.brum.beds.ac.uk/10.3390/s20113179

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop