Automated Monitoring of Livestock and Poultry with Machine Learning Technology, Volume II

A special issue of Animals (ISSN 2076-2615). This special issue belongs to the section "Animal System and Management".

Deadline for manuscript submissions: 15 August 2024 | Viewed by 2343

Special Issue Editors

Department of Poultry Science, University of Georgia, Athens, GA, USA
Interests: precision livestock farming; animal welfare and behavior; smart sensing; applied artificial intelligence
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Biosystems Engineering Department, Tarbiat Modares University, Tehran 14117-13116, Iran
Interests: machine vision; advanced and intelligent agricultural systems; application of non-destructive measurements; thermography

E-Mail Website
Guest Editor
Department of Poultry Science, Tarbiat Modares University, Tehran 14115-336, Iran
Interests: artificial intelligence to understand livestock; analysis of agriculture-animal systems (network, optimization, data science)

Special Issue Information

Dear Colleagues,

This is the extended second edition of the Special Issue “Automated Monitoring of Livestock and Poultry with Machine Learning Technology”. In Phase I, the Special Issue collected 12 publications, covering a wide spectrum of the most recent research in precision livestock and poultry farming. The issue also involved a wide range of agriculture animals, including laying hens, broilers, pigs, sheep, beef cattle, and horses.

The livestock and poultry industry producing daily animal protein for humans continues to grow for improved genetics, nutrition, stewardship, and welfare to enhance production efficiency, secure food safety, sufficiency, and sustainability, and feed the growing human population. Maintaining and improving contemporary intensive production systems requires substantial natural and human resources and can greatly impact economy, public health, environment, and society. Automated monitoring features developments and applications of continuous, objective, and supportive sensing technologies and computer tools for sustainable and efficient animal production. Recent advancements in computer hardware and machine learning modelling boost the performance of automated monitoring and can assist producers in management decisions and provide early detection and prevention of disease and production inefficiencies. Automated monitoring with machine learning technology offers solutions to the animal industry to address challenges with regard to precision/smart management, environment, nutrition, genetics, big data analytics, real-time monitoring, automation and robotics, welfare assessment, animal tracking, individual identification, behaviour recognition, etc.

We are pleased to invite original research and review papers from all over the globe. Contributing papers are expected to address the advancement towards production efficiency, safety, and sustainability of the animal industry and explore the abovementioned areas through developments and applications of automated monitoring with machine learning technology. 

Dr. Guoming Li
Dr. Ahmad Banakar
Dr. Hamed Ahmadi
Dr. Lilong Chai
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Animals is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • artificial intelligence
  • modelling
  • precision livestock farming
  • machine learning
  • deep learning
  • metaheuristic optimization
  • IoT
  • sensor
  • computer vision
  • big data
  • robotics

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 3281 KiB  
Article
An Integrated Gather-and-Distribute Mechanism and Attention-Enhanced Deformable Convolution Model for Pig Behavior Recognition
by Rui Mao, Dongzhen Shen, Ruiqi Wang, Yiming Cui, Yufan Hu, Mei Li and Meili Wang
Animals 2024, 14(9), 1316; https://0-doi-org.brum.beds.ac.uk/10.3390/ani14091316 - 27 Apr 2024
Viewed by 208
Abstract
The behavior of pigs is intricately tied to their health status, highlighting the critical importance of accurately recognizing pig behavior, particularly abnormal behavior, for effective health monitoring and management. This study addresses the challenge of accommodating frequent non-rigid deformations in pig behavior using [...] Read more.
The behavior of pigs is intricately tied to their health status, highlighting the critical importance of accurately recognizing pig behavior, particularly abnormal behavior, for effective health monitoring and management. This study addresses the challenge of accommodating frequent non-rigid deformations in pig behavior using deformable convolutional networks (DCN) to extract more comprehensive features by incorporating offsets during training. To overcome the inherent limitations of traditional DCN offset weight calculations, the study introduces the multi-path coordinate attention (MPCA) mechanism to enhance the optimization of the DCN offset weight calculation within the designed DCN-MPCA module, further integrated into the cross-scale cross-feature (C2f) module of the backbone network. This optimized C2f-DM module significantly enhances feature extraction capabilities. Additionally, a gather-and-distribute (GD) mechanism is employed in the neck to improve non-adjacent layer feature fusion in the YOLOv8 network. Consequently, the novel DM-GD-YOLO model proposed in this study is evaluated on a self-built dataset comprising 11,999 images obtained from an online monitoring platform focusing on pigs aged between 70 and 150 days. The results show that DM-GD-YOLO can simultaneously recognize four common behaviors and three abnormal behaviors, achieving a precision of 88.2%, recall of 92.2%, and mean average precision (mAP) of 95.3% with 6.0MB Parameters and 10.0G FLOPs. Overall, the model outperforms popular models such as Faster R-CNN, EfficientDet, YOLOv7, and YOLOv8 in monitoring pens with about 30 pigs, providing technical support for the intelligent management and welfare-focused breeding of pigs while advancing the transformation and modernization of the pig industry. Full article
Show Figures

Figure 1

30 pages, 12906 KiB  
Article
Research on Dynamic Pig Counting Method Based on Improved YOLOv7 Combined with DeepSORT
by Xiaobao Shao, Chengcheng Liu, Zhixuan Zhou, Wenjing Xue, Guoye Zhang, Jianyu Liu and Hongwen Yan
Animals 2024, 14(8), 1227; https://0-doi-org.brum.beds.ac.uk/10.3390/ani14081227 - 19 Apr 2024
Viewed by 393
Abstract
A pig inventory is a crucial component of achieving precise and large-scale farming. In complex pigsty environments, due to pigs’ stress reactions and frequent obstructions, it is challenging to count them accurately and automatically. This difficulty contrasts with most current deep learning studies, [...] Read more.
A pig inventory is a crucial component of achieving precise and large-scale farming. In complex pigsty environments, due to pigs’ stress reactions and frequent obstructions, it is challenging to count them accurately and automatically. This difficulty contrasts with most current deep learning studies, which rely on overhead views or static images for counting. This research proposes a video-based dynamic counting method, combining YOLOv7 with DeepSORT. By utilizing the YOLOv7 network structure and optimizing the second and third 3 × 3 convolution operations in the head network ELAN-W with PConv, the model reduces the computational demand and improves the inference speed without sacrificing accuracy. To ensure that the network acquires accurate position perception information at oblique angles and extracts rich semantic information, we introduce the coordinate attention (CA) mechanism before the three re-referentialization paths (REPConv) in the head network, enhancing robustness in complex scenarios. Experimental results show that, compared to the original model, the improved model increases the mAP by 3.24, 0.05, and 1.00 percentage points for oblique, overhead, and all pig counting datasets, respectively, while reducing the computational cost by 3.6 GFLOPS. The enhanced YOLOv7 outperforms YOLOv5, YOLOv4, YOLOv3, Faster RCNN, and SSD in target detection with mAP improvements of 2.07, 5.20, 2.16, 7.05, and 19.73 percentage points, respectively. In dynamic counting experiments, the improved YOLOv7 combined with DeepSORT was tested on videos with total pig counts of 144, 201, 285, and 295, yielding errors of -3, -3, -4, and -26, respectively, with an average accuracy of 96.58% and an FPS of 22. This demonstrates the model’s capability of performing the real-time counting of pigs in various scenes, providing valuable data and references for automated pig counting research. Full article
Show Figures

Figure 1

18 pages, 10712 KiB  
Article
Improved YOLOv8 Model for Lightweight Pigeon Egg Detection
by Tao Jiang, Jie Zhou, Binbin Xie, Longshen Liu, Chengyue Ji, Yao Liu, Binghan Liu and Bo Zhang
Animals 2024, 14(8), 1226; https://0-doi-org.brum.beds.ac.uk/10.3390/ani14081226 - 19 Apr 2024
Viewed by 472
Abstract
In response to the high breakage rate of pigeon eggs and the significant labor costs associated with egg-producing pigeon farming, this study proposes an improved YOLOv8-PG (real versus fake pigeon egg detection) model based on YOLOv8n. Specifically, the Bottleneck in the C2f module [...] Read more.
In response to the high breakage rate of pigeon eggs and the significant labor costs associated with egg-producing pigeon farming, this study proposes an improved YOLOv8-PG (real versus fake pigeon egg detection) model based on YOLOv8n. Specifically, the Bottleneck in the C2f module of the YOLOv8n backbone network and neck network are replaced with Fasternet-EMA Block and Fasternet Block, respectively. The Fasternet Block is designed based on PConv (Partial Convolution) to reduce model parameter count and computational load efficiently. Furthermore, the incorporation of the EMA (Efficient Multi-scale Attention) mechanism helps mitigate interference from complex environments on pigeon-egg feature-extraction capabilities. Additionally, Dysample, an ultra-lightweight and effective upsampler, is introduced into the neck network to further enhance performance with lower computational overhead. Finally, the EXPMA (exponential moving average) concept is employed to optimize the SlideLoss and propose the EMASlideLoss classification loss function, addressing the issue of imbalanced data samples and enhancing the model’s robustness. The experimental results showed that the F1-score, mAP50-95, and mAP75 of YOLOv8-PG increased by 0.76%, 1.56%, and 4.45%, respectively, compared with the baseline YOLOv8n model. Moreover, the model’s parameter count and computational load are reduced by 24.69% and 22.89%, respectively. Compared to detection models such as Faster R-CNN, YOLOv5s, YOLOv7, and YOLOv8s, YOLOv8-PG exhibits superior performance. Additionally, the reduction in parameter count and computational load contributes to lowering the model deployment costs and facilitates its implementation on mobile robotic platforms. Full article
Show Figures

Figure 1

13 pages, 2183 KiB  
Article
Deep Learning Methods for Tracking the Locomotion of Individual Chickens
by Xiao Yang, Ramesh Bahadur Bist, Bidur Paneru and Lilong Chai
Animals 2024, 14(6), 911; https://0-doi-org.brum.beds.ac.uk/10.3390/ani14060911 - 15 Mar 2024
Viewed by 808
Abstract
Poultry locomotion is an important indicator of animal health, welfare, and productivity. Traditional methodologies such as manual observation or the use of wearable devices encounter significant challenges, including potential stress induction and behavioral alteration in animals. This research introduced an innovative approach that [...] Read more.
Poultry locomotion is an important indicator of animal health, welfare, and productivity. Traditional methodologies such as manual observation or the use of wearable devices encounter significant challenges, including potential stress induction and behavioral alteration in animals. This research introduced an innovative approach that employs an enhanced track anything model (TAM) to track chickens in various experimental settings for locomotion analysis. Utilizing a dataset comprising both dyed and undyed broilers and layers, the TAM model was adapted and rigorously evaluated for its capability in non-intrusively tracking and analyzing poultry movement by intersection over union (mIoU) and the root mean square error (RMSE). The findings underscore TAM’s superior segmentation and tracking capabilities, particularly its exemplary performance against other state-of-the-art models, such as YOLO (you only look once) models of YOLOv5 and YOLOv8, and its high mIoU values (93.12%) across diverse chicken categories. Moreover, the model demonstrated notable accuracy in speed detection, as evidenced by an RMSE value of 0.02 m/s, offering a technologically advanced, consistent, and non-intrusive method for tracking and estimating the locomotion speed of chickens. This research not only substantiates TAM as a potent tool for detailed poultry behavior analysis and monitoring but also illuminates its potential applicability in broader livestock monitoring scenarios, thereby contributing to the enhancement of animal welfare and management in poultry farming through automated, non-intrusive monitoring and analysis. Full article
Show Figures

Figure 1

Back to TopTop