A Computer Vision-Based Automatic System for Egg Grading and Defect Detection

Animals (Basel). 2023 Jul 19;13(14):2354. doi: 10.3390/ani13142354.

Abstract

Defective eggs diminish the value of laying hen production, particularly in cage-free systems with a higher incidence of floor eggs. To enhance quality, machine vision and image processing have facilitated the development of automated grading and defect detection systems. Additionally, egg measurement systems utilize weight-sorting for optimal market value. However, few studies have integrated deep learning and machine vision techniques for combined egg classification and weighting. To address this gap, a two-stage model was developed based on real-time multitask detection (RTMDet) and random forest networks to predict egg category and weight. The model uses convolutional neural network (CNN) and regression techniques were used to perform joint egg classification and weighing. RTMDet was used to sort and extract egg features for classification, and a Random Forest algorithm was used to predict egg weight based on the extracted features (major axis and minor axis). The results of the study showed that the best achieved accuracy was 94.8% and best R2 was 96.0%. In addition, the model can be used to automatically exclude non-standard-size eggs and eggs with exterior issues (e.g., calcium deposit, stains, and cracks). This detector is among the first models that perform the joint function of egg-sorting and weighing eggs, and is capable of classifying them into five categories (intact, crack, bloody, floor, and non-standard) and measuring them up to jumbo size. By implementing the findings of this study, the poultry industry can reduce costs and increase productivity, ultimately leading to better-quality products for consumers.

Keywords: deep learning; defect detection; egg quality; egg weight; laying hen production.

Grants and funding

The study was sponsored by the USDA-NIFA AFRI (2023-68008-39853), Egg Industry Center; Georgia Research Alliance (Venture Fund); Oracle America (Oracle for Research Grant, CPQ-2060433); University of Georgia (UGA) CAES Dean’s Office Research Fund; UGA Rural Engagement Seed Grant & UGA Global Engagement fund.