An Efficient and Effective Framework for Intestinal Parasite Egg Detection Using YOLOv5

Diagnostics (Basel). 2023 Sep 18;13(18):2978. doi: 10.3390/diagnostics13182978.

Abstract

Intestinal parasitic infections pose a grave threat to human health, particularly in tropical and subtropical regions. The traditional manual microscopy system of intestinal parasite detection remains the gold standard procedure for diagnosing parasite cysts or eggs. This approach is costly, time-consuming (30 min per sample), highly tedious, and requires a specialist. However, computer vision, based on deep learning, has made great strides in recent years. Despite the significant advances in deep convolutional neural network-based architectures, little research has been conducted to explore these techniques' potential in parasitology, specifically for intestinal parasites. This research presents a novel proposal for state-of-the-art transfer learning architecture for the detection and classification of intestinal parasite eggs from images. The ultimate goal is to ensure prompt treatment for patients while also alleviating the burden on experts. Our approach comprised two main stages: image pre-processing and augmentation in the first stage, and YOLOv5 algorithms for detection and classification in the second stage, followed by performance comparison based on different parameters. Remarkably, our algorithms achieved a mean average precision of approximately 97% and a detection time of only 8.5 ms per sample for a dataset of 5393 intestinal parasite images. This innovative approach holds tremendous potential to form a solid theoretical basis for real-time detection and classification in routine clinical examinations, addressing the increasing demand and accelerating the diagnostic process. Our research contributes to the development of cutting-edge technologies for the efficient and accurate detection of intestinal parasite eggs, advancing the field of medical imaging and diagnosis.

Keywords: CNN; YOLOv5; intestinal parasites; transfer learning.

Grants and funding

The authors extend their appreciation to the Deputyship for Research and innovation, Ministry of Education in Saudi Arabia for funding this research through the project number IFP-IMSIU-2023063. The authors also appreciate the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University (IMSIU) for supporting and supervising this project.