Accuracy vs. Energy: An Assessment of Bee Object Inference in Videos from On-Hive Video Loggers with YOLOv3, YOLOv4-Tiny, and YOLOv7-Tiny

Sensors (Basel). 2023 Jul 29;23(15):6791. doi: 10.3390/s23156791.

Abstract

A continuing trend in precision apiculture is to use computer vision methods to quantify characteristics of bee traffic in managed colonies at the hive's entrance. Since traffic at the hive's entrance is a contributing factor to the hive's productivity and health, we assessed the potential of three open-source convolutional network models, YOLOv3, YOLOv4-tiny, and YOLOv7-tiny, to quantify omnidirectional traffic in videos from on-hive video loggers on regular, unmodified one- and two-super Langstroth hives and compared their accuracies, energy efficacies, and operational energy footprints. We trained and tested the models with a 70/30 split on a dataset of 23,173 flying bees manually labeled in 5819 images from 10 randomly selected videos and manually evaluated the trained models on 3600 images from 120 randomly selected videos from different apiaries, years, and queen races. We designed a new energy efficacy metric as a ratio of performance units per energy unit required to make a model operational in a continuous hive monitoring data pipeline. In terms of accuracy, YOLOv3 was first, YOLOv7-tiny-second, and YOLOv4-tiny-third. All models underestimated the true amount of traffic due to false negatives. YOLOv3 was the only model with no false positives, but had the lowest energy efficacy and highest operational energy footprint in a deployed hive monitoring data pipeline. YOLOv7-tiny had the highest energy efficacy and the lowest operational energy footprint in the same pipeline. Consequently, YOLOv7-tiny is a model worth considering for training on larger bee datasets if a primary objective is the discovery of non-invasive computer vision models of traffic quantification with higher energy efficacies and lower operational energy footprints.

Keywords: Apis mellifera; YOLO; artificial intelligence; computer vision; deep learning; energy efficacy; hive monitoring; honey bee; power efficiency; precision apiculture; precision pollination.

MeSH terms

  • Animals
  • Beekeeping*
  • Bees
  • Physical Phenomena
  • Urticaria*

Grants and funding

All hardware, woodenware, and bee packages used in this research were partially funded by three open science hive monitoring fundraisers on www.kickstarter.com (accessed on 5 June 2023) (see supplementary materials for details) and partially funded by the first author. All software (i.e., operating systems, compilers, interpreters, word processors, packages and libraries), used in this research are open source for which Utah State University paid no license fees. All power meters used in this investigation were self-funded by the first author.