Real-Time Food Intake Monitoring Using Wearable Egocnetric Camera

Annu Int Conf IEEE Eng Med Biol Soc. 2020 Jul:2020:4191-4195. doi: 10.1109/EMBC44109.2020.9175497.

Abstract

With technological advancement, wearable egocentric camera systems have extensively been studied to develop food intake monitoring devices for the assessment of eating behavior. This paper provides a detailed description of the implementation of CNN based image classifier in the Cortex-M7 microcontroller. The proposed network classifies the captured images by the wearable egocentric camera as food and no food images in real-time. This real-time food image detection can potentially lead the monitoring devices to consume less power, less storage, and more user-friendly in terms of privacy by saving only images that are detected as food images. A derivative of pre-trained MobileNet is trained to detect food images from camera captured images. The proposed network needs 761.99KB of flash and 501.76KB of RAM to implement which is built for an optimal trade-off between accuracy, computational cost, and memory footprint considering implementation on a Cortex-M7 microcontroller. The image classifier achieved an average precision of 82%±3% and an average F-score of 74%±2% while testing on 15343 (2127 food images and 13216 no food images) images of five full days collected from five participants.

MeSH terms

  • Data Collection
  • Eating
  • Feeding Behavior*
  • Food
  • Humans
  • Wearable Electronic Devices*