Hierarchical Approach to Classify Food Scenes in Egocentric Photo-Streams

IEEE J Biomed Health Inform. 2020 Mar;24(3):866-877. doi: 10.1109/JBHI.2019.2922390. Epub 2019 Jun 12.

Abstract

Recent studies have shown that the environment where people eat can affect their nutritional behavior [1]. In this paper, we provide automatic tools for personalized analysis of a person's health habits by the examination of daily recorded egocentric photo-streams. Specifically, we propose a new automatic approach for the classification of food-related environments, that is able to classify up to 15 such scenes. In this way, people can monitor the context around their food intake in order to get an objective insight into their daily eating routine. We propose a model that classifies food-related scenes organized in a semantic hierarchy. Additionally, we present and make available a new egocentric dataset composed of more than 33 000 images recorded by a wearable camera, over which our proposed model has been tested. Our approach obtains an accuracy and F-score of 56% and 65%, respectively, clearly outperforming the baseline methods.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Food / classification*
  • Humans
  • Image Processing, Computer-Assisted / methods*
  • Life Style
  • Machine Learning
  • Photography / classification*