SenseHunger: Machine Learning Approach to Hunger Detection Using Wearable Sensors

Sensors (Basel). 2022 Oct 11;22(20):7711. doi: 10.3390/s22207711.

Abstract

The perception of hunger and satiety is of great importance to maintaining a healthy body weight and avoiding chronic diseases such as obesity, underweight, or deficiency syndromes due to malnutrition. There are a number of disease patterns, characterized by a chronic loss of this perception. To our best knowledge, hunger and satiety cannot be classified using non-invasive measurements. Aiming to develop an objective classification system, this paper presents a multimodal sensory system using associated signal processing and pattern recognition methods for hunger and satiety detection based on non-invasive monitoring. We used an Empatica E4 smartwatch, a RespiBan wearable device, and JINS MEME smart glasses to capture physiological signals from five healthy normal weight subjects inactively sitting on a chair in a state of hunger and satiety. After pre-processing the signals, we compared different feature extraction approaches, either based on manual feature engineering or deep feature learning. Comparative experiments were carried out to determine the most appropriate sensor channel, device, and classifier to reliably discriminate between hunger and satiety states. Our experiments showed that the most discriminative features come from three specific sensor modalities: Electrodermal Activity (EDA), infrared Thermopile (Tmp), and Blood Volume Pulse (BVP).

Keywords: artificial neural network; hunger; machine learning; multimodal sensing; non-invasive sensing; physiological signals; satiety.

MeSH terms

  • Body Weight
  • Humans
  • Hunger* / physiology
  • Machine Learning
  • Obesity
  • Wearable Electronic Devices*

Grants and funding

Research activities leading to this publication have been financially supported by the DAMP foundation within the grant SENSE “Systemic Nutritional Medicine”.