Improving the recognition of eating gestures using intergesture sequential dependencies

IEEE J Biomed Health Inform. 2015 May;19(3):825-31. doi: 10.1109/JBHI.2014.2329137. Epub 2014 Jun 5.

Abstract

This paper considers the problem of recognizing eating gestures by tracking wrist motion. Eating gestures are activities commonly undertaken during the consumption of a meal, such as sipping a drink of liquid or using utensils to cut food. Each of these gestures causes a pattern of wrist motion that can be tracked to automatically identify the activity. Previous works have studied this problem at the level of a single gesture. In this paper, we demonstrate that individual gestures have sequential dependence. To study this, three types of classifiers were built: 1) a K-nearest neighbor classifier which uses no sequential context, 2) a hidden Markov model (HMM) which captures the sequential context of subgesture motions, and 3) HMMs that model intergesture sequential dependencies. We built first-order to sixth-order HMMs to evaluate the usefulness of increasing amounts of sequential dependence to aid recognition. On a dataset of 25 meals, we found that the baseline accuracies for the KNN and the subgesture HMM classifiers were 75.8% and 84.3%, respectively. Using HMMs that model intergesture sequential dependencies, we were able to increase accuracy to up to 96.5%. These results demonstrate that sequential dependencies exist between eating gestures and that they can be exploited to improve recognition accuracy.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Activities of Daily Living
  • Algorithms
  • Eating / physiology*
  • Gestures*
  • Humans
  • Markov Chains
  • Movement / physiology
  • Pattern Recognition, Automated / methods*
  • Signal Processing, Computer-Assisted
  • Telemedicine
  • Wrist / physiology