Real Time Human Activity Recognition Using Acceleration and First-Person Camera data

Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov:2021:6966-6969. doi: 10.1109/EMBC46164.2021.9630369.

Abstract

The aim of this work is to present an automated method, working in real time, for human activity recognition based on acceleration and first-person camera data. A Long-Short-Term-Memory (LSTM) model has been built for recognizing locomotive activities (i.e. walking, sitting, standing, going upstairs, going downstairs) from acceleration data, while a ResNet model is employed for the recognition of stationary activities (i.e. eating, reading, writing, watching TV working on PC). The outcomes of the two models are fused in order for the final decision, regarding the performed activity, to be made. For the training, testing and evaluation of the proposed models, a publicly available dataset and an "in-house" dataset are utilized. The overall accuracy of the proposed algorithmic pipeline reaches 87.8%.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acceleration*
  • Human Activities
  • Humans
  • Recognition, Psychology
  • Sitting Position
  • Walking*