Visualizing Inertial Data For Wearable Sensor Based Daily Life Activity Recognition Using Convolutional Neural Network

Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul:2019:2478-2481. doi: 10.1109/EMBC.2019.8857366.

Abstract

Nowadays human activity recognition (HAR) plays an crucial role in the healthcare and wellness domains, for example, HAR contributes to context-aware systems like elder home assistance and care as a core technology. Despite promising performance in terms of recognition accuracy achieved by the advancement of machine learning for classification tasks, most of the existing HAR approaches, which adopt low-level handcrafted features, cannot completely deal with practical activities. Therefore, in this paper, we present an efficient wearable sensor based activity recognition method that allows encoding inertial data into color image data for learning highly discriminative features by convolutional neural networks (CNNs). The proposed data encoding technique converts tri-axial samples to color pixels and then arranges them for image-formed representation. Our method reaches the recognition accuracy of over 95% on two challenging activities datasets and further outperforms other deep learning-based HAR approaches.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Activities of Daily Living*
  • Humans
  • Machine Learning
  • Neural Networks, Computer*
  • Wearable Electronic Devices*