Zero-Shot Human Activity Recognition Using Non-Visual Sensors

Sensors (Basel). 2020 Feb 4;20(3):825. doi: 10.3390/s20030825.

Abstract

Due to significant advances in sensor technology, studies towards activity recognition have gained interest and maturity in the last few years. Existing machine learning algorithms have demonstrated promising results by classifying activities whose instances have been already seen during training. Activity recognition methods based on real-life settings should cover a growing number of activities in various domains, whereby a significant part of instances will not be present in the training data set. However, to cover all possible activities in advance is a complex and expensive task. Concretely, we need a method that can extend the learning model to detect unseen activities without prior knowledge regarding sensor readings about those previously unseen activities. In this paper, we introduce an approach to leverage sensor data in discovering new unseen activities which were not present in the training set. We show that sensor readings can lead to promising results for zero-shot learning, whereby the necessary knowledge can be transferred from seen to unseen activities by using semantic similarity. The evaluation conducted on two data sets extracted from the well-known CASAS datasets show that the proposed zero-shot learning approach achieves a high performance in recognizing unseen (i.e., not present in the training dataset) new activities.

Keywords: activity recognition; non-visual sensors; sensor data; zero-shot learning.

MeSH terms

  • Activities of Daily Living
  • Algorithms
  • Equipment Design
  • Human Activities*
  • Humans
  • Machine Learning
  • Monitoring, Ambulatory / instrumentation*
  • Monitoring, Ambulatory / methods*
  • Neural Networks, Computer
  • Pattern Recognition, Automated / methods*
  • Reproducibility of Results
  • Support Vector Machine