Exploring the Possibility of Photoplethysmography-Based Human Activity Recognition Using Convolutional Neural Networks

Sensors (Basel). 2024 Mar 1;24(5):1610. doi: 10.3390/s24051610.

Abstract

Various sensing modalities, including external and internal sensors, have been employed in research on human activity recognition (HAR). Among these, internal sensors, particularly wearable technologies, hold significant promise due to their lightweight nature and simplicity. Recently, HAR techniques leveraging wearable biometric signals, such as electrocardiography (ECG) and photoplethysmography (PPG), have been proposed using publicly available datasets. However, to facilitate broader practical applications, a more extensive analysis based on larger databases with cross-subject validation is required. In pursuit of this objective, we initially gathered PPG signals from 40 participants engaged in five common daily activities. Subsequently, we evaluated the feasibility of classifying these activities using deep learning architecture. The model's performance was assessed in terms of accuracy, precision, recall, and F-1 measure via cross-subject cross-validation (CV). The proposed method successfully distinguished the five activities considered, with an average test accuracy of 95.14%. Furthermore, we recommend an optimal window size based on a comprehensive evaluation of performance relative to the input signal length. These findings confirm the potential for practical HAR applications based on PPG and indicate its prospective extension to various domains, such as healthcare or fitness applications, by concurrently analyzing behavioral and health data through a single biometric signal.

Keywords: convolutional neural networks; cross-subject validation; human activity recognition; photoplethysmography; window size.

MeSH terms

  • Electrocardiography / methods
  • Human Activities
  • Humans
  • Neural Networks, Computer*
  • Photoplethysmography* / methods
  • Prospective Studies