Human Action Recognition and Note Recognition: A Deep Learning Approach Using STA-GCN

Sensors (Basel). 2024 Apr 14;24(8):2519. doi: 10.3390/s24082519.

Abstract

Human action recognition (HAR) is growing in machine learning with a wide range of applications. One challenging aspect of HAR is recognizing human actions while playing music, further complicated by the need to recognize the musical notes being played. This paper proposes a deep learning-based method for simultaneous HAR and musical note recognition in music performances. We conducted experiments on Morin khuur performances, a traditional Mongolian instrument. The proposed method consists of two stages. First, we created a new dataset of Morin khuur performances. We used motion capture systems and depth sensors to collect data that includes hand keypoints, instrument segmentation information, and detailed movement information. We then analyzed RGB images, depth images, and motion data to determine which type of data provides the most valuable features for recognizing actions and notes in music performances. The second stage utilizes a Spatial Temporal Attention Graph Convolutional Network (STA-GCN) to recognize musical notes as continuous gestures. The STA-GCN model is designed to learn the relationships between hand keypoints and instrument segmentation information, which are crucial for accurate recognition. Evaluation on our dataset demonstrates that our model outperforms the traditional ST-GCN model, achieving an accuracy of 81.4%.

Keywords: action recognition; deep learning; morin khuur; recognize musical notes; spatial temporal attention graph convolutional network (STA-GCN).

MeSH terms

  • Algorithms
  • Deep Learning*
  • Gestures
  • Human Activities
  • Humans
  • Movement / physiology
  • Music*
  • Neural Networks, Computer
  • Pattern Recognition, Automated / methods

Grants and funding

This research was supported by National Science Council, Taiwan, under grant number: NSTC 112-2420-H-008-002.