Unveiling EMG semantics: a prototype-learning approach to generalizable gesture classification

J Neural Eng. 2024 May 16. doi: 10.1088/1741-2552/ad4c98. Online ahead of print.

Abstract

Upper limb loss can profoundly impact an individual's quality of life, posing challenges to both physical capabilities and emotional well-being. To restore limb function by decoding electromyography (EMG) signals, in this paper, we present a novel deep prototype learning method for accurate and generalizable EMG-based gesture classification. Existing methods suffer from limitations in generalization across subjects due to the diverse nature of individual muscle responses, impeding seamless applicability in broader populations. By leveraging deep prototype learning, we introduce a method that goes beyond direct output prediction. Instead, it matches new EMG inputs to a set of learned prototypes and predicts the corresponding labels. This novel methodology significantly enhances the model's classification performance and generalizability by discriminating subtle differences between gestures, making it more reliable and precise in real-world applications. Our experiments on four Ninapro datasets suggest that our deep prototype learning classifier outperforms state-of-the-art methods in terms of intra-subject and inter-subject classification accuracy in gesture prediction. The results from our experiments validate the effectiveness of the proposed method and pave the way for future advancements in the field of EMG gesture classification for upper limb prosthetics.

Keywords: Generalizability; Hand Gesture Classification; Prototype Learning; Surface Electromyography.