Online Learning and Classification of EMG-Based Gestures on a Parallel Ultra-Low Power Platform Using Hyperdimensional Computing

IEEE Trans Biomed Circuits Syst. 2019 Jun;13(3):516-528. doi: 10.1109/TBCAS.2019.2914476. Epub 2019 May 2.

Abstract

This paper presents a wearable electromyographic gesture recognition system based on the hyperdimensional computing paradigm, running on a programmable parallel ultra-low-power (PULP) platform. The processing chain includes efficient on-chip training, which leads to a fully embedded implementation with no need to perform any offline training on a personal computer. The proposed solution has been tested on 10 subjects in a typical gesture recognition scenario achieving 85% average accuracy on 11 gestures recognition, which is aligned with the state-of-the-art, with the unique capability of performing online learning. Furthermore, by virtue of the hardware friendly algorithm and of the efficient PULP system-on-chip (Mr. Wolf) used for prototyping and evaluation, the energy budget required to run the learning part with 11 gestures is 10.04 mJ, and 83.2 μJ per classification. The system works with a average power consumption of 10.4 mW in classification, ensuring around 29 h of autonomy with a 100 mAh battery. Finally, the scalability of the system is explored by increasing the number of channels (up to 256 electrodes), demonstrating the suitability of our approach as universal, energy-efficient biopotential wearable recognition framework.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Electromyography*
  • Gestures*
  • Humans
  • Pattern Recognition, Automated*
  • Signal Processing, Computer-Assisted*
  • Wearable Electronic Devices*