Sensoring the Neck: Classifying Movements and Actions with a Neck-Mounted Wearable Device

Sensors (Basel). 2022 Jun 7;22(12):4313. doi: 10.3390/s22124313.

Abstract

Sensor technology that captures information from a user's neck region can enable a range of new possibilities, including less intrusive mobile software interfaces. In this work, we investigate the feasibility of using a single inexpensive flex sensor mounted at the neck to capture information about head gestures, about mouth movements, and about the presence of audible speech. Different sensor sizes and various sensor positions on the neck are experimentally evaluated. With data collected from experiments performed on the finalized prototype, a classification accuracy of 91% in differentiating common head gestures, a classification accuracy of 63% in differentiating mouth movements, and a classification accuracy of 83% in speech detection are achieved.

Keywords: flex sensor; interaction design; machine learning (ML); neck-mounted interface; wearable computing.

MeSH terms

  • Gestures
  • Movement
  • Neck
  • Wearable Electronic Devices*