Visualizing Hand Force with Wearable Muscle Sensing for Enhanced Mixed Reality Remote Collaboration

IEEE Trans Vis Comput Graph. 2023 Nov;29(11):4611-4621. doi: 10.1109/TVCG.2023.3320210. Epub 2023 Nov 2.

Abstract

In this paper, we present a prototype system for sharing a user's hand force in mixed reality (MR) remote collaboration on physical tasks, where hand force is estimated using wearable surface electromyography (sEMG) sensor. In a remote collaboration between a worker and an expert, hand activity plays a crucial role. However, the force exerted by the worker's hand has not been extensively investigated. Our sEMG-based system reliably captures the worker's hand force during physical tasks and conveys this information to the expert through hand force visualization, overlaid on the worker's view or on the worker's avatar. A user study was conducted to evaluate the impact of visualizing a worker's hand force on collaboration, employing three distinct visualization methods across two view modes. Our findings demonstrate that sensing and sharing hand force in MR remote collaboration improves the expert's awareness of the worker's task, significantly enhances the expert's perception of the collaborator's hand force and the weight of the interacting object, and promotes a heightened sense of social presence for the expert. Based on the findings, we provide design implications for future mixed reality remote collaboration systems that incorporate hand force sensing and visualization.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Augmented Reality*
  • Computer Graphics
  • Muscles
  • Wearable Electronic Devices*