Biomechanical Fidelity of Simulated Pick-and-Place Tasks: Impact of Visual and Haptic Renderings

IEEE Trans Haptics. 2021 Jul-Sep;14(3):692-698. doi: 10.1109/TOH.2021.3052658. Epub 2021 Sep 9.

Abstract

Virtual environments (VE) and haptic interfaces (HI) tend to be introduced as virtual prototyping tools to assess ergonomic features of workstations. These approaches are cost-effective and convenient since working directly on the Digital Mock-Up in a VE is preferable to constructing a physical mock-up in a Real Environment (RE). However it can be usable only if the ergonomic conclusions made from the VE are similar to the ones you would make in the real world. This article aims at evaluating the impact of visual and haptic renderings in terms of biomechanical fidelity for pick-and-place tasks. Fourteen subjects performed time-constrained pick-and-place tasks in RE and VE with a real and a virtual, haptic driven object at three different speeds. Motion of the hand and muscles activation of the upper limb were recorded. A questionnaire assessed subjectively discomfort and immersion. The results revealed significant differences between measured indicators in RE and VE and with real and virtual object. Objective and subjective measures indicated higher muscle activity and higher length of the hand trajectories in VE and with HI. Another important element is that no cross effect between haptic and visual rendering was reported. Theses results confirmed that such systems should be used with caution for ergonomics evaluation, especially when investigating postural and muscle quantities as discomfort indicators. The last contribution of the paper lies in an experimental setup easily replicable to asses more systematically the biomechanical fidelity of virtual environments for ergonomics purposes.

MeSH terms

  • Ergonomics
  • Hand Strength
  • Hand*
  • Humans
  • Upper Extremity
  • User-Computer Interface*