Visually Guided Acquisition of Contact Dynamics and Case Study in Data-Driven Haptic Texture Modeling

IEEE Trans Haptics. 2020 Jul-Sep;13(3):611-627. doi: 10.1109/TOH.2020.2965449. Epub 2020 Jan 10.

Abstract

Data-driven modeling of human hand contact dynamics starts with a tedious process of data collection. The data of contact dynamics consist of an input describing an applied action and response stimuli from the environment. The quality and stability of the model mainly depend on how well data points cover the model space. Thus, in order to build a reliable data-driven model, a user usually collects data dozens of times. In this article, we aim to build an interactive system that assists a user in data collection. We develop an online segmentation framework that partitions a multivariate streaming signal. Real-time segmentation allows for tracking the process of how the model space is being populated. We applied the proposed framework for a haptic texture modeling use-case. In order to guide a user in data collection, we designed a user interface mapping applied input to alternative visual modalities based on the theory of direct perception. A combination of the segmentation framework and user interface implements a human-in-loop system, where the user interface assigns the target combination of input variables and the user tries to acquire them. Experimental results show that the proposed data collection schema considerably increases the approximation quality of the model, whereas the proposed user interface considerably reduces mental workload experienced during data collection.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Data Collection
  • Humans
  • Models, Theoretical*
  • Physical Phenomena*
  • Touch Perception*
  • Touch*
  • User-Computer Interface*
  • Visual Perception*