Learning to Feel Textures: Predicting Perceptual Similarities From Unconstrained Finger-Surface Interactions

IEEE Trans Haptics. 2022 Oct-Dec;15(4):705-717. doi: 10.1109/TOH.2022.3212701. Epub 2022 Dec 19.

Abstract

Whenever we touch a surface with our fingers, we perceive distinct tactile properties that are based on the underlying dynamics of the interaction. However, little is known about how the brain aggregates the sensory information from these dynamics to form abstract representations of textures. Earlier studies in surface perception all used general surface descriptors measured in controlled conditions instead of considering the unique dynamics of specific interactions, reducing the comprehensiveness and interpretability of the results. Here, we present an interpretable modeling method that predicts the perceptual similarity of surfaces by comparing probability distributions of features calculated from short time windows of specific physical signals (finger motion, contact force, fingernail acceleration) elicited during unconstrained finger-surface interactions. The results show that our method can predict the similarity judgments of individual participants with a maximum Spearman's correlation of 0.7. Furthermore, we found evidence that different participants weight interaction features differently when judging surface similarity. Our findings provide new perspectives on human texture perception during active touch, and our approach could benefit haptic surface assessment, robotic tactile perception, and haptic rendering.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Fingers
  • Humans
  • Learning
  • Touch
  • Touch Perception*
  • Visual Perception