Sim-to-Real for High-Resolution Optical Tactile Sensing: From Images to Three-Dimensional Contact Force Distributions

Soft Robot. 2022 Oct;9(5):926-937. doi: 10.1089/soro.2020.0213. Epub 2021 Nov 25.

Abstract

The images captured by vision-based tactile sensors carry information about high-resolution tactile fields, such as the distribution of the contact forces applied to their soft sensing surface. However, extracting the information encoded in the images is challenging and often addressed with learning-based approaches, which generally require a large amount of training data. This article proposes a strategy to generate tactile images in simulation for a vision-based tactile sensor based on an internal camera that tracks the motion of spherical particles within a soft material. The deformation of the material is simulated in a finite element environment under a diverse set of contact conditions, and spherical particles are projected to a simulated image. Features extracted from the images are mapped to the three-dimensional contact force distribution, with the ground truth also obtained using finite-element simulations, with an artificial neural network that is therefore entirely trained on synthetic data avoiding the need for real-world data collection. The resulting model exhibits high accuracy when evaluated on real-world tactile images, is transferable across multiple tactile sensors without further training, and is suitable for efficient real-time inference.

Keywords: computer vision; machine learning; sim-to-real; tactile sensing.

MeSH terms

  • Computer Simulation
  • Neural Networks, Computer
  • Touch Perception*
  • Touch*
  • Vision, Ocular