Generation of Tactile Data From 3D Vision and Target Robotic Grasps

IEEE Trans Haptics. 2021 Jan-Mar;14(1):57-67. doi: 10.1109/TOH.2020.3011899. Epub 2021 Mar 24.

Abstract

Tactile perception is a rich source of information for robotic grasping: it allows a robot to identify a grasped object and assess the stability of a grasp, among other things. However, the tactile sensor must come into contact with the target object in order to produce readings. As a result, tactile data can only be attained if a real contact is made. We propose to overcome this restriction by employing a method that models the behaviour of a tactile sensor using 3D vision and grasp information as a stimulus. Our system regresses the quantified tactile response that would be experienced if this grasp were performed on the object. We experiment with 16 items and 4 tactile data modalities to show that our proposal learns this task with low error.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Hand Strength
  • Humans
  • Robotic Surgical Procedures*
  • Robotics*
  • Touch
  • Vision, Ocular