Exploring Tactile Temporal Features for Object Pose Estimation during Robotic Manipulation

Sensors (Basel). 2023 May 6;23(9):4535. doi: 10.3390/s23094535.

Abstract

Dexterous robotic manipulation tasks depend on estimating the state of in-hand objects, particularly their orientation. Although cameras have been traditionally used to estimate the object's pose, tactile sensors have recently been studied due to their robustness against occlusions. This paper explores tactile data's temporal information for estimating the orientation of grasped objects. The data from a compliant tactile sensor were collected using different time-window sample sizes and evaluated using neural networks with long short-term memory (LSTM) layers. Our results suggest that using a window of sensor readings improved angle estimation compared to previous works. The best window size of 40 samples achieved an average of 0.0375 for the mean absolute error (MAE) in radians, 0.0030 for the mean squared error (MSE), 0.9074 for the coefficient of determination (R2), and 0.9094 for the explained variance score (EXP), with no enhancement for larger window sizes. This work illustrates the benefits of temporal information for pose estimation and analyzes the performance behavior with varying window sizes, which can be a basis for future robotic tactile research. Moreover, it can complement underactuated designs and visual pose estimation methods.

Keywords: LSTM; object manipulation; pose estimation; sliding window; tactile sensing.

MeSH terms

  • Hand
  • Neural Networks, Computer
  • Robotic Surgical Procedures*
  • Robotics*
  • Touch

Grants and funding

The SEED grant of the Faculty of Science of the Memorial University of Newfoundland partially funded this research.