Interactive robot teaching based on finger trajectory using multimodal RGB-D-T-data

Front Robot AI. 2023 Mar 16:10:1120357. doi: 10.3389/frobt.2023.1120357. eCollection 2023.

Abstract

The concept of Industry 4.0 brings the change of industry manufacturing patterns that become more efficient and more flexible. In response to this tendency, an efficient robot teaching approach without complex programming has become a popular research direction. Therefore, we propose an interactive finger-touch based robot teaching schema using a multimodal 3D image (color (RGB), thermal (T) and point cloud (3D)) processing. Here, the resulting heat trace touching the object surface will be analyzed on multimodal data, in order to precisely identify the true hand/object contact points. These identified contact points are used to calculate the robot path directly. To optimize the identification of the contact points we propose a calculation scheme using a number of anchor points which are first predicted by hand/object point cloud segmentation. Subsequently a probability density function is defined to calculate the prior probability distribution of true finger trace. The temperature in the neighborhood of each anchor point is then dynamically analyzed to calculate the likelihood. Experiments show that the trajectories estimated by our multimodal method have significantly better accuracy and smoothness than only by analyzing point cloud and static temperature distribution.

Keywords: RGB-D-T-data; finger trajectory recognition; meshless finite difference solution; multimodal image processing; point cloud processing; robot teaching.

Grants and funding

This research was funded by Group for Quality Assurance and Industrial Image Processing, Technische Universität Ilmenau.