Robot Imitation Learning of Social Gestures with Self-Collision Avoidance Using a 3D Sensor

Sensors (Basel). 2018 Jul 20;18(7):2355. doi: 10.3390/s18072355.

Abstract

To effectively interact with people, social robots need to perceive human behaviors and in turn display their own behaviors using social communication modes such as gestures. The modeling of gestures can be difficult due to the high dimensionality of the robot configuration space. Imitation learning can be used to teach a robot to implement multi-jointed arm gestures by directly observing a human teacher's arm movements (for example, using a non-contact 3D sensor) and then mapping these movements onto the robot arms. In this paper, we present a novel imitation learning system with robot self-collision awareness and avoidance. The proposed method uses a kinematical approach with bounding volumes to detect and avoid collisions with the robot itself while performing gesticulations. We conducted experiments with a dual arm social robot and a 3D sensor to determine the effectiveness of our imitation system in being able to mimic gestures while avoiding self-collisions.

Keywords: imitation learning; multi-jointed arm gestures; self-collision avoidance; social robots.