Learning From Human Demonstrations for Wheel Mobile Manipulator: An Unscented Model Predictive Control Approach

IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10864-10874. doi: 10.1109/TNNLS.2022.3171595. Epub 2023 Nov 30.

Abstract

Industry 4.0 requires new production models to be more flexible and efficient, which means that robots should be capable of flexible skills to adapt to different production and processing tasks. Learning from demonstration (LfD) is considered as one of the promising ways for robots to obtain motion and manipulation skills from humans. In this article, a framework that enables a wheel mobile manipulator to learn skills from humans and complete the specified tasks in an unstructured environment is developed, including a high-level trajectory learning and a low-level trajectory tracking control. First, a modified dynamic movement primitives (DMPs) model is utilized to simultaneously learn the movement trajectories of a human operator's hand and body as reference trajectories for the mobile manipulator. Considering that the auxiliary model obtained by the nonlinear feedback is hard to accurately describe the behavior of mobile manipulator with the presence of uncertain parameters and disturbances, a novel model is established, and an unscented model predictive control (UMPC) strategy is then presented to solve the trajectory tracking control problem without violating the system constraints. Moreover, a sufficient condition guaranteeing the input to state practical stability (ISpS) of the system is obtained, and the upper bound of estimated error is also defined. Finally, the effectiveness of the proposed strategy is validated by three simulation experiments.