Omnidirectional Walking Pattern Generator Combining Virtual Constraints and Preview Control for Humanoid Robots

Front Robot AI. 2021 Jun 1:8:660004. doi: 10.3389/frobt.2021.660004. eCollection 2021.

Abstract

This paper presents a novel omnidirectional walking pattern generator for bipedal locomotion combining two structurally different approaches based on the virtual constraints and the preview control theories to generate a flexible gait that can be modified on-line. The proposed strategy synchronizes the displacement of the robot along the two planes of walking: the zero moment point based preview control is responsible for the lateral component of the gait, while the sagittal motion is generated by a more dynamical approach based on virtual constraints. The resulting algorithm is characterized by a low computational complexity and high flexibility, requisite for a successful deployment to humanoid robots operating in real world scenarios. This solution is motivated by observations in biomechanics showing how during a nominal gait the dynamic motion of the human walk is mainly generated along the sagittal plane. We describe the implementation of the algorithm and we detail the strategy chosen to enable omnidirectionality and on-line gait tuning. Finally, we validate our strategy through simulation experiments using the COMAN + platform, an adult size humanoid robot developed at Istituto Italiano di Tecnologia. Finally, the hybrid walking pattern generator is implemented on real hardware, demonstrating promising results: the WPG trajectories results in open-loop stable walking in the absence of external disturbances.

Keywords: bipedal locomotion; humanoid robots; motion control; walking pattern generation; whole-body control.