iCub3 avatar system: Enabling remote fully immersive embodiment of humanoid robots

Sci Robot. 2024 Jan 24;9(86):eadh3834. doi: 10.1126/scirobotics.adh3834. Epub 2024 Jan 24.

Abstract

We present an avatar system designed to facilitate the embodiment of humanoid robots by human operators, validated through iCub3, a humanoid developed at the Istituto Italiano di Tecnologia. More precisely, the paper makes two contributions: First, we present the humanoid iCub3 as a robotic avatar that integrates the latest significant improvements after about 15 years of development of the iCub series. Second, we present a versatile avatar system enabling humans to embody humanoid robots encompassing aspects such as locomotion, manipulation, voice, and facial expressions with comprehensive sensory feedback including visual, auditory, haptic, weight, and touch modalities. We validated the system by implementing several avatar architecture instances, each tailored to specific requirements. First, we evaluated the optimized architecture for verbal, nonverbal, and physical interactions with a remote recipient. This testing involved the operator in Genoa and the avatar in the Biennale di Venezia, Venice-about 290 kilometers away-thus allowing the operator to visit the Italian art exhibition remotely. Second, we evaluated the optimized architecture for recipient physical collaboration and public engagement on stage, live, at the We Make Future show, a prominent world digital innovation festival. In this instance, the operator was situated in Genoa while the avatar operated in Rimini-about 300 kilometers away-interacting with a recipient who entrusted the avatar with a payload to carry on stage before an audience of approximately 2000 spectators. Third, we present the architecture implemented by the iCub Team for the All Nippon Airways (ANA) Avatar XPrize competition.

MeSH terms

  • Avatar*
  • Feedback, Sensory
  • Haptic Interfaces
  • Humans
  • Locomotion
  • Robotics*