Learning-based robotic grasping: A review

Front Robot AI. 2023 Apr 4:10:1038658. doi: 10.3389/frobt.2023.1038658. eCollection 2023.

Abstract

As personalization technology increasingly orchestrates individualized shopping or marketing experiences in industries such as logistics, fast-moving consumer goods, and food delivery, these sectors require flexible solutions that can automate object grasping for unknown or unseen objects without much modification or downtime. Most solutions in the market are based on traditional object recognition and are, therefore, not suitable for grasping unknown objects with varying shapes and textures. Adequate learning policies enable robotic grasping to accommodate high-mix and low-volume manufacturing scenarios. In this paper, we review the recent development of learning-based robotic grasping techniques from a corpus of over 150 papers. In addition to addressing the current achievements from researchers all over the world, we also point out the gaps and challenges faced in AI-enabled grasping, which hinder robotization in the aforementioned industries. In addition to 3D object segmentation and learning-based grasping benchmarks, we have also performed a comprehensive market survey regarding tactile sensors and robot skin. Furthermore, we reviewed the latest literature on how sensor feedback can be trained by a learning model to provide valid inputs for grasping stability. Finally, learning-based soft gripping is evaluated as soft grippers can accommodate objects of various sizes and shapes and can even handle fragile objects. In general, robotic grasping can achieve higher flexibility and adaptability, when equipped with learning algorithms.

Keywords: high mix and low volume; learning policy; personalization; soft gripping; tactile sensing; versatile grasping.

Publication types

  • Review

Grants and funding

This research was supported by the Advanced Remanufacturing and Technology Center under Project No. C22-05-ARTC for AI-Enabled Versatile Grasping and Robotic HTPO Seed Fund (Award C211518007).