Human-to-Robot Handover Control of an Autonomous Mobile Robot Based on Hand-Masked Object Pose Estimation

Yu Yun Huang, Kai Tai Song*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

This letter presents a human-to-robot handover design for an Autonomous Mobile Robot (AMR). The developed control system enables the AMR to navigate to a specific person and grasp the object that the person wants to handover. This letter proposes a motion planning algorithm for grasping an unseen object held in hand. Through hand detection and segmentation, the hand region is masked and removed from the acquired depth image, which is used to estimate the object pose for grasping. For grasp pose determination, we propose to add the Convolutional Block Attention Module (CBAM) to the Generative Grasping Convolutional Neural Network (GGCNN) model to enhance the recognition rate. For the object-grasp task, the AMR localizes the object in person's hand, and uses the Model Predictive Control (MPC)-based controller to simultaneously control the mobile base and manipulator to grasp the object. A laboratory-developed mobile manipulator, equipped with a 6-DoF TM5M-900 is used for experimental verification. The experimental results show an average handover success rate of 81% for five different objects.

Original languageEnglish
Pages (from-to)7851-7858
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume9
Issue number9
DOIs
StatePublished - 2024

Keywords

  • Collaborative robots in manufacturing
  • mobile manipulation
  • pose estimation
  • task and motion planning

Fingerprint

Dive into the research topics of 'Human-to-Robot Handover Control of an Autonomous Mobile Robot Based on Hand-Masked Object Pose Estimation'. Together they form a unique fingerprint.

Cite this