Human-to-Robot Handover Control of an Autonomous Mobile Robot Based on Hand-Masked Object Pose Estimation

Yu Yun Huang, Kai Tai Song*

*此作品的通信作者

研究成果: Article同行評審

摘要

This letter presents a human-to-robot handover design for an Autonomous Mobile Robot (AMR). The developed control system enables the AMR to navigate to a specific person and grasp the object that the person wants to handover. This letter proposes a motion planning algorithm for grasping an unseen object held in hand. Through hand detection and segmentation, the hand region is masked and removed from the acquired depth image, which is used to estimate the object pose for grasping. For grasp pose determination, we propose to add the Convolutional Block Attention Module (CBAM) to the Generative Grasping Convolutional Neural Network (GGCNN) model to enhance the recognition rate. For the object-grasp task, the AMR localizes the object in person's hand, and uses the Model Predictive Control (MPC)-based controller to simultaneously control the mobile base and manipulator to grasp the object. A laboratory-developed mobile manipulator, equipped with a 6-DoF TM5M-900 is used for experimental verification. The experimental results show an average handover success rate of 81% for five different objects.

原文English
頁(從 - 到)7851-7858
頁數8
期刊IEEE Robotics and Automation Letters
9
發行號9
DOIs
出版狀態Published - 2024

指紋

深入研究「Human-to-Robot Handover Control of an Autonomous Mobile Robot Based on Hand-Masked Object Pose Estimation」主題。共同形成了獨特的指紋。

引用此