Application of multisensor fusion to develop a personal location and 3D mapping system

Ya Wen Hsu, Shiang Shuang Huang, Jau Woei Perng*

*此作品的通信作者

研究成果: Article同行評審

13 引文 斯高帕斯(Scopus)

摘要

With the popularization of indoor pedestrian positioning systems in recent years, many 3D indoor map-building methods and stereoscopic maps for visualization purposes have been developed. This study presents a human-portable 3D simultaneous localization and mapping (SLAM) system that can adapt to various environmental scenarios. Based on RGB-D SLAM and IMU/laser SLAM, we propose a sensor fusion SLAM algorithm that combines the advantages of these two algorithms and fuses Microsoft Kinect, a Hokuyo laser range finder (LRF), and inertial measurement unit (IMU) sensors to implement 3D positioning and mapping. In terms of positioning, the data from the IMU is used to estimate the user's velocity and attitude. To correct drift from the inertial sensor, the update procedure of the extended Kalman filter in this system mainly depends on displacement, which is estimated from the Kinect sensor. When the displacement estimated from the Kinect sensor is judged as a failure, the walking velocity estimated from the LRF is used to update the extended Kalman filter. Finally, the colored point cloud extracted from the Kinect is used to create dense 3D environmental maps. Experimental results for three different scenarios prove that our proposed algorithm surpasses the other two.

原文English
頁(從 - 到)328-339
頁數12
期刊Optik
172
DOIs
出版狀態Published - 11月 2018

指紋

深入研究「Application of multisensor fusion to develop a personal location and 3D mapping system」主題。共同形成了獨特的指紋。

引用此