TY - JOUR
T1 - Application of multisensor fusion to develop a personal location and 3D mapping system
AU - Hsu, Ya Wen
AU - Huang, Shiang Shuang
AU - Perng, Jau Woei
N1 - Publisher Copyright:
© 2018 Elsevier GmbH
PY - 2018/11
Y1 - 2018/11
N2 - With the popularization of indoor pedestrian positioning systems in recent years, many 3D indoor map-building methods and stereoscopic maps for visualization purposes have been developed. This study presents a human-portable 3D simultaneous localization and mapping (SLAM) system that can adapt to various environmental scenarios. Based on RGB-D SLAM and IMU/laser SLAM, we propose a sensor fusion SLAM algorithm that combines the advantages of these two algorithms and fuses Microsoft Kinect, a Hokuyo laser range finder (LRF), and inertial measurement unit (IMU) sensors to implement 3D positioning and mapping. In terms of positioning, the data from the IMU is used to estimate the user's velocity and attitude. To correct drift from the inertial sensor, the update procedure of the extended Kalman filter in this system mainly depends on displacement, which is estimated from the Kinect sensor. When the displacement estimated from the Kinect sensor is judged as a failure, the walking velocity estimated from the LRF is used to update the extended Kalman filter. Finally, the colored point cloud extracted from the Kinect is used to create dense 3D environmental maps. Experimental results for three different scenarios prove that our proposed algorithm surpasses the other two.
AB - With the popularization of indoor pedestrian positioning systems in recent years, many 3D indoor map-building methods and stereoscopic maps for visualization purposes have been developed. This study presents a human-portable 3D simultaneous localization and mapping (SLAM) system that can adapt to various environmental scenarios. Based on RGB-D SLAM and IMU/laser SLAM, we propose a sensor fusion SLAM algorithm that combines the advantages of these two algorithms and fuses Microsoft Kinect, a Hokuyo laser range finder (LRF), and inertial measurement unit (IMU) sensors to implement 3D positioning and mapping. In terms of positioning, the data from the IMU is used to estimate the user's velocity and attitude. To correct drift from the inertial sensor, the update procedure of the extended Kalman filter in this system mainly depends on displacement, which is estimated from the Kinect sensor. When the displacement estimated from the Kinect sensor is judged as a failure, the walking velocity estimated from the LRF is used to update the extended Kalman filter. Finally, the colored point cloud extracted from the Kinect is used to create dense 3D environmental maps. Experimental results for three different scenarios prove that our proposed algorithm surpasses the other two.
KW - Extended Kalman filter
KW - Inertial measurement unit
KW - Kinect
KW - Laser range finder
KW - Simultaneous localization and mapping
UR - http://www.scopus.com/inward/record.url?scp=85049969943&partnerID=8YFLogxK
U2 - 10.1016/j.ijleo.2018.07.029
DO - 10.1016/j.ijleo.2018.07.029
M3 - Article
AN - SCOPUS:85049969943
SN - 0030-4026
VL - 172
SP - 328
EP - 339
JO - Optik
JF - Optik
ER -