Abstract
Recently, several robotic wheelchairs have been proposed that employ autonomous functions. In designing wheelchairs, it is important to reduce the accompanist load. To provide such a task, the mobile robot needs to recognize and track people. In this paper, we propose to utilize the multisensory data fusion to track a target accompanist. First, the simultaneous localization and map building is achieved by using the laser range finder (LRF) and inertial sensors with the extended Kalman filter recursively. To track the target person robustly, the accompanist, are tracked by fusing laser and vision data. The human objects are detected by LRF, and the identity of accompanist is recognized using a PTZ camera with a pre-defined signature using the speed-up robust features algorithm. The proposed system can adaptively search visual signature and track the accompanist by dynamically zooming the PTZ camera based on LRF detection results to enlarge the range of human following. The experimental results verified and demonstrated the performance of the proposed system.
Original language | English |
---|---|
Article number | 6974238 |
Pages (from-to) | 2138-2143 |
Number of pages | 6 |
Journal | Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics |
Volume | 2014-January |
Issue number | January |
DOIs | |
State | Published - 1 Jan 2014 |
Event | 2014 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2014 - San Diego, United States Duration: 5 Oct 2014 → 8 Oct 2014 |
Keywords
- Extended Kalman filter
- Laser range finder
- Pan-Tilt-Zoom (PTZ) camera
- SLAM