Orbeez-SLAM: A Real-time Monocular Visual SLAM with ORB Features and NeRF-realized Mapping

Chi Ming Chung, Yang Che Tseng, Ya Ching Hsu, Xiang Qian Shi, Yun Hung Hua, Jia Fong Yeh, Wen Chin Chen, Yi Ting Chen, Winston H. Hsu

研究成果: Conference contribution同行評審

9 引文 斯高帕斯(Scopus)

摘要

A spatial AI that can perform complex tasks through visual signals and cooperate with humans is highly anticipated. To achieve this, we need a visual SLAM that easily adapts to new scenes without pre-training and generates dense maps for downstream tasks in real-time. None of the previous learning-based and non-learning-based visual SLAMs satisfy all needs due to the intrinsic limitations of their components. In this work, we develop a visual SLAM named Orbeez-SLAM, which successfully collaborates with implicit neural representation and visual odometry to achieve our goals. Moreover, Orbeez-SLAM can work with the monocular camera since it only needs RGB inputs, making it widely applicable to the real world. Results show that our SLAM is up to 800x faster than the strong baseline with superior rendering outcomes. Code link: https://github.com/MarvinChung/Orbeez-SLAM.

原文English
主出版物標題Proceedings - ICRA 2023
主出版物子標題IEEE International Conference on Robotics and Automation
發行者Institute of Electrical and Electronics Engineers Inc.
頁面9400-9406
頁數7
ISBN(電子)9798350323658
DOIs
出版狀態Published - 2023
事件2023 IEEE International Conference on Robotics and Automation, ICRA 2023 - London, United Kingdom
持續時間: 29 5月 20232 6月 2023

出版系列

名字Proceedings - IEEE International Conference on Robotics and Automation
2023-May
ISSN(列印)1050-4729

Conference

Conference2023 IEEE International Conference on Robotics and Automation, ICRA 2023
國家/地區United Kingdom
城市London
期間29/05/232/06/23

指紋

深入研究「Orbeez-SLAM: A Real-time Monocular Visual SLAM with ORB Features and NeRF-realized Mapping」主題。共同形成了獨特的指紋。

引用此