V-Eye: A Vision-Based Navigation System for the Visually Impaired

Ping Jung Duh, Yu Cheng Sung, Liang Yu Fan Chiang, Yung Ju Chang, Kuan Wen Chen*

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    2 Scopus citations

    Abstract

    Numerous systems for helping visually impaired people navigate in unfamiliar places have been proposed. However, few can detect and warn about moving obstacles, provide correct orientation in real time, or support navigation between indoor and outdoor spaces. Accordingly, this paper proposes V-Eye, which fulfills these needs by utilizing a novel global localization method (VB-GPS) and image-segmentation techniques to achieve better scene understanding with a single camera. Our experiments establish that the proposed system can reliably provide precise locations and orientation information (with a median error of approximately 0.27 m and 0.95°); detect unpredictable obstacles; and support navigating both within and between indoor and outdoor environments. The results of a user-experience study of V-eye further indicate that it helped the participants not only with navigation, but also improved their awareness of obstacles, enhanced their spatial awareness more generally, and led them to feel more secure and independent while walking.

    Original languageEnglish
    Article number9113751
    Pages (from-to)1567-1580
    Number of pages14
    JournalIEEE Transactions on Multimedia
    Volume23
    DOIs
    StatePublished - 2021

    Keywords

    • global localization
    • navigation system
    • scene understanding
    • user study
    • Visually impaired

    Fingerprint

    Dive into the research topics of 'V-Eye: A Vision-Based Navigation System for the Visually Impaired'. Together they form a unique fingerprint.

    Cite this