Lost in style: Gaze-driven Adaptive Aid for VR Navigation

Rawan Alghofaili, Yasuhito Sawahata, Haikun Huang, Hsueh-Cheng Wang, Takaaki Shiratori, Lap Fai Yu

研究成果: Paper同行評審

27 引文 斯高帕斯(Scopus)

摘要

A key challenge for virtual reality level designers is striking a balance between maintaining the immersiveness of VR and providing users with on-screen aids after designing a virtual experience. These aids are often necessary for wayfnding in virtual environments with complex paths. We introduce a novel adaptive aid that maintains the effectiveness of traditional aids, while equipping designers and users with the controls of how often help is displayed. Our adaptive aid uses gaze patterns in predicting user’s need for navigation aid in VR and displays mini-maps or arrows accordingly. Using a dataset of gaze angle sequences of users navigating a VR environment and markers of when users requested aid, we trained an LSTM to classify user’s gaze sequences as needing navigation help and display an aid. We validated the efcacy of the adaptive aid for wayfnding compared to other commonly-used wayfnding aids.

原文American English
DOIs
出版狀態Published - 2 5月 2019
事件2019 CHI Conference on Human Factors in Computing Systems, CHI 2019 - Glasgow, United Kingdom
持續時間: 4 5月 20199 5月 2019

Conference

Conference2019 CHI Conference on Human Factors in Computing Systems, CHI 2019
國家/地區United Kingdom
城市Glasgow
期間4/05/199/05/19

指紋

深入研究「Lost in style: Gaze-driven Adaptive Aid for VR Navigation」主題。共同形成了獨特的指紋。

引用此