Abstract
In this paper, we present a prototype system that is capable of associating real objects with virtual models and turning the table top into imaginary virtual scenes. A user can interact with these objects when she or he is immersed in the virtual environment. To accomplish this goal, a vision-based system is developed to online recognize and track the real objects in the scene. The corresponding virtual models are retrieved based on their tactile shapes. They are displayed and moved on a head-mounted display (HMD) according to tracked object poses. The experiment demonstrates that our prototype system can find reasonable association between real and virtual objects, and users are interested in the novel interaction.
Original language | English |
---|---|
Title of host publication | SIGGRAPH Asia 2021 Posters |
Pages | 1-2 |
Number of pages | 2 |
DOIs | |
State | Published - Dec 2021 |