An Empirical Evaluation of the Calibration of Auditory Distance Perception under Different Levels of Virtual Environment Visibilities

Wan Yi Lin*, Rohith Venkatakrishnan, Roshan Venkatakrishnan, Sabarish V. Babu, Christopher Pagano, Wen Chieh Lin

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The perception of distance is a complex process that often involves sensory information beyond that of just vision. In this work, we investigated if depth perception based on auditory information can be calibrated, a process by which perceptual accuracy of depth judgments can be improved by providing feedback and then performing corrective actions. We further investigated if perceptual learning through carryover effects of calibration occurs in different levels of a virtual environment's visibility based on different levels of virtual lighting. Users performed an auditory depth judgment task over several trials in which they walked where they perceived an aural sound to be, yielding absolute estimates of perceived distance. This task was performed in three sequential phases: pretest, calibration, posttest. Feedback on the perceptual accuracy of distance estimates was only provided in the calibration phase, allowing to study the calibration of auditory depth perception. We employed a 2 (Visibility of virtual environment) $\times 3$ (Phase) $\times 5$ (Target Distance) multi-factorial design, manipulating the phase and target distance as within-subjects factors, and the visibility of the virtual environment as a between-subjects factor. Our results revealed that users generally tend to underestimate aurally perceived distances in VR similar to the distance compression effects that commonly occur in visual distance perception in VR. We found that auditory depth estimates, obtained using an absolute measure, can be calibrated to become more accurate through feedback and corrective action. In terms of environment visibility, we find that environments visible enough to reveal their extent may contain visual information that users attune to in scaling aurally perceived depth.

Original languageEnglish
Title of host publicationProceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages690-700
Number of pages11
ISBN (Electronic)9798350374025
DOIs
StatePublished - 2024
Event31st IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024 - Orlando, United States
Duration: 16 Mar 202421 Mar 2024

Publication series

NameProceedings - 2024 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024

Conference

Conference31st IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2024
Country/TerritoryUnited States
CityOrlando
Period16/03/2421/03/24

Keywords

  • Auditory Distance Perception
  • Perceptual Learning and Calibration
  • Virtual Reality

Fingerprint

Dive into the research topics of 'An Empirical Evaluation of the Calibration of Auditory Distance Perception under Different Levels of Virtual Environment Visibilities'. Together they form a unique fingerprint.

Cite this