Human Perception of Aural and Visual Disparity in Virtual Environments

TR Number

Date

2018-09-13

Journal Title

Journal ISSN

Volume Title

Publisher

Virginia Tech

Abstract

With the development of technology over the past decade and the former challenges of virtual environments mitigated, the need for further study of human interaction with these environments has grown apparent. The visual and interaction components for virtual reality applications have been comprehensively studied, but a lack of spatial audio fidelity leaves a need for understanding how well humans can localize aural cues and discern audio-visual disparity in these virtual environments. In order for development of accurate and efficient levels of audio fidelity, a human study was conducted with 18 participants to see how far a bimodal audio and visual cue need to separate for someone to notice. As suspected, having a visual component paired with an auditory one led to biasing toward the visual component. The average participant noticed a disparity when the audio component was 33.7° apart from the visual one, pertaining to the azimuth. There was no significant evidence to suggest that speed or direction of audio component disparity led to better localization performance by participant. Presence and prior experience did not have an effect on localization performance; however, a larger participant base may be needed to draw further conclusions. Increase in localization ability was observed within a few practice rounds for participants. Overall, performance in virtual reality was parallel to augmented reality performance when a visual source biased sound localization, and can this be a both tool and design constraint for virtual environment developers.

Description

Keywords

Virtual Reality, Spatial Audio, Localization, Ambisonics

Citation

Collections