Navigation via Acoustical Cues through Enclosed Spaces without Visual Information
* Presenting author
Abstract:
Awareness of space, and subsequent orientation and navigation in rooms, is dominated by the visual system. However, humans are able to extract auditory information about their surrounding space from the way external sounds are affected by surfaces and objects, generating early reflections and reverberation in enclosed spaces. To better understand awareness of space and navigation based on acoustic cues without additional visual information, a real-time virtual acoustic environment was presented in a 3-dimensional 86-channel loudspeaker system. Listeners were asked to navigate through three virtual corridor layouts (I-, U-, and Z-shaped) using only acoustical cues wearing a head mounted display, which displayed no visual information of the geometry (neutral black screen) except for a control condition with vision. Subjects were seated on a rotating chair in the center of the loudspeaker array and navigated by “teleporting” steps through the environment. Acoustic information of the environment originated from virtual sound sources at the collision point of a virtual ray with the boundaries. The ray was casted either in direction of the head, the hand, or in a rotating, “radar”-like, fashion in 90° steps to all sides. Time to complete, number of collisions and movements in the virtual environment were tracked and evaluated.