Head-motion and Eye-gaze Behavior of Normal-hearing and Hearing- impaired Listeners in Complex Listening Environments
* Presenting author
Abstract:
It is common to move the head and eyes in response to an acoustic signal, either to improve the spatial perception or as a social norm when having a conversation. However, it is unknown if these movements reflect certain patterns and in which way these patterns depend on the complexity of the sound scene. Furthermore, hearing impaired listeners might exhibit different patterns if they rely more strongly on visual information than normal-hearing listeners. Here, we present two experiments conducted in audio-visual virtual environments. In experiment 1, the listeners’ task was to localize a speech source in a multi-talker scenario. The complexity of the scenes was varied by means of the number of talkers as well as the amount of reverberation. In experiment 2, listeners were asked to localize an audiovisual non-speech target in the presence of a varying number of distractors. In both experiments, the response times were found to increase with scene complexity. The analysis of the motion behavior revealed an increased use of head movements in scenes with higher complexity. Furthermore, while the number of saccades also increased with increasing scene complexity, the saccade amplitudes remained constant. Thus, motion behavior might be a predictor for challenges with perception.