Self-rotation behavior during a spatialized speech test in reverberation
* Presenting author
Abstract:
Self-movement during a cocktail party has a strong effect on speech perception; however, speech testing has been traditionally done with static sounds and listeners. During listening, people may also adapt their movement to understand better. Here, we analyze behavior during self-rotation in a spatialized speech test, in which target sentences were presented randomly either from the front, rear, or side of participants together with a frontal speech-shaped noise interferer. The stimuli were spatialized in a reverberant room created in Simulated Open Field Environment. The task was to respond naturally with movement as if somebody was talking to the participant from that direction. A static visual avatar could appear at the target direction. We observed that people had a tendency to increase probability of corrective movements towards the end of the target sentence, rotation speed was highest at the initial part of the sentence and the presence of the visual cue increased the consistency of the trajectories across trials. This suggests that a typical rotation trajectory in a trial consisted of a fast initial saccade followed by corrective turns while the visual cues helped with target localization. These observations support the hypothesis that people adapt their movement for better intelligibility.