The Influence of Lip-Movement Modelling on Cortical Speech Tracking in Virtual Environments with Mobile EEG
* Presenting author
Abstract:
Congruent lip-movements facilitate speech comprehension, for instance in the presence of irrelevant but speech-related background noise. It is currently unknown whether these beneficial effects generalize to virtual reality environments (VEs), which provide an opportunity to combine the reproducibility of laboratory settings with the complexity of every-day life communication.We will report a mobile EEG experiment in a VE. Participants will be presented audio-visual scenes comprising a character telling stories in four different conditions with babble noise in the background. One condition will comprise videos of real speakers with congruent lip movements, while in the remaining three conditions, speakers will be animated avatars. In one of the animated conditions, the avatars will have their mouth covered. In the remaining conditions, lip-movements will be presented congruently, either as currently implemented (mixing blend shapes based on relative formant energies) or based on a novel machine learning method.We will measure cortical tracking of speech and hypothesize a benefit for the advanced lip-movement animation compared to the previous one. We will further explore whether cortical speech tracking is comparable between real and animated characters. Our results will shed light on the ecological validity of VE paradigms and pave the way for more interactive concepts.