Effect of Simple Visual Inputs on Syllable Parsing
* Presenting author
Abstract:
Visual signals, such as arising from a talker's face, can aid speech comprehension. The neural mechanisms behind the audiovisual integration remain, however, poorly understood. To probe the mechanisms involved, here we utilize a computational model of a cortical microcircuit for speech processing. The model generates oscillations in the theta frequency range through the coupling of an excitatory and an inhibitory neural population. The theta rhythm becomes entrained to the onsets of syllables in presence of a speech input, thus enabling the deduction of syllable onsets from the network activity. We add visual stimuli to this model and investigate their respective effect on parsing scores. Specifically, the different visual input currents are related to the rate of syllables as well as the mouth-opening area of the speakers. We find that adding visual currents to the excitatory neuronal population influences speech comprehension, either boosting it or impeding it, depending on the audiovisual time delay and on whether the currents occur in an excitatory or inhibitory manner. In contrast, adding visual input currents to the inhibitory population does not affect speech comprehension. Our results, therefore, suggest neural mechanisms for audiovisual integration and make predictions that can be experimentally tested.