Modulation of Speech Processing using Tactile Stimuli
* Presenting author
Abstract:
Speech is a complex signal, the processing of which requires segmentation of the acoustic stream into words, syllables and phonemes in order to extract semantic content. This segmentation presumably involves neural oscillations in the delta (1 - 4 Hz) and theta (4 - 8 Hz) frequency ranges that entrain to the rhythm of syllables and words. Transcranial current stimulation with such speech rhythms has indeed been found to influence speech-in-noise comprehension. Due to neural connections from the somatosensory system to the auditory cortex, such cortical oscillations may also be influenced through tactile stimuli. Here we investigated whether tactile stimuli paired to speech rhythms can influence the comprehension of speech in noise. We designed tactile stimuli that occurred at the centre of individual syllables in continuous speech, while measuring a subject's speech comprehension as well as neural activity through electroencephalography (EEG). We found that the tactile stimuli could modulate speech comprehension, with varying effects depending on the delay between the tactile signal and the speech stream. Moreover, the neuroimaging data showed that the effects on speech comprehension were correlated with electrophysiological markers of audiotactile integration.