Lip-read words can be decoded by auditory regions of the brain in the same way as heard speech, according to a new University of Michigan report that examined how vision supports verbal perception.
The researchers used functional magnetic resonance imaging and electrodes implanted in patients’ brains to show that watching someone speak when you can’t hear them (lip reading) activates auditory regions of the brain in ways similar to real speech.
David Brang, associate professor of psychology and lead author of the study, said the perception of a person’s facial movements often begins before sounds are produced. The auditory system uses these early visual cues to prime auditory neurons before sounds are heard, he explained.
The study showed that integrating visual and auditory cues allows a person to obtain more accurate and effective speech information, which significantly improves communication skills.
Brang and his colleagues sought to understand how visual signals during lipreading are represented in the auditory system.
They used fMRI data from healthy adults and intracranial recordings from electrodes implanted in epileptic patients during auditory and visual speech perception tasks.
The results revealed that lip-read words could be categorized earlier than heard words. This suggests that lip-reading may involve a predictive mechanism that facilitates speech processing before auditory information is available, Brang said.
The results support a model in which the auditory system combines neural distributions evoked by heard and lip-read words to generate a more accurate estimate of what was said. The results are published in the journal Current Biology.
Brang said these findings suggest that the auditory system rapidly integrates lipreading information to improve hearing abilities, especially in challenging listening environments like noisy restaurants. Observing a speaker’s lips can influence our auditory perception even before sounds are produced.
For people with hearing loss, this rapid use of information obtained through lipreading is probably even more pronounced, he added.
“As hearing abilities decline, people increasingly rely on visual cues to aid in comprehension,” Brang said. “The ability of visual speech to activate and encode information in the auditory cortex appears to be a crucial compensatory mechanism.”
This helps people maintain their hearing abilities as they age, highlighting the value of face-to-face communication in supporting listening comprehension.
The study was co-authored by Karthik Ganesan, Cody Zhewei Cao, Michael Demidenko, Andrew Jahn, William Stacey and Vibhangini Wasade.
More information:
Ganesan Karthik et al, Auditory cortex encodes lipreading information via spatially distributed activity, Current Biology (2024). DOI: 10.1016/j.cub.2024.07.073
Provided by the University of Michigan
Quote:Lipreading activates brain regions similar to real speech, researchers show (2024, August 16) retrieved August 16, 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.