Researchers at the University of Iowa have defined how people recognize words. In a new study of people who use cochlear implants to hear, the researchers identified three main approaches that people with and without hearing loss use to recognize words, a critical part of understanding spoken language. The approach varies from person to person, regardless of their ability or hearing capacity: Some people wait a while before identifying a word, while others may struggle between two or more words before deciding which one was heard.
When a person hears a word, their brain briefly considers hundreds, if not thousands, of options and eliminates most of them in less than a second. For example, when a person hears “Hawkeyes,” their brain may briefly consider “hot dogs,” “hawk,” “hockey,” and other similar-sounding words before settling on the target word.
Although the brain works quickly and differences in word recognition strategies can be subtle, the results of this study are important because they could open new avenues for hearing specialists to identify word recognition difficulties in early childhood or in older adults (who are prone to hearing loss) and more effectively manage these conditions.
“What this study has shown us is that people don’t all operate the same way, even when it comes to how they recognize a word,” said Bob McMurray, the F. Wendell Miller Professor in the Department of Psychology and Brain Sciences and corresponding author of the study. “People seem to come up with their own solutions to the challenge of word recognition. There’s not just one way to use a language. It’s pretty crazy when you think about it.”
McMurray has been studying word recognition in children and older adults for three decades. His research has shown differences in how people of all ages recognize spoken language. But those differences tend to be so small that it’s hard to categorize them neatly. So McMurray and his research team turned to people who use cochlear implants, devices used by the profoundly deaf or severely hard of hearing that bypass the normal pathways through which people hear, using electrodes to deliver sound.
“It’s like replacing millions of hair cells and thousands of frequencies with 22 electrodes. It mixes everything up. But it works because the brain can adapt,” McMurray says.
The research team recruited 101 participants from the Iowa Cochlear Implant Clinical Research Center at the University of Iowa Medical Center. Participants listened to a word spoken over loudspeakers and then selected from four images on a computer screen the one that matched the word they heard. Hearing and selection activities were recorded using eye-tracking technology, allowing the researchers to track, in a fraction of a second, how and when each participant decided on a word they had heard.
The study is published in the journal Communication about nature.
The experiments revealed that cochlear implant users, even with different hearing abilities, used the same basic process to select spoken words as people with normal hearing.
Researchers have identified three dimensions of word recognition:
- Waiting attitude
- Sustained activation
- Slow activation
Most of the cochlear implant participants used some degree of the Wait and See method, meaning they waited up to a quarter of a second after hearing the word to firmly decide which word they were hearing.
Previous research in McMurray’s lab has shown that children with early hearing loss have wait-and-see tendencies, but this phenomenon has not been observed more generally.
“Maybe it’s a way for them to avoid having a bunch of other competing words in their heads,” McMurray says. “They can slow down and keep it simple.”
The researchers also found that some cochlear implant participants tended to adopt sustained activation, in which listeners struggle a bit between words before agreeing on what they think is the word they heard, or they used slow activation, meaning they were slower to recognize words. Importantly, each listener appeared to adopt a hybrid strategy, with a different degree of each strategy.
These dimensions correspond to the patterns by which people without hearing loss, from young to old, tend to recognize words, as shown in a previous study by McMurray’s team.
“Now that we have identified the dimensions of our cochlear implant population, we can look at people without hearing loss and see that the same dimensions apply,” McMurray says. “What we see very clearly in how cochlear implant users recognize words also happens in many people.”
The researchers now hope to apply these findings to develop strategies that could help people who are at the extremes of a particular dimension of word recognition. About 15 percent of adults in the United States have hearing loss, which can lead to cognitive decline, decreased social interaction and greater isolation.
“Our goal is to have a more refined method than just asking them, ‘How well are you listening? Do you have trouble perceiving speech in the real world?'” McMurray says.
More information:
Underlying dimensions of real-time word recognition in cochlear implant users, Nature Communications (2024). DOI: 10.1038/s41467-024-51514-3
Provided by the University of Iowa
Quote: Cochlear implant users reveal basic approaches to word recognition (2024, August 29) retrieved August 29, 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.