Attention decoding in users of hearing aids and cochlear implants

For most of us, chatting in a busy restaurant is a manageable challenge. But for those who rely on hearing aids (HAs) or cochlear implants (CIs), the ability to focus on one person while others are talking remains a significant hurdle.

Our research explored why this is so difficult by looking directly at the brain. When we listen to someone, our brain waves actually “sync up” with the rhythm of their speech. This process, called cortical speech tracking, acts like a biological highlighter, helping the brain prioritize the voice we want to hear while ignoring the background noise.

Our study compared three groups: people with typical hearing, hearing aid users, and cochlear implant users. Using EEG, we measured how well each person’s brain tracked a specific speaker when two people were talking at the same time.

We found that hearing-aid users performed remarkably similarly to those with typical hearing. Their neural speech tracking was able to distinguish between the person they were listening to and the person they were ignoring.

Cochlear-implant users, on the other hand, struggled significantly to understand speech in the presence of a competing talker. Regarding their neural speech tracking, the attentional modulation was much weaker.

Taken together, our findings highlight a specific neurological deficit in cochlear-implant users that may underly their difficulties with speech-in-noise listening. On the other hand, we show that neural speech tracking and its attentional modulation in hearing-aid users is remarkably similar to that of typical-hearing people, showing that neurofeedback based on attention decoding for steering hearing aids may be feasible.