September 7, 2017

Study reveals how brain processes spatial hearing information

Scientists have known that the brain detects where sound comes from based on a couple of major cues — when the sound hits each ear (interaural time difference) and what the sound level is when it does (interaural level difference.) Less is known, however, about where and how that spatial hearing information is processed in the brain.

Scientists have known that the brain detects where sound comes from based on a couple of major cues — when the sound hits each ear (interaural time difference) and what the sound level is when it does (interaural level difference.) Less is known, however, about where and how that spatial hearing information is processed in the brain.

A new study that includes Vanderbilt researchers used functional magnetic resonance imaging (fMRI) to create brain images of how normal-hearing listeners perceive sounds, discovering that it could predict brain responses to interaural level differences based on responses to interaural time differences, and vice versa.

The research, newly published in the journal Proceedings of the National Academy of Sciences, finds that the area of the brain studied, the posterior auditory cortex, is working on the higher order of perception, not just detecting acoustics.

G. Christopher Stecker, Ph.D.

“To me, that’s the biggest and most exciting finding,” said G. Christopher Stecker, Ph.D., associate professor of Hearing and Speech Sciences, one of the study’s authors. “I think this does tell us a lot more about the brain mechanisms that are responsible for the capacity for auditory spatial awareness.”

That knowledge could lead to further advances in the assessment and treatment of people with hearing disorders, Stecker said. Brain-based measurements could improve signal processing in hearing aids and cochlear implants. Improvements in spatial sound assessment could also inform advances in virtual and augmented reality devices.

“It’s not just localizing sounds; it’s really knowing where we are,” Stecker said. “And I think understanding the mechanisms of that will help us to understand what happens when auditory spatial awareness goes wrong and also how to assess it and maximize it. Right now, we just don’t have a lot of ways to look at how the brain is dealing with that spatial information, and I think this is a step in that direction.”

Nathan Higgins, Ph.D., a former postdoctoral fellow at Vanderbilt University and another of the study’s authors, said the findings were very exciting. “Fundamentally understanding how the brain works can help guide a lot of studies in the future,” said Higgins, who is now a postdoctoral fellow at University of Nevada, Las Vegas.

Other co-authors are Susan McLaughlin, Ph.D., at the University of Washington, and Teemu Rinne, Ph.D., at the University of Turku, Finland.

Anne Marie Tharpe, Ph.D., professor and chair of Hearing and Speech Sciences and associate director of the Vanderbilt Bill Wilkerson Center, remarked about the significance of the study’s findings.

“People don’t always realize that our auditory system provides us with more than basic detection of sound,” she said. “It also provides us with information about where we are relative to other objects in our environment, how fast objects might be approaching us, and other indicators of spatial awareness. This information is critical for the optimization of hearing technologies used by those with auditory disorders.”

The study was supported by National Institutes of Health Grant R01-DC011548.