A close-up of an eye dominates the computer monitor in Avinash Bala’s lab. It’s uncomfortably close, blinking like it’s irritated. A vaguely female voice is repeating the word “bah” in the background every few seconds.
“The eighth trial is coming up,” Bala says quietly. “The tenth one will be a different sound, and we should pay attention to that because I’m expecting it to be really big dilation.”
Bala is a neuroscientist at the University of Oregon, and although he calls his workspace the “Baby Hearing Lab,” the eye on the screen is decidedly adult. It belongs to Jared Acosta-King, a graduate student who Bala flagged down to help demonstrate a new hearing test that could constitute a major step forward in diagnosing hearing loss in babies, toddlers and adults who have difficulty communicating.
“Bah.”
“This is nine. The next one will be a ‘pah,’” Bala says.
Acosta-King is in a sound booth next door. He’s sitting in front of a camera that captures his pupil in high relief.
“Pah.”
Acosta-King’s pupil grows noticeably in size in response to the new sound.
“There we go,” Bala says. “It is so reliable, and it is so predictable, and that is what makes it so eminently usable.”
Catch-22
Detecting hearing loss in babies is kind of a Catch-22. It’s important to uncover hearing issues early to start therapies to help them develop language. But because babies don’t have language to tell you what they hear, it’s difficult to diagnose just how severe their hearing loss is.
Bala’s new test relies on an involuntary pupil response, one that is triggered when humans hear a new sound. It’s a reaction he figured out in an unlikely way – while he was studying owls.
About 20 years ago Bala was working on a research project studying how barn owls hear the world as a way to better understand how human brains process sound. He was trying to condition the owls to respond when they heard different sounds.
“We had the owl in a quiet room. We had a video camera, like a security camera, watching the owl,” he said.
While they were setting up the experiment — going in and out of the owl’s room — the odd door would slam down the hall. Or someone would drop something on a desk.
“And I realized that every time something unexpected happened, owl’s eyes seemed to get brighter,” he said.
They showed brighter on the video because the owl’s eyes were dilating in response to the new sounds, reflecting more light back to the camera — like a cat in headlights.
The owl-conditioning effort itself wasn’t going well.
“Avinash was extremely frustrated,” recalled Institute of Neuroscience co-director Terry Takahashi. “He came up and said, ‘Hey, this doesn’t work. The only thing that happens when I play sound is that the pupil dilates.’ And then all of a sudden, we all stop and go, ‘Wait a minute!’”
They recognized this involuntary pupil response could be used to measure hearing in the owls.
Pretty soon thereafter, Bala figured out that humans have the same involuntary response to new sounds – any new sound, including the same word at different volumes and words with slight variations like the “bah” and “pah” used in the lab’s tests.
“It’s exciting when you can take something that’s really basic science, like how do owls figure out where sounds are coming from? What’s the computer circuitry in the brain that does that? And all of a sudden, to take a technique that was refined in owls and apply it to human hearing, it’s just something that I never had expected,” Takahashi said.
It’s not known why this response happens to new sounds. Bala says it could be a byproduct of how organisms have evolved to orient themselves in the world — like when you unconsciously turn your head towards a sound. Another possibility is that a larger pupil size also allows animals to hone their focus on a specific object – which could be useful if a new sound signals the presence of prey or possible danger.
“What I realized was that we could also use this in people who are unable to respond for one reason or another. And the biggest such group of people is infants, because babies can’t tell us what they’re thinking,” Bala said.
Current techniques
It’s standard practice in the United States to screen babies for hearing loss within the first month of life. Infants who are flagged for potential hearing issues are sent for further testing, but the tests audiologists use aren’t the same “tell-us-if-you-hear-the-sound” tests older children and adults use.
“Research has shown that the first six months of life are really critical for a child’s brain development. And if we were to wait until babies are old enough to do a regular hearing test, that really nice window of opportunity where all that brain development is happening has passed,” said pediatric audiologist Kristy Knight.
Knight works at OHSU Doernbecher Children’s Hospital and is partnering with Bala on the project.
Knight says there are several ways audiologist screen young children for hearing loss, but they all have their limitations.
One measures the physical vibration of the cochlea but can’t detect if the brain is registering those signals.
Another uses warbling sounds at different pitches and volumes to prompt a baby to look a certain direction when they hear something. But depending on the age of the child being tested, it can be difficult to keep them focused enough to get good results.
Still another test looks at brainwaves of infants exposed to different noises. But this test requires that the baby fall asleep for about an hour while in the office. And interpreting the results is a subjective endeavor that requires training and experience.
“One of the things that we really struggle [with] young children is knowing, can they recognize the difference between sounds like ‘else’ versus ‘elf’, for example? Our regular hearing tests don’t tell us that. We have to wait till the child has some amount of language development to really measure that clinically,” Knight said.
She said this pupil response-focused hearing test would give audiologists another tool.
“What we’re proposing is that if we have a test that doesn’t require expertise, any audiologists can administrate,” Bala said. “What it involves essentially is putting a baby in his mother’s lap or in a high chair playing sounds… and at the end of 15 minutes, the computer just comes that gives you up yes or no answer.”
The version of the test designed for babies will keep their attention towards the camera with an animated video. A computer algorithm will be used to measure changes in pupil size as the different sounds are played.
Bala recently received a Small Business Innovation Research Grant from the National Institutes of Health to further develop the new test. This fall, he plans to produce a prototype that is more-streamlined than the set-up currently being used at the Baby Hearing Lab.
And if COVID-19 allows, Bala says he hopes to start real-world testing the new baby hearing test at OHSU Doernbecher Children’s Hospital in Portland early next year.