New! Sign up for our free email newsletter.
Science News
from research organizations

Left And Right Ears Not Created Equal As Newborns Process Sound, Finds UCLA/UA Research

Date:
September 10, 2004
Source:
University Of California - Los Angeles
Summary:
Challenging decades of scientific belief that the decoding of sound originates from a preferred side of the brain, UCLA and University of Arizona scientists have demonstrated that right-left differences for the auditory processing of sound start at the ear.
Share:
FULL STORY

Challenging decades of scientific belief that the decoding of sound originates from a preferred side of the brain, UCLA and University of Arizona scientists have demonstrated that right-left differences for the auditory processing of sound start at the ear.

Reported in the Sept. 10 edition of Science, the new research could hold profound implications for rehabilitation of persons with hearing loss in one or both ears, and help doctors enhance speech and language development in hearing-impaired newborns.

"From birth, the ear is structured to distinguish between various types of sounds and to send them to the optimal side in the brain for processing," explained Yvonne Sininger, Ph.D., visiting professor of head and neck surgery at the David Geffen School of Medicine at UCLA. "Yet no one has looked closely at the role played by the ear in processing auditory signals."

Scientists have long understood that the auditory regions of the two halves of the brain sort out sound differently. The left side dominates in deciphering speech and other rapidly changing signals, while the right side leads in processing tones and music. Because of how the brain's neural network is organized, the left half of the brain controls the right side of the body, and the left ear is more directly connected to the right side of the brain.

Prior research had assumed that a mechanism arising from cellular properties unique to each brain hemisphere explained why the two sides of the brain process sound differently. But Sininger's findings suggest that the difference is inherent in the ear itself.

"We always assumed that our left and right ears worked exactly the same way," she said. "As a result, we tended to think it didn't matter which ear was impaired in a person. Now we see that it may have profound implications for the individual's speech and language development."

Working with co-author Barbara Cone-Wesson, Ph.D., associate professor of speech and hearing sciences at the University of Arizona, Sininger studied tiny amplifiers in the outer hair cells of the inner ear.

"When we hear a sound, tiny cells in our ear expand and contract to amplify the vibrations," explained Sininger. "The inner hair cells convert the vibrations to neural cells and send them to the brain, which decodes the input."

"These amplified vibrations also leak back out to the ear in a phenomena call otoacoustic emission (OAE)," added Sininger. "We measured the OAE by inserting a microphone in the ear canal."

In a six-year study, the UCLA/UA team evaluated more than 3,000 newborns for hearing ability before they left the hospital. Sininger and Cone-Wesson placed a tiny probe device in the baby's ear to test its hearing. The probe emitted a sound and measured the ear's OAE.

The researchers measured the babies OAE with two types of sound. First, they used rapid clicks and then sustained tones. They were surprised to find that the left ear provides extra amplification for tones like music, while the right ear provides extra amplification for rapid sounds timed like speech.

"We were intrigued to discover that the clicks triggered more amplification in the baby's right ear, while the tones induced more amplification in the baby's left ear," said Sininger. "This parallels how the brain processes speech and music, except the sides are reversed due to the brain's cross connections."

"Our findings demonstrate that auditory processing starts in the ear before it is ever seen in the brain," said Cone-Wesson. "Even at birth, the ear is structured to distinguish between different types of sound and to send it to the right place in the brain."

Previous research supports the team's new findings. For example, earlier research shows that children with impairment in the right ear encounter more trouble learning in school than children with hearing loss in the left ear.

"If a person is completely deaf, our findings may offer guidelines to surgeons for placing a cochlear implant in the individual's left or right ear and influence how cochlear implants or hearing aids are programmed to process sound," explained Cone-Wesson. "Sound-processing programs for hearing devices could be individualized for each ear to provide the best conditions for hearing speech or music."

"Our next step is to explore parallel processing in brain and ear simultaneously," said Sininger. "Do the ear and brain work together or independently in dealing with stimuli? How does one-sided hearing loss affect this process? And finally, how does hearing loss compare to one-sided loss in the right or left ear?"

The National Institute on Deafness and Other Communicative Disorders funded the study.


Story Source:

Materials provided by University Of California - Los Angeles. Note: Content may be edited for style and length.


Cite This Page:

University Of California - Los Angeles. "Left And Right Ears Not Created Equal As Newborns Process Sound, Finds UCLA/UA Research." ScienceDaily. ScienceDaily, 10 September 2004. <www.sciencedaily.com/releases/2004/09/040910082553.htm>.
University Of California - Los Angeles. (2004, September 10). Left And Right Ears Not Created Equal As Newborns Process Sound, Finds UCLA/UA Research. ScienceDaily. Retrieved April 18, 2024 from www.sciencedaily.com/releases/2004/09/040910082553.htm
University Of California - Los Angeles. "Left And Right Ears Not Created Equal As Newborns Process Sound, Finds UCLA/UA Research." ScienceDaily. www.sciencedaily.com/releases/2004/09/040910082553.htm (accessed April 18, 2024).

Explore More

from ScienceDaily

RELATED STORIES