MADISON -- The slightest turn of the head can significantly change the way a person or animal detects sound. A subtle tilt alters the angle at which high-frequency sound waves hit the ear, providing cues to localize the sound. To use those cues, the brain must put what it hears into the context of the position of the head. Until recently, scientists were not sure how this was done.
Now researchers at University of Wisconsin Medical School appear to have the explanation. They have discovered that in the cochlear nucleus, the first sound-processing station in the brain, certain cells accomplish the job by integrating the two kinds of information, each of which travels along a distinct pathway.
The researchers compared activity in both pathways, examining currents running through synapses--or signal-transmitting junctions--in fusiform cells of the cochlear nucleus. To their surprise, they learned that synapses transmitting acoustic information were not influenced by previous activity--they were stable. On the other hand, synapses carrying information about head and ear position were continually strengthened or weakened depending on the amount of activity-they were plastic.
The study, by Donata Oertel of the Department of Physiology at the University of Wisconsin Medical School, and Kiyohiro Fujino, now of the Departmenty of Otolaryngology at Kyoto University Graduate School of Medicine, appears in the December 16 Proceedings of the National Academy of Sciences.
The auditory system's main responsibilities are to locate sounds, analyze their properties, and then recognize what they mean. The initial duties take place in the cochlear nucleus.
"Sound localization is an especially important function of the auditory system because it allows us to figure out what's happening around corners, in the dark or when vision can't help," said Oertel, a UW professor of physiology who is an expert on the cochlear nucleus. For locating sounds on the horizontal plane-those coming from the left or right of the head-factors such as relative sound intensity in each ear, and the difference in sound's arrival time at each ear are important cues. Cells in the ventral cochlear nucleus are responsible for pinpointing horizontal sounds.
"But you don't have those cues in the vertical plane. If you're trying to distinguish sounds coming from above or below the head, or in front of or behind it, time and intensity differences at the two ears don't help at all," Oertel said. "High frequency sound waves are distorted differently when they are heard from straight on rather than high up, and the asymmetry of our ears distorts the sound waves in another way when they come from the front or back."
Following the lead of other investigators, including some at UW, Oertel looked to the dorsal cochlear nucleus for the source of sound detection on the vertical plane. In her earlier research, she showed that fusiform cells, the principal cells of the dorsal cochlear nucleus, are activated by two sets of dendrites, or threadlike arms of nerve cells. One set detects sounds through auditory nerve fibers; the other carries information about the position of the ears, head and neck through parallel fibers.
In the current work, Fujino, who was a post-doctoral fellow with Oertel, used a technique called patch-clamping to record activity at the synapses of single fusiform cells. The experiment showed remarkable differences.
Currents evoked by activating signals through the parallel fibers were greatly strengthened with increasing use and weakened with decreasing use. This plasticity presumably aids in adapting to differing head positions, Oertel said. Signals evoked through the auditory nerve, which are involved in sound processing, were stable and not influenced by use.
Oertel said it is extremely rare for single cells to exhibit both the plasticity and the stability she and Fujino found.
"The observation that the strength of synapses can vary as a function of their activity has been of great interest because it underlies the brain's ability to learn and respond to the environment," she said. "However, if this part of the auditory system were plastic, it would cause what we hear now to be confused with what we heard just moments before."
Oertel's knowledge of the cochlear nucleus and its role in understanding how the brain uses sound is helping computer scientists in their efforts to develop computer speech processors.
"The engineers are making computers that work in a way that's similar to how the brain functions," she said. "The computers are being made to work well even in noisy environments."
The above post is reprinted from materials provided by University Of Wisconsin-Madison. Note: Content may be edited for style and length.
Cite This Page: