Feb. 7, 2000 CHAMPAIGN, Ill. -- Measuring hearing ability may not be as clear cut and predictable as specialists have long thought. University of Illinois researchers are beating a new drum, saying that responses of brain cells to single isolated tones don't predict how sounds in the real world are processed. In the January issue of the Journal of Neurophysiology, the U. of I. team reports that little brown bats (Myotis lucifugus) did best picking out sound that simulated sonar echoes from the fluttering wings of their favorite insect prey when the sound was offered in rapid chains of pulses.
"In the real world, sound rarely occurs in isolation," said Albert S. Feng, a professor of molecular and integrative physiology and a neuroscientist at the U. of I. Beckman Institute for Advanced Science and Technology. "Sound usually occurs in a continuous stream. It is never simple."
In recent years, researchers have begun to recognize that the central nervous system can actively control how neurons process sound - or frequency - vibrations. The paper by Feng and postdoctoral students Alexander V. Galazyuk and Daniel Llano is believed to be the first to show that the rate of stimulation can markedly affect the ability to perceive sound amplitude, or volume.
Bats actively adjust their sonar emission rate to enable them to "tune in" to the volume of echoes that is known to be proportional to the target size. By doing so, they can obtain a higher resolution of target size, which is useful for target discrimination.
"We are getting the idea that the hearing system is dynamic and under active control," Feng said. "With this new revelation, we no longer think that the system is passive or static. Until our paper, people thought you could characterize the basic properties of the hearing system by looking at the response to isolated sounds, or single tones, given slowly."
The research -- funded by the National Institutes of Health focused on neurons in the bats' inferior colliculus, located in the midbrain where many audio pathways converge. This processing center is important for sending information upstream to higher brain centers for sound perception, and also downstream for the indirect regulation of hearing sensitivity in the ear, whose role had long been thought to be passive.
The new findings, Feng said, are basic in nature. When the underlying cellular mechanisms are understood, he said, scientists will have obtained new and efficient building blocks for constructing better hearing aids.
Feng is part of a Beckman Institute team working to develop an "intelligent" hearing aid, one that will allow a person with hearing impairments to accurately extract and localize sounds in the environment.
Other social bookmarking and sharing tools:
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Note: If no author is given, the source is cited instead.