New! Sign up for our free email newsletter.
Science News
from research organizations

Bear or chipmunk? Engineer finds how brain encodes sounds

Date:
November 8, 2017
Source:
Washington University in St. Louis
Summary:
When you are out in the woods and hear a cracking sound, your brain needs to process quickly whether the sound is coming from, say, a bear or a chipmunk. A biomedical engineer now has a new interpretation for an old observation, debunking an established theory in the process.
Share:
FULL STORY

When you are out in the woods and hear a cracking sound, your brain needs to process quickly whether the sound is coming from, say, a bear or a chipmunk. In new research published in PLoS Biology, a biomedical engineer at Washington University in St. Louis has a new interpretation for an old observation, debunking an established theory in the process.

Dennis Barbour, MD, PhD, associate professor of biomedical engineering in the School of Engineering & Applied Science who studies neurophysiology, found in an animal model that auditory cortex neurons may be encoding sounds differently than previously thought. Sensory neurons, such as those in auditory cortex, on average respond relatively indiscriminately at the beginning of a new stimulus, but rapidly become much more selective. The few neurons responding for the duration of a stimulus were generally thought to encode the identity of a stimulus, while the many neurons responding at the beginning were thought to encode only its presence. This theory makes a prediction that had never been tested -- that the indiscriminate, initial responses would encode stimulus identity less accurately than how the selective responses register over the sound's duration.

"At the beginning of a sound transition, things are diffusely encoded across the neuron population, but sound identity turns out to be more accurately encoded," Barbour said. "As a result, you can more rapidly identify sounds and act on that information. If you get about the same amount of information for each action potential spike of neural activity, as we found, then the more spikes you can put toward a problem, the faster you can decide what to do. Neural populations spike most and encode most accurately at the beginning of stimuli."

Barbour's study involved recording individual neurons. To make similar kinds of measurements of brain activity in humans, researchers must use noninvasive techniques that average many neurons together. Event-related potential (ERP) techniques record brain signals through electrodes on the scalp and reflect neural activity synchronized to the onset of a stimulus. Functional MRI (fMRI), on the other hand, reflects activity averaged over several seconds. If the brain were using fundamentally different encoding schemes for onsets versus sustained stimulus presence, these two methods might be expected to diverge in their findings. Both reveal the neural encoding of stimulus identity, however.

"There has been a lot of debate for a very long time, but especially in the past couple of decades, about whether information representation in the brain is distributed or local," Barbour said.

"If function is localized, with small numbers of neurons bunched together doing similar things, that's consistent with sparse coding, high selectivity, and low population spiking rates. But if you have distributed activity, or lots of neurons contributing all over the place, that's consistent with dense coding, low selectivity and high population spiking rates. Depending on how the experiment is conducted, neuroscientists see both. Our evidence suggests that it might just be both, depending on which data you look at and how you analyze it."

Barbour said the research is the most fundamental work to build a theory for how information might be encoded for sound processing, yet it implies a novel sensory encoding principle potentially applicable to other sensory systems, such as how smells are processed and encoded.

Earlier this year, Barbour worked with Barani Raman, associate professor of biomedical engineering, to investigate how the presence and absence of an odor or a sound is processed. While the response times between the olfactory and auditory systems are different, the neurons are responding in the same ways. The results of that research also gave strong evidence that there may exist a stored set of signal processing motifs that is potentially shared by different sensory systems and even different species.


Story Source:

Materials provided by Washington University in St. Louis. Original written by Beth Miller. Note: Content may be edited for style and length.


Journal Reference:

  1. Wensheng Sun, Dennis L. Barbour. Rate, not selectivity, determines neuronal population coding accuracy in auditory cortex. PLOS Biology, 2017; 15 (11): e2002459 DOI: 10.1371/journal.pbio.2002459

Cite This Page:

Washington University in St. Louis. "Bear or chipmunk? Engineer finds how brain encodes sounds." ScienceDaily. ScienceDaily, 8 November 2017. <www.sciencedaily.com/releases/2017/11/171108151851.htm>.
Washington University in St. Louis. (2017, November 8). Bear or chipmunk? Engineer finds how brain encodes sounds. ScienceDaily. Retrieved October 5, 2024 from www.sciencedaily.com/releases/2017/11/171108151851.htm
Washington University in St. Louis. "Bear or chipmunk? Engineer finds how brain encodes sounds." ScienceDaily. www.sciencedaily.com/releases/2017/11/171108151851.htm (accessed October 5, 2024).

Explore More

from ScienceDaily

RELATED STORIES