New! Sign up for our free email newsletter.
Science News
from research organizations

Brain picks out salient sounds from background noise by tracking frequency and time, study finds

Date:
July 23, 2013
Source:
Wellcome Trust
Summary:
New research reveals how our brains are able to pick out important sounds from the noisy world around us. The study could lead to new diagnostic tests for hearing disorders.
Share:
FULL STORY

New research reveals how our brains are able to pick out important sounds from the noisy world around us. The findings, published online today in the journal eLife, could lead to new diagnostic tests for hearing disorders.

Our ears can effortlessly pick out the sounds we need to hear from a noisy environment -- hearing our mobile phone ringtone in the middle of the Notting Hill Carnival, for example -- but how our brains process this information (the so-called 'cocktail party problem') has been a longstanding research question in hearing science.

Researchers have previously investigated this using simple sounds such as two tones of different pitches, but now researchers at UCL and Newcastle University have used complicated sounds that are more representative of those we hear in real life. The team used 'machine-like beeps' that overlap in both frequency and time to recreate a busy sound environment and obtain new insights into how the brain solves this problem.

In the study, groups of volunteers were asked to identify target sounds from within this noisy background in a series of experiments.

Sundeep Teki, a PhD student from the Wellcome Trust Centre for Neuroimaging at UCL and joint first author of the study, said: "Participants were able to detect complex target sounds from the background noise, even when the target sounds were delivered at a faster rate or there was a loud disruptive noise between them."

Dr Maria Chait, a senior lecturer at UCL Ear Institute and joint first author on the study, adds: "Previous models based on simple tones suggest that people differentiate sounds based on differences in frequency, or pitch. Our findings show that time is also an important factor, with sounds grouped as belonging to one object by virtue of being correlated in time."

Professor Tim Griffiths, Professor of Cognitive Neurology at Newcastle University and lead researcher on the study, said: "Many hearing disorders are characterised by the loss of ability to detect speech in noisy environments. Disorders like this that are caused by problems with how the brain interprets sound information, rather than physical damage to the ear and hearing machinery, remain poorly understood.

"These findings inform us about a fundamental brain mechanism for detecting sound patterns and identifies a process that can go wrong in hearing disorders. We now have an opportunity to create better tests for these types of hearing problems."

The research was funded by the Wellcome Trust and Deafness Research UK.


Story Source:

Materials provided by Wellcome Trust. Note: Content may be edited for style and length.


Journal Reference:

  1. Teki S et al. Segregation of complex acoustic scenes based on temporal coherence. eLife, 2013 DOI: 10.7554/elife.00699

Cite This Page:

Wellcome Trust. "Brain picks out salient sounds from background noise by tracking frequency and time, study finds." ScienceDaily. ScienceDaily, 23 July 2013. <www.sciencedaily.com/releases/2013/07/130723113743.htm>.
Wellcome Trust. (2013, July 23). Brain picks out salient sounds from background noise by tracking frequency and time, study finds. ScienceDaily. Retrieved March 28, 2024 from www.sciencedaily.com/releases/2013/07/130723113743.htm
Wellcome Trust. "Brain picks out salient sounds from background noise by tracking frequency and time, study finds." ScienceDaily. www.sciencedaily.com/releases/2013/07/130723113743.htm (accessed March 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES