New! Sign up for our free email newsletter.
Science News
from research organizations

Brain Section Multitasks, Handling Phonetics And Decision-making

Date:
July 1, 2009
Source:
Brown University
Summary:
Scientists have found that a portion of the brain that handles decision-making also helps decipher different sounds.
Share:
FULL STORY

A front portion of the brain that handles tasks like decision-making also helps decipher different phonetic sounds, according to new Brown University research.

This section of the brain — the left inferior frontal sulcus — treats different pronunciations of the same speech sound (such as a ‘d’ sound) the same way.

In determining this, scientists have solved a mystery. “No two pronunciations of the same speech sound are exactly alike. Listeners have to figure out whether these two different pronunciations are the same speech sound such as a ‘d’ or two different sounds such as a ‘d’ sound and a ‘t’ sound,” said Emily Myers, assistant professor (research) of cognitive and linguistic sciences at Brown University. “No one has shown before what areas of the brain are involved in these decisions.”

Sheila Blumstein, the study’s principal investigator, said the findings provide a window into how the brain processes speech.

“As human beings we spend much of our lives categorizing the world, and it appears as though we use the same brain areas for language that we use for categorizing non-language things like objects, said Blumstein, the Albert D. Mead Professor of Cognitive and Linguistic Sciences at Brown.

Researchers from Brown University’s Department of Neuroscience and from the Department of Psychiatry at the University of Cincinnati also took part in the study. Details will be published in the July issue of the journal Psychological Science.

To conduct the research, scientists studied 13 women and five men, ages 19 to 29. All were brought into an MRI scanner at Brown University’s Magnetic Resonance Facility. An MRI machine, with its powerful magnet, allows technicians to measure blood flow in response to different types of stimuli.

Subjects were asked to listen to repetitive syllables in a row as they lay in the scanner. The sounds were derived from recorded, synthesized speech. Initially subjects would hear identical “dah” or “tah” sounds — four in a row — which would reduce brain activity because of the repetition. The fifth sound could be the same or a different sound.

Researchers found that the brain signal in the left inferior frontal sulcus changed when the final sound was a different one. But if the final sound was only a different pronunciation of the same sound, the brain’s response remained steady.

Myers and Blumstein said the study matters in the bid to understand language and speaking and how the brain is able to understand certain sounds and pronunciations.

“What these results suggest is that [the left inferior frontal sulcus] is a shared resource used for both language and non-language categorization,” Blumbstein said.

Financial support for the study came from the National Institute on Deafness and Other Communication Disorders (NIDCD), an Institute of the National Institutes of Health, and the Ittleson Foundation.


Story Source:

Materials provided by Brown University. Note: Content may be edited for style and length.


Cite This Page:

Brown University. "Brain Section Multitasks, Handling Phonetics And Decision-making." ScienceDaily. ScienceDaily, 1 July 2009. <www.sciencedaily.com/releases/2009/06/090630132101.htm>.
Brown University. (2009, July 1). Brain Section Multitasks, Handling Phonetics And Decision-making. ScienceDaily. Retrieved April 27, 2024 from www.sciencedaily.com/releases/2009/06/090630132101.htm
Brown University. "Brain Section Multitasks, Handling Phonetics And Decision-making." ScienceDaily. www.sciencedaily.com/releases/2009/06/090630132101.htm (accessed April 27, 2024).

Explore More

from ScienceDaily

RELATED STORIES