Dec. 20, 2007 In conversation, humans recognize words primarily from the sounds they hear. However, scientists have long known that what humans perceive goes beyond the sounds and even the sights of speech. The brain actually constructs its own unique interpretation, factoring in both the sights and sounds of speech.
For example, when combining the acoustic patterns of speech with the visual images of the speaker's mouth moving, humans sometimes reconstruct a syllable that is not physically present in either sight or sound. Although this illusion suggests spoken syllables are represented in the brain in a way that is more abstract than the physical patterns of speech, scientists haven't understood how the brain generates abstractions of this sort.
Researchers at the University of Chicago have identified brain areas responsible for this perception. One of these areas, known as Broca's region, is typically thought of as an area of the brain used for talking rather than listening.
"When the speech sounds do not correspond exactly to the words that are mouthed, the brain often conjures a third sound as an experience -- and this experience may often vary from what was actually spoken," explains Uri Hasson, lead author of the study and a post-doctoral scholar at the university's Human Neuroscience Laboratory.
"As an example, what would happen if a person's voice says 'pa,' but the person's lips mouth the word 'ka"' One would think you might hear 'pa' because that is what was said. But in fact, with the conflicting verbal and visual signals, the brain is far more likely to hear 'ta,' an entirely new sound," he explains.
This demonstration is called the McGurk effect (named after Harry McGurk, a developmental psychologist from England who first noticed this phenomenon in the 1970s). In the current study, scientists used functional magnetic resonance imaging (graphic depiction of brain activity) to demonstrate that Broca's region is responsible for the type of abstract speech processing that underlies this effect.
Although we experience speech as a series of words like print on a page, the speech signal is not as clear as print, and must be interpreted rather than simply recognized, Hasson explains.
He says this paper provides a glimpse into how such interpretations are carried out in the brain. These types of interpretations might be particularly important, when the speech sounds are unclear, such as when conversing in a crowded bar, listening to an unfamiliar accent, or coping with hearing loss. "In all these cases, understanding what is said requires interpreting the physical speech signal to determine what is said. And scientists now know the Broca's region is plays a major role in this process."
Details of this study were published in the December 20th issue of Neuron. Additional authors include Jeremy Skipper, Howard Nusbaum and Steven Small of the University of Chicago.
The National Institute of Mental Health supported this research.
Other social bookmarking and sharing tools:
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Note: If no author is given, the source is cited instead.