Seeing the song: Study aims to understand how, when the auditory system registers complex auditory-visual synchrony
- Date:
- October 23, 2013
- Source:
- Northwestern University
- Summary:
- Imagine the brain's delight when experiencing the sounds of Beethoven's "Moonlight Sonata" while simultaneously taking in a light show produced by a visualizer. A new study did more than that. To understand how the brain responds to complex auditory-visual stimuli like music and moving images, the study tracked parts of the auditory system involved in the perceptual processing of "Moonlight Sonata" while synchronized with the light show made by the iTunes Jelly visualizer.
- Share:
Imagine the brain's delight when experiencing the sounds of Beethoven's "Moonlight Sonata" while simultaneously taking in a light show produced by a visualizer.
A new Northwestern University study did much more than that.
To understand how the brain responds to highly complex auditory-visual stimuli like music and moving images, the study tracked parts of the auditory system involved in the perceptual processing of "Moonlight Sonata" while it was synchronized with the light show made by the iTunes Jelly visualizer.
The study shows how and when the auditory system encodes auditory-visual synchrony between complex and changing sounds and images.
Much of related research looks at how the brain processes simple sounds and images. Locating a woodpecker in a tree, for example, is made easier when your brain combines the auditory (pecking) and visual (movement of the bird) streams and judges that they are synchronous. If they are, the brain decides that the two sensory inputs probably came from a single source.
While that research is important, Julia Mossbridge, lead author of the study and research associate in psychology at Northwestern, said it also is critical to expand investigations to highly complex stimuli like music and movies.
"These kinds of things are closer to what the brain actually has to manage to process in every moment of the day," she said. "Further, it's important to determine how and when sensory systems choose to combine stimuli across their boundaries.
"If someone's brain is mis-wired, sensory information could combine when it's not appropriate," she said. "For example, when that person is listening to a teacher talk while looking out a window at kids playing, and the auditory and visual streams are integrated instead of separated, this could result in confusion and misunderstanding about which sensory inputs go with what experience."
It was already known that the left auditory cortex is specialized to process sounds with precise, complex and rapid timing; this gift for auditory timing may be one reason that in most people, the left auditory cortex is used to process speech, for which timing is critical. The results of this study show that this specialization for timing applies not just to sounds, but to the timing of complex and dynamic sounds and images.
Previous research indicates that there are multi-sensory areas in the brain that link sounds and images when they change in similar ways, but much of this research is focused particularly on speech signals (e.g., lips moving as vowels and consonants are heard). Consequently, it hasn't been clear what areas of the brain process more general auditory-visual synchrony or how this processing differs when sounds and images should not be combined.
"It appears that the brain is exploiting the left auditory cortex's gift at processing auditory timing, and is using similar mechanisms to encode auditory-visual synchrony, but only in certain situations; seemingly only when combining the sounds and images is appropriate," Mossbridge said.
Story Source:
Materials provided by Northwestern University. Original written by Hilary Hurd Anyaso. Note: Content may be edited for style and length.
Journal Reference:
- Julia A. Mossbridge, Marcia Grabowecky, Satoru Suzuki. Seeing the Song: Left Auditory Structures May Track Auditory-Visual Dynamic Alignment. PLoS ONE, 2013; 8 (10): e77201 DOI: 10.1371/journal.pone.0077201
Cite This Page: