How the sounds we hear help us predict how things feel
- Date:
- August 24, 2022
- Source:
- University of East Anglia
- Summary:
- Researchers have made an important discovery about the way our brains process the sensations of sound and touch. The new study reveals how the brain's different sensory systems are all closely interconnected -- with regions that respond to touch also involved when we listen to sounds associated with touching objects (for example the sound of typing on a keyboard or crushing paper). It is hoped that understanding this key area of brain function may in future help people who are neurodiverse, or with conditions such as schizophrenia or anxiety. And it could lead to developments in brain-inspired computing and AI.
- Share:
Researchers at the University of East Anglia have made an important discovery about the way our brains process the sensations of sound and touch.
A new study published today shows how the brain's different sensory systems are all closely interconnected -- with regions that respond to touch also involved when we listen to specific sounds associated with touching objects.
They found that these areas of the brain can tell the difference between listening to sounds such as such as a ball bouncing, or the sound of typing on a keyboard.
It is hoped that understanding this key area of brain function may in future help people who are neurodiverse, or with conditions such as schizophrenia or anxiety. And it could lead to developments in brain-inspired computing and AI.
Lead researcher Dr Fraser Smith, from UEA's School of Psychology, said: "We know that when we hear a familiar sound such as a bouncing a ball, this leads us to expect to see a particular object. But what we have found is that it also leads the brain to represent what it might feel like to touch and interact with that object.
"These expectations can help the brain process sensory information more efficiently."
The research team used an MRI scanner to collect brain imaging data while 10 participants listened to sounds generated by interacting with objects -- such as bouncing a ball, knocking on a door, crushing paper, or typing on a keyboard.
Using a special imaging technique called functional MRI (fMRI), they measured brain activity throughout the brain.
They used sophisticated machine learning analysis techniques to test whether the activity generated in the earliest touch areas of the brain (primary somatosensory cortex) could tell apart sounds generated by different types of object interaction (bouncing a ball, verses typing on a keyboard).
They also performed a similar analysis for control sounds, similar to those used in hearing tests, to rule out that just any sounds can be discriminated in this brain region.
Researcher Dr Kerri Bailey said: "Our research shows that parts of our brains, which were thought to only respond when we touch objects, are also involved when we listen to specific sounds associated with touching objects.
"This supports the idea that a key role of these brain areas is to predict what we might experience next, from whatever sensory stream is currently available.
Dr Smith added: "Our findings challenge how neuroscientists traditionally understand the workings of sensory brain areas and demonstrate that the brain's different sensory systems are actually all very interconnected.
"Our assumption is that the sounds provide predictions to help our future interaction with objects, in line with a key theory of brain function -- called Predictive Processing.
"Understanding this key mechanism of brain function may provide compelling insights into mental health conditions such as schizophrenia, autism or anxiety and in addition, lead to developments in brain-inspired computing and AI."
This study was led by UEA, in collaboration with researchers at Aix-Marseille University (France) and Maastricht University (Netherlands).
Story Source:
Materials provided by University of East Anglia. Note: Content may be edited for style and length.
Journal Reference:
- Kerri Bailey, Bruno Giordano, Amanda Kaas, Fraser Smith. Decoding sounds depicting hand-object interactions in primary somatosensory cortex. Cerebral Cortex (Accepted/In press), 2022 [abstract]
Cite This Page: