Our brain's understanding of spatial awareness is not triggered by sight alone, scientists have found, in a development that could help design technology for the visually impaired.
Researchers at the University of Edinburgh have found that our brain can use other senses -- such as touch -- to help us understand spatial awareness.
Scientists took MRI brain scans of both sighted volunteers and others who had been blind since birth while they examined three-dimensional spaces.
Both groups were first asked to feel 3-D Lego models representing a geometric layout of a room and models of abstract objects containing no enclosed spaces. The sighted volunteers were then also asked to look at photographs of the same rooms and objects.
The scans showed that activity in the part of the brain that computes the spatial layout of a scene -- known as the parahippocampal place area -- was doubled for the sighted volunteers when looking at images of a room layout compared with when they looked at images of abstract objects.
The research reinforces previous findings linking the parahippocampal place area to our understanding of spatial awareness.
Crucially, this brain activity was also much stronger for rooms compared with objects when the sighted volunteers touched the models without being able to see them. Given that the non-sighted participants showed the same results, these findings cannot be explained by visual imagery but instead demonstrate that the parahippocampal place area receives spatial information from multiple senses.
The study, published in Current Biology, was partially funded by the National Institute for Health.
Dr Thomas Wolbers, of the University of Edinburgh's Centre for Cognitive and Neural Systems, said: "Touch plays a role in our understanding of spatial awareness in the same way that we rely on our sense of sight. Feeling a three-dimensional model to comprehend a layout of a room triggers the same part of the brain that would have been activated if the room was seen. There is no reason why other senses, such as sound, would also not have the same effect."
Scientists say the findings may help develop technologies to help the visually impaired, such as sensors that can measure spaces and convey the information to the brain, through touch such as vibrations.
- Thomas Wolbers, Roberta L. Klatzky, Jack M. Loomis, Magdalena G. Wutte, Nicholas A. Giudice. Modality-Independent Coding of Spatial Layout in the Human Brain. Current Biology, 26 May 2011 DOI: 10.1016/j.cub.2011.04.038
Cite This Page: