Science News
from research organizations

'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics

Date:
September 4, 2013
Source:
Association for Psychological Science
Summary:
Our sense of touch can contribute to our ability to perceive faces, according to new research. The findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain.
Share:
       
FULL STORY

Happy, sad or neutral? Researchers took advantage of a phenomenon called the "face aftereffect" to investigate whether our visual system responds to nonvisual signals for processing faces. In the face aftereffect, we adapt to a face with a particular expression -- happiness, for example -- which causes us to perceive a subsequent neutral face as having the opposite facial expression (i.e., sadness).
Credit: © HaywireMedia / Fotolia

Our sense of touch can contribute to our ability to perceive faces, according to new research published in Psychological Science, a journal of the Association for Psychological Science.

"In daily life, we usually recognize faces through sight and almost never explore them through touch," says lead researcher Kazumichi Matsumiya of Tohoku University in Japan. "But we use information from multiple sensory modalities in order to perceive many everyday non-face objects and events, such as speech perception or object recognition -- these new findings suggest that even face processing is essentially multisensory."

In a series of studies, Matsumiya took advantage of a phenomenon called the "face aftereffect" to investigate whether our visual system responds to nonvisual signals for processing faces. In the face aftereffect, we adapt to a face with a particular expression -- happiness, for example -- which causes us to perceive a subsequent neutral face as having the opposite facial expression (i.e., sadness).

Matsumiya hypothesized that if the visual system really does respond to signals from another modality, then we should see evidence for face aftereffects from one modality to the other. So, adaptation to a face that is explored by touch should produce visual face aftereffects.

To test this, Matsumiya had participants explore face masks concealed below a mirror by touching them. After this adaptation period, the participants were visually presented with a series of faces that had varying expressions and were asked to classify the faces as happy or sad. The visual faces and the masks were created from the same exemplar.

In line with his hypothesis, Matsumiya found that participants' experiences exploring the face masks by touch shifted their perception of the faces presented visually compared to participants who had no adaptation period, such that the visual faces were perceived as having the opposite facial expression.

Further experiments ruled out other explanations for the results, including the possibility that the face aftereffects emerged because participants were intentionally imagining visual faces during the adaptation period.

And a fourth experiment revealed that the aftereffect also works the other way: Visual stimuli can influence how we perceive a face through touch.

According to Matsumiya, current views on face processing assume that the visual system only receives facial signals from the visual modality -- but these experiments suggest that face perception is truly crossmodal.

"These findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain," notes Matsumiya, suggesting that these findings may have implications for enhancing vision and telecommunication in the development of aids for the visually impaired.


Story Source:

The above post is reprinted from materials provided by Association for Psychological Science. Note: Materials may be edited for content and length.


Journal Reference:

  1. K. Matsumiya. Seeing a Haptically Explored Face: Visual Facial-Expression Aftereffect From Haptic Adaptation to a Face. Psychological Science, 2013; DOI: 10.1177/0956797613486981

Cite This Page:

Association for Psychological Science. "'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics." ScienceDaily. ScienceDaily, 4 September 2013. <www.sciencedaily.com/releases/2013/09/130904105411.htm>.
Association for Psychological Science. (2013, September 4). 'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics. ScienceDaily. Retrieved September 2, 2015 from www.sciencedaily.com/releases/2013/09/130904105411.htm
Association for Psychological Science. "'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics." ScienceDaily. www.sciencedaily.com/releases/2013/09/130904105411.htm (accessed September 2, 2015).

Share This Page: