Featured Research

from universities, journals, and other organizations

'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics

Date:
September 4, 2013
Source:
Association for Psychological Science
Summary:
Our sense of touch can contribute to our ability to perceive faces, according to new research. The findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain.

Happy, sad or neutral? Researchers took advantage of a phenomenon called the "face aftereffect" to investigate whether our visual system responds to nonvisual signals for processing faces. In the face aftereffect, we adapt to a face with a particular expression -- happiness, for example -- which causes us to perceive a subsequent neutral face as having the opposite facial expression (i.e., sadness).
Credit: HaywireMedia / Fotolia

Our sense of touch can contribute to our ability to perceive faces, according to new research published in Psychological Science, a journal of the Association for Psychological Science.

"In daily life, we usually recognize faces through sight and almost never explore them through touch," says lead researcher Kazumichi Matsumiya of Tohoku University in Japan. "But we use information from multiple sensory modalities in order to perceive many everyday non-face objects and events, such as speech perception or object recognition -- these new findings suggest that even face processing is essentially multisensory."

In a series of studies, Matsumiya took advantage of a phenomenon called the "face aftereffect" to investigate whether our visual system responds to nonvisual signals for processing faces. In the face aftereffect, we adapt to a face with a particular expression -- happiness, for example -- which causes us to perceive a subsequent neutral face as having the opposite facial expression (i.e., sadness).

Matsumiya hypothesized that if the visual system really does respond to signals from another modality, then we should see evidence for face aftereffects from one modality to the other. So, adaptation to a face that is explored by touch should produce visual face aftereffects.

To test this, Matsumiya had participants explore face masks concealed below a mirror by touching them. After this adaptation period, the participants were visually presented with a series of faces that had varying expressions and were asked to classify the faces as happy or sad. The visual faces and the masks were created from the same exemplar.

In line with his hypothesis, Matsumiya found that participants' experiences exploring the face masks by touch shifted their perception of the faces presented visually compared to participants who had no adaptation period, such that the visual faces were perceived as having the opposite facial expression.

Further experiments ruled out other explanations for the results, including the possibility that the face aftereffects emerged because participants were intentionally imagining visual faces during the adaptation period.

And a fourth experiment revealed that the aftereffect also works the other way: Visual stimuli can influence how we perceive a face through touch.

According to Matsumiya, current views on face processing assume that the visual system only receives facial signals from the visual modality -- but these experiments suggest that face perception is truly crossmodal.

"These findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain," notes Matsumiya, suggesting that these findings may have implications for enhancing vision and telecommunication in the development of aids for the visually impaired.


Story Source:

The above story is based on materials provided by Association for Psychological Science. Note: Materials may be edited for content and length.


Journal Reference:

  1. K. Matsumiya. Seeing a Haptically Explored Face: Visual Facial-Expression Aftereffect From Haptic Adaptation to a Face. Psychological Science, 2013; DOI: 10.1177/0956797613486981

Cite This Page:

Association for Psychological Science. "'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics." ScienceDaily. ScienceDaily, 4 September 2013. <www.sciencedaily.com/releases/2013/09/130904105411.htm>.
Association for Psychological Science. (2013, September 4). 'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics. ScienceDaily. Retrieved September 2, 2014 from www.sciencedaily.com/releases/2013/09/130904105411.htm
Association for Psychological Science. "'Seeing' faces through touch: Brain may code facial information in shared representation between vision and haptics." ScienceDaily. www.sciencedaily.com/releases/2013/09/130904105411.htm (accessed September 2, 2014).

Share This




More Mind & Brain News

Tuesday, September 2, 2014

Featured Research

from universities, journals, and other organizations


Featured Videos

from AP, Reuters, AFP, and other news services

Can You Train Your Brain To Eat Healthy?

Can You Train Your Brain To Eat Healthy?

Newsy (Sep. 1, 2014) New research says if you condition yourself to eat healthy foods, eventually you'll crave them instead of junk food. Video provided by Newsy
Powered by NewsLook.com
Coffee Then Napping: The (New) Key To Alertness

Coffee Then Napping: The (New) Key To Alertness

Newsy (Aug. 30, 2014) Researchers say having a cup of coffee then taking a nap is more effective than a nap or coffee alone. Video provided by Newsy
Powered by NewsLook.com
Young Entrepreneurs Get $100,000, If They Quit School

Young Entrepreneurs Get $100,000, If They Quit School

AFP (Aug. 29, 2014) Twenty college-age students are getting 100,000 dollars from a Silicon Valley leader and a chance to live in San Francisco in order to work on the start-up project of their dreams, but they have to quit school first. Duration: 02:20 Video provided by AFP
Powered by NewsLook.com
Baby Babbling Might Lead To Faster Language Development

Baby Babbling Might Lead To Faster Language Development

Newsy (Aug. 29, 2014) A new study suggests babies develop language skills more quickly if their parents imitate the babies' sounds and expressions and talk to them often. Video provided by Newsy
Powered by NewsLook.com

Search ScienceDaily

Number of stories in archives: 140,361

Find with keyword(s):
Enter a keyword or phrase to search ScienceDaily for related topics and research stories.

Save/Print:
Share:

Breaking News:
from the past week

In Other News

... from NewsDaily.com

Science News

Health News

Environment News

Technology News



Save/Print:
Share:

Free Subscriptions


Get the latest science news with ScienceDaily's free email newsletters, updated daily and weekly. Or view hourly updated newsfeeds in your RSS reader:

Get Social & Mobile


Keep up to date with the latest news from ScienceDaily via social networks and mobile apps:

Have Feedback?


Tell us what you think of ScienceDaily -- we welcome both positive and negative comments. Have any problems using the site? Questions?
Mobile: iPhone Android Web
Follow: Facebook Twitter Google+
Subscribe: RSS Feeds Email Newsletters
Latest Headlines Health & Medicine Mind & Brain Space & Time Matter & Energy Computers & Math Plants & Animals Earth & Climate Fossils & Ruins