Humans are incessant imitators. We unintentionally imitate subtle aspects of each other's mannerisms, postures and facial expressions. We also imitate each other's speech patterns, including inflections, talking speed and speaking style. Sometimes, we even take on the foreign accent of the person to whom we're talking, leading to embarrassing consequences.
New research by the University of California, Riverside, published in the August issue of the journal Attention, Perception, & Psychophysics, shows that unintentional speech imitation can even make us sound like people whose voices we never hear. The journal is published by The Psychonomic Society, which promotes scientific research in psychology and allied sciences.
UCR psychology professor Lawrence D. Rosenblum and graduate students Rachel M. Miller and Kauyumari Sanchez found that when people lipread from a talker and say aloud what they've lipread, their speech sounds like that of the talker.
The researchers asked hearing individuals with no formal lipreading experience to watch a silent face articulate 80 simple words, such as tennis and cabbage. Those individuals were asked to identify the words by saying them out loud clearly and quickly. To make the lipreading task easier, the test subjects were given a choice of two words: e.g., tennis or table). They were never asked to imitate or repeat the talker.
Even so, the researchers found that words spoken by the test subjects sounded more like the words of the talker they lipread than did words they spoke when simply reading from a list. That finding is evidence that unintentional speech imitation extends to lipreading, even for normal hearing individuals with no formal lipreading experience, they wrote in a paper titled "Alignment to Visual Speech Information."
"Whether we are hearing or lipreading speech articulations, a talker's speaking style has subtle influences on our own manner of speaking," Rosenblum says. "This unintentional imitation could serve as a social glue, helping us to affiliate and empathize with each other. But it also might reflect deep aspects of the language function. Specifically, it adds to evidence that the speech brain is sensitive to -- and primed by -- speech articulation, whether heard or seen. It also adds to the evidence that a familiar talker's speaking style can help us recognize words."
The research project was funded by a grant from the National Institutes of Health's National Institute on Deafness and Other Communication Disorders.
- Miller et al. Alignment to visual speech information. Attention Perception & Psychophysics, 2010; 72 (6): 1614 DOI: 10.3758/APP.72.6.1614
Cite This Page: