New! Sign up for our free email newsletter.
Science News
from research organizations

Say ChEES(e): Software Helps UF Researchers Compare Expressions In Men And Women

Date:
February 19, 2001
Source:
University Of Florida
Summary:
University of Florida researchers have turned to computer technology to quantify gender differences in one component of emotional expression -- how it is revealed by the face. They discovered that although men and women are equally expressive, men display most of their joy, disgust or other sentiments in the lower left quadrant of their face.
Share:
FULL STORY

GAINESVILLE, Fla. --- When it comes to men, women and emotion, pet theories abound on whether one sex is more emotional or inhibited than the other.

But since such notions are rarely backed by data, University of Florida researchers turned to computer technology to quantify gender differences in one component of emotional expression -- how it is revealed by the face. They discovered that although men and women are equally expressive, men display most of their joy, disgust or other sentiments in the lower left quadrant of their face. Women, on the other hand, show their emotions across their entire countenance.

How is this significant? A leading hypothesis is that the findings reflect differences in how brains are wired, said Dawn Bowers an associate professor in the College of Health Professions' clinical and health psychology department.

"There's been an argument that the brains of men are more compartmentalized than the brains of women," said Bowers, who is presenting the research Saturday (2/17/01) at the International Neuropsychological Society conference in Chicago. "Previous research has shown, for example, that for men, language functions are very concentrated in the left hemisphere of their brains, whereas in women they are more equally distributed across the brain.

"It's possible that is also at work in facial expressions -- that the emotional priming systems for men may be located in the right hemisphere but are more dispersed for women," said Bowers, who is affiliated with UF's Evelyn F. and William L. McKnight Brain Institute.

Sparked by thoughts that are fleeting or more lasting, the brain, nerves and muscles work together to produce revealing looks of joy, anger or sadness. Understanding this complex interaction could provide new insight into human relationships, as well as into illnesses in which facial movements can be affected. These include the flat facade of Parkinson's disease or a facial droop sometimes caused by stroke.

What's more, the UF-developed computer methodology for "digitizing the moving face" holds potential for assessing pain in patients who cannot speak. It also could be refined to enable computers to recognize and respond to human emotions, said Didem Gökçay, a UF doctoral candidate in computer and information sciences who wrote the facial analysis software program CHEES (for Computerized Human Expression Evaluation System).

"Outside the health field, automatic facial expression recognition could improve the interaction between human and machine," said Gökçay, who also is affiliated with UF's McKnight Brain Institute. "When you work on a computer now, it responds like a robot. We could move toward more friendly interaction. If the machine understands your moods by recognizing your facial expression, that would be a big step."

For the pilot study they are reporting this week, the researchers evaluated 25 male and 23 female college students. The participants were videotaped as they demonstrated looks of happiness, sadness, fear, distrust and other sentiments. The images were transferred to a computer, which captured 30 video frames for each one-second expression.

Using Gökçay's software program, the researchers, who included former UF students Ashish Desai and Charles Richardson analyzed facial movements by quantifying changes in surface light reflection from one black and white frame to the next. That methodology was first conceived by UF neuroscientist Christiana M. Leonard.

"When you look at someone's face, what your brain processes are changes in light reflection across the face as it moves," Bowers said. "So that's what we were looking to do using the computer. We're measuring the face in action."

The computer program sums up differences in gray-scale intensity of the pixels that constitute each frame. The resulting scores showed that men and women moved their faces a similar amount to reveal an emotion. But when the scores were broken down by facial regions, it became clear that men's expressions were asymmetrical, with the bulk of the movement confined to the lower left quadrant.

Bowers says that without the computer, she cannot consciously see these expression disparities in men and women. It will take more research to determine whether at some level people do perceive these distinctions, affecting how they interpret mood.

"There may well be a difference in what men and women pay attention to when they are communicating," Bowers said. She noted that in a previous study, she found that men and women were adept at interpreting tone of voice and content of a spoken message, but that women were slightly better at recognizing the tone and men were a bit stronger at noting the words.

Meanwhile, there is much more research to be done on the age-old question of emotional differences in men and women. Bowers said she would like to study whether the sexes differ in how they spontaneously reveal their feelings.

"We looked at posed expressions," Bowers said, "so we know that if you tell a guy to be expressive, he can be."


Story Source:

Materials provided by University Of Florida. Note: Content may be edited for style and length.


Cite This Page:

University Of Florida. "Say ChEES(e): Software Helps UF Researchers Compare Expressions In Men And Women." ScienceDaily. ScienceDaily, 19 February 2001. <www.sciencedaily.com/releases/2001/02/010216080642.htm>.
University Of Florida. (2001, February 19). Say ChEES(e): Software Helps UF Researchers Compare Expressions In Men And Women. ScienceDaily. Retrieved April 17, 2024 from www.sciencedaily.com/releases/2001/02/010216080642.htm
University Of Florida. "Say ChEES(e): Software Helps UF Researchers Compare Expressions In Men And Women." ScienceDaily. www.sciencedaily.com/releases/2001/02/010216080642.htm (accessed April 17, 2024).

Explore More

from ScienceDaily

RELATED STORIES