New! Sign up for our free email newsletter.
Science News
from research organizations

Facial Expression Recognition Software Developed

Date:
February 25, 2008
Source:
Universidad Politécnica de Madrid
Summary:
Researchers have developed an algorithm that is capable of processing 30 images per second to recognize a person's facial expressions in real time and categorize them as one of six prototype expressions: anger, disgust, fear, happiness, sadness and surprise. Applying the facial expression recognition algorithm, the developed prototype is capable of processing a sequence of frontal images of moving faces and recognizing the person's facial expression. The software can be applied to video sequences in realistic situations and can identify the facial expression of a person seated in front of a computer screen. Although still only a prototype, the software is capable of working on a desktop computer or even on a laptop.
Share:
FULL STORY

Researchers at the Department of Artificial Intelligence (DIA) of the Universidad Politécnica de Madrid’s School of Computing (FIUPM) have, in conjunction with Madrid’s Universidad Rey Juan Carlos, developed an algorithm that is capable of processing 30 images per second to recognize a person’s facial expressions in real time and categorize them as one of six prototype expressions: anger, disgust, fear, happiness, sadness and surprise.

Applying the facial expression recognition algorithm, the developed prototype is capable of processing a sequence of frontal images of moving faces and recognizing the person’s facial expression. The software can be applied to video sequences in realistic situations and can identify the facial expression of a person seated in front of a computer screen. Although still only a prototype, the software is capable of working on a desktop computer or even on a laptop.

Flexibility and adaptability

The system analyses the face of a person sitting in front of a camera connected to a computer running the prototype. The system analyses the person’s face (up to 30 images per second) through several boxes, each “attached” to or focusing on part of the user’s face. These boxes monitor the user’s facial movements until they manage to determine what the facial expression is by comparison with the expressions captured from different people (333 sequences) from the Cohn-Kanade database.

The system’s success rate on the Cohn-Kanade database is 89%. It can work under adverse conditions where ambient lighting, frontal facial movements or camera displacements produce major changes in facial appearance.

This software has a range of applications: advanced human-computer interfaces, improved relations with the e-commerce consumers, and metaverse avatars with an unprecedented capability to relate to the person they represent.

Multiple applications

This software can enrich advanced human-computer interfaces because it would enable the construction of avatars that really do simulate a person’s facial expression. This is a really exciting prospect for sectors like the video games industry.

Electronic commerce could also benefit from this technology. During the e-commerce buying process, the computer would be able to identify potential buyers’ gestures, determine whether or not they intend to make a purchase and even gauge how satisfied they are with a product or service by helping to reduce the ambiguities of spoken or written language.

Applied to metaverses like Second Life, this software would also enable the avatars representing system users to act out the feelings of the user captured through facial expressions.

Innovations

Although there are some facial analysis products on the market, none specifically target the analysis of user facial expressions. Visit the Computational Perception and Robotics Research Group’s website for videos illustrating the algorithm in operation.

Additionally, while most similar systems developed by other researchers focus on just part of expression recognition, the developed prototype does the whole job: 1) locates and monitors the face in the image using an algorithm that works despite changes of illumination or user movement, and 2) classifies the user’s facial expression. Finally, it also incorporates an original algorithm that calculates the likely evolution of the analysed user’s facial expressions.

The results of this research were published in the January issue of Pattern Analysis and Applications in an article authored by Luis Baumela and Enrique Muñoz, of the DIA, and José Miguel Buenaposada, of the Universidad Rey Juan Carlos de Madrid’s College of Computer Systems Engineering. http://www.springerlink.com/content/q075h33723m475k1/?p=d626d623f4294fde878ed45706cd3971π=2

Video of facial expressions rated by the new software: http://www.dia.fi.upm.es/%7Epcr/videos/real_experiment_sequence.mpg


Story Source:

Materials provided by Universidad Politécnica de Madrid. Note: Content may be edited for style and length.


Cite This Page:

Universidad Politécnica de Madrid. "Facial Expression Recognition Software Developed." ScienceDaily. ScienceDaily, 25 February 2008. <www.sciencedaily.com/releases/2008/02/080223125318.htm>.
Universidad Politécnica de Madrid. (2008, February 25). Facial Expression Recognition Software Developed. ScienceDaily. Retrieved October 8, 2024 from www.sciencedaily.com/releases/2008/02/080223125318.htm
Universidad Politécnica de Madrid. "Facial Expression Recognition Software Developed." ScienceDaily. www.sciencedaily.com/releases/2008/02/080223125318.htm (accessed October 8, 2024).

Explore More

from ScienceDaily

RELATED STORIES