Computer, read my lips
A computer is being taught to interpret human emotions based on lip pattern, according to research published in the International Journal of Artificial Intelligence and Soft Computing. The system could improve the way we interact with computers and perhaps allow disabled people to use computer-based communications devices, such as voice synthesizers, more effectively and more efficiently. Karthigayan Muthukaruppanof Manipal International University in Selangor, Malaysia, and co-workers have developed a system using a genetic algorithm that gets better and better with each iteration to match irregular ellipse fitting equations to the shape of the human mouth displaying different emotions. They have used photos of individuals from South-East Asia and Japan to train a computer to recognize the six commonly accepted human emotions -- happiness, sadness, fear, angry, disgust, surprise -- and a neutral expression. The upper and lower lip is each analyzed as two separate ellipses by the algorithm.
"In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers especially in the area of human emotion recognition by observing facial expression," the team explains. Earlier researchers have developed an understanding that allows emotion to be recreated by manipulating a representation of the human face on a computer screen. Such research is currently informing the development of more realistic animated actors and even the behavior of robots. However, the inverse process in which a computer recognizes the emotion behind a real human face is still a difficult problem to tackle.
It is well known that many deeper emotions are betrayed by more than movements of the mouth. A genuine smile for instance involves flexing of muscles around the eyes and eyebrow movements are almost universally essential to the subconscious interpretation of a person's feelings. However, the lips remain a crucial part of the outward expression of emotion. The team's algorithm can successfully classify the seven emotions and a neutral expression described.
The researchers suggest that initial applications of such an emotion detector might be helping disabled patients lacking speech to interact more effectively with computer-based communication devices, for instance.
Source: Inderscience Publishers
- Computer scans lips to identify emotionsfrom UPIMon, 10 Sep 2012, 18:30:18 EDT
- Computer, read my lips: Emotion detector developed using a genetic algorithmfrom Science DailyMon, 10 Sep 2012, 18:00:26 EDT
- Computer, read my lips: Emotion detector developed using a genetic algorithmfrom PhysorgMon, 10 Sep 2012, 10:02:55 EDT
Latest Science NewsletterGet the latest and most popular science news articles of the week in your Inbox! It's free!
Learn more about
Check out our next project, Biology.Net
From other science news sites
Popular science news articles
- New lithium-oxygen battery greatly improves energy efficiency, longevity
- Induced labor not associated with risk for ASDs
- Study finds induced labor not associated with risk for autism spectrum disorders
- Ultra-flat circuits will have unique properties
- Human 'super predator' more terrifying than bears, wolves and dogs
- Researchers map Zika's routes to the developing fetus
- Does hormone therapy after menopause affect memory?
- Hormone therapy for brain performance: No effect, whether started early or late
- Rapid, low-temperature process adds weeks to milk's shelf life
- Yale scientists apply new imaging tool to common brain disorders
- Scientists move 1 step closer to creating an invisibility cloak
- Researchers invent 'smart' thread that collects diagnostic data when sutured into tissue
- Smallest hard disk to date writes information atom by atom
- Exploring superconducting properties of 3-D printed parts
- Satellite spots remnants of Ex-Tropical Cyclone Celia