Authors:
Khadoudja Ghanem
1
;
Alice Caplier
2
and
Sébastien Stillittano
2
Affiliations:
1
Mentouri University, Algeria
;
2
GIPSA-lab/DIS, France
Keyword(s):
Facial Expression, intensity estimation, belief theory.
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Feature Extraction
;
Features Extraction
;
Image and Video Analysis
;
Informatics in Control, Automation and Robotics
;
Signal Processing, Sensors, Systems Modeling and Control
Abstract:
This article presents a new method to estimate the intensity of a human facial expression. Supposing an expression occurring on a face has been recognized among the six universal emotions (joy, disgust, surprise, sadness, anger, fear), the estimation of the expression’s intensity is based on the determination of the degree of geometrical deformations of some facial features and on the analysis of several distances computed on skeletons of expressions. These skeletons are the result of a contour segmentation of facial permanent features (eyes, brows, mouth). The proposed method uses the belief theory for data fusion. The intensity of the recognized expression is scored on a three-point ordinal scale: "low intensity", "medium intensity" or " high intensity". Experiments on a great number of images validate our method and give good estimation for facial expression intensity. We have implemented and tested the method on the following three expressions: joy, surprise and disgust.