Authors:
Amaryllis Raouzaiou
;
Kostas Karpouzis
and
Stefanos Kollias
Affiliation:
Image, Video and multimedia Systems Laboratory, National Technical University of Athens, Greece
Keyword(s):
MPEG-4 facial animation, facial expressions, emotion synthesis
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Enterprise Information Systems
;
Human-Computer Interaction
;
Informatics in Control, Automation and Robotics
;
Intelligent User Interfaces
;
Robotics and Automation
;
Virtual Environment, Virtual and Augmented Reality
;
Virtual Reality and Augmented Reality
Abstract:
Man-Machine Interaction (MMI) systems that utilize multimodal information about users' current emotional state are presently at the forefront of interest of the computer vision and artificial intelligence communities. Interfaces with human faces expressing emotions may help users feel at home when interacting with a computer because they are accepted as the most expressive means for communicating and recognizing emotions. Thus, emotion synthesis can enhance the atmosphere of a virtual environment and communicate messages far more vividly than any textual or speech information. In this paper, we present an abstract means of description of facial expressions, by utilizing concepts included in the MPEG-4 standard to synthesize expressions using a reduced representation, suitable for networked and lightweight applications.