Authors:
Yu Suk Cho
;
Ji He Suk
and
Kwang Hee Han
Affiliation:
Yonsei University, Korea, Republic of
Keyword(s):
Embodied agents, Mismatched emotion, Emotional information.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Artificial Intelligence
;
Computer Art
;
e-Business
;
Education/Learning
;
e-Learning
;
Enterprise Information Systems
;
Human-Computer Interaction
;
Intelligent User Interfaces
;
Internet HCI: Web Interfaces and Usability
;
Knowledge Management and Information Sharing
;
Knowledge-Based Systems
;
Multimedia Systems
Abstract:
Today an embodied agent generates a large amount of interest because of its vital role for human-human interactions and human-computer interactions in virtual world. A number of researchers have found that we can recognize and distinguish between emotions expressed by an embodied agent. In addition many studies found that we respond to simulated emotions in a similar way to human emotion. This study investigates interpretation of mismatched emotions expressed by an embodied agent (e.g. a happy face with a sad voice). The study employed a 4 (visual: happy, sad, warm, cold) X 4 (audio: happy, sad, warm, cold) within-subjects repeated measure design. The results suggest that people perceive emotions not depending on just one channel but depending on both channels. Additionally facial expression (happy face vs. sad face) makes a difference in influence of two channels; Audio channel has more influence in interpretation of emotions when facial expression is happy. People were able to feel
other emotion which was not expressed by face or voice from mismatched emotional expressions, so there is a possibility that we may express various and delicate emotions with embodied agent by using only several kinds of emotions.
(More)