Interpretation of Time Dependent Facial Expressions in Terms of Emotional Stimuli

Roman Gorbunov, Emilia Barakova, Matthias Rauterberg

2012

Abstract

In this paper we demonstrate how genetic programming can be used to interpret time dependent facial expressions in terms of emotional stimuli of different types and intensities. In our analysis we have used video records of facial expressions made during the Mars-500 experiment in which six participants have been isolated for 520 days to simulate flight to Mars. The FaceReader, commercial software developed by VicarVision and Noldus Information Technology, has been used to extract seven time dependent components of facial expressions from the video records. To interpret the obtained time dependent components of facial expressions we have proposed a mathematical model of emotional stimuli assuming that dynamics of facial expressions is determined by emotional stimuli of different types and intensities and facial expression at the moment of the stimuli. Genetic programming has been used to find the locations, types and intensities of the emotional stimuli as well as the way the facial expressions react on them.

References

  1. Banzhaf, W., Nordin, P., Kelller, R. E., and Francone, F. D. (1998). Genetic programming - An Introduction: On the Automatic Evolution of Computer Programs and Its Applications. Morgan Kaufmann.
  2. Barakova, E. I. and Lourens, T. (2010). Expressing and interpreting emotional movements in social games with robots. Personal and Ubiquitous Computing, 14:457- 467.
  3. Ekman, P. and Friesen, W. V. (1977). Manual for the Facial Action Coding System. Consulting Psychologists Press, Palo Alto, CA.
  4. Gouizi, K., Reguig, F. B., and Maaoui, C. (2011). Emotion recognition from physiological signals. Journal of Medical Engineering and Technology, 35:300-307.
  5. Grosz, B. J., Kraus, S., Talman, S., Stossel, B., and Havlin, M. (2004). The influence of social dependencies on decision-making. initial investigations with a new game. Proceedings of the 3rd international joint conference on autonomous agents and multiagent systems, 2:782-789.
  6. Hill, R. P. and Mazis, M. B. (1986). Measuring emotional responses to advertising. Advances in Consumer Research, 2:164-169.
  7. Lourens, T., van Berkel, R., and Barakova, E. (2010). Communicating emotions and mental states to robots in a real time parallel framework using laban movement analysis. Robotics and Autonomous Systems, 58:1256-1265.
  8. Marian, D. E. and Shimamura, A. P. (2011). Emotions in context: Pictorial influences on affective attributions. Emotion, 12:371-375.
  9. Poels, K. and Dewitte, S. (2006). How to capture the heart? reviewing 20 years of emotion measurement in advertising. KUL Working Paper No. MO 0605. Available at SSRN: http://ssrn.com/abstract=944401 or http://dx.doi.org/10.2139/ssrn.944401.
  10. Scherer, K. R. (2003). Vocal communication of emotion: A review of research paradigms. Speech Communication, 40:227-256.
  11. Segaran, T. (2008). Programming Collective Intelligence: Building Smart Web 2.0 Applications. O'Reilly Media.
  12. Terzis, V., Moridis, C., and Economides, A. (2011). Measuring instant emotions based on facial expressions during computer-based assessment. Personal and Ubiquitous Computing, pages 1-10.
  13. Uyl, M. J. D. and van Kuilenburg, H. (2005). The facereader: Online facial expression recognition. Measuring Behavior 2005, 5th International Conference on Methods and Techniques in Behavioral Research, pages 589-590.
  14. van den Broek, E. L., Janssen, J. H., Westerink, J. H., and Healey, J. A. (2009). Prerequisites for Affective Signal Processing (ASP). INSTICC Press, Portugal.
Download


Paper Citation


in Harvard Style

Gorbunov R., Barakova E. and Rauterberg M. (2012). Interpretation of Time Dependent Facial Expressions in Terms of Emotional Stimuli . In Proceedings of the 4th International Joint Conference on Computational Intelligence - Volume 1: ECTA, (IJCCI 2012) ISBN 978-989-8565-33-4, pages 231-237. DOI: 10.5220/0004166302310237


in Bibtex Style

@conference{ecta12,
author={Roman Gorbunov and Emilia Barakova and Matthias Rauterberg},
title={Interpretation of Time Dependent Facial Expressions in Terms of Emotional Stimuli},
booktitle={Proceedings of the 4th International Joint Conference on Computational Intelligence - Volume 1: ECTA, (IJCCI 2012)},
year={2012},
pages={231-237},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004166302310237},
isbn={978-989-8565-33-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 4th International Joint Conference on Computational Intelligence - Volume 1: ECTA, (IJCCI 2012)
TI - Interpretation of Time Dependent Facial Expressions in Terms of Emotional Stimuli
SN - 978-989-8565-33-4
AU - Gorbunov R.
AU - Barakova E.
AU - Rauterberg M.
PY - 2012
SP - 231
EP - 237
DO - 10.5220/0004166302310237