Facial Emotion Recognition from Kinect Data – An Appraisal of Kinect Face Tracking Library

Tanwi Mallick, Palash Goyal, Partha Pratim Das, Arun Kumar Majumdar

Abstract

Facial expression classification and emotion recognition from gray-scale or colour images or videos have been extensively explored over the last two decades. In this paper we address the emotion recognition problem using Kinect 1.0 data and the Kinect Face Tracking Library (KFTL). A generative approach based on facial muscle movements is used to classify emotions. We detect various Action Units (AUs) of the face from the feature points extracted by KFTL and then recognize emotions by Artificial Neural Networks (ANNs) based on the detected AUs. We use six emotions, namely, Happiness, Sadness, Fear, Anger, Surprise and Neutral for our work and appraise the strengths and weaknesses of KFTL in terms of feature extraction, AU computations, and emotion detection. We compare our work with earlier studies on emotion recognition from Kinect 1.0 data.

References

  1. Black, M. J. and Yacoob, Y. (1995). Recognizing facial expressions in image sequences using local parameterized models of image motion. International Conf. on Computer Vision, pages 374-381.
  2. Cohen, I., Sebe, N., Garg, A., Lew, M. S., and Huang, T. S. (2003). Facial expression recognition from video sequences: temporal and static modeling. Computer Vision and Image Understanding, 91.
  3. Cohn, J. F., Zlochower, A. J., Lien, J. J., and Kanade, T. (1998). Feature-point tracking by optical flow discriminates subtle differences in facial expression. International Conference on Automatic Face and Gesture Recognition, pages 396-401.
  4. Ekman, P. and Friesen, W. (1978). Facial action coding system: A technique for measurement of facial movement. Palo Alto, CA.: Consulting Psychologists Press.
  5. Essa, I. A. and Pentland, A. P. (1997). Coding, analysis, interpretation, recognition of facial expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19:757-763.
  6. Hung, D., Kim, H., and Huang, S. (1996). Modeling six Universal Emotions. in Human Facial Modeling Project at Cornell University. http://www.nbb.cornell.edu/neurobio/land/oldstudent projects/cs490-95to96/hjkim/emotions.html. Last accessed on 27-Sep-2015.
  7. Kim, D. and Bien, Z. (2003). Fuzzy neural networks (fnn)-based approach for personlized facial expression recognition novel feature selection method. IEEE International Conference on Fuzzy Systems, 2:908-913.
  8. Kim, M. H., Joo, Y. H., and Park, J. B. (2005). Emotion detection algorithm using frontal face image. In Computer Applications in Shipbuilding (ICCAS 2005), 12th International Conference on, pages 2373-78.
  9. Kolakowska, A., Landowska, A., Szwoch, M., Szwoch, W., and Wróbel, M. R. (2013). Emotion recognition and its application in software engineering. In Human System Interaction, 6th International Conference on.
  10. Linköping, U. (2012). Candide3 Model. http://www.icg.isy.liu.se/candide/main.html. Last updated on 24-May-2012. Last accessed on 27-Sep2015.
  11. Microsoft (2014). Face Tracking SDK. http://msdn. microsoft.com/en-us/library/jj130970.aspx. Last accessed on 27-Sep-2015.
  12. Nelson, A. (2013). Facial expression analysis with Kinect. http://themusegarden.wordpress.com/2013/02/02/ facial-expression-analysis-with-kinect-thesis-update1/ and the linked updates. Last accessed on 27-Sep2015.
  13. Nissen, S. (2014). Fast Artificial Neural Network. http://leenissen.dk/fann/wp/. Last accessed on 27- Sep-2015.
  14. Pantic, M. and Patras, I. (2006). Dynamics of facial expression: Recognition of facial actions and their temporal segments from face profile image sequences. IEEE Trans. Systems, Man, and Cybernetics - Part B: Cybernetics, 36:433-449.
  15. Tian, Y., Kanade, T., and Cohn, J. F. (2001). Recognizing action units for facial expression analysis. IEEE transactions on pattern analysis and machine intelligence, 23(2).
  16. Tsapatsoulis, N. and Piat, F. (2000). Exploring the time course of facial expressions with a fuzzy system. National Technical University of Athens.
  17. Mellon University, C. (2015). Facial Action Coding System. http://www.cs.cmu.edu/ face/facs.htm. Last updated on 24-Apr-2014. Last accessed on 27-Sep-2015.
  18. Wu, T., Fu, S., and Yang, G. (2012). Survey of the facial expression recognition research. Advances in Brain Inspired Cognitive Systems, Lecture Notes in Computer Science, 7366.
  19. Wyrembelski, A. (2013). Detection of the selected, basic emotions based on face expression using Kinect. Unpublished Report. http://stc.fs.cvut.cz/pdf13/2659.pdf. Last accessed on 27-Sep-2015.
  20. Yang, D., Kunihiro, T., Shimoda, H., and Yoshikawa., H. (1999). A study of realtime image processing method for treating human emotion by facial expression. International Conference on System, Man and Cybernetics (SMC99).
  21. Yoneyama, M., Iwano, Y., Ohtake, A., and Shirai, K. (1997). Facial expressions recognition using discrete Hopfield neural networks. International Conference on Information Processing, 3:117-120.
  22. Youssef, A. E., Aly, S. F., Ibrahim, A. S., and Abbott, A. L. (2013). Auto-optimized multimodal expression recognition framework using 3d kinect data for asd therapeutic aid. International Journal of Modeling and Optimization, 3.
Download


Paper Citation


in Harvard Style

Mallick T., Goyal P., Das P. and Majumdar A. (2016). Facial Emotion Recognition from Kinect Data – An Appraisal of Kinect Face Tracking Library . In Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016) ISBN 978-989-758-175-5, pages 525-532. DOI: 10.5220/0005713405250532


in Bibtex Style

@conference{visapp16,
author={Tanwi Mallick and Palash Goyal and Partha Pratim Das and Arun Kumar Majumdar},
title={Facial Emotion Recognition from Kinect Data – An Appraisal of Kinect Face Tracking Library},
booktitle={Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016)},
year={2016},
pages={525-532},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005713405250532},
isbn={978-989-758-175-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, (VISIGRAPP 2016)
TI - Facial Emotion Recognition from Kinect Data – An Appraisal of Kinect Face Tracking Library
SN - 978-989-758-175-5
AU - Mallick T.
AU - Goyal P.
AU - Das P.
AU - Majumdar A.
PY - 2016
SP - 525
EP - 532
DO - 10.5220/0005713405250532