Facial Expression Recognition based on EOG toward Emotion Detection for Human-Robot Interaction

Aniana Cruz, Diogo Garcia, Gabriel Pires, Urbano Nunes

2015

Abstract

The ability of an intelligent system to recognize the user’s emotional and mental states is of considerable interest for human-robot interaction and human-machine interfaces. This paper describes an automatic recognizer of the facial expression around the eyes and forehead based on electrooculographic (EOG) signals. Six movements of the eyes, namely, up, down, right, left, blink and frown, are detected and reproduced in an avatar, aiming to analyze how they can contribute for the characterization of facial expression. The recognition algorithm extracts time and frequency domain features from EOG, which are then classified in real-time by a multiclass LDA classifier. The offline and online classification results showed a sensitivity around 92% and 85%, respectively.

References

  1. Andreassi, J. L., 2000. Psychophysiology: Human Behavior and Physiological Response. Lawrence Erlbaum Associates. London, 4th edition.
  2. Banerjee, A., Datta, S., Pai, M., Konar, A., Tibarewala, D. N., Janarthanan, R., 2013. Classifying Electrooculogram to Detect Directional Eye Movements. International Conference on Computational Intelligence: Modeling Techniques and Applications (CIMTA), (10) 67-75.
  3. Barea R., Boquete L., Mazo M., Lopez E., 2002. System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10(4):209- 218.
  4. Barea, R., Boquete, L., Rodriguez-Ascariz, J. M., Ortega, S., López, E., 2011. Sensory System for Implementing a Human-Computer Interface Based on Electrooculography. Sensors, 11 (1), 310-328.
  5. Bartlett, M. S., Hager, J. C., Ekman, P., Sejnowski, T. J., 1999. Measuring facial expressions by computer image analysis. Psychophysiology, 36(2):253-263.
  6. Black, M. J., Yacoob, Y., 1997. Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion. International Journal of Computer Vision, 25(1), 23-48.
  7. Buenaposada, J., M., Muñoz, E., Baumela, L., 2008. Recognising facial expressions in video sequences. Pattern Analysis and Applications, 11:101-116.
  8. Busso C., Deng, Z., Yildirim S., Bulut, M., Lee, C. M., Kazemzadeh, A., Lee, S., Neumann, U., Narayanan S., 2004. Analysis of Emotion Recognition using Facial Expressions, Speech and Multimodal Information. Proceedings of the 6th international conference on Multimodal interfaces, 205-211.
  9. Castellano, G., Kessous, L., Caridakis, G., 2008. Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech, Affect and Emotion in HumanComputer Interaction, 92-103.
  10. Cohen, I., Sebe, N., Garg, A., Chen, L., Huang, T.S., 2003. Facial expression recognition from video sequences: Temporal and static modeling. Computer Vision Image Understand. 91: 160-187.
  11. De Silva, L. C., Ng, P. C., 2000. Bimodal emotion recognition, In: IEEE International Conference on Automatic Face and Gesture Recognition, 332-335.
  12. Duchowski, A., 2007. Eye Tracking Methodology: Theory and Practice, Springer. 2nd edition.
  13. Duda R. O., Hart, P. E., Stork, D., G., 2000. Pattern Classification. John Wiley and Sons Ltd. 2nd edition.
  14. Ekman, P., Friesen, W. V., 1975. Unmasking the face. A guide to recognizing emotions from facial clues. Englewood Cliffs, New Jersey: Prentice-Hall.
  15. Emerich S., Lupu, E., Apatean, A., 2009. Emotions recognition by speech and facial expressions analysis. 17th European Signal Processing Conference.
  16. Hamedi, M., Rezazadeh, I. M., Firoozabadi M., 2011. Facial Gesture Recognition Using Two-Channel BioSensors Configuration and Fuzzy Classifier: A Pilot Study. International Conference on Electrical, Control and Computer Engineering, 338-343.
  17. Huang, T. S., Chen L. S., Tao, H., Miyasato, T., Nakatsu, R., 1998. Bimodal Emotion Recognition by Man and Machine. ATR Workshop on Virtual Communication Environments.
  18. Koelstra, S., Patras, I., 2013. Fusion of facial expressions and EEG for implicit affective tagging. Image and Vision Computing, 31(2) 164 -174.
  19. Kurniawan, H., Maslov A. V., Pechenizkiy, M., 2013. Stress detection from speech and Galvanic Skin Response signals. International Symposium on Computer-Based Medical Systems (CBMS), 209-214.
  20. Malmivuo, J., Plonsey, R., 1995. Principles and Applications of Bioelectric and Biomagnetic Fields. New York, Oxford, Oxford University Press, Inc.
  21. Mase, K., 1991. Recognition of facial expressions for optical flow. IEICE Transactions, Special Issue on Computer Vision and its Applications, E 74(10).
  22. Metri, P., Ghorpade, J., Butalia, A., 2011. Facial Emotion Recognition using Context Based Multimodal Approach. International journal on interactive multimedia and artificial intelligence, 2(1), 171-182.
  23. Monajati, M., Abbasi, S. H., Shabaninia, F., Shamekhi, S., 2012. Emotions States Recognition Based on Physiological Parameters by Employing of FuzzyAdaptive Resonance Theory. International Journal of Intelligence Science, 2, 166-175.
  24. Nwe, T. L., Wei, F. S., De Silva, L. C., 2001. Speech based emotion classification. Electrical and Electronic Technology, (1) 297-301.
  25. Schacter, D. S., Gilbert, D. T., Wegner, D. M., 2009. Psychology. New York: Worth.
  26. Shayegh, F., Erfanian, A., 2006. Real-time ocular artifacts suppression from EEG signals using an unsupervised adaptive blind source separation. Engineering in Medicine and Biology society, 28th Annual International Conference of the IEEE, 5269-5272.
  27. Terzopoulus, D., Waters, K., 1993. Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans. Pattern Analysis and Machine Intelligence, 15(6):569-579.
  28. Verma, G. K., Singh, B. K., 2011. Emotion Recognition based on Texture Analysis of Facial Expression. International Conference on Image Information Processing, 1-6.
  29. Zhu W., Zeng N., Wang N., 2010. Sensitivity, Specificity, Accuracy, Associated Confidence Interval and ROC Analysis with Practical SAS® Implementations. NESUG proceedings: Health Care and Life Sciences, Baltimore, Maryland.
Download


Paper Citation


in Harvard Style

Cruz A., Garcia D., Pires G. and Nunes U. (2015). Facial Expression Recognition based on EOG toward Emotion Detection for Human-Robot Interaction . In Proceedings of the International Conference on Bio-inspired Systems and Signal Processing - Volume 1: BIOSIGNALS, (BIOSTEC 2015) ISBN 978-989-758-069-7, pages 31-37. DOI: 10.5220/0005187200310037


in Bibtex Style

@conference{biosignals15,
author={Aniana Cruz and Diogo Garcia and Gabriel Pires and Urbano Nunes},
title={Facial Expression Recognition based on EOG toward Emotion Detection for Human-Robot Interaction},
booktitle={Proceedings of the International Conference on Bio-inspired Systems and Signal Processing - Volume 1: BIOSIGNALS, (BIOSTEC 2015)},
year={2015},
pages={31-37},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005187200310037},
isbn={978-989-758-069-7},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Bio-inspired Systems and Signal Processing - Volume 1: BIOSIGNALS, (BIOSTEC 2015)
TI - Facial Expression Recognition based on EOG toward Emotion Detection for Human-Robot Interaction
SN - 978-989-758-069-7
AU - Cruz A.
AU - Garcia D.
AU - Pires G.
AU - Nunes U.
PY - 2015
SP - 31
EP - 37
DO - 10.5220/0005187200310037