Physiology-based Recognition of Facial Micro-expressions using EEG and Identification of the Relevant Sensors by Emotion

Mohamed S. Benlamine, Maher Chaouachi, Claude Frasson, Aude Dufresne

2016

Abstract

In this paper, we present a novel work about predicting the facial expressions from physiological signals of the brain. The main contributions of this paper are twofold. a) Investigation of the predictability of facial micro-expressions from EEG. b) Identification of the relevant features to the prediction. To reach our objectives, an experiment was conducted and we have proceeded in three steps: i) We recorded facial expressions and the corresponding EEG signals of participant while he/she is looking at pictures stimuli from the IAPS (International Affective Picture System). ii) We fed machine learning algorithms with time-domain and frequency-domain features of one second EEG signals with also the corresponding facial expression data as ground truth in the training phase. iii) Using the trained classifiers, we predict facial emotional reactions without the need to a camera. Our method leads us to very promising results since we have reached high accuracy. It also provides an additional important result by locating which electrodes can be used to characterize specific emotion. This system will be particularly useful to evaluate emotional reactions in virtual reality environments where the user is wearing VR headset that hides the face and makes the traditional webcam facial expression detectors obsolete.

References

  1. AlZoubi, Omar, Calvo, Rafael A and Stevens, Ronald H. (2009). Classification of EEG for affect recognition: an adaptive approach AI 2009: Advances in Artificial Intelligence (pp. 52-61): Springer.
  2. Chaouachi, Maher, Chalfoun, Pierre, Jraidi, Imène and Frasson, Claude. (2010). Affect and mental engagement: towards adaptability for intelligent systems. Paper presented at the Proceedings of the 23rd International FLAIRS Conference, Daytona Beach, FL. http://citeseerx.ist.psu.edu/viewdoc/download.
  3. Chaouachi, Maher and Frasson, Claude. (2012). Mental workload, engagement and emotions: an exploratory study for intelligent tutoring systems. Paper presented at the Intelligent Tutoring Systems.
  4. Chaouachi, Maher, Jraidi, Imène and Frasson, Claude. (2015). MENTOR: A Physiologically Controlled Tutoring System User Modeling, Adaptation and Personalization (pp. 56-67): Springer.
  5. Chi, Yu Mike, Wang, Yu-Te, Wang, Yijun, Maier, Christoph, Jung, Tzyy-Ping and Cauwenberghs, Gert. (2012). Dry and noncontact EEG sensors for mobile brain-computer interfaces. Neural Systems and Rehabilitation Engineering, IEEE Transactions on, 20(2), 228-235.
  6. Ekman, Paul. (2007). Emotions revealed: Recognizing faces and feelings to improve communication and emotional life: Macmillan.
  7. Emotient. (2015). Emotient Accuracy Measures and Methodology. Retrieved from Emotient.com website: http://www.emotient.com/wp-content/uploads/Emoti ent-Accuracy-Methods-May-2015.pdf
  8. Facet, iMotions. (2013). Attention Tool FACET Module Guide Retrieved from imotions.com website: https://imotions.com/wpcontent/uploads/2013/08/130806_FACET_Guide.pdf
  9. Fairclough, Stephen H. (2010). Physiological computing: interfacing with the human nervous system Sensing emotions (pp. 1-20): Springer.
  10. Hall, Mark, Frank, Eibe, Holmes, Geoffrey, Pfahringer, Bernhard, Reutemann, Peter and Witten, Ian H. (2009). The WEKA data mining software: an update. ACM SIGKDD explorations newsletter, 11(1), 10-18.
  11. Heraz, Alicia and Frasson, Claude. (2007). Predicting the three major dimensions of the learner's emotions from brainwaves. World Academy of Science, Engineering and Technology, 31, 323-329.
  12. Jraidi, Imène, Chaouachi, Maher and Frasson, Claude. (2013). A dynamic multimodal approach for assessing learners' interaction experience. Paper presented at the Proceedings of the 15th ACM on International conference on multimodal interaction.
  13. Kassam, K. S., Markey, A. R., Cherkassky, V. L., Loewenstein, G. and Just, M. A. (2013). Identifying Emotions on the Basis of Neural Activation. Plos One, 8(6). doi: 10.1371/journal.pone.0066032
  14. Lang, Peter J, Bradley, Margaret M and Cuthbert, Bruce N. (2008). International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical report A-8.
  15. Lewinski, Peter, den Uyl, Tim M and Butler, Crystal. (2014). Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader. Journal of Neuroscience, Psychology, and Economics, 7(4), 227.
  16. Littlewort, Gwen, Whitehill, Jacob, Wu, Tingfan, Fasel, Ian, Frank, Mark, Movellan, Javier and Bartlett, Marian. (2011). The computer expression recognition toolbox (CERT). Paper presented at the Automatic Face & Gesture Recognition and Workshops (FG 2011), 2011 IEEE International Conference on.
  17. Liu, Yisi, Sourina, Olga and Nguyen, Minh Khoa. (2011). Real-time EEG-based emotion recognition and its applications Transactions on computational science XII (pp. 256-277): Springer.
  18. Lough, Sinclair, Kipps, Christopher M, Treise, Cate, Watson, Peter, Blair, James R and Hodges, John R. (2006). Social reasoning, emotion and empathy in frontotemporal dementia. Neuropsychologia, 44(6), 950-958.
  19. Lu, Yifei, Zheng, Wei-Long, Li, Binbin and Lu, BaoLiang. (2015). Combining eye movements and EEG to enhance emotion recognition. Paper presented at the International Joint Conference on Artificial Intelligence (IJCAI).
  20. Picard, Rosalind W. (1997). Affective computing (Vol. 252): MIT press Cambridge.
  21. Samsung. (2015). Samsung prototypes brainwave-reading wearable stroke detector. Retrieved from Samsung Tomorrow website: https://news.samsung.com/global/ c-lab-engineers-developing-wearable-health-sensorfor-stroke-detection
  22. Sarkheil, Pegah, Goebel, Rainer, Schneider, Frank and Mathiak, Klaus. (2013). Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions. Social cognitive and affective neuroscience, 8(8), 950-957.
Download


Paper Citation


in Harvard Style

Benlamine M., Chaouachi M., Frasson C. and Dufresne A. (2016). Physiology-based Recognition of Facial Micro-expressions using EEG and Identification of the Relevant Sensors by Emotion . In Proceedings of the 3rd International Conference on Physiological Computing Systems - Volume 1: PhyCS, ISBN 978-989-758-197-7, pages 130-137. DOI: 10.5220/0006002701300137


in Bibtex Style

@conference{phycs16,
author={Mohamed S. Benlamine and Maher Chaouachi and Claude Frasson and Aude Dufresne},
title={Physiology-based Recognition of Facial Micro-expressions using EEG and Identification of the Relevant Sensors by Emotion},
booktitle={Proceedings of the 3rd International Conference on Physiological Computing Systems - Volume 1: PhyCS,},
year={2016},
pages={130-137},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006002701300137},
isbn={978-989-758-197-7},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 3rd International Conference on Physiological Computing Systems - Volume 1: PhyCS,
TI - Physiology-based Recognition of Facial Micro-expressions using EEG and Identification of the Relevant Sensors by Emotion
SN - 978-989-758-197-7
AU - Benlamine M.
AU - Chaouachi M.
AU - Frasson C.
AU - Dufresne A.
PY - 2016
SP - 130
EP - 137
DO - 10.5220/0006002701300137