Applicability of Multi-modal Electrophysiological Data Acquisition and Processing to Emotion Recognition

Filipe Canento, Hugo Silva, Ana Fred

2012

Abstract

We present an overview and study on the applicability of multimodal electrophysiological data acquisition and processing to emotion recognition. We build on previous work in the field and further explore the emotion elicitation process, by using videos to stimulate emotions in several participants. Elec-trophysiological data from Electrocardiography (ECG), Blood Volume Pulse (BVP), Electrodermal Activity (EDA), Respiration (RESP), Electromyography (EMG), and Peripheral Temperature (SKT) sensors was acquired and used to classify the negative and positive emotions. We evaluate the emotional status identification accuracy both in terms of the target emotions and those reported by the participants, with recognition rates above 70% through Leave One Out Cross Validation (LOOCV) with a k-NN Classifier.

References

  1. Filipe Canento. Affective mouse. Master's thesis, IST-UTL, 2011.
  2. Filipe Canento, Ana Fred, Hugo Silva, Hugo Gamboa, and André Lourenço. Multimodal biosignal sensor data handling for emotion recognition. In Proceedings of the IEEE Sensors Conference, 2011.
  3. Hugo Gamboa. Multi-Modal Behavioural Biometrics Based on HCI and Electrophysiology. PhD thesis, IST-UTL, 2006.
  4. André Lourenço, Hugo Silva, and Ana Fred. Unveiling the biometric potential of FingerBased ECG signals. Computational Intelligence and Neuroscience, 2011.
  5. A. Fred, H. Gamboa, and H. Silva, “Himotion project,” tech. rep., Universidade Técnica de Lisboa, Instituto Superior Técnico, 2007.
  6. PLUX, “PLUX Website”, www.plux.info, March 2011.
  7. R. Picard, Affective Computing. MIT press, 1997.
  8. J. Larsen, G. Berntson, K. Poehlmann, T. Ito, and J. Cacioppo, “The psychophysiology of emotion,” in The handbook of emotions, pp. 180-195, Guilford, 2008.
  9. M. Whang and J. Lim, “A physiological approach to affective computing,” in Affective computing: Focus on Emotion Expression, Synthesis and Recognition (J. Or, ed.), pp. 309- 318, In-Tech Education and Publishing, 2008.
  10. L Shen, M.Wang, and R. Shen, “Affective e-learning: Using “emotional” data to improve learning in pervasive learning environment,” Educational Technology & Society, 2009.
  11. Haag, S. Goronzy, P. Schaich, and J. Williams, “Emotion recognition using bio-sensors: First steps towards an automatic system,” in Affective Dialogue Systems: Lecture Notes in Computer Science, pp. 36-48, Ed. Springer Berlin, 2004.
  12. E. Leon, G. Clarke, V. Callaghan, and F. Sepulveda, “A user-independent real-time emotion recognition system for software agents in domestic environments,” Engineering Applications of Artificial Intelligence, 2006.
  13. K. Kim, S. Bang, and S. Kim, “Emotion recognition system using short-term monitoring of physiological signals,” Medical & Biological Engineering & Computing, 2004.
  14. F. Hönig, A. Batliner, and E. Nöth, “Real-time recognition of the affective user state with physiological signals,” in Proceedings of the Doctoral Consortium of the 2nd International Conference on Affective Computing and Intelligent Interaction, pp. 1-8, 2006.
  15. E. van den Broek, V. Lisý, J. Janssen, J. Westerink, M. Schut, and K. Tuinenbreijer, “Affective man-machine interface: Unveiling human emotions through biosignals,” in Biomedical Engineering Systems and Technologies: BIOSTEC2009 Selected Revised papers (A. Fred, J. Filipe, and H. Gamboa, eds.), pp. 21-47, Springer-Verlag, 2010.
  16. R. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intelligence: Analysis of affective physiological state,” IEEE Transactions Pattern Analysis and Machine Intelligence, 2001.
  17. J. Rottenberg, R. Ray, and J. Gross, “Emotion elicitation using films,” in The handbook of emotion elicitation and assessment (J. Coan and J. Allen, eds.), pp. 9-28, Oxford University Press Series in Affective Science, 2007.
  18. R. Picard, Affective Computing Research Group at MIT, “Affective computing projects.” http://affect.media.mit.edu/projects.php, March 2011.
  19. J. Hewig, D. Hagemann, J. Seifert, M. Gollwitzer, E. Naumann, and D. Bartussek, “Brief report: A revised film set for the induction of basic emotions,” Cognition & Emotion, 2005.
  20. R. Duda, P. Hart, and D. Stork, Pattern Classification. Wiley, 2001.
  21. C. Bishop, Pattern Recognition and Machine Learning. Springer, 2006 (f) Questionnaire page Fig. 3. Web application for emotion elicitation using videos. (cont.)
Download


Paper Citation


in Harvard Style

Canento F., Silva H. and Fred A. (2012). Applicability of Multi-modal Electrophysiological Data Acquisition and Processing to Emotion Recognition . In Proceedings of the 2nd International Workshop on Computing Paradigms for Mental Health - Volume 1: MindCare, (BIOSTEC 2012) ISBN 978-989-8425-92-8, pages 59-70. DOI: 10.5220/0003891800590070


in Bibtex Style

@conference{mindcare12,
author={Filipe Canento and Hugo Silva and Ana Fred},
title={Applicability of Multi-modal Electrophysiological Data Acquisition and Processing to Emotion Recognition},
booktitle={Proceedings of the 2nd International Workshop on Computing Paradigms for Mental Health - Volume 1: MindCare, (BIOSTEC 2012)},
year={2012},
pages={59-70},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003891800590070},
isbn={978-989-8425-92-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 2nd International Workshop on Computing Paradigms for Mental Health - Volume 1: MindCare, (BIOSTEC 2012)
TI - Applicability of Multi-modal Electrophysiological Data Acquisition and Processing to Emotion Recognition
SN - 978-989-8425-92-8
AU - Canento F.
AU - Silva H.
AU - Fred A.
PY - 2012
SP - 59
EP - 70
DO - 10.5220/0003891800590070