A Multimodal Workflow for Modeling Personality and Emotions to Enable User Profiling and Personalisation

Ryan Donovan, Aoife Johnson, Aine de Roiste, Ruairi O’Reilly

2021

Abstract

The Personality Emotion Model (PEM) is a workflow for generating quantifiable and bi-directional mappings between 15 personality traits and the basic emotions. PEM utilises Affective computing methodology to map this relationship across the modalities of self-report, facial expressions, semantic analysis, and affective prosody. The workflow is an end-to-end solution integrating data collection, feature extraction, data analysis, and result generation. PEM results in a real-time model that provides a high-resolution correlated mapping between personality traits and the basic emotions. The robustness of PEM’s model is supported by the work- flow’s ability to conduct meta-analytical and multimodal analysis; each state-to-trait mapping is dynamically updated in terms of its magnitude, direction, and statistical significance as data is processed. PEM provides a methodology that can contribute to long-standing research questions in the fields of Psychology and Affective computing. These research questions include (i) quantifying the emotive nature of personality, (ii) minimising the effects of context variance in basic emotions research, and (iii) investigating the role of emotion sequencing effects in relation to individual differences. PEM’s methodology enables direct applications in any domain that requires the provision of individualised and personalised services (e.g. advertising, clinical care, research).

Download


Paper Citation


in Harvard Style

Donovan R., Johnson A., de Roiste A. and O’Reilly R. (2021). A Multimodal Workflow for Modeling Personality and Emotions to Enable User Profiling and Personalisation. In Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 2: HUCAPP; ISBN 978-989-758-488-6, SciTePress, pages 145-152. DOI: 10.5220/0010203301450152


in Bibtex Style

@conference{hucapp21,
author={Ryan Donovan and Aoife Johnson and Aine de Roiste and Ruairi O’Reilly},
title={A Multimodal Workflow for Modeling Personality and Emotions to Enable User Profiling and Personalisation},
booktitle={Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 2: HUCAPP},
year={2021},
pages={145-152},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010203301450152},
isbn={978-989-758-488-6},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 2: HUCAPP
TI - A Multimodal Workflow for Modeling Personality and Emotions to Enable User Profiling and Personalisation
SN - 978-989-758-488-6
AU - Donovan R.
AU - Johnson A.
AU - de Roiste A.
AU - O’Reilly R.
PY - 2021
SP - 145
EP - 152
DO - 10.5220/0010203301450152
PB - SciTePress