Personality Traits Assessment using P.A.D. Emotional Space in
Human-robot Interaction
Zuhair Zafar, Ashita Ashok and Karsten Berns
Robotics Research Lab, Department of Computer Science,
Technische Universit
¨
at Kaiserslautern, Germany
Keywords:
Personality Trait Assessment, PAD Emotional Space, Robot Adaptivity, Human-Robot Interaction.
Abstract:
Cognitive social robotics is the field of research that is committed to building social robots that facilitate to
draw parallels with human beings. Humans assess the behavior and personality of their counterparts to adapt
their behavior and show empathy to flourish human-human interaction. Similarly, assessment of human per-
sonality is highly critical in realizing natural and intelligent human-robot interaction. Numerous personality
traits assessment systems have been reported in the literature; however, most of them target the big five per-
sonality traits. From only visual information, this work proposes to use pleasure, arousal, and dominance
emotional space for the assessment of personality traits based on the work of Mehrabian. To validate the sys-
tem, three different scenarios have been developed to assess 12 different personality traits on a social humanoid
robot. Experimental results show that the system can assess human personality traits with 84% accuracy in
real-time and, hence, it can adapt its behavior according to the perceived personality of the interaction partner.
1 INTRODUCTION
With the technological advent and constant research
in the field of robotics, it is now quite practical to ac-
knowledge the actuality of social robots being a part
of human’s daily life in the next decades. It neces-
sitates and inspires the motivation of creating robots
that can perceive the various learnings of life similar
to humans, especially in a real-world Human-Robot
Interaction (HRI). Concerning HRI, the basic expec-
tations from a social robot are to perceive words, emo-
tions, behaviors, and so on, in order to draw several
conclusions and informed decisions for realizing nat-
ural HRI. Henceforth, assessment of human person-
ality traits is essential to bring a sense of appeal and
acceptance towards the robot during the interaction.
Personality plays a vital role in Human-Human In-
teraction (HHI) as it guides the conversation towards
a level of satisfaction and comfort for humans. Ac-
cording to psychologists, human behavior is known
to be a combination of verbal cues together with non-
verbal cues. Nonverbal cues such as temperamen-
tal characteristics are known to be innate in human
beings, sometimes existing subtly or visibly. These
temperaments aggregate into traits, e.g., extrovert or
introvert, throughout human life through daily expe-
riences. The significance of personality in HHI can
be better exemplified by two renowned theories from
the field of human psychology, i.e., the chameleon
effect (Chartrand and Bargh, 1999) and similarity/at-
traction theory (Henderson and Furnham, 1982). The
chameleon effect explains the non-conscious human
tendency to passively mimic the behavior of one’s in-
teraction partner in a social environment.
In contrast, the similarity/attraction theory empha-
sizes that humans are generally attracted to and prefer
the company of others who maintain morals and at-
titudes similar to their own. For example, it is quite
often observed that there exists a sense of shared per-
sonality among friends than among random pairs of
strangers. People tend to change their behavior ac-
cording to their interlocutor’s behavior. If he/she is
talkative and expressive, one also tends to be more
expressive. Therefore, assessment of human person-
ality is highly crucial for a robotic system in order to
interact and adapt naturally for intelligent HRI.
The primary goal of this work is the assessment
of human personality traits in the real-world using
the temperament framework presented by Mehra-
bian (Mehrabian, 1996). The author has exploited the
pleasure, arousal, and dominance emotional space to
describe and measure individual differences on dif-
ferent personality scales. This work formulates a
hypothesis that using P.A.D. emotional space, com-
puted through human nonverbal cues, can provide
a successful assessment of human personality traits
Zafar, Z., Ashok, A. and Berns, K.
Personality Traits Assessment using P.A.D. Emotional Space in Human-robot Interaction.
DOI: 10.5220/0010161801110118
In Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 2: HUCAPP, pages
111-118
ISBN: 978-989-758-488-6
Copyright
c
2021 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
111
in the context of HRI. To realize the hypothesis,
P.A.D. dimensions are computed using nonverbal
cues, namely, human posture, head gesture, hand ges-
ture, proximity, body activity, and facial expression.
These P.A.D. values are then used in different person-
ality trait equations devised by Mehrabian (Mehra-
bian, 1996) for assessment of personality traits.
2 LITERATURE SURVEY
There are many complex models available in psy-
chology to recognize the personality traits of humans.
The most significant theory that describes personality
traits in five dimensions is commonly known as the
big five (BF) personality theory or five-factor model.
Costa and McCrae (Costa Jr and McCrae, 1976) have
introduced the BF model, which is based on five di-
mensions, namely, extroversion, agreeableness, neu-
roticism, conscientiousness, and openness to new ex-
periences. Each dimension is a continuum, and a low
value in the dimension represents the opposite char-
acteristics of that dimension.
Some researchers have tried to map the cognitive
emotional models directly from humans to a robot, for
example (Rodi
´
c et al., 2016). However, conventional
approach is to use personality models from psychol-
ogy. The BF model has been used extensively in the
literature of personality traits assessment. In the case
of automatic personality detection from nonverbal be-
havioral cues, authors (Batrinca et al., 2011) have ap-
plied automatic detection of the BF personality traits
in scenarios of self-presentation and employment in-
terviews. Different features, 17 visual, 3 speech time,
and 9 acoustic cues have been used for classification.
The visual nonverbal cues include eye-gaze, frowning
emotion, hand movements, head orientation, mouth
fidgeting, and posture.
In their further work, authors (Batrinca et al.,
2012) have explored to detect the BF personality traits
in the human-computer interaction scenario using the
map task. The work aims to recognize the BF per-
sonality traits in a collaborative task setting. Features
used for classification of personality traits are acous-
tic features, e.g., duration of speech, pitch, intensity,
and so on; visual features, e.g., motion vector mag-
nitude over skin computed by using discrete cosine
transform, and additional features such as the number
of speaking turns by the subject. Authors have used
support vector machines (SVMs) for the classification
task. The results report an accuracy of greater than or
equal to 70 percent for emotional stability, extrover-
sion, and conscientiousness scales. Although authors
have argued having detected the BF traits accurately
except for open to new experiences trait, a question
that remains unanswered is how the lack of features,
for instance, the distance between the speakers, detec-
tion of facial expression, and many more have been
compensated.
Interesting research on the automatic analysis
of engagement in HRI is observed in the findings
by (Salam et al., 2016). This work aims to judge the
impact of personality traits of human participants on
the engagement with robots. Among the three phases
of analysis, the first phase consisted of data collection
in the HRI triadic scenario, while the second phase in-
cluded extraction of individual and interpersonal fea-
tures based upon nonverbal cues from human par-
ticipants. Individual features include a histogram of
gradients (HOG), a histogram of optical flow (HOF),
body activity, joint speed, motion features, etc. In-
terpersonal features include the visual focus of atten-
tion, the global quantity of movement, relative orien-
tation, and distance between the participants, and rel-
ative orientation with respect to the robot. The final
phase included a prediction of the level of engage-
ment in two types of engagement, namely individ-
ual and group engagement, based upon the predicted
the BF personality traits and the features extracted in
phase 2. They have concluded that the prediction of
engagement using personality traits reports better re-
sults as compared to when personality trait informa-
tion is not used.
In the context of HRI, authors (Zafar et al., 2018a)
have presented a humanoid robot that can recognize
3 dimensions of the BF model in real-time. The
authors have discussed the importance of nonverbal
cues in automatic personality recognition. Nonverbal
features such as human postures, facial expressions,
body activity, head gestures, and proximity have been
used. SVMs are used for the classification task. Au-
thors have claimed above 90% recognition rate for 3
dimensions, namely, extroversion, agreeableness, and
neuroticism.
The works mentioned above for personality as-
sessment are directly based on either visual features
or linguistic features or both of them. They also are
focused mainly on BF personality traits. However,
Mehrabian has presented a general framework for
describing and measuring individual temperaments
of personality that also covers more traits, such as
shyness, anxiety, and aggression (Mehrabian, 1996).
This framework is based on pleasure, arousal, and
dominance emotional space (Russell and Mehrabian,
1977). After extensive research, authors have defined
these three domains as follows. Pleasure can be deter-
mined using cognitive judgments of evaluation, i.e.,
higher evaluations of stimuli associated with greater
HUCAPP 2021 - 5th International Conference on Human Computer Interaction Theory and Applications
112
pleasure induced by stimuli. Arousal corresponds to
judgments of high-low stimulus activity using mea-
sure of stimulus “information rate”. Dominance is
defined as judgment of stimulus potency, with more
significant the influence of stimuli corresponding to
lower values of dominance.
To the best of our knowledge, there exists no re-
search that has implemented the framework (Mehra-
bian, 1996) on human personality traits concerning
the adaptivity and behavior of social robots in HRI.
Our work is the first to implement this psychology
framework for human personality traits assessment
using nonverbal cues in the context of HRI.
3 PERSONALITY TRAITS
ASSESSMENT
Although a limited number of technical systems have
been reported in the literature for real-time personal-
ity traits assessment, these systems at best can only
recognize the BF personality traits. They are unable
to distinguish between subtle personality traits, for
example, shyness and introversion or dominance and
aggression. According to (Watson and Clark, 1997),
extroversion can be subdivided into the more specific
facets of assertiveness, gregariousness, cheerfulness,
and energy. Similarly, neuroticism can be subdivided
into loneliness, anxiety, and sensitivity to rejection,
while shyness is the part of introversion trait. As
mentioned in the previous section, this work employs
the framework (Mehrabian, 1996) that uses pleasure,
arousal, and dominance (P.A.D.) emotional space for
personality traits assessment. In the following sub-
section, P.A.D. emotional space is defined.
3.1 P.A.D. Emotional Space
In literature, human emotions are often defined in
multiple dimensional spaces. However, the definition
of emotion varies for each researcher, who adopted
one or more dimensions to define it. For example, ac-
cording to (Wundt and Judd, 1897), the three dimen-
sions of emotions are namely, “pleasurable vs. un-
pleasurable”, “arousing vs. subduing” and “strain vs.
relaxation”. Many emotional spaces have been pre-
sented in psychology. Among them, the prominent
ones are the circumplex model by Russell (Russell,
1980) and the Positive Activation-Negative Activa-
tion (PANA) model (Watson and Tellegen, 1985).
There exists another renowned three-dimensional
emotional space, called Pleasure-Arousal-Dominance
emotional space (Russell and Mehrabian, 1977). The
P.A.D. model aims to describe and measure emotional
traits that correspond to human personality. The three
dimensions are defined to be bipolar such that plea-
sure is described as a continuum that ranges from in-
tense pain or unhappiness on one end to intense hap-
piness or ecstasy on the other. Arousal has been re-
ported to range from sleepiness and drowsiness to a
high level of alertness and excitement. Dominance
varies from emotions of a complete absence of con-
trol or impact over events to feeling influential and in
control of the situation at the opposite extreme. In the
following sections, a methodology that uses nonver-
bal cues for the implementation of P.A.D. emotional
space is presented.
3.1.1 Pleasure
As previously mentioned, the value on the pleasure
scale describes how much the event is enjoyable for
a person. The study on facial expressions conducted
by (Boukricha et al., 2009) shows that pleasure is di-
rectly associated with facial expressions. If a person
is happy, the value on the pleasure scale is high. Sim-
ilarly, if a person is unhappy and exhibits facial ex-
pressions such as sadness, fear, anger, or disgust, then
the value on the pleasure scale is low.
Algorithm 1: Estimation of Pleasure Value.
pleasant = 0, unpleasant = 0
n number of frames in 10 seconds
for i 0 to n do
if Human.Face.Exist() then
P
expression
Current.Expression
if P
expression
= happy OR surprise then
pleasant = pleasant +1
else
unpleasant = unpleasant +1
if pleasant unpleasant then
P pleasant
else
P unpleasant × (1)
pleasure =
1
n
× P // average pleasure
return pleasure
To estimate the pleasure value from facial ex-
pressions, a system developed by authors (Al-Darraji
et al., 2017) has been used to recognize six basic fa-
cial expressions in real-time. The facial expressions,
extracted in every frame, are standardized and aver-
aged over a 10 second period. Happiness and surprise
expressions contribute towards the positive value of
pleasure scale, while sad, fear, disgust, and anger con-
tribute towards the negative value of pleasure scale.
Personality Traits Assessment using P.A.D. Emotional Space in Human-robot Interaction
113
Algorithm 1 shows the estimation of pleasure.
3.1.2 Arousal
The value on the arousal scale describes how much
the event is exciting and thrilling for a person.
The arousal can be assessed by the combination of
two nonverbal features, namely proximity and body
movements. According to (Nass and Lee, 2001), peo-
ple, when aroused, show frequent body movements.
Similarly, (Hirth et al., 2011) have established the re-
lationship between the proximity of a person from the
interlocutor and the arousal. Arousal of a person is
considered high if he/she moves towards or stands
close to the robot. If a person moves away or stands
farther from the robot, the arousal value goes down.
To calculate arousal value, a weighted sum of prox-
imity and body movements is used. In order to esti-
mate the proximity value, the concept of interpersonal
distances of humans during human-human interaction
has been used. According to (Hall, 1963), interper-
sonal distances of a person can be categorized into
four zones, namely intimate space, personal space, so-
cial space, and public space. If the robot is in the pub-
lic space of a person, the proximity value is negative
1. If the robot is in the intimate space of a person,
the proximity value increases up to +1.
To detect human body movement during interac-
tion, skeleton joints positions of the upper body are
used, which are provided by the NiTE middleware li-
brary and analyze them over time. Activity is detected
if the change in values is exceeded from a threshold.
The proximity of a person with regards to a robot is
determined by using the depth information of tracked
humans. Equation 1 shows the weighted summation
of activity and proximity to estimate arousal.
Arousal =
1
n
n
i=0
(W
p
× P +W
A
× A)
W
p
= 0.8
W
A
= 0.2
4 P > 0.5m
W
p
= [0.6 0.8]
W
A
= [0.4 0.2]
0.2m < 4P 0.5m (1)
W
p
= 0.6
W
A
= 0.4
4 P 0.2m
In Equation 1, W
p
and W
A
are proximity and ac-
tivity weights, respectively. The weights are dynamic
and change according to the change in proximity, 4P,
of a person. If a person moves more than half a meter
during the interaction, the W
p
gets the higher value
to depict this sudden change on the arousal dimen-
sion. Moreover, the proximity value (P) is a continu-
ous value between 1 to +1 and depends on how far
the person is standing from the robot.
3.1.3 Dominance
In the HRI scenario, dominance can be estimated by
analyzing human behavior over time. As mentioned
by (Jensen, 2016), confident and dominant people
generally have a wide-open trunk during interactions,
which shows that they are approachable to others and
keeps them in a more open-minded attitude. Simi-
larly, threatening postures such as feet spread apart
with hands-on-hips posture and pointing postures are
correlated with aggression and dominance. Dominant
people are also physically active during interactions.
In contrast, submissive people tend to look down with
slumped body postures. Submissiveness is correlated
with self-touching postures, such as cross arms pos-
ture or thinking postures (Argyle, 1988). Submissive
people also avoid mutual eye gaze (Argyle, 1988).
They generally are passive during interactions.
The system developed in (Zafar et al., 2018b) has
been used to recognize different human postures us-
Algorithm 2: Estimation of Dominance Value.
dominant = 0, submissive = 0
n number of frames in 10 seconds
for i 0 to n do
if Human.Body.Exist() then
P
posture
Current.Posture
P
head gesture
Current.Head Gesture
P
body movements
Current.Body Movement
if (P
posture
= O.P OR P.P OR A.S) AND
(P
head gesture
= L.A OR L.U) AND
(P
body movements
= true) then
dominant = dominant + 1
else if (P
posture
= C.P OR T.P) AND
(P
head gesture
= L.D OR L.A OR L.L
OR L.R) AND
(P
body movements
= f alse) then
submissive = submissive + 1
else
do nothing
if dominant submissive then
D dominant
else
D submissive × (1)
dominance =
1
n
× D // average dominance
return dominance
HUCAPP 2021 - 5th International Conference on Human Computer Interaction Theory and Applications
114
Figure 1: Schematic flow of personality traits assessment using P.A.D. emotional space from findings of (Mehrabian, 1996).
ing human skeleton joint angles, e.g., pointing pos-
ture, thinking posture, crossed arms posture, open
arms posture, aggressive posture and so on. To rec-
ognize head gestures, the system developed in (Saleh
and Berns, 2015) has been used. Head gestures such
as head nodding, head shaking, look left, look down,
look up, look right, look ahead, etc. are considered in
this study. Body movements of a person are detected
using the same method described in Arousal imple-
mentation. Algorithm 2 shows the estimation of the
dominance dimension.
In Algorithm 2, O.P = open posture, P.P = point-
ing posture, A.S = Aggressive stance posture,
C.P = crossed arms posture, T.P = thinking posture,
L.A = looking ahead gesture, L.U = looking up ges-
ture, L.D = look down gesture, L.L = look left gesture
and L.R = look right gesture.
3.2 Personality Trait Assessment
System
This work proposes to use the P.A.D. emotional space
for the assessment of human personality traits using
the Mehrabian’s framework (Mehrabian, 1996). Us-
ing the three dimensions, pleasure, arousal, and dom-
inance, he has formulated 59 individual measures that
correspond to human personality traits. It has been
demonstrated that traits are symmetrically related to
one another based upon the P.A.D. dimensions.
Although the formulated traits are of a wide range,
only 12 out of 59 traits are realized in this work.
These traits are chosen according to the experimen-
tal restrictions and based on the knowledge of non-
verbal cues associated with them. Personality traits
such as mysticism, loneliness, and anorexic require
either verbal or contextual information or both for an
accurate assessment. Furthermore, even humans find
it challenging to assess these traits in human-human
interaction. Therefore, 12 realizable traits are consid-
ered. These traits’ equations, which are reported in
(Mehrabian, 1996), are shown in the Equation 2. Fig-
ure 1 shows the schematic flow of the approach.
Intellect = 0.14P + 0.20A + 0.48D
Achievement = 0.13P + 0.60D
Extroversion = 0.21P + 0.17A + 0.50D
SocialDesirablility = 0.34P 0.26A + 0.17D
ArousalSeeking = 0.14P + 0.26A + 0.55D
Aggression = 0.36P + 0.20A + 0.28D
TraitDominance = 0.72D
PhysicallyActive = 0.26P + 0.40D
Anxiety = 0.24A 0.20D
Shyness = 0.29P + 0.13A 0.56D
SensitivitytoRe jection = 0.14A 0.71D
Nurturance = 0.41P + 0.12A + 0.17D
(2)
4 EXPERIMENTATION
The main hypothesis proposed is that the use of
P.A.D. emotional space, computed through human
nonverbal cues, can provide a successful assessment
of human personality traits in the context of HRI. Due
to the unavailability of ground truth, the validation of
the hypothesis is a challenging task. In order to vali-
date the authenticity of the proposed system, written
feedbacks have been compiled in the form of a ques-
Personality Traits Assessment using P.A.D. Emotional Space in Human-robot Interaction
115
tionnaire from psychology students. In the following
sub-sections, the experimentation and evaluation pro-
cedure is described in detail.
4.1 Experimental Setup
For experimentation purposes, 15 university students
(12 males, 3 females; age range 22-45 years) from
different ethnical backgrounds have participated. The
participants have been naive about the objective and
nature of the experiments. The participants provide
informed consent with regards to the guidelines of an
anonymous research group. Laboratory experiments
have been conducted in a closed office room environ-
ment.
Participants have been instructed to stand in the
line of sight of ROBIN, the humanoid robot of TU
Kaiserslautern, as shown in Figure 2. Three cameras
are placed at different locations to record the interac-
tion. ROBIN perception GUI is also recorded for the
duration of the interaction. Artificial lights are used
during the experiments to make it consistent for all
the participants. An experimenter is also present in
the room to monitor the processes systematically tak-
ing place and only intervene if the system malfunc-
tions because of technical issues.
4.2 Experimental Scenarios
In the direction of validating the proposed hypothe-
sis, participants are assessed for personality traits in
three different scenarios. These scenarios have been
developed with regard to a student’s area of exposure
that consequently leads to three relatable tasks for ex-
perimentation. Each participant has been instructed to
enact the following scenarios one at a time. The ex-
perimenter also introduces participants with ROBIN
at the beginning of the experiments to get familiarized
with ROBIN.
The first scenario involves an interaction between
ROBIN, role-playing as the professor, and the partic-
ipant, role-playing as a researcher. The second sce-
Figure 2: Humanoid robot ROBIN.
nario involves an interaction between ROBIN, role-
playing as a master student, and the participant, role-
playing as a supervisor. The last scenario involves an
interaction between ROBIN, role-playing as an inter-
viewer, and the participant, role-playing as a candi-
date. For each scenario, the robot takes the lead by
asking questions and responding generically.
5 PERFORMANCE ANALYSIS
In order to evaluate the personality assessment sys-
tem, summative evaluations are used from 5 psychol-
ogy students. All the interactions are video recorded,
and these videos are used for further analysis. After
the experimentation, each evaluator is presented with
a list of 12 personality traits and their corresponding
description as follows:
1. Intellect: A person who engages in critical think-
ing, research, and reflection about society, pro-
poses solutions for its problems
2. Achievement: Something done with effort & skill.
3. Extroversion: Extroverts are behaviorally more
dominant in face-to-face interactions with others.
4. Social Desirability: To answer questions in a man-
ner that is viewed to be favorable by others.
5. Arousal Seeking: A person that looks for excite-
ment, change, new environments, take risks, etc.
6. Aggression: Readiness to attack or confront
7. Dominance: Showing power and influence
8. Physically Active: A person that is continuously
active, working, organizing activities, etc.
9. Anxiety: Feeling of worry, nervousness, or un-
easiness about something with uncertain outcome.
10. Shyness: Nervous or timid in company of others.
11. Sensitivity to Rejection: People who are affected
easily by the negative remarks of others.
12. Nurturance: Emotional/physical care given to
someone.
Observer evaluations have been used to assess hu-
man personality traits. Evaluators assess the personal-
ity traits of all the subjects using the recorded videos
and descriptions of personality traits to establish the
ground truth. They use the provided descriptions and
their prior knowledge of human behavior to form an
informed judgment about the presence (active) or ab-
sence (inactive) of each personality trait. In order to
combine the outcomes from each evaluator to gen-
erate ground truth, the maximum occurrence of the
outcome is used as the final outcome. For example,
HUCAPP 2021 - 5th International Conference on Human Computer Interaction Theory and Applications
116
if two evaluators report the subject’s anxiety trait as
active and three evaluators report it as inactive, then
the ground truth for anxiety trait of the subject is de-
scribed as inactive. These assessments are compared
with the system results. However, the proposed sys-
tem reports the trait values between 1 to +1 range.
For evaluation and validation, the trait with positive
value is considered as active, and the trait with a neg-
ative value as inactive. Table 1 shows the recognition
rates of each personality trait.
Table 1: Personality Traits and Recognition Rates.
ID Trait Dimension Accuracy (%)
1 Intellect 83.11
2 Achievement 78.22
3 Extroversion 91.11
4 Social Desirability 80.44
5 Arousal Seeking 87.55
6 Aggression 84.88
7 Trait Dominance 89.77
8 Physically Active 92.88
9 Anxiety 90.66
10 Shyness 78.66
11 Sensitivity to Rejection 73.33
12 Nurturance 77.77
Average 84.03
5.1 Discussion
It can be seen from Table 1 that extroversion, arousal
seeking, trait dominance, physically active, and anxi-
ety traits have higher recognition rates. It is because
these traits are distinct and easily distinguishable vi-
sually. For example, the system detects the presence
or absence of an extroversion trait with 90% accu-
racy. Most of the subjects that are identified as high
on extroversion trait are expressive and physically ac-
tive with an open body stance during interactions,
which in turn yields higher scores on the P.A.D. scale.
Therefore, the system estimates the extroversion trait
as active for these subjects. On the other side, some of
the subjects appear to be nervous and shy during inter-
actions, which results in lower values for the P.A.D.
scale. Therefore, the system estimates the extrover-
sion trait as inactive for these subjects. Similarly,
some subjects show anxiety during interactions by ex-
hibiting dejected and self-touching postures. Subjects
are also found to be passive and restless (leaning side
to side) during interactions. It has been found after
analysis that dominance scores for such subjects have
been negative. However, their arousal score is posi-
tive due to their movement attributed to restlessness.
Therefore, the system estimates the anxiety trait as
active for these subjects.
The reason for the wrong assessment of some per-
sonality traits lies in the inaccurate recognition of fa-
cial expressions. The facial expression recognition
(FER) system works accurately when a person is ex-
pressing the emotions clearly. During the interac-
tion, the facial expressions of a subject are sometimes
wrongly interpreted, which affects the pleasure value.
Nurturance and social desirability strongly correlate
with pleasure scale as can be seen from equation 2,
but due to the technical limitation of the FER sys-
tem, these traits achieve low accuracy as depicted in
table 1.
Due to the subjective nature of the task, evaluators
themselves find it difficult sometimes to have a mu-
tual consensus on these traits. Because of the avail-
ability of additional information such as verbal, sit-
uational, and contextual cues, evaluators labeled the
ground truths of personality traits for each subject ac-
cordingly. However, the personality traits system uses
only visual information to analyze human behavior
and, therefore, sometimes wrongly reports a subject
on a particular trait. For example, sensitivity to rejec-
tion trait is highly subjective and needs a context as
well to recognize it accurately. Some subjects that ex-
hibit submissive body postures, such as crossed arms
and thinking postures, and avoid eye contact during
interactions are detected as high on sensitivity to re-
jection trait, which may not be accurate in every case.
For example, introverts also show similar body pos-
tures and head gestures, however, they may not be
sensitive to rejection.
Since contextual and situational cues are not con-
sidered in this work, the system is not able to differ-
entiate between fake personality and genuine person-
ality. The system assesses personality based on non-
verbal cues, which can also be sometimes expressed
artificially. Although the subjects are instructed to en-
act genuinely, some may have faked their responses.
Therefore, the personality assessed by the system may
differ from the actual personality of the subject, which
shows the importance of contextual and situational
cues. However, the highly engaging nature of scenar-
ios, along with the robot’s human-like gestures and
expressions, challenge the subjects to respond with
minimal artificiality in this work. Hence, the system
can assess different personality traits with 84% accu-
racy.
6 CONCLUSION
This paper proposes to use the temperament frame-
work to describe and measure individual differences
on different personality scales using P.A.D. emotional
Personality Traits Assessment using P.A.D. Emotional Space in Human-robot Interaction
117
space. The proposed study has been conducted in psy-
chology but has never been realized in HRI. In order
to validate the research hypothesis, P.A.D. emotional
space has been developed using the recognition / de-
tection of nonverbal cues, such as human postures,
facial expressions, proximity, activity, head gestures,
etc. Using the trait equations provided in the litera-
ture (Mehrabian, 1996), a score has been calculated
for each trait. For evaluation, three scenarios have
been developed, and 15 university students have been
invited. Due to the unavailability of the labeled data,
five psychology students have been consulted to eval-
uate the personality traits. From validation studies, it
is clear that the framework presented by Mehrabian is
also applicable in HRI by using nonverbal cues.
REFERENCES
Al-Darraji, S., Berns, K., and Rodi
´
c, A. (2017). Action unit
based facial expression recognition using deep learn-
ing. In Rodi
´
c, A. and Borangiu, T., editors, Advances
in Robot Design and Intelligent Control, pages 413–
420, Cham. Springer International Publishing.
Argyle, M. (1988). Bodily communication, methuen.[aaw]
armstrong, d.(1984) the patients view. Social Science
and Medicine, 18:73744.
Batrinca, L., Lepri, B., Mana, N., and Pianesi, F.
(2012). Multimodal recognition of personality traits in
human-computer collaborative tasks. In Proceedings
of the 14th ACM international conference on multi-
modal interaction, pages 39–46. ACM.
Batrinca, L. M., Mana, N., Lepri, B., Pianesi, F., and Sebe,
N. (2011). Please, tell me about yourself: automatic
personality assessment using short self-presentations.
In Proceedings of the 13th international conference on
multimodal interfaces, pages 255–262. ACM.
Boukricha, H., Wachsmuth, I., Hofst
¨
atter, A., and Gram-
mer, K. (2009). Pleasure-arousal-dominance driven
facial expression simulation. In 2009 3rd Interna-
tional Conference on Affective Computing and Intel-
ligent Interaction and Workshops, pages 1–7. IEEE.
Chartrand, T. L. and Bargh, J. A. (1999). The chameleon
effect: the perception–behavior link and social inter-
action. Journal of personality and social psychology,
76(6):893.
Costa Jr, P. T. and McCrae, R. R. (1976). Age differences
in personality structure: A cluster analytic approach.
Journal of gerontology, 31(5):564–570.
Hall, E. T. (1963). A system for the notation of proxemic
behavior 1. American anthropologist, 65(5):1003–
1026.
Henderson, M. and Furnham, A. (1982). Similarity and
attraction: The relationship between personality, be-
liefs, skills, needs and friendship choice. Journal of
Adolescence, 5(2):111–123.
Hirth, J., Schmitz, N., and Berns, K. (2011). Towards social
robots: Designing an emotion-based architecture. In-
ternational Journal of Social Robotics, 3(3):273–290.
Jensen, M. (2016). Personality traits and nonverbal com-
munication patterns. Int’l J. Soc. Sci. Stud., 4:57.
Mehrabian, A. (1996). Pleasure-arousal-dominance: A gen-
eral framework for describing and measuring individ-
ual differences in temperament. Current Psychology,
14(4):261–292.
Nass, C. and Lee, K. M. (2001). Does computer-
synthesized speech manifest personality? experi-
mental tests of recognition, similarity-attraction, and
consistency-attraction. Journal of experimental psy-
chology: applied, 7(3):171.
Rodi
´
c, A., Urukalo, D., Vujovi
´
c, M., Spasojevi
´
c, S., Tomi
´
c,
M., Berns, K., Al-Darraji, S., and Zafar, Z. (2016).
Embodiment of human personality with ei-robots by
mapping behaviour traits from live-model. In Interna-
tional Conference on Robotics in Alpe-Adria Danube
Region, pages 438–448. Springer.
Russell, J. A. (1980). A circumplex model of affect. Journal
of personality and social psychology, 39(6):1161.
Russell, J. A. and Mehrabian, A. (1977). Evidence for a
three-factor theory of emotions. Journal of research
in Personality, 11(3):273–294.
Salam, H., Celiktutan, O., Hupont, I., Gunes, H., and
Chetouani, M. (2016). Fully automatic analysis of en-
gagement and its relationship to personality in human-
robot interactions. IEEE Access, 5:705–721.
Saleh, S. and Berns, K. (2015). Nonverbal communica-
tion with a humanoid robot via head gestures. In Pro-
ceedings of the 8th ACM International Conference on
PErvasive Technologies Related to Assistive Environ-
ments, page 15. ACM.
Watson, D. and Clark, L. A. (1997). Measurement and mis-
measurement of mood: Recurrent and emergent is-
sues. Journal of personality assessment, 68(2):267–
296.
Watson, D. and Tellegen, A. (1985). Toward a consensual
structure of mood. Psychological bulletin, 98(2):219.
Wundt, W. M. and Judd, C. H. (1897). Outlines of psychol-
ogy, volume 1. Scholarly Press.
Zafar, Z., Paplu, S. H., and Berns, K. (2018a). Au-
tomatic assessment of human personality traits: A
step towards intelligent human-robot interaction. In
2018 IEEE-RAS 18th International Conference on
Humanoid Robots (Humanoids), pages 1–9. IEEE.
Zafar, Z., Venugopal, R., and Berns, K. (2018b). Real-
time recognition of human postures for human-robot
interaction. In 11th International Conference on
Advances in Computer-Human Interactions (ACHI),
2018, pages 29–35.
HUCAPP 2021 - 5th International Conference on Human Computer Interaction Theory and Applications
118