The Potential of Telepresence Robots for Intergroup Contact
Avner Peled
1 a
, Teemu Leinonen
1 b
and B
´
eatrice Hasler
2 c
1
Department of Media, Aalto University, Espoo, Finland
2
Sammy Ofer School of Communications, Interdisciplinary Center Herzliya, Herzliya, Israel
Keywords:
Intergroup Contact, Human-Robot Interaction, Telepresence, Conflict Resolution.
Abstract:
We propose the use of telepresence robots as a medium for intergroup contact that aims at reducing prejudice
between groups in conflict. We argue for robots as a midpoint between online communication and a face-
to-face meeting, combining the flexibility of the virtual world and the depth of physical interactions. We
define the basic architecture of telepresence systems and present a conceptual framework for robot-mediated
encounters in an intergroup context. We then provide design guidelines for telepresence systems that may
guide future implementations of robotic intergroup contact.
1 INTRODUCTION
The pervasive role of technology in intensifying the
ability of humans to harm one another is well known;
the use of technology to promote peace at both the
collective and personal levels is considerably more
modest. Over the years there have been calls within
the Human-Computer Interaction research commu-
nity to promote the use of technology to support peace
in world conflicts (Hourcade and Bullock-Rest, 2011;
Eckert et al., 2019). Often when people think of a
technological contribution to conflict resolution, the
emphasis is placed on decision support and negotia-
tion for policymakers and national leaders. A differ-
ent approach that is taken in the current paper, is using
technology to reconcile the ‘common’ people in a sit-
uation of conflict and build more positive intergroup
relations from the bottom up.
One of the most prominent models that act as a
guideline for this approach is the contact hypothe-
sis (Allport, 1954), which states that under the right
conditions, encounters with members of the opposing
group (i.e., the outgroup) can lead to reduced preju-
dice and more harmonious intergroup relations. We
propose using robots as a communication medium for
such contact, as they combine both the flexibility and
accessibility of online communication and the corpo-
reality of face-to-face encounters in a shared physical
a
https://orcid.org/0000-0002-0525-6385
b
https://orcid.org/0000-0002-6227-052X
c
https://orcid.org/0000-0002-3251-4677
space.
The hypotheses presented in this article are par-
tially based on observations from an initial test we
conducted on intergroup telepresence contact (Peled,
2019). The test system included one remotely con-
trolled telerobot that facilitated conversations be-
tween immigrants and local participants. We have an-
alyzed the results qualitatively through post-session
interviews.
2 CONCEPTUAL FRAMEWORK
2.1 Intergroup Contact Hypothesis
The contact hypothesis, as formulated by Gordon All-
port in his seminal book The Nature of Prejudice
(1954), specifies four conditions that need to be ful-
filled during positive intergroup contact: equal status,
having common goals, active cooperation, and insti-
tutional support. Fifty years later, a meta-analysis
across more than 500 studies in a variety of inter-
group contexts (Pettigrew and Tropp, 2006) has re-
vealed that contact is an effective means to reduce
prejudice. However, the meta-analysis also showed
that the conditions are not strictly essential for a pos-
itive outcome, yet they are factors among others that
facilitate it. Later research focused on expanding the
theory to include more conditions such as forming
cross-group friendships (Cook, 1962) and identifying
affective drivers, such as empathy and (reduced) anxi-
210
Peled, A., Leinonen, T. and Hasler, B.
The Potential of Telepresence Robots for Intergroup Contact.
DOI: 10.5220/0010148102100217
In Proceedings of the 4th International Conference on Computer-Human Interaction Research and Applications (CHIRA 2020), pages 210-217
ISBN: 978-989-758-480-0
Copyright
c
2020 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
ety, that play a mediating role in contact interventions
(Pettigrew et al., 2011; Brown and Hewstone, 2005).
An additional factor that moderates the outcome of
contact is group salience, the degree in which the par-
ticipants’ group identity is evident. A high level of
group salience facilitates the generalization of atti-
tudes from the interpersonal level to the group level
(Voci and Hewstone, 2003).
Most previous intergroup contact studies were
conducted in face-to-face (FtF) settings. However,
face-to-face contact can be challenging to implement,
particularly in areas of violent conflict (Hasler and
Amichai-Hamburger, 2013). Organizers commonly
face practical issues such as gathering diverse groups,
finding a neutral, accessible location, and compensat-
ing participants for travel expenses. Therefore, recent
projects have used technology (especially online com-
munication) to facilitate intergroup encounters.
2.2 Online Contact
Communication technologies expand the models of
contact and add new modalities of interaction while
compromising on the benefits of traditional FtF en-
counters. Research on online intergroup contact has
shown its potential to reduce prejudice and aid in
conflict resolution (Amichai-Hamburger et al., 2015;
Hasler and Amichai-Hamburger, 2013; Walther et al.,
2015). However, online contact is not always con-
structive, and may result in a negative outcome and in-
creased prejudice. The remote nature of the medium
makes participants less accountable for their actions
and less engaged in the conversation (White et al.,
2015; Schumann et al., 2017). The lack of nonver-
bal cues (Burgoon and Hoobler, 1994) obstructs the
path to a mutual understanding and impairs the turn-
taking process, which may evoke negative feelings
between the group members, such as anger and frus-
tration (Johnson et al., 2009).
Virtual reality (VR) is studied as a medium that
offers an immersive communication experience that
increases the user’s sense of embodiment during com-
munication (Kilteni et al., 2012). It was positively
evaluated for use in intergroup contact, both as a
space for dialog (Hasler et al., 2014), and as a tool
that allows individuals to immerse themselves in the
perspective of the other side (Hasson et al., 2019; Ka-
biljo, 2019). However, along with its promise, VR
also raises a number of ethical and moral concerns.
While the experience of being in the virtual space
intensifies as the technology develops, our corporeal
body is left behind as we subsume an abstract rep-
resentation as our new reality (Penny, 1993). This
quintessential mind-body split may alter one’s rela-
tion to corporeality, leading to psychological deficits,
such as depersonalization and derealization or body
neglect (Spiegel, 2018). Additionally, immersive
perspective-taking risks in assuming an ‘improper
distance’ (Chouliaraki, 2011; Nash, 2018) between
the viewer and the outgroup member, in which one
subordinates the other, incorporating their representa-
tion, rather than recognizing their irreducible alterity.
Prejudice can be seen as an abstraction of the
human body (Ahmed, 2000); yet despite the inher-
ent abstraction in virtual mediums and the widely
recognized role the body in forming social cogni-
tion (Dewey, 1986; Merleau-Ponty, 2013; Gallagher,
2006; Malafouris, 2013), little attention has been
given to robots as a tool for intergroup contact. Re-
motely controlled robots (telerobots) have a lot in
common with online mediums and may carry similar
risks when used for contact. Nevertheless, telerobots
have a physical presence; we use our bodies to inter-
act with robots just as we would with a living being.
They provide corporeal depth to mediated contact, sit-
uating a midpoint between online communication and
an FtF meeting.
2.3 Telepresence and Telerobots
Originally, the term telepresence was used by Mar-
vin Minsky and Patrick Gunkel to describe a vision of
a futuristic economy in which people perform man-
ual, physical labor from remote locations (Minsky,
1980). Although the term is nowadays used to de-
scribe a human’s presence in a virtual environment
(Steuer, 1992), telepresence originally refers to the
experience of being in a remote environment that is
real and mediated by a physical sensing agent, that is,
a telerobot. (Campanella, 2000). When a telerobot
serves as a remote representation of a human oper-
ator, it is referred to as its avatar. In phenomeno-
logical terms, the experience of operating a teler-
obot is named re-embodiment (Dolezal, 2009). To-
day’s telerobots go beyond industrial use and are de-
ployed in social care (Michaud et al., 2007), education
(Tanaka et al., 2014), and interpersonal communica-
tion (Ogawa et al., 2011), utilizing the internet as the
medium for tele-operation.
2.4 A Conceptual Model for
Telepresence Contact
Based on previous models of prejudice reduction in
intergroup contact (Pettigrew, 1998; Brown and Hew-
stone, 2005), we suggest a conceptual model for
telepresence-based contact (see fig. 1). We hypoth-
esize that an ingroup member first develops an atti-
The Potential of Telepresence Robots for Intergroup Contact
211
Figure 1: Telepresence Contact: Conceptual model.
tude toward the robot before projecting it onto the out-
group human operator. The initial attitude toward the
robot could be influenced by a previous general bias
or by characteristics of the particular robot. We then
expect the perception of the robot as a representation
of the operator’s agency to be moderated by the de-
gree of perceived co-presence. Initially formulated
by Goffman as a measure of our awareness of an-
other human being in our physical space (Goffman,
2008), the term is now used in literature to measure
the feeling of “togetherness” in mediated communica-
tion, virtual (S
¨
oeffner and Nam, 2007; Casanueva and
Blake, 2001; Bente et al., 2008), and physical (Hwang
et al., 2008; Choi and Kwak, 2017). Finally, as previ-
ous research on intergroup contact suggests (Voci and
Hewstone, 2003; Brown and Hewstone, 2005; Ken-
worthy et al., 2005), a generalized attitude toward the
outgroup is moderated by the level of group salience
apparent in the conversation.
2.5 Telepresence Systems
A communication event that is mediated by telepres-
ence robots could manifest in different architectures
that we define as telepresence systems. We identify
three different types:
1. Asymmetric: Participant A (operator) is repre-
sented by a telerobot and is operating it from a re-
mote location using a computer or mobile device.
Participant B (interlocutor) is co-located with the
robot, interacting with it in a shared physical envi-
ronment. Implementations of asymmetric systems
include industrial robots, military robots, surgical
robots, office work telepresence, and social ser-
vice robots.
2. Symmetric bidirectional: Both participants are si-
multaneously interacting with a co-located robot
and operating their remote telerobot. The oper-
ators do not see a dedicated control interface as
they would in a computer-based interface. In-
stead, they interact with the robot of their partner,
allowing it to capture their movements and trans-
mit them to the telerobot representing them on the
opposing end. This type of system is more chal-
lenging to implement, and only a few implemen-
tations exist as prototypes and proofs-of-concept
(Nagendran et al., 2015).
3. Symmetric unidirectional: Both participants are
operating a telerobot via a control interface, with-
out any physical human-robot interaction tak-
ing place. The two robots are co-located with
each other, while the participants are in sepa-
rate spaces. Implementations of this system in-
clude cooperative multi-robot tasks (Sirouspour
and Setoodeh, 2005) and Robot combat compe-
titions such as Battle Bots
1
.
3 TELEPRESENCE DESIGN
CONSIDERATIONS FOR
INTERGROUP CONTACT
Research in Human-Robot-Interaction (HRI) over the
past two decades offers insight on a wide range of
possibilities for designing social robots (robots that
conduct social interaction with humans). Key factors
that influence the attitude toward the robot include the
level of anthropomorphism of the robot’s appearance
(Hancock et al., 2011; Fink, 2012), the use of an ex-
ternal display on the body of the robot (Thrun, 2004;
Choi and Kwak, 2016), the use of affective touch and
soft materials (Kerruish, 2017; Stiehl et al., 2005; Bao
et al., 2018), and the use of nonverbal cues (Hirano
et al., 2016; Lala et al., 2019). In telepresence, re-
search focused on factors affecting the sense of pres-
ence and self-extension from operators toward their
robotic avatars. Key elements include the respon-
siveness and feedback level of the control interface
(Cole et al., 2000; Dolezal, 2009) and appearance of
the robotic avatar (Lee et al., 2015a; Groom et al.,
2009). Fig. 2 depicts a variety of existing telerobot
designs. Designs vary from the commonly used video
conferencing tablet on wheels to full-body anthropo-
morphic, zoomorphic, and caricature appearances.
The above factors are all relevant for establishing
trust and positive relations between interlocutors and
robotic avatars. In this article, however, we focus on
the design and architectural elements that may be of
particular importance to intergroup contact and con-
flict resolution. The following sections hypothesize
potential pitfalls and opportunities one may encounter
when applying telepresence robots as a means to re-
duce prejudice between groups.
1
https://battlebots.com/
CHIRA 2020 - 4th International Conference on Computer-Human Interaction Research and Applications
212
Figure 2: Telepresence Robots. Left to right: Dou-
ble Robotics
2
, Telenoid (Ogawa et al., 2011), BOCCO
3
,
stuffed-bear robot (Kuwamura et al., 2012).
3.1 Equality
One of Allport’s conditions for positive intergroup
contact is having an equal status between group mem-
bers; for example, colleagues in a workplace context
(Allport, 1954). It was further shown that maintain-
ing symmetry and equality during communication is
beneficial for contact in groups that are in asymmet-
ric conflicts, such as the Israeli-Palestinian conflict
(Maoz, 2005).
Symmetric telepresence systems provide the hard-
ware foundation for equality in contact situations, but
asymmetric systems produce an experience that is dif-
ferent in nature for both sides. The side that is in-
teracting with the robot is less aware of the media-
tion that is taking place and may experience stronger
senses of agency (the sense that I am the initiator of an
act) and ownership (the sense that it is my body that
is moving) in the interaction (Gallagher, 2000; Cole
et al., 2000). The side that is operating the telerobot
from a distance is more aware of the control opera-
tion and may exhibit behaviors characterizing anony-
mous computer-mediated-communication (CMC), as
posited in the SIDE model (Spears et al., 2002) or the
hyperpersonal model (Walther, 1996).
2
https://www.doublerobotics.com/
3
https://www.bocco.me/en/
In one use-case between advantaged and disad-
vantaged groups, a disadvantaged-group member may
operate a telerobot anonymously from their home,
while the advantaged-group member is interacting
with it in a public space. This scenario is likely to
reduce anxiety as the operator remains in their com-
fort zone and may get empowered by the ability to
see through the robot’s camera while not being seen
by the interaction partner. That may not only lower
the participation threshold in an intergroup contact
project but may also encourage bringing up more dif-
ficult topics related to conflict during the conversa-
tion. However, such a reversed power asymmetry
in robotic intergroup encounters could also hinder
the experience. In an initial test case conducted in
an intercultural setting between minority and major-
ity groups in Finland, participants felt uncomfortable
with the asymmetry. One member of a minority group
noted that they felt as if they were a government offi-
cial investigating their exposed partners (Peled, 2019,
p.132).
3.2 Anthropomorphism and
Dehumanization
A pivotal discussion revolves around the question of
anthropomorphism: the degree in which a robot’s ap-
pearance and behavior resemble that of a human. Cur-
rent literature paints a picture that is manifold (Fink,
2012): while anthropomorphic features may increase
empathy and acceptance of the robot, the effect is con-
text and culturally-dependent. In some cases, peo-
ple have preferred pet-shaped over human-like robots,
particularly in the realms of child therapy and elderly
care (Lorenz et al., 2016). Human-like robots may
also raise negative emotions when they appear eerily
human but are noticeably non-human (See the theory
of the Uncanny valley (Mori et al., 2012)). Addition-
ally, research by Groom et al. (2009) suggests that
robot operators have a greater sense of self-extension
(Belk, 1988) to their avatar when it is non-human.
Despite the advantages of a non-anthropomorphic
appearance, the question deepens in the context of
intergroup contact. Dehumanization (i.e., seeing an
outgroup member as non-human or less than human)
is both a marker and a driver of intergroup conflict
(Haslam, 2006; Kteily et al., 2016). Specifically, in-
dividuals involved in intergroup conflict tend to view
the outgroup as either animal-like or mechanistic au-
tomata, both common forms for robots. Current re-
search does not yet deal with the effects of avatar an-
thropomorphism on intergroup conflict, but a study
on video games points out that people find it easier to
make immoral decisions toward non-human avatars
The Potential of Telepresence Robots for Intergroup Contact
213
(Lin, 2011). Additionally, the distance formed by
CMC increased dehumanization in decision making
(Lee et al., 2015b).
When using a zoomorphic, mechanistic, or cari-
caturistic image for a telerobot, measures should be
taken to mitigate dehumanization. Encouraging the
display of secondary human emotions such as affec-
tion and admiration may help group participants to
humanize one another (Leyens et al., 2000). Such
emotions could rise as a result of self-disclosure in the
conversation (Kashian et al., 2017), or by discussing
an unrelated case of group suffering (Gubler et al.,
2015). Additionally, the use of visual, auditory, and
intellectual cues that remind the interlocutors of the
human operator could mitigate the effect of a non-
anthropomorphic avatar.
3.3 Designing with Group Salience in
Mind
When participants are aware of their interaction part-
ner’s group membership and if the interaction partner
is regarded as a (typical) representative of his or her
group, positive effects of the interpersonal encounter
are more likely to generalize to the outgroup as a
whole. One approach, suggested by Pettigrew (1998),
is to expose group identities gradually, starting with
a low salience, allowing initial contact to form, and
increasing it over time as the interaction partners es-
tablish an interpersonal relationship.
Group identity can be transmitted through a va-
riety of channels in robotic telepresence, beginning
with the design of the avatar; its appearance, voice,
and its surroundings, and proceeding into the con-
tent of the interaction. A robotic avatar may have
a non-humanoid appearance, but still maintain group
identity through group symbols, cues, and language.
It may speak in a group-specific language or accent,
wear typical accessories or flaunt national colors. The
freedom to use material objects brings up new de-
sign possibilities that are not available in an online en-
counter. Group cues may be positioned in subtle ways
to be gradually revealed by the interlocutor. If the ini-
tial appearance and behavior of the robot are engaging
enough, an interpersonal bond may form despite the
presence of group-related cues.
3.4 Language Translation
The outstanding benefit of mediated verbal interac-
tion for intergroup contact is the ability to translate
between different languages and dialects (Amichai-
Hamburger, 2012). Often groups in conflict do not
speak a common language and are required to speak
in a third language in the language of the advan-
taged/majority group. This situation forms an obsta-
cle to achieving equality in contact. Language trans-
lation may reinforce equality in communication, as
all participants can express themselves in their na-
tive language. Machine translation, however, may
also be destructive to cultural and political nuances
(Lehman-Wilzig, 2000; Cronin, 2012), and contem-
porary deep-learning translators exhibit stylistic and
gender bias based on their training datasets (Hovy
et al., 2020; Stanovsky et al., 2019). Models such
as Timo Honkella’s “peace machine” (Honkela, 2017;
Koulu and Kontiainen, 2019) attempt to resolve this
problem by preserving cultural-dependent meanings
within a translation.
While no data exist on the implications of machine
translation in intergroup contact, human translators
and interpreters often suffer from a lack of trust by
the participants who fear of bias and misinterpreta-
tion (Monz
´
o-Nebot, 2019); a machine translator may
enfold similar risks. In our initial test for automatic
language translation in contact between minority and
majority groups in Finland, participants enjoyed their
newly acquired ability to speak to one another in their
language, but have raised concerns about being mis-
represented by the machine (Peled, 2019). Further re-
search should focus on implementing and evaluating
feedback mechanisms within the translation process
that may reduce the fear of misinterpretation.
3.5 Public Space Interventions
Robots can transcend both physical borders set by
governments and online borders set by IT corpo-
rations (Del Vicario et al., 2016). They have the
potential to reach crowds that would not normally
engage in intergroup contact. One might consider
whether the group identity of the telerobot’s opera-
tor should be widely exposed to passersby, allowing
them to make a voluntary decision to approach, or
whether they should only realize it during the con-
versation. A meta-analysis by Pettigrew et al. (2011)
concluded that in contacts that had a negative out-
come, it was worse when initiated involuntarily. How-
ever, when the group identity is known, members may
avoid interaction (Wessel, 2009), which results in a
self-selection bias in organized interventions between
groups (Maoz, 2011). A balanced approach may work
well here: Some cues could be exposed, providing
only subtle hints about the telerobot’s identity, allow-
ing passersby to approach an intergroup encounter
voluntarily.
CHIRA 2020 - 4th International Conference on Computer-Human Interaction Research and Applications
214
4 DISCUSSION
The guidelines set forth in this article provide a foun-
dation for further research and implementation of an
innovative medium for intergroup contact. Any or-
ganized forms of intergroup contact should always be
questioned and scrutinized regarding its internal moti-
vation, especially in the context of violent, asymmet-
ric conflict. When involving technology, a transparent
and fully open-source strategy should be employed
to expose inherent biases and avoid concentrations of
power. The practice of Co-Design can increase the in-
volvement of minority groups in the process, dissem-
inate technological knowledge, and reduce the notion
of a higher power from above coming to restore peace
without perceiving the situation and its nuances.
Moreover, Groom et al. showed that operators had
a greater sense of self-extension to a robot that was as-
sembled by them, rather than another (Groom et al.,
2009). Robots were also successfully co-designed
with children as the target users (Alves-Oliveira et al.,
2017; Henkemans et al., 2016), and co-design meth-
ods improved the general attitude of students toward
robots in educational settings (Reich-Stiebert et al.,
2019).
Toward the first installment of telepresence robots
in sensitive conflict situations, benefits and risks are
to be carefully evaluated in dedicated focus groups
and with A/B testing of particular features and inter-
actions. We look forward to unveiling the potential of
telepresence robots for intergroup contact in further
research, ultimately leading to positive social change.
REFERENCES
Ahmed, S. (2000). Strange Encounters : Embodied Others
in Post-Coloniality. Routledge, London.
Allport, G. W. (1954). The Nature of Prejudice. Addison-
Wesley, Oxford, England.
Alves-Oliveira, P., Arriaga, P., Paiva, A., and Hoffman, G.
(2017). YOLO, a Robot for Creativity: A Co-Design
Study with Children. In Proceedings of the 2017 Con-
ference on Interaction Design and Children, pages
423–429, Stanford California USA. ACM.
Amichai-Hamburger, Y. (2012). Reducing intergroup con-
flict in the digital age. The handbook of intergroup
communication, pages 181–193.
Amichai-Hamburger, Y., Hasler, B. S., and Shani-Sherman,
T. (2015). Structured and unstructured intergroup con-
tact in the digital age. Computers in Human Behavior,
52:515–522.
Bao, G., Fang, H., Chen, L., Wan, Y., Xu, F., Yang, Q., and
Zhang, L. (2018). Soft robotics: Academic insights
and perspectives through bibliometric analysis. Soft
robotics, 5(3):229–241.
Belk, R. W. (1988). Possessions and the Extended Self.
Journal of Consumer Research, 15(2):139.
Bente, G., R
¨
uggenberg, S., Kr
¨
amer, N. C., and Eschen-
burg, F. (2008). Avatar-Mediated Networking: In-
creasing Social Presence and Interpersonal Trust in
Net-Based Collaborations. Human Communication
Research, 34(2):287–318.
Brown, R. and Hewstone, M. (2005). An integrative the-
ory of intergroup contact. Advances in experimental
social psychology, 37(37):255–343.
Burgoon, J. K. and Hoobler, G. D. (1994). Nonverbal
signals. Handbook of interpersonal communication,
2:229–285.
Campanella, T. (2000). Eden by wire: Webcameras and the
telepresent landscape. pages 22–46.
Casanueva, J. and Blake, E. (2001). The effects of avatars
on co-presence in a collaborative virtual environment.
Choi, J. J. and Kwak, S. S. (2016). Can you feel me?:
How embodiment levels of telepresence systems af-
fect presence. In 2016 25th IEEE International Sym-
posium on Robot and Human Interactive Communica-
tion (RO-MAN), pages 606–611, New York, NY, USA.
IEEE.
Choi, J. J. and Kwak, S. S. (2017). Who is this?: Identity
and presence in robot-mediated communication. Cog-
nitive Systems Research, 43:174–189.
Chouliaraki, L. (2011). ‘Improper distance’: Towards a crit-
ical account of solidarity as irony. International Jour-
nal of Cultural Studies, 14(4):363–381.
Cole, J., Sacks, O., and Waterman, I. (2000). On the immu-
nity principle: A view from a robot. Trends in Cogni-
tive Sciences, 4(5):167.
Cook, S. W. (1962). The systematic analysis of socially sig-
nificant events: A strategy for social research. Journal
of Social Issues, 18(2):66–84.
Cronin, M. (2012). Translation in the Digital Age. Rout-
ledge, first edition.
Del Vicario, M., Vivaldo, G., Bessi, A., Zollo, F., Scala, A.,
Caldarelli, G., and Quattrociocchi, W. (2016). Echo
chambers: Emotional contagion and group polariza-
tion on facebook. Scientific reports, 6:37825.
Dewey, J. (1986). Experience and education. In The Edu-
cational Forum, volume 50, pages 241–252. Taylor &
Francis.
Dolezal, L. (2009). The Remote Body: The Phenomenol-
ogy of Telepresence and Re-Embodiment. Human
Technology, 5(November):208–226.
Eckert, C., Isaksson, O., Hallstedt, S., Malmqvist, J.,
R
¨
onnb
¨
ack, A.
¨
O., and Panarotto, M. (2019). Indus-
try Trends to 2040. In Proceedings of the Design So-
ciety: International Conference on Engineering De-
sign, volume 1, pages 2121–2128. Cambridge Univer-
sity Press.
Fink, J. (2012). Anthropomorphism and human likeness
in the design of robots and human-robot interaction.
In International Conference on Social Robotics, pages
199–208. Springer.
Gallagher, S. (2000). Philosophical conceptions of the self:
Implications for cognitive science. Trends in Cogni-
tive Sciences, 4(1):14–21.
Gallagher, S. (2006). How the Body Shapes the Mind.
Clarendon Press.
Goffman, E. (2008). Behavior in Public Places. Simon and
Schuster.
The Potential of Telepresence Robots for Intergroup Contact
215
Groom, V., Takayama, L., Ochi, P., and Nass, C. (2009).
I am my robot: The impact of robot-building and
robot form on operators. In Proceedings of the
4th ACM/IEEE International Conference on Human
Robot Interaction - HRI ’09, page 31, La Jolla, Cali-
fornia, USA. ACM Press.
Gubler, J. R., Halperin, E., and Hirschberger, G. (2015).
Humanizing the Outgroup in Contexts of Protracted
Intergroup Conflict. Journal of Experimental Political
Science, 2(1):36–46.
Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J.
Y. C., de Visser, E. J., and Parasuraman, R. (2011). A
Meta-Analysis of Factors Affecting Trust in Human-
Robot Interaction. Human Factors: The Journal of the
Human Factors and Ergonomics Society, 53(5):517–
527.
Haslam, N. (2006). Dehumanization: An Integrative Re-
view. Personality and Social Psychology Review,
10(3):252–264.
Hasler, B. S. and Amichai-Hamburger, Y. (2013). Online
Intergroup Contact. In Amichai-Hamburger, Y., edi-
tor, The Social Net, pages 220–252. Oxford University
Press.
Hasler, B. S., Hirschberger, G., Shani-Sherman, T., and
Friedman, D. A. (2014). Virtual Peacemakers:
Mimicry Increases Empathy in Simulated Contact
with Virtual Outgroup Members. Cyberpsychology,
Behavior, and Social Networking, 17(12):766–771.
Hasson, Y., Schori-Eyal, N., Landau, D., Hasler, B. S.,
Levy, J., Friedman, D., and Halperin, E. (2019).
The enemy’s gaze: Immersive virtual environ-
ments enhance peace promoting attitudes and emo-
tions in violent intergroup conflicts. PLOS ONE,
14(9):e0222342.
Henkemans, O. B., Neerincx, M., Pal, S., Van Dam, R.,
Hong, J. S., Oleari, E., Pozzi, C., Sardu, F., and Sac-
chitelli, F. (2016). Co-Design of the Pal Robot and
Avatar That Perform Joint Activities with Children
for Improved Diabetes Self-Management. New York:
IEEE Press.
Hirano, T., Shiomi, M., Iio, T., Kimoto, M., Nagashio, T.,
Tanev, I., Shimohara, K., and Hagita, N. (2016). Com-
munication Cues in a Human-Robot Touch Interac-
tion. In Proceedings of the Fourth International Con-
ference on Human Agent Interaction - HAI ’16, pages
201–206, Biopolis, Singapore. ACM Press.
Honkela, T. (2017). Rauhankone: Teko
¨
alytutkijan testa-
mentti.
Hourcade, J. P. and Bullock-Rest, N. E. (2011). HCI for
peace: A call for constructive action. In Proceedings
of the 2011 Annual Conference on Human Factors in
Computing Systems - CHI ’11, page 443, Vancouver,
BC, Canada. ACM Press.
Hovy, D., Bianchi, F., and Fornaciari, T. (2020). Can You
Translate that into Man? Commercial Machine Trans-
lation Systems Include Stylistic Biases. In Proceed-
ings of the 58th Annual Meeting of the Association for
Computational Linguistics.
Hwang, J., Sangyup Lee, Sang Chul Ahn, and Hyoung-
gon Kim (2008). Augmented robot agent: Enhanc-
ing co-presence of the remote participant. In 2008 7th
IEEE/ACM International Symposium on Mixed and
Augmented Reality, pages 161–162, Cambridge, UK.
IEEE.
Johnson, N. A., Cooper, R. B., and Chin, W. W. (2009).
Anger and flaming in computer-mediated negotia-
tion among strangers. Decision Support Systems,
46(3):660–672.
Kabiljo, L. (2019). Virtual Reality Fostering Empathy:
Meet the Enemy. Studies in Art Education, 60(4):317–
320.
Kashian, N., Jang, J.-w., Shin, S. Y., Dai, Y., and Walther,
J. B. (2017). Self-disclosure and liking in computer-
mediated communication. Computers in Human Be-
havior, 71:275–283.
Kenworthy, J. B., Turner, R. N., Hewstone, M., and Voci,
A. (2005). Intergroup contact: When does it work,
and why. On the nature of prejudice: Fifty years after
Allport, pages 278–292.
Kerruish, E. (2017). Affective Touch in Social Robots.
Transformations (14443775), (29).
Kilteni, K., Groten, R., and Slater, M. (2012). The Sense of
Embodiment in Virtual Reality. Presence: Teleopera-
tors and Virtual Environments, 21(4):373–387.
Koulu, R. and Kontiainen, L. E. (2019). How Will AI Shape
the Future of Law?
Kteily, N., Hodson, G., and Bruneau, E. (2016). They
see us as less than human: Metadehumanization pre-
dicts intergroup conflict via reciprocal dehumaniza-
tion. Journal of Personality and Social Psychology,
110(3):343–370.
Kuwamura, K., Minato, T., Nishio, S., and Ishiguro,
H. (2012). Personality distortion in communication
through teleoperated robots. In 2012 IEEE RO-MAN:
The 21st IEEE International Symposium on Robot
and Human Interactive Communication, pages 49–54,
Paris, France. IEEE.
Lala, D., Inoue, K., and Kawahara, T. (2019). Smooth
Turn-taking by a Robot Using an Online Continuous
Model to Generate Turn-taking Cues. In 2019 Inter-
national Conference on Multimodal Interaction, ICMI
’19, pages 226–234, Suzhou, China. Association for
Computing Machinery.
Lee, H., Kim, Y.-H., Lee, K.-k., Yoon, D.-K., and You, B.-J.
(2015a). Designing the appearance of a telepresence
robot, M4K: A case study. In International Workshop
on Cultural Robotics, pages 33–43. Springer.
Lee, M. K., Fruchter, N., and Dabbish, L. (2015b). Making
Decisions From a Distance: The Impact of Techno-
logical Mediation on Riskiness and Dehumanization.
In Proceedings of the 18th ACM Conference on Com-
puter Supported Cooperative Work & Social Comput-
ing - CSCW ’15, pages 1576–1589, Vancouver, BC,
Canada. ACM Press.
Lehman-Wilzig, S. (2000). The Tower of Babel vs the
power of babble: Future political, economic and cul-
tural consequences of synchronous, automated trans-
lation systems (SATS). New Media & Society,
2(4):467–494.
Leyens, J.-P., Paladino, P. M., Rodriguez-Torres, R., Vaes,
J., Demoulin, S., Rodriguez-Perez, A., and Gaunt, R.
(2000). The Emotional Side of Prejudice: The Attri-
bution of Secondary Emotions to Ingroups and Out-
groups. Personality and Social Psychology Review,
4(2):186–197.
Lin, S.-F. (2011). Effect of Opponent Type on Moral
Emotions and Responses to Video Game Play. Cy-
berpsychology, Behavior, and Social Networking,
14(11):695–698.
CHIRA 2020 - 4th International Conference on Computer-Human Interaction Research and Applications
216
Lorenz, T., Weiss, A., and Hirche, S. (2016). Synchrony and
Reciprocity: Key Mechanisms for Social Companion
Robots in Therapy and Care. International Journal of
Social Robotics, 8(1):125–143.
Malafouris, L. (2013). How Things Shape the Mind. MIT
Press.
Maoz, I. (2005). Evaluating the Communication between
Groups in Dispute: Equality in Contact Interventions
between Jews and Arabs in Israel. Negotiation Jour-
nal, 21(1):131–146.
Maoz, I. (2011). Does contact work in protracted asymmet-
rical conflict? Appraising 20 years of reconciliation-
aimed encounters between Israeli Jews and Palestini-
ans. Journal of Peace Research, 48(1):115–125.
Merleau-Ponty, M. (2013). Phenomenology of Perception.
Routledge.
Michaud, F., Boissy, P., Labonte, D., Corriveau, H., Grant,
A., Lauria, M., Cloutier, R., Roux, M.-A., Iannuzzi,
D., and Royer, M.-P. (2007). Telepresence Robot for
Home Care Assistance. In AAAI Spring Symposium:
Multidisciplinary Collaboration for Socially Assistive
Robotics, pages 50–55. California, USA.
Minsky, M. (1980). Telepresence.
Monz
´
o-Nebot, E. (2019). Translators and interpreters as
agents of diversity. Managing myths and pursuing jus-
tice in postmonolingual societies. Translating and In-
terpreting Justice in a Postmonolingual Age, page 9.
Mori, M., MacDorman, K. F., and Kageki, N. (2012). The
Uncanny Valley [From the Field]. IEEE Robotics Au-
tomation Magazine, 19(2):98–100.
Nagendran, A., Steed, A., Kelly, B., and Pan, Y. (2015).
Symmetric telepresence using robotic humanoid sur-
rogates: Robotic symmetric telepresence. Computer
Animation and Virtual Worlds, 26(3-4):271–280.
Nash, K. (2018). Virtual reality witness: Exploring the
ethics of mediated presence. Studies in Documentary
Film, 12(2):119–131.
Ogawa, K., Nishio, S., Koda, K., Taura, K., Minato, T.,
Ishii, C. T., and Ishiguro, H. (2011). Telenoid:
Tele-presence android for communication. In ACM
SIGGRAPH 2011 Emerging Technologies on - SIG-
GRAPH ’11, pages 1–1, Vancouver, British Columbia,
Canada. ACM Press.
Peled, A. (2019). Soft Robotic Incarnation. PhD thesis,
Aalto University.
Penny, S. (1993). Virtual Bodybuilding. Media Information
Australia, 69(1):17–22.
Pettigrew, T. F. (1998). Intergroup contact theory. Annual
review of psychology, 49(1):65–85.
Pettigrew, T. F. and Tropp, L. R. (2006). A meta-analytic
test of intergroup contact theory. Journal of Personal-
ity and Social Psychology, 90(5):751–783.
Pettigrew, T. F., Tropp, L. R., Wagner, U., and Christ, O.
(2011). Recent advances in intergroup contact the-
ory. International Journal of Intercultural Relations,
35(3):271–280.
Reich-Stiebert, N., Eyssel, F., and Hohnemann, C. (2019).
Involve the user! Changing attitudes toward robots
by user participation in a robot prototyping process.
Computers in Human Behavior, 91:290–296.
Schumann, S., Klein, O., Douglas, K., and Hewstone, M.
(2017). When is computer-mediated intergroup con-
tact most promising? Examining the effect of out-
group members’ anonymity on prejudice. Computers
in Human Behavior, 77:198–210.
Sirouspour, S. and Setoodeh, P. (2005). Multi-
operator/multi-robot teleoperation: An adaptive non-
linear control approach. In 2005 IEEE/RSJ Interna-
tional Conference on Intelligent Robots and Systems,
pages 1576–1581. IEEE.
S
¨
oeffner, J. and Nam, C. S. (2007). Co-presence in shared
virtual environments: Avatars beyond the opposition
of presence and representation. In International Con-
ference on Human-Computer Interaction, pages 949–
958. Springer.
Spears, R., Postmes, T., Lea, M., and Wolbert, A. (2002).
When are net effects gross products? Communication.
Journal of Social Issues, 58(1):91–107.
Spiegel, J. S. (2018). The Ethics of Virtual Reality Technol-
ogy: Social Hazards and Public Policy Recommenda-
tions. Science and Engineering Ethics, 24(5):1537–
1550.
Stanovsky, G., Smith, N. A., and Zettlemoyer, L. (2019).
Evaluating gender bias in machine translation. arXiv
preprint arXiv:1906.00591.
Steuer, J. (1992). Defining Virtual Reality: Dimensions De-
termining Telepresence. Journal of Communication,
42(4):73–93.
Stiehl, W., Lieberman, J., Breazeal, C., Basel, L., Lalla, L.,
and Wolf, M. (2005). Design of a therapeutic robotic
companion for relational, affective touch. In ROMAN
2005. IEEE International Workshop on Robot and Hu-
man Interactive Communication, 2005., pages 408–
415, Nashville, TN, USA. IEEE.
Tanaka, F., Takahashi, T., Matsuzoe, S., Tazawa, N., and
Morita, M. (2014). Telepresence robot helps children
in communicating with teachers who speak a different
language. In Proceedings of the 2014 ACM/IEEE In-
ternational Conference on Human-Robot Interaction
- HRI ’14, pages 399–406, Bielefeld, Germany. ACM
Press.
Thrun, S. (2004). Toward a Framework for Human-Robot
Interaction. Human–Computer Interaction, 19(1-
2):9–24.
Voci, A. and Hewstone, M. (2003). Intergroup Contact and
Prejudice Toward Immigrants in Italy: The Media-
tional Role of Anxiety and the Moderational Role of
Group Salience. Group Processes & Intergroup Rela-
tions, 6(1):37–54.
Walther, J. B. (1996). Computer-mediated communication:
Impersonal, interpersonal, and hyperpersonal interac-
tion. Communication research, 23(1):3–43.
Walther, J. B., Hoter, E., Ganayem, A., and Shonfeld, M.
(2015). Computer-mediated communication and the
reduction of prejudice: A controlled longitudinal field
experiment among Jews and Arabs in Israel. Comput-
ers in Human Behavior, 52:550–558.
Wessel, T. (2009). Does Diversity in Urban Space Enhance
Intergroup Contact and Tolerance? Geografiska An-
naler: Series B, Human Geography, 91(1):5–17.
White, F. A., Harvey, L. J., and Abu-Rayya, H. M. (2015).
Improving Intergroup Relations in the Internet Age:
A Critical Review. Review of General Psychology,
19(2):129–139.
The Potential of Telepresence Robots for Intergroup Contact
217