Tangible Interaction for Simple 3D Interaction Tasks:
Comparing Device-In-Hand and Hand-In-Device Scenarios
Mirjam Augstein
1
, Thomas Neumayr
2
, Stephan Vrecer
2
, Werner Kurschl
3
and Josef Altmann
1
1
Communication and Knowledge Media, University of Applied Sciences Upper Austria, Hagenberg, Austria
2
Research & Development, University of Applied Sciences Upper Austria, Hagenberg, Austria
3
Human-Centered Computing, University of Applied Sciences Upper Austria, Hagenberg, Austria
Keywords:
Tangible Interaction, Device-in-Hand Interaction, Hand-in-Device Interaction, Interaction Directions.
Abstract:
Although recently, touch-based input with a reduced amount of haptic guidance gained popularity, traditional
tangible input devices like mice or joysticks are still indispensable of everyday human-computer interaction.
Most traditional tangible devices are implemented in so-called device-in-hand settings. There, the user’s hand
grabs the device and the device-hand system is then moved to trigger an input activity. Thus, the user’s
hand posture usually stays relatively stable. Hand-in-device approaches are an alternative form of tangible
interaction settings where the user’s hand moves within a tangible device. Such a setting differs considerably
as i) the user’s hand posture is more flexible, and ii) the device itself is stable while the interacting hand moves.
This paper describes a comparative study on users’ interaction performance with device-in-hand and hand-in-
device settings for simple 3D interaction tasks. Further, it contributes to the body of knowledge on favourable
interaction directions (left or right).
1 INTRODUCTION
Most traditional input devices like mice or joysticks
strongly rely on a haptic experience during the inte-
raction process. Throughout the past years, the advent
of current smart phone and tablet technology brought
about a sharp increase in the usage of touch screen
interfaces that resulted in ever less haptics involved
in the interaction process. Most such devices do not
(any more) provide physical buttons or other tangible
hardware elements (other than the touch screen itself).
A lot of effort was recently taken to re-introduce
a haptic experience for touch screen interfaces (see,
e.g., (Ciesla et al., 2013; Kincaid, 2012; Bau et al.,
2010) for work on tactile touch screens and haptic
guidance for touch screens). Guidance through hap-
tics is especially important when interactive systems
should be well controllable without permanent visual
observation. This is most relevant in automotive or
industrial sectors but also for traditional, physically
split interactive settings (such as a mouse-monitor
combination) where the user’s eyes usually rest on
the screen most of the time and the mouse handling
is controlled via the visual output (e.g., the mouse
cursor’s movement on the screen) only. For every-
day human-computer interaction settings, particularly
those aside from embedded input/visualization sys-
tems like most touch screens that generally allow for a
more direct way of interaction, tangible input devices
are still indispensable.
The focus of this paper lies on tangible interaction
for simple 3D interaction tasks. By tangible inte-
raction we understand interaction that relies on a
permanent physical contact between the user’s inte-
racting hand and the input device. This kind of scena-
rio is most common, e.g., in the gaming sector or for
the operation of control panels in the industrial area.
For both, precision and thus also guidance is decisive.
The most popular form of tangible input devices is
the one of a device held in the user’s interacting hand
(e.g., mice or joysticks). In this paper we will refer
to such settings as Device-In-Hand (DIH) scenarios,
whereas in so-called Hand-In-Device (HID) settings,
the user’s interacting hand is moved within the de-
vice (which also defines physical boundaries) while
the hand posture itself is flexible.
Both types of settings involve advantages. In DIH
settings, we expect the user to be better able to con-
trol the input process as most related devices offer a
high hand-device stability. In HID settings, the user
is provided with a higher flexibility and can choose
the own hand posture according to individual prere-
Augstein, M., Neumayr, T., Vrecer, S., Kurschl, W. and Altmann, J.
Tangible Interaction for Simple 3D Interaction Tasks: Comparing Device-In-Hand and Hand-In-Device Scenarios.
DOI: 10.5220/0006883300230033
In Proceedings of the 2nd International Conference on Computer-Human Interaction Research and Applications (CHIRA 2018), pages 23-33
ISBN: 978-989-758-328-5
Copyright © 2018 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
23
quisites. This paper reports a study on users’ inte-
raction performance with representative DIH and HID
input devices. Further, it analyzes differences in per-
formance related to the interaction direction (left or
right, towards or away from the body respectively).
2 RELATED WORK
This section describes related work on DIH and HID
approaches and preferred interaction directions.
2.1 Interaction Approaches
(Shen et al., 2011) distinguish between two forms of
interaction that are possible with hands: i) device-
assisted hand interaction and ii) bare-hand interaction.
In some aspects, this distinction is similar to ours. For
HID and bare-hand interaction the user’s interacting
hand does not grab and hold on to a physical device
and the hand posture is flexible. Regarding device-
assisted and DIH interaction, both involve permanent
physical contact between the user’s hand and a de-
vice. However, the definitions differ in other aspects,
the most important being that we focus on tangible
HID interaction which involves permanent physical
contact between user and device and thus also haptic
guidance. The latter is an aspect that is usually mis-
sing with touchless input (although there have alre-
ady been endeavours to re-introduce haptic feedback
for touchless settings, see e.g., (Pfeiffer et al., 2014;
Carter et al., 2013)).
In general, lots of work can be found related to
DIH interaction for 3D navigation tasks. In most ca-
ses these approaches use 3D mice. We picked out a
few we consider most relevant within the scope of
this paper and describe them as follows. (Stannus
et al., 2011) compared touchless gestural interaction
to two DIH interaction techniques for navigation tasks
in Google Earth. For DIH interaction they used a 3D
mouse
1
and a conventional mouse. More recently, a
similar study was conducted by (Tscharn et al., 2016),
who also analyzed users’ performance at 3D naviga-
tion tasks with SpaceNavigator (a DIH input device)
and the Leap motion controller (for bare-hand input).
In both cases the task-based study design, the inte-
raction tasks (of varying complexity) and the metrics
analyzed (e.g., their accuracy closely matches our re-
gularity) are similar to ours.
In addition to the parallels between HID and tou-
chless interaction with the Leap motion controller,
there are many similarities between the DIH device
1
The concrete model is not named in their paper.
used by our study and the ones of (Stannus et al.,
2011; Tscharn et al., 2016). SpaceNavigator as well
as the joystick we use (a Thrustmaster T.Flight X) are
devices that are held in hand (i.e., grabbed once and
then held on to during the further process). Additio-
nally, the physical appearance of SpaceNavigator re-
minds of a small joystick and both e.g., use push/pull
activities for forwards/backwards movement. They
differ in some other aspects like movement range of
the physical device or number of supported DoF.
A different comparison of DIH to bare-hand in-air
interaction is described by (Dangeti et al., 2016) who
focus on navigation of 3D objects in modeling envi-
ronments. They propose a prototype utilizing Micro-
soft Kinect for gesture recognition and an iPhone or
Playdoh/Lego augmented with an accelerometer as a
tangible solution. While their work is still prelimi-
nary, the scope of interest is similar to ours.
Tangible HID is less popular than tangible DIH
interaction as most common HID approaches use tou-
chless input. Yet, (Sato et al., 2012) discuss novel
application areas for capacitive sensing capabilities,
amongst them a fish tank filled with water. Although
the research aims of (Sato et al., 2012) are not directly
related to ours, their approach is highly interesting for
us as their interactive water tank constitutes a HID ap-
proach according to our definition. There is a perma-
nent physical contact between user and “device” and
resistance provided by the water. Further, the tank
borders physically limit the interaction space.
A related study of (Augstein et al., 2018) compa-
red a tangible HID approach to touchless input (wit-
hout provision of physical borders, thus not a HID set-
ting according to our definition) and an input techni-
que that can be classified as somewhere in between
tangible HID and fully touchless input without physi-
cal constraints: it uses physical boundaries in form of
a box and touchless input (however also allowing for
the walls of the box to be touched, interpreting also
the amount of pressure). The study used SpongeBox,
the device that is also used for the study described in
this paper as an example for HID (see Section 3.2), the
Leap motion controller for touchless input and anot-
her device prototype called SquareSense for the third
kind of interaction just explained.
Although the methods for gathering data related
to users’ interaction performance are similar compa-
red to what is described in this paper (see Section 4),
the study of (Augstein et al., 2018) differed drasti-
cally regarding its research questions. It was intended
to compare interaction with three devices involving a
different amount of haptics along several interaction
performance but also User Experience criteria.
CHIRA 2018 - 2nd International Conference on Computer-Human Interaction Research and Applications
24
2.2 Favorable Hand-Arm Movements
Prior work in the field of ergonomics and work place
safety dealt with the question which movements of the
hand-arm system are favorable for humans. (Stras-
ser et al., 1989) and (Strasser and M
¨
uller, 1999) con-
ducted experiments throughout a decade using elec-
tromyography to detect strain for different arm mo-
vements on a horizontal plane and found out that an-
gles (to the body plane) of about 30
were optimal
whereas 150
where most suboptimal, i.e., the areas
towards or closer to the body were preferred.
(McDonald et al., 2012) investigated shoulder
muscle demands in horizontal pulling and pushing
activities and found that there is “a potential incre-
ase in intramuscular pressure (IMP) that occurs in ab-
ducted postures”, which is relevant for the interaction
direction right (if the right hand is used).
Further, according to (Rinck and Becker, 2007),
humans consider a movement of the arm away from
the body an action of avoidance (i.e., unpleasant) and
a movement towards the body an action of approach
(i.e., pleasant). These findings are also backed by our
results. Especially for complex tasks, the interaction
in the direction left worked significantly better than
right for right-handed users.
3 INPUT TECHNIQUES
This section describes the input techniques and de-
vices used for the study reported in this paper. The
3D interaction tasks users had to perform during the
study (see Section 4), the interaction space, user inter-
face and reactivity of the digital object moved in this
space were almost
2
identical for both settings in order
to allow for comparison. Also, we configured the two
input devices we used to both function as isometric
input devices (i.e., input devices that connect the hu-
man limb and the device through force (Zhai, 2008)).
The input methods however are fundamentally diffe-
rent regarding the interplay of the user’s hand and the
device. The most important difference can be seen
in stability (DIH) vs. flexibility (HID) regarding the
hand-device system.
During DIH interaction, the user’s hand grabs the
device and holds on to it during interaction. This
can lead to better controllability and the hand-device
posture usually remains unchanged (if an input acti-
vity requires movement, e.g., for a navigation task,
the hand-device system is moved altogether). Du-
ring HID interaction, the device is steady while the
2
There is one exception we could not avoid due to the
nature of the devices which is explained in Section 4.
user’s hand moves. Thus, the hand-device posture
constantly changes (but without losing physical con-
tact). This can decrease controllability but it allows
for more freedom regarding the hand posture itself,
which is especially beneficial when the hand’s mo-
bility is limited. The participants of our study had no
known motor impairments, however we plan a second
study with people with motor impairments (as discus-
sed later) and expect substantial differences between
the two groups.
3.1 DIH Interaction with a Joystick
A typical example for a DIH setting is interaction
using a joystick. A joystick is usually grabbed once
and then held in the hand during the interaction wit-
hout major changes of hand-device posture. Many
joysticks directly support this stability by their phy-
sical appearance. For instance, the Thrustmaster
T.Flight Stick X USB we used for our study is desig-
ned according to the physiology of the human hand
and provides a wide hand rest which prevents slip-
ping.
Figure 1 shows a pre-test user while getting fa-
miliar with the device. In general, most commerci-
ally available joysticks enable two or three Degrees
of Freedom (DoF). The joystick we used in the study
supports movement along three axes. Two axes are
operable by pressing the stick forward, back, to the
left and to the right while movement along the the z-
axis can be done by stick rotation. For reasons of bet-
ter comparability regarding the input process, we used
a slightly different configuration during the study (re-
lying on movement forwards, back, left and right), as
explained in Section 4. Further, the joystick was con-
figured to require force application instead of just mo-
vement (thus constituting an isometric device).
Figure 1: User interacting DIH, using a joystick.
Tangible Interaction for Simple 3D Interaction Tasks: Comparing Device-In-Hand and Hand-In-Device Scenarios
25
3.2 HID Interaction with SpongeBox
While DIH interaction is widely used, HID input is
less popular, at least for a larger target group and for
tangible settings. As described earlier, there are some
parallels between HID and touchless interaction (e.g.,
flexible hand movement while the device remains sta-
ble) but also differences, the most important being
that tangible HID settings provide permanent physical
contact and borders during the interaction process.
Thus, a user is not able to leave the device’s phy-
sical interaction space (i.e., the space sensitive to user
input). Further, a user receives direct or indirect hap-
tic feedback and guidance (e.g., via the physical resis-
tance of the device that needs to be overcome to trig-
ger an action). While HID settings might cause users
to feel less in control of the process, compared to DIH
interaction, they offer other potentials like e.g., rea-
ching additional target groups due to the hand-device
posture which can be individualized easier (e.g., for
users who have difficulties grabbing a device but also
for users who prefer an alternative hand posture for
other reasons).
For our study, we used the SpongeBox device pro-
totype (Augstein et al., 2017a), which has been de-
veloped specifically for comparative studies on tan-
gible interaction. As mentioned earlier, it has e.g.,
been used for a comparative study on input devices
that provide a different amount of haptics.
SpongeBox is a box with open upper and back
walls while the left, right, front and bottom walls are
covered with sponges. The user interacts by placing
the hand inside the box (see Figure 2) and pressing it
slightly against the sponges. The hand posture is thus
flexible, a user can freely choose to, e.g., make a fist
or keep the hand open.
Technologically, SpongeBox consists of an Ar-
duino Uno and several pressure sensors placed under
the sponges. It supports the interaction directions left,
right, forward, back, down and up, and three DoF (the
directions back and up are restricted to moving back
to the initial position after having moved forward or
down, due to missing upper and back walls, which
was intended, however. A back wall would make it
impossible to place the hand comfortably inside the
box and an upper wall would inhibit occasional visual
control of the interacting hand.
4 TASKS AND METRICS
Our study focuses on simple 3D interaction tasks
and considers three basic metrics indicative of perfor-
mance at these tasks. We rely on tasks and metrics
Figure 2: User interacting HID, using SpongeBox.
Figure 3: Visualization users see during the tests (Augstein
et al., 2018) (the red cube is the user’s “cursor” in the in-
teraction space, the green cube is used to indicate target
areas).
that have been tested earlier in a similar-procedure
but different-purpose study on the effect of haptics
on interaction performance and user experience (see
a more detailed general description of the metrics in
(Augstein et al., 2018)).
To be able to compute concrete values for the me-
trics, the participants of our study performed identical
so-called “interaction tests” with both settings. First,
users see a UI with a predefined 3D interaction space
and an interactive digital object acting as their “cur-
sor” visualized as the red cube depicted in Figure 3.
The interactive object can be moved along two dimen-
sions and in several directions, starting from the initial
position in the center of the space.
The setting in our study allows for movement of
the interactive object in the directions left (and back
to the initial position), right (and back to the initial po-
sition), forward (and back to the initial position) and
down (and back up to the initial position). Some tasks
require a user to perform simple, one-directional mo-
vements while others require a user to perform more
complex movements involving several directions and
dimensions.
Figure 4 shows the two input settings HID (a.)
and DIH (b.) and the DoF as relevant for the study.
CHIRA 2018 - 2nd International Conference on Computer-Human Interaction Research and Applications
26
Figure 4: Input devices and methods and the two to three
DoF used by the interaction tasks during the study.
Further, the figure shows the effects of input activities
with the two settings on the interactive object (upper
part of the figure). E.g., the horizontal movement be-
tween left and right is denoted as the first DoF in the
figure and both, moving the hand within the Sponge-
Box device to the left or right, and moving the joystick
to the left or right will result in a position change of
the digital object (moving it to the left or right).
Regarding the second DoF, moving the
hand/joystick forward and back to the start po-
sition will cause the digital object to move forward
in the interaction space and back to the start position.
These movements are well comparable among the
settings and cause identical effects regarding the
change in the digital space.
The down-movement of the interactive digital ob-
ject is however triggered differently with the two set-
tings due to the nature of the devices. The related
results are thus not as well comparable as the rest and
will only be reported for purposes of completeness in
Section 5.4. With the HID setting, the user’s physi-
cal activity (pressing down) results in the matching
reaction in the digital environment (the cube moves
down). With the DIH setting this was not possible as
the z-axis is addressed by rotation of the stick. This is
a common solution with joysticks, however it differs
significantly from the input activity that addresses the
z-axis with the HID setting.
In order not to introduce a completely different in-
put activity (rotation) to trigger the down-movement,
we decided to implement it by moving the joystick
backwards (from the default position) and then for-
wards to reach the start position again.
The following sections describe in more detail the
concrete 3D interaction tasks the study participants
were asked to do and the related metrics indicative
of interaction performance.
4.1 Reach
The first task requires a user to move the interactive
cube to the personal maximum comfortable position
in the directions left, right, forward and down.
Thus, the related Reach metric describes the maxi-
mum distance to the starting point that can be comfor-
tably reached by a user. The directions are tested se-
parately, each starting from the initial position in the
center of the interaction space. The related metrics
are stored as ReachLeft, ReachRight, ReachForward
and ReachDown. In addition to the individual values
for the relevant directions we compute an aggregated
Reach result which averages over all directions.
The metric was chosen because the mobility as
well as strength of a user’s dominant hand and com-
fort during movement are highly individual. They
could be significantly reduced for people with motor
impairments but also differ due to personal preferen-
ces. We however expect the results to be similarly
good for all participants here (all without known im-
pairments).
At the HID setting, users are required to press the
hand against the sponges in all directions to move the
cube in the respective direction. At the DIH setting,
the joystick has to be moved forwards, left, right and
back (for the down movement of the interactive ob-
ject), to the respective personal maximum comforta-
ble position. The joystick we used for the study can
be configured regarding its physical resistance which
is why we did a thorough pre-test before the actual
user study to find a configuration where resistance
is neither perceived as too high (i.e., physically de-
manding) nor as too low (i.e., prone to unintended in-
teraction). Likewise we also calibrated SpongeBox
regarding the amount of physical pressure needed to
trigger an input activity and device reactivity. Thus
for the study, the amount of physical effort demanded
by the tasks was about equal for both settings.
The values for Reach are stored in percent of the
system’s global maximum (the interactive cube can-
not be moved out of the interaction space, if it has
reached the maximum position, the related Reach is
100% and the object stops there). The task is similar
to others used for related studies (e.g., the one des-
cribed in (Tscharn et al., 2016)) comparing different
interaction settings.
4.2 Regularity
The logs of the Reach tasks are used to compute an
additional metric called Regularity. Regularity des-
cribes how straight and direct a user’s path is bet-
ween start and end point. Regularity is again stored
Tangible Interaction for Simple 3D Interaction Tasks: Comparing Device-In-Hand and Hand-In-Device Scenarios
27
for all interaction directions: RegularityLeft, Regu-
larityRight, RegularityForward, RegularityDown and
Regularity (i.e., an aggregated value averaging over
these directions).
To compute the results, the system analyzes de-
viations from the straight-most path between initial
position and the user’s maximum position in each di-
rection. The metric is again measured in percent; a
straight path would result in a Regularity of 100%.
The path is analyzed at every time stamp between ini-
tial and end position, the deviation from the straight
path is then averaged over all time stamps and sub-
tracted from an initial value of 100%.
4.3 ContinuousRegularity
The ContinuousRegularity task, which shares some
similarities with the “rotation navigation task” of
(Tscharn et al., 2016), requires the user to follow a
green target cube (as depicted in Figure 3) over a path
that reaches all relevant areas of the 3D interaction
space and that includes movement in all directions
described earlier. The path starts at the initial posi-
tion in the center of the interaction space.
ContinuousRegularity measures how straight and
interruption-free a complex path (covering several di-
mensions and directions) is. The computation mat-
ches the one of Regularity algorithmically. However
here, the user is required to perform a continuous mo-
vement that covers all relevant directions whereas Re-
gularity is tested for every direction individually (with
a break between the tests for the individual directi-
ons).
Based on the ContinuousRegularity task, we
compute the following metrics: ContinuousRegula-
rity which is an aggregated result for the related
task, ContinuousRegularityLeftDown, Continuous-
RegularityLeftForward, ContinuousRegularityRight-
Down, ContinuousRegularityRightForward, Continu-
ousRegularityLeft (which aggregates the metrics that
involve the direction left) and ContinuousRegularity-
Right (which aggregates the metrics involving right).
The latter two metrics are stored individually to be
better able to analyze preferred interaction directions
(as also targeted by our hypothesis H5, see Section
5.1). We did not compare the directions forward and
down as these are not fully comparable due to diffe-
ring input activities, as explained earlier and depicted
in Figure 4.
All metrics just described focus on the “quality”
of a user’s interaction related to simple 3D interaction
tasks. To gain additional information we however also
measured an average time needed for each task with
the HID and DIH settings but did not find any notable
differences which is why we do not report it in detail
in Section 5.4.
5 USER STUDY
This section describes research questions, methodo-
logy, participants and results of our comparative user
study on HID and DIH settings. As described above,
we conducted a pre-test before the actual study in or-
der to configure the devices and ensure comparability.
5.1 Research Questions
The user study aimed at analyzing two aspects rela-
ted to i) the users’ interaction performance with DIH
vs. HID interaction settings and ii) the user’s better-
performing interaction direction (left or right). The
first aspect contributes to better understanding tangi-
ble interaction and related supporting and inhibiting
factors, the second aspect contributes to better under-
standing users’ non-individual interaction prerequisi-
tes and preferences.
Accordingly, we formulated hypotheses in order
to investigate the aforementioned aspects of our rese-
arch questions. They are listed below, followed by a
discussion of how we arrived at them.
H1: We expect users’ Reach to be about equal
with DIH and HID interaction.
H2: We expect users’ Reach to be high ( 95%)
for both settings and for all directions.
H3: We expect users’ Regularity to be better with
the DIH interaction as we believe controllability
to be higher being able to grab the device, holding
on to it.
H4: We expect users’ ContinuousRegularity to be
better with DIH interaction, again for reasons of
better controllability holding on to the device.
H5: We expect users to be better able to control
their movement in the direction towards the body
than away from the body (in our study, all partici-
pants were right-handed and used their dominant
hand, thus the direction left is towards the body
while right is away from the body for all users).
Regarding H1 and H2, we believe that for users wit-
hout motor impairments (such as the participants of
our study), the interaction space used for the study
is well coverable. We did include the related metrics
even if we do not expect significant insights here as
we plan to do a subsequent analysis with people with
motor impairments and aim at comparing the results.
CHIRA 2018 - 2nd International Conference on Computer-Human Interaction Research and Applications
28
Regarding H3 and H4 we believe that the sta-
ble hand-device posture and related better control-
lability leads to better regularity performance for
both, single-direction tasks (Regularity) and continu-
ous movement tasks (ContinuousRegularity).
Regarding H5, we found two interesting aspects
in related literature that led to this hypothesis. First,
(Rinck and Becker, 2007), e.g., describe a movement
of an arm away from oneself as an action of “avoi-
dance” (pushing unpleasant objects away) and a mo-
vement towards the body as an action of “appro-
ach” (pulling pleasant objects closer). Second, (Stras-
ser and M
¨
uller, 1999) analyzed favorable movements
of the hand-arm system (physiologically assessed by
electromyographic investigation and subjectively ra-
ted by the participants in addition). They found out
that the participants (who had to handle light weig-
hts) could perform a movement towards the body best
while it was more uncomfortable the further it got
away from the body. These findings are also confir-
med by (McDonald et al., 2012) who analyzed muscle
demands in horizontal pushing (away from the body)
and pulling (towards the body).
5.2 Procedure and Methodology
The study followed a within-subjects design and took
place in a controlled lab setting. The participants did
all tasks described earlier with both settings. We used
a counterbalanced order in which the settings were
presented to the participants to prevent a bias related
to practicing effects. Further, participants could try
the devices and their handling as long as they wis-
hed so that the actual tests did not start before users
felt ready. Also, all participants received a short in-
troduction by a test supervisor who explained to them
the input devices and techniques. The test supervisor
was present during the full duration of the tests to be
able to help in case of technical problems, to switch
between the tasks and set up the input devices for the
users. The results of the tests were automatically re-
corded and analyzed using the framework described
in (Augstein et al., 2017b).
5.3 Participants
We recruited 24 volunteers aged between 20 and 49
(M=26.75, SD=8.38), 14 female. None of the parti-
cipants had previous experiences with the interaction
tasks or the concrete devices used (although some had
generally used a joystick before). They were recruited
via email as well as direct invitations. All were uni-
versity students or staff and generally had high media
skills related to input techniques and devices.
5.4 Results
This section discusses the results of the study based
on the metrics related to Reach, Regularity, and Con-
tinuousRegularity. The results for the metrics for all
directions are listed in Table 1; Table 2 shows the
aggregated results. To statistically analyze the re-
spective difference between the HID and DIH set-
tings, we conducted a T-Test on related samples (see
Tables 1 and 2). The T-Test assumes data to be nor-
mally distributed which is violated by a small sub-
set of our metrics. Thus we additionally conducted a
Wilcoxon test for two related samples which does not
presume normal distribution. It confirmed the results
of the T-Test in all cases and suggested statistical sig-
nificance for the same comparisons. As the T-Test is
generally more reliable in the identification of statis-
tical significance, we report the results of the T-Test
here and used the Wilcoxon results for confirmation.
Additionally, we conducted a comparison of
users’ results with the same setting related to the di-
rections left or right. This comparison was mainly
aimed at the confirmation of hypothesis H5. The re-
sults of this direction-related comparison are reported
in Table 3 where we list the outcome of a T-Test. We
again subsequently conducted a Wilcoxon test which
confirmed all occurrences of statistical significance.
5.5 Reach
As expected, the results show little variance for all Re-
ach-related metrics and were about equally high (i.e.,
close to 100%) with HID and DIH interaction. As
described earlier, we believe Reach to be of high rele-
vance especially in cases where users’ hand mobility
is limited. The results confirm that there is no general
barrier limiting the user’s personal interaction space
with any of the settings (which would lower the qua-
lity of user’s interaction with the system and negati-
vely affect also the results for other metrics).
5.6 Regularity
For Regularity we found significantly different results
in two cases (see Table 1). Users gained significantly
better results for RegularityDown with the DIH (i.e.,
joystick) setting. This finding is of limited reliability
however, as the direction down is not fully compara-
ble for the two settings as described earlier).
Further, users gained significantly better results
for RegularityLeft with the HID (i.e., SpongeBox) set-
ting. For the other comparisons (i.e., RegularityFor-
ward and RegularityRight), the differences were not
statistically significant.
Tangible Interaction for Simple 3D Interaction Tasks: Comparing Device-In-Hand and Hand-In-Device Scenarios
29
Table 1: Metrics and the computed values (all in percent) averaged for the 23 participants. The results are compared for the
hand-in-device and device-in-hand settings. Bold values in the sig column denote statistical significance on the 0.05 (*) or
0.001 level (**).
Metric Result T-Test
mean stdev t df sig
ReachDown HID 100.0 0.00
1.282 22 .213
ReachDown DIH 98.55 5.42
ReachForward HID 100.0 0.00
. . .
ReachForward DIH 100.0 0.00
ReachLeft HID 100.0 0.00
1.367 22 .186
ReachLeft DIH 97.83 7.63
ReachRight HID 98.19 4.99
-.463 22 .648
ReachRight DIH 98.19 5.21
RegularityDown HID 53.22 43.21
-2.770 22 .011*
RegularityDown DIH 75.32 20.88
RegularityForward HID 82.11 33.65
1.848 22 .078
RegularityForward DIH 68.33 25.90
RegularityLeft HID 93.64 14.83
3.328 22 .003*
RegularityLeft DIH 73.56 24.51
RegularityRight HID 70.93 35.32
-.167 22 .869
RegularityRight DIH 72.30 26.83
ContinuousRegularityLeftForward HID 81.96 7.56
-2.969 22 .007*
ContinuousRegularityLeftForward DIH 87.14 3.21
ContinuousRegularityLeftDown HID 87.23 5.38
-1.105 22 .281
ContinuousRegularityLeftDown DIH 89.00 5.50
ContinuousRegularityRightForward HID 74.72 5.01
-4.955 22 .000**
ContinuousRegularityRightForward DIH 81.05 4.25
ContinuousRegularityRightDown HID 83.05 12.17
-1.635 22 .116
ContinuousRegularityRightDown DIH 87.69 5.63
Table 2: Aggregated metrics and the computed values (all in percent) averaged for the 23 participants. The results are
compared for the hand-in-device and device-in-hand settings. Bold values in the sig column denote statistical significance on
the 0.05 level (*).
Metric Result T-Test
mean stdev t df sig
Reach HID 99.55 .26
.914 22 .370
Reach DIH 98.82 .72
Regularity HID 74.97 21.57
.566 22 .577
Regularity DIH 72.38 18.50
ContinuousRegularityLeft HID 84.59 6.33
-2.309 22 .031*
ContinuousRegularityLeft DIH 88.07 3.49
ContinuousRegularityRight HID 78.88 7.36
-3.203 22 .004*
ContinuousRegularityRight DIH 84.37 3.95
ContinuousRegularity HID 81.74 6.66
-2.897 22 .008*
ContinuousRegularity DIH 86.22 3.39
CHIRA 2018 - 2nd International Conference on Computer-Human Interaction Research and Applications
30
Table 3: Comparison between the interaction directions left and right including all relevant metrics (all values in percent).
The results are listed for the hand-in-device and device-in-hand settings. Bold values in the sig column denote statistical
significance on the 0.05 (*) or 0.001 level (**).
Metric Result T-Test
mean stdev t df sig
ReachLeft HID 100.0 .00
1.738 22 .096
ReachRight HID 98.19 5.00
ReachLeft DIH 97.83 7.63
-1.367 22 .186
ReachRight DIH 98.91 5.21
RegularityLeft HID 93.64 14.83
3.101 22 .005*
RegularityRight HID 70.93 35.32
RegularityLeft DIH 73.56 24.51
.200 22 .844
RegularityRight DIH 72.30 26.83
ContinuousRegularityLeftForward HID 81.96 7.56
5.626 22 .000**
ContinuousRegularityRightForward HID 74.72 5.01
ContinuousRegularityLeftForward DIH 87.14 3.21
6.069 22 .000**
ContinuousRegularityRightForward DIH 81.05 4.25
ContinuousRegularityLeftDown HID 87.23 5.38
2.516 22 .02*
ContinuousRegularityRightDown HID 83.05 12.17
ContinuousRegularityLeftDown DIH 89.00 5.50
2.580 22 .017*
ContinuousRegularityRightDown DIH 87.69 5.63
ContinuousRegularityLeft HID 84.59 6.33
8.241 22 .000**
ContinuousRegularityRight HID 78.88 7.36
ContinuousRegularityLeft DIH 88.07 3.49
5.760 22 .000**
ContinuousRegularityRight DIH 84.37 3.95
Regarding the aggregated Regularity metric as re-
ported in Table 2 the differences were not statistically
significant which generally was a bit surprising for us
as we had expected users to be better able to control
their movement with the DIH (i.e., joystick) setting.
Regarding the comparison of the interaction directi-
ons left and right, the tests however did reveal signifi-
cantly better results for the direction left with the HID
setting (for the DIH setting the difference was not sig-
nificant), compared to right.
5.7 ContinuousRegularity
Regarding ContinuousRegularity, the results were ge-
nerally more conclusive (see Table 1). We found sig-
nificant differences in the results for the metrics Con-
tinuousRegularityLeftForward and ContinuousRegu-
larityRightForward (for the latter, the differences
were statistically highly significant). In both cases the
users performed better with the DIH setting (joystick)
which confirmed our expectations.
The results for ContinuousRegularityLeftDown
and ContinuousRegularityRightDown were not signi-
ficantly different as shown in Table 2 (however, in
this case the comparability is generally limited any-
way due to the strong influence of the down direction).
Regarding the aggregated results, users gained signi-
ficantly better results with the DIH setting (joystick).
Regarding the interaction directions left and right,
we found statistically different results for all compa-
risons (see Table 3). For ContinuousRegularityLeft-
Forward and ContinuousRegularityRightForward we
found the direction left to be statistically highly signi-
ficantly better than right for both settings. Also for
the comparison of ContinuousRegularityLeftDown
and ContinuousRegularityRightDown left was signi-
ficantly better than right for both settings.
Here, the results are reliable although the direction
down is involved as the device (and thus also input
activity) remained equal within a comparison. The
comparison of the aggregated ContinuousRegularity-
Left and ContinuousRegularityRight results show a
statistically highly significant difference for the HID
as well as the DIH settings (with left outperforming
right in all cases).
5.8 Findings Related to the Hypotheses
We summarize the results regarding our hypotheses
as follows. The study confirmed H1 and H2 related
Tangible Interaction for Simple 3D Interaction Tasks: Comparing Device-In-Hand and Hand-In-Device Scenarios
31
to users’ Reach as the results were not notably diffe-
rent for HID and DIH and about equally high ( 95%)
for all interaction directions with both settings. In-
specting the raw data, we found out that for the total
of the users and for all settings and directions, only 8
of 184 (23*2*4) reported values were slightly lower.
H3 had to be rejected as we did not find signi-
ficant differences between HID and DIH for Regu-
larity. In fact, we found significantly better results
for DIH interaction only for RegularityDown which
is however of limited reliability. Unexpectedly, we
found significantly better results with HID interaction
for RegularityLeft, compared to DIH. For the rest of
the comparisons there were no significant differences.
We could confirm H4 as all highly reliable results
(i.e., for the comparison of ContinuousRegularityLeft-
Forward and ContinuousRegularityRightForward), as
well as those for the aggregated metrics Continuous-
RegularityLeft, ContinuousRegularityRight and Con-
tinuousRegularity indicated a significantly better per-
formance of users with the DIH setting, compared
to the HID setting. The results for ContinuousRegu-
larityLeftDown and ContinuousRegularityRightDown
(which might be biased due to the involvement of the
direction down) were better with the DIH setting as
well for the majority of the users, however the diffe-
rence was not statistically significant.
H5 was confirmed as all comparisons except for
the aggregated RegularityLeft metric in the DIH set-
ting, revealed statistically significantly or even highly
significantly better results for the direction left.
6 DISCUSSION
In this paper we have compared users’ interaction per-
formance with isometric HID and DIH settings for
simple 3D interaction tasks. Besides analyzing diffe-
rences in interaction performance, we aimed at inves-
tigating differences between the interaction directions
left and right for both settings, HID and DIH.
The study revealed that there are no significant
differences between HID and DIH settings regarding
users’ Reach. This might lead to the assumption that
the Reach metric is of limited relevance generally
which we however want to argue against.
As shortly mentioned earlier, we believe that in-
formation about users’ Reach is of utmost importance
as a limited Reach can lead to a barrier that might
inhibit users to be able to interact with predefined set-
tings at all. We thus believe that especially when de-
signing accessible UIs (including input devices and
methods), a user’s Reach should be analyzed and used
to individually configure the UI in case it is conside-
rably reduced for whatever reason. This was confir-
med also by a user test with a small number of people
with motor impairments we conducted earlier (Aug-
stein et al., 2017a). Except for SpongeBox, this test
used different devices and it was purposed to reveal
insights regarding the importance of haptics for the
target group. Thus the results are not comparable to
those presented in this paper but they generally sug-
gest that Reach differs more considerably among pe-
ople with impairments.
Our study further revealed that regarding Regula-
rity, the differences between HID and DIH settings
were less conclusive than expected. For most related
metrics, the differences between the results with the
two settings were either not statistically significant or
even significantly better with the HID setting (Regula-
rityLeft). From these findings we derive that for sim-
ple one-directional interaction tasks users’ Regularity
is less dependent on a stable hand-device posture than
expected.
Regarding ContinuousRegularity where users had
to perform a more complex interaction task cove-
ring several interaction dimensions and directions, the
study has shown that a stable hand-device posture
actually had a positive impact on the users’ perfor-
mance. This was confirmed by the statistically signi-
ficantly better results for all reliable and all aggrega-
ted ContinuousRegularity metrics with the DIH set-
ting. We thus conclude that the positive influence of
a certain hand-device stability is strongly dependent
on the complexity of the interaction task (increasing
with increasing task complexity).
Regarding the comparison of performance in the
interaction directions left and right we derived our
expectations from related research on favorable mo-
vements due to positive and negative associations as
well as physiological prerequisites.
Literature research led to the assumption that for
right-handed users, left is the better-performing inte-
raction direction. The results of our study confirm
this assumption, especially with increasing task com-
plexity. Similarly, we assume that right is the better-
performing direction for left-hand interaction, which
we however could not confirm as all of our partici-
pants were right-handed and used their dominant hand
for interaction.
Our study did not reveal cases where the HID set-
ting seemed to be generally preferable over the DIH
setting. We found little evidence for actual benefit pe-
ople without known impairment might gain from HID
interaction. Thus we believe that most users without
impairments will prefer DIH over HID.
Limitations of our work reported in this paper are
discussed as follows. First, our findings are restricted
CHIRA 2018 - 2nd International Conference on Computer-Human Interaction Research and Applications
32
to relatively simple 3D interaction tasks and to isome-
tric input devices. Our results have shown that task
complexity influences several relevant aspects of in-
teraction performance, thus for even more complex
tasks (especially involving more DoF), they might be
less reliable. We hence recommend analyzing inte-
raction performance anew in case the findings should
be generalized to more than three DoF.
Further, as explained earlier, the results related to
the interaction direction down are less reliable due to
the limited comparability of the related input activi-
ties. We reported these results for reasons of comple-
teness in spite of this limitation but recommend re-
lying only on those we suggested as conclusive.
Another limitation is that the interaction directi-
ons up and back were restricted to moving back to the
initial position after having moved forwards or down
in our user study.
Future work could include repeating the study
with a different target group. We believe that the HID
setting bears higher potential for people with limited
hand mobility than for users without motor impair-
ments affecting the interacting hand. Further, we be-
lieve that a similar study with devices offering more
DoF will lead to interesting insights and expect it to
confirm the trend regarding task complexity.
REFERENCES
Augstein, M., Neumayr, T., and Burger, T. (2017a). The
Role of Haptics in User Input for People with Mo-
tor and Cognitive Impairments. In Proceedings of the
2017 AAATE Conference, Sheffield, UK.
Augstein, M., Neumayr, T., Kern, D., Kurschl, W., Alt-
mann, J., and Burger, T. (2017b). An Analysis and
Modeling Framework for Personalized Interaction. In
IUI 2017 Companion: Proceedings of the 22nd In-
ternational Conference on Intelligent User Interfaces,
Limassol, Cyprus.
Augstein, M., Neumayr, T., Vrecer, S., Kurschl, W., and
Altmann, J. (2018). The role of haptics in user input
for simple 3d interaction tasks an analysis of inte-
raction performance and user experience. In Procee-
dings of the 2nd International Conference on Human
Computer Interaction Theory and Applications, Fun-
chal, Madeira, Portugal.
Bau, O., Poupyrev, I., Israr, A., and Harrison, C. (2010).
Teslatouch: electrovibration for touch surfaces. In
Proceedings of the 23nd annual ACM symposium on
User interface software and technology, pages 283–
292. ACM.
Carter, T., Seah, S. A., Long, B., Drinkwater, B., and Subra-
mania, S. (2013). Ultra haptics: Multi-point mid-air
haptic feedback for touch surfacces. In Proceedings
of the 26th Annual ACM Symposium on User Inter-
face Software and Technology, pages 505–514. ACM.
Ciesla, C., Yairi, M., and Saal, N. (2013). User interface
system. US Patent 8,456,438.
Dangeti, S., Chen, Y. V., and Zheng, C. (2016). Compa-
ring bare-hand-in-air gesture and object-in-hand tan-
gible user interaction for navigation of 3d objects in
modeling. In Proceedings of the TEI’16: Tenth In-
ternational Conference on Tangible, Embedded, and
Embodied Interaction, pages 417–421. ACM.
Kincaid, R. (2012). Tactile guides for touch screen con-
trols. In Proceedings of the 26th Annual BCS Inte-
raction Specialist Group Conference on People and
Computers, BCS-HCI ’12, pages 339–344, Swinton,
UK, UK. British Computer Society.
McDonald, A., Picco, B. R., Belbeck, A. L., Chow, A. Y.,
and Dickerson, C. R. (2012). Spatial dependency of
shoulder muscle demands in horizontal pushing and
pulling. Applied ergonomics, 43(6):971–978.
Pfeiffer, M., Schneegass, S., Alt, F., and Rohs, M. (2014).
Let me grab this: A comparison of ems and vibration
for haptic feedback in free-hand interaction. In Pro-
ceedings of the 5th Augmented Human International
Conference. ACM.
Rinck, M. and Becker, E. S. (2007). Approach and avoi-
dance in fear of spiders. Journal of behavior therapy
and experimental psychiatry, 38(2):105–120.
Sato, M., Poupyrev, I., and Harrison, C. (2012). Touch
´
e: en-
hancing touch interaction on humans, screens, liquids,
and everyday objects. In Proceedings of the SIGCHI
Conference on Human Factors in Computing Systems,
pages 483–492. ACM.
Shen, Y., Ong, S.-K., and Nee, A. Y. (2011). Vision-
based hand interaction in augmented reality environ-
ment. Intl. Journal of Human–Computer Interaction,
27(6):523–544.
Stannus, S., Rolf, D., Lucieer, A., and Chinthammit, W.
(2011). Gestural navigation in google earth. In Pro-
ceedings of the 23rd Australian Computer-Human In-
teraction Conference, pages 269–272. ACM.
Strasser, H., Keller, E., M
¨
uller, K.-W., and Ernst, J. (1989).
Local muscular strain dependent on the direction of
horizontal arm movements. Ergonomics, 32(7):899–
910.
Strasser, H. and M
¨
uller, K.-W. (1999). Favorable mo-
vements of the hand-arm system in the horizontal
plane assessed by electromyographic investigations
and subjective rating. International Journal of Indus-
trial Ergonomics, 23(4):339–347.
Tscharn, R., Schaper, P., Sauerstein, J., Steinke, S., Stiers-
dorfer, S., Scheller, C., and Huynh, H. T. (2016). User
Experience of 3D Map Navigation – Bare-Hand Inte-
raction or Touchable Device? In Mensch und Compu-
ter 2016. GI.
Zhai, S. (2008). Human Performance in Six Degree of Free-
dom Input Control. PhD thesis, University of Toronto.
Tangible Interaction for Simple 3D Interaction Tasks: Comparing Device-In-Hand and Hand-In-Device Scenarios
33