An User-centred AI-based Assistance System to Encounter Pandemics in
Clinical Environments: A Concept Overview
Christian Wiede
1 a
, Roman Seidel
2 b
, Carolin Wuerich
1 c
, Damir Haskovic
3 d
,
Gangolf Hirtz
2 e
and Anton Grabmaier
1
1
Fraunhofer IMS, Finkenstrasse 61, Duisburg, Germany
2
Chemnitz University of Technology, Reichenhainer Strasse 70 Chemnitz, Germany
3
MINDS & SPARKS GmbH, Gumpendorfer Strasse 73/17, Vienna, Austria
Keywords:
Artificial Intelligence, Assistance Systems, Clinical Environments, Robotics, Optical Sensors.
Abstract:
The current coronavirus pandemic has highlighted the need for enhanced digital technologies to provide high
quality care to patients in hospitals while protecting the health and safety of the medical staff. It can also be
expected that there will be a second and third wave in the corona pandemic and that preparation for future
pandemics must be made. In order to close this emerging gap, we propose a concept aiming at boosting the
adoption of AI and robotic related technologies to ensure sustainable, patient-centred care in hospitals. The
planned assistance system will provide a continuous and safe monitoring of patients in the whole hospital
environment from entrance to the ward, including data security and protection. The benefits consist in a fast
detection of possible infected persons, a continuous monitoring of patients, a support by robots to reduce
physical contacts during epidemics, and an automatic disinfection by robots. In addition to the technical
challenges, medical, social and economic challenges for such an assistance system are discussed.
1 INTRODUCTION
The COVID-19 pandemic is affecting all continents
including Europe. The globalisation and increasingly
interconnected economies mean that most countries
are, and will be affected by COVID-19 and its con-
sequences. Global effort is therefore required to ef-
fectively break the chains of the virus transmission in
the population, while protecting the health of front-
line workers, particularly health professionals, from
infection.
Robotics, sensor technology and AI assistant sys-
tems can significantly reduce the risk of infectious
disease transmission to frontline healthcare workers
by making it possible to triage, evaluate, monitor, and
treat patients from a safe distance without having the
risks of contagion. In addition, robots, sensor tech-
nologies and AI assistant systems have the potential
to be deployed for disinfection, delivering medica-
a
https://orcid.org/0000-0002-2511-4659
b
https://orcid.org/0000-0002-3144-1488
c
https://orcid.org/0000-0003-0917-2696
d
https://orcid.org/0000-0001-5173-4272
e
https://orcid.org/0000-0002-4393-5354
tions and food, measuring vital signs, and supporting
health professionals in various tasks. As the epidemic
escalated and a great number of health professionals
have been infected by the virus during their work, the
benefits of robotics are becoming increasingly clear.
We intend to develop an assistance system in or-
der to improve healthcare uptake and treatment of pa-
tients during pandemics (e.g. COVID-19). The sys-
tem enables an increased speed and effectiveness of
diagnoses, treatment and monitoring of patients, and
thus protects patients, caregivers and healthcare pro-
fessionals in hospital settings. The objectives com-
prise a continuous, contactless monitoring of patients
and medical staff through optical sensors, a secure
hospital infrastructure and assistance robots to sup-
port caregivers. To achieve our ambitious objectives,
we present each step required to integrate such an as-
sistance system within a hospital ecosystem, whereby
an interdisciplinary team with partners from industry,
science and healthcare is necessary.
In order to outline our findings, this paper is struc-
tured as follows: Sect. 2 provides a system overview
of all aspects of the proposed solution. In Sect 3. the
secure infrastructure in the hospital is described. This
Wiede, C., Seidel, R., Wuerich, C., Haskovic, D., Hirtz, G. and Grabmaier, A.
An User-centred AI-based Assistance System to Encounter Pandemics in Clinical Environments: A Concept Overview.
DOI: 10.5220/0010301006930700
In Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2021) - Volume 4: VISAPP, pages
693-700
ISBN: 978-989-758-488-6
Copyright
c
2021 by SCITEPRESS Science and Technology Publications, Lda. All rights reserved
693
is followed by the description of the tracking of pa-
tients and medical staff by means of smart sensors in
Sect 4., Sect. 5 focuses on the contactless extraction
of patients’ vital parameters. Sect. 6 follows this with
the use of telepresence robots. All technical details
and other factors are discussed in Sect. 7. Conclu-
sions for further actions are presented in Sect. 8.
2 SYSTEM OVERVIEW
The global objective of the assistance system is to im-
prove healthcare uptake and treatment of patients dur-
ing pandemics (e.g., COVID-19) and beyond in hos-
pital settings for patients, caregivers and healthcare
professionals by increasing the speed and effective-
ness of diagnoses, treatment and monitoring of pa-
tients. This concept is visualised in Figure 1 and rep-
resents an assistance system in the clinical environ-
ment to improve treatment and care for patients, en-
hance the work balance of medical staff and improve
logistics while reducing costs.
2.1 Objectives of System Concept
Our system concept is divided into two technical ob-
jectives, which are:
1. Fast Triage and Continuous Monitoring: of
potential infected patients in hospitals during pan-
demics by the use of optical sensors to measure vi-
tal signs automatically in the entrance and emergency
rooms for fast detection of infected people and the
contactless continuous monitoring of patients in pa-
tient rooms and corridors for automatic health status
determination and for pathway tracking.
2. Protection of Patients, Caregivers and Infras-
tructure in Hospitals: during pandemics by de-
ploying a contact map for detecting patients and clini-
cal staff in order to track and trace contacts of persons
in the hospital using optical sensors to isolate infected
persons and contain superspreading events.
Furthermore, the system concept envisages the de-
ployment of assistance robots for automatic disinfec-
tion and for working tasks of clinical staff, thereby re-
ducing contact between patients and caregivers, and
the assurance of the system inviolability against cy-
berattacks by redundancy in the systems and stable
firewalls.
This is accompanied by objectives for rapid and
effective conversion of hospital processes and struc-
tures in the case of epidemics and the establishment of
a reliable, quantifiable and large-scale AI assistance
platform for the deployment in European hospitals.
2.2 Related Technologies
Autonomous monitoring of the health status is a cru-
cial task during pandemics. We aim to have a fast
detection in the hospital entrance of possible infected
persons for isolation, and the continuous monitoring
of already infected. The measurement will be auto-
matic, contactless to prevent contamination and fast
for high throughput. Both tasks can be realised by
the monitoring of vital parameters by means of opti-
cal sensors. Literature shows that there is research in
determining heart rate (Poh et al., 2010; Verkruysse
et al., 2008) and respiration rate (Wiede et al., 2017;
Lukac et al., 2014) remotely in various application
fields. However, non of them targeting the fields of
patient recovery monitoring and triage of patients,
which both have different demands on the algorithms
(see Sect. 5.2 and 5.1). Monitoring respiration rate
has turned out to be particularly advantageous due the
fact that COVID-19 patients suffer from shortness of
breath. To date, the remote, contactless determination
of patients’ vital parameters has not yet been part of a
large-scale clinical survey for pandemics comparable
to the one planned in this project concept. Therefore,
we have chosen this as a significant focus area in this
project where vital parameter monitoring will be done
in 2D images captured from 3D smart sensors.
Furthermore, emergency situations such as falls
and circulatory collapses have to be detected imme-
diately. If such emergency cases are not immedi-
ately recognised, it is conceivable that the patients’
health would suffer and that the care costs would in-
crease. In contrast to the approach using wearable
sensors for fall detection by (Ferrari et al., 2012), the
authors of (Rantz et al., 2013) investigated an image-
based method that detects human falls through anal-
ysis of a depth map. This depth map can be par-
tially rendered as a 3D image without the full rendi-
tion (colour / brightness) of the scene. The proposed
method was tested successfully in multiple hospitals
(TRL 7) (Rantz et al., 2013). A fall detection which
is adapted to fisheye images (Seidel et al., 2020) is in-
tegrated in our system concept and will perform fall
detection without physical contact. Furthermore, an
accurate tracking of persons is guaranteed while re-
specting privacy issues.
Another aspect of our system is the consideration
to use robots. Robotic solutions for hospitals and in
the healthcare ecosystem can be organised into cate-
gories depending on their application and use cases
(Bodenhagen et al., 2019). Telepresence robots are
VISAPP 2021 - 16th International Conference on Computer Vision Theory and Applications
694
Figure 1: Overview of the technical assistance system with 3D sensors, a robotic platform and a data integration unit. The
system can be integrated in a hospital ward. As an example different use cases are shown in the patient rooms and in the
corridor.
remote-controlled robotic devices that enable a per-
son - such as a health professional - to be virtually
present, to interact and socially engage with patients
remotely and to physically manipulate the robot and
its immediate surroundings. This way, the user is able
to maintain a large degree of control over their vir-
tual presence and interactions with people and objects
in that physical space. Potential use cases envisaged
within the project include reducing physical interac-
tions between patients and carergivers.
There are existing studies of telepresence robots
supporting elderly people and patients that demon-
strate strong, positive influences over the interactions
with a manageable overall level of care delivered to
patients (Cesta et al., 2016; Gonzalez-Jimenez et al.,
2013; Lee et al., 2017; ?).
Especially in the COVID-19 pandemic, patients
suffered from isolation effects. Therefore, existing
robots will be integrated for telepresence purposes, to
guide and assist patients in hospitals. Telepresence
robots effectively and accurately monitor the health
status of the patients, can alert carers where specific
problems are arising and are able to intervene where
possible and necessary.
They are also able to actively remind or “nudge”
patients to keep an appointment or to take their re-
quired medicine. Moreover, the robots will be capa-
ble of physical transportation tasks such as movement
of physical objects (meals, medical devices) or phys-
ically guiding patients and their families themselves
from one location to another within the hospital envi-
ronment. Furthermore, the robots can perform tasks
for automatic disinfection of areas in the hospitals by
using ultra violet (UV) light. Data from robots, 3D
sensors and other Hospital Information System (HIS)
related sources will be fused within a data integration
unit, analysed and interpreted by the hospital staff.
The data is stored in dedicated servers hosted in the
hospital and analysed by an AI module designed to
identify further emergency situations requiring deci-
sions and intervention.
3 SECURE INFRASTRUCTURE
Since the collected data contains a huge amount of
sensitive information from different sensors, the data
integration into the HIS needs to meet certain require-
ments regarding the data management infrastructure,
privacy and security. A secure communication, com-
putation, data management and standardisation is es-
sential for a reliable AI assistance platform in hospi-
tals. A distributed system design is required to sup-
port all hospital AI components. For this, multiple
means of visualisation, computing, communication
architectures as well as database management sys-
tems for real-time operation and security will be con-
sidered. Data specifications are defined beforehand,
especially the outputs of the AI methods employed
by the robots and smart sensors. Additionally, regula-
tory requirements like necessary certifications, com-
pliance, and data anonymisation needs to be consid-
ered.
An User-centred AI-based Assistance System to Encounter Pandemics in Clinical Environments: A Concept Overview
695
3.1 Data Management and Interfaces
The data management comprises services for system-
atically collecting data from the registered sensors,
robots and other HIS-related sources, as well as the
management for further processing and distributing
data in a secure way. It provides the basis for ensur-
ing interoperability between data sources, providing a
common operational picture in real-time during hos-
pital emergency situations. Events, such as incoming
messages from HIS and Health Portals (HL7, CDA,
FHIR-standard) may trigger further actions such as
email-indications, or tracing and tracking of medi-
cal personnel, patients, mobile equipment and devices
across hospital premises.
Due to the heterogeneous data sources (e.g. hospi-
tal robot) and streams (e.g. smart sensors), a harmon-
isation of the data is required. Algorithms transferred
herein, thus, could include tools for descriptive ana-
lytics (i.e. statistical summaries of integrated robot
and sensory data), time-series analysis of critical pa-
rameters (e.g. symptoms like changes in tempera-
ture, respiration rate, and pulse), and anomaly detec-
tion (e.g. using auto-regression). This shall provide
medical staff with holistic descriptions of the patients’
states, while also feeding key features for diagnostic
detection of critical events into the event bus. In order
to allow medical staff to interpret and use informa-
tion from these heterogeneous data sources, data har-
monisation and visualised analyses will be provided.
Finally, a set of visual components should be inte-
grated into the hospital system by providing front-end
visual analytics integrated into the end-use environ-
ments. Hence, medical staff can monitor vital param-
eters and alarms on a generic hospital control display
component. Customised visual components will be
prepared for displaying status of selected components
e.g. current vital parameters and the patient’s medical
record from the HIS.
3.2 Data Privacy and Security
To provide transparency, data privacy and security
(legal and ethical) restrictions and policies with an
elaborated roles and rights concept will be enforced
and monitored. Especially during public health cri-
sis, hospitals have been targets of cyberattacks in the
past. Thus, the inviolability of the system against
such attacks needs to be assured by redundancy in
the system and stable firewalls. IT security and user
management policies ensure that only eligible person-
nel can access the systems’ sensitive data. The work
of (Da
ˇ
si
´
c et al., 2017) explores solutions for clinical
monitoring via state-of-the-art of video surveillance
clouds. Another approach is to process images and
videos either locally by a 3D sensor and a robot it-
self or remotely via a secure and powerful processing
node that routes generated meta-data to a data inte-
gration unit. No video data itself is processed by the
integration unit and only necessary data is transmit-
ted through the hospital network, thus improving data
security. In addition, for data protection reasons all
data is processed directly in the hospital and no data is
transferred to a cloud. Moreover, network monitoring
across all layers of ISO/OSI model, forensic analysis
for investigation of security or operational incidents
should be provided.
4 RE-IDENTIFICATION AND
TRACKING
In order to determine user-group-specific informa-
tion it is necessary to distinguish between the user
groups: clinical staff, visitors and patients. The re-
identification of users in the clinical environment en-
sures that information retrieved from sensors like for
tracking or vital parameter monitoring are assigned
to the right person. Persons will be re-identified by
analysing the 2D images from the smart sensors. In
contrast to body mounted sensor technology such as
radio-frequency identification (RFID) technology im-
age sensors have advantages in terms of number of
required sensors and are not object-related. To reduce
the number of necessary 2D image sensors per room,
fisheye lenses with a complete 180
top view (cover-
ing the entire scene below the ceiling) are used. Deep
learning approaches can be adapted to re-identify per-
sons in fisheye images. Focusing on facial informa-
tion as performed by algorithms based on front-view
images is error-prone for top-view image analyses
since the face itself covers only a small portion in
the image. Therefore, features incorporating the en-
tire appearance of the patient (like AMOC (Liu et al.,
2017)) and the training of a detector with synthetic
data (Scheck et al., 2020) will be used. The goal of
this system concept is to achieve a high (rank-1) iden-
tification rate while providing a confidence value in-
dicating the certainty of the identification. For this,
existing re-identification methods have to be retrained
and fine-tuned on top-view fisheye images.
A continuous movement tracking in the hospi-
tal will retrieve and analyse the position of all per-
sons within the hospital based on the person re-
identification. In case someone is reported to be in-
fected, the previous pathways of this person in the
hospital can be traced back, relevant persons are
informed and automated disinfection processes can
VISAPP 2021 - 16th International Conference on Computer Vision Theory and Applications
696
start. For this, smart 3D sensors are mounted at the
ceiling of each hospital room to track only the iden-
tified patients in the room. The 3D sensors consist
of multiple 2D fisheye image sensors and capture im-
ages for anonymously analysing persons’ locations in
the hospital. Fully integrated distance maps between
people to identify contact chains and potential infec-
tion areas will be measured by the smart sensors di-
rectly in 3D, after the tracking of identified persons
is completed. Furthermore, emergency cases such as
falls can be detected by means of Support Vector Ma-
chine (SVM) classifiers. They are based on the 3D
pose referring to the 3D skeleton extraction. If a pa-
tient falls down onto the floor, the SVM would detect
it and the smart sensor sends an alarm signal to the
data fusion unit which immediately informs medical
professionals or the robots.
5 VITAL PARAMETERS
The contactless measurement of vital parameter is
sensible to do at the hospital entrance to isolate po-
tential infected persons and for the continuous moni-
toring of already infected in the wards. To date, the re-
mote, contactless determination of patients’ vital pa-
rameters has not been part of a large-scale clinical sur-
vey for pandemics as planned in our assistance sys-
tem.
5.1 Triage in Hospital Entrances
In order to establish a fast detection of potential
COVID-19 infected people, an access lock is to be in-
stalled in the entrance area of the emergency room.
Symptomatic COVID-19 cases are characterised in
particular by shortness of breath and fever. In case
of a combination of sympathetic symptoms, these pa-
tients have to be isolated from others immediately in
order to prevent contagion. The measurement will be
carried out using thermal imaging cameras to measure
temperature and RGB cameras to determine the respi-
ration rate. The measurements will be fully automatic
and without contact to the patient. This saves time
and reduces the possible sources of infection.
For the respiratory rate measurement, the patient
needs to sit in front of a camera for 30 seconds with-
out moving. With an RGB camera it is possible to de-
tect and track the torso motions (Wiede et al., 2017).
This can be realised by determining a region of in-
terest (ROI) on the upper body with an detector (Liao
et al., 2016), then detecting minimum Eigenvalue fea-
tures in this region (Shi and Tomasi, 1993) and track
them by means of optical flow (Tomasi and Kanade,
1991). A time channel can be extracted from the tra-
jectories of the tracked points. The signal information
is improved by applying a bandpass filtering and a
principal component analysis (PCA). Finally, the res-
piration rate is obtained from the signal spectrum’s
highest peak. This value is one indicator for the diag-
noses made by the clinical personnel.
First tests in a hospital during the first wave
of COVID-19 crisis have proven successful (Wiede
et al., 2020).
5.2 Recovery Monitoring
Furthermore, the embedded omnidirectional sensors
in the patient rooms enable the remote monitoring of
the vital parameters such as heart and respiratory rate.
To account for data privacy, the person identification
assures on the one hand that the data is retrieved only
for the user groups patients and medical staff (not vis-
itors), and on the other hand that the retrieved data is
assigned to the right person. In case the health status
changes drastically or emergencies occur, an alarm is
triggered and either a robot or a caregiver will come
and assist.
Unlike in the case of triage, an omnidirectional
image will be processed. This has the advantage that
the whole room can be monitored, but at the expense
of strong distortions. One solution lies in the use a vir-
tual perspective camera such as proposed by Meinel
et al. (Meinel et al., 2014). Thereby, the detected per-
son in the omnidirectional image will be projected to
an artificial perspective image.
First, the heart rate can be estimated by detect-
ing the face of a person by the method of Zhu and
Ramanan (Zhu and Ramanan, 2012), determining
a person-dependent skin colour model and tracking
the features with optical flow (Tomasi and Kanade,
1991). Subsequently, all pixels in the face matching
the skin colour model criteria are extracted and aver-
aged in every time step to obtain three colour chan-
nels. These channels are normalised, bandpass fil-
tered and processed by an ICA. Finally, the heart rate
is determined by dominant frequency within the Fast
Fourier Transform (FFT) spectrum. This determina-
tion of the respiration rate is analogous to the proce-
dure described in Sect. 5.1.
The use of omnidirectional sensors for the deter-
mination of heart and respiration rate has already been
investigated for ambient assisted living (AAL) appli-
cations (Wiede et al., 2019), whereby a transfer into
the clinical area is generally possible. Moreover, the
mobile robot platforms are able to detect vital param-
eters in the same fashion as their additional tasks in
the hospital.
An User-centred AI-based Assistance System to Encounter Pandemics in Clinical Environments: A Concept Overview
697
6 ROBOTS
In the context of pandemics, the field of robotics in
combination with intelligent sensors offers enormous
potential to meet some of the challenges ahead, i.e.
automation of processes in hospitals, reduction of
physical contact between people and better monitor-
ing of patients’ health state. We will focus on telep-
resence robots and personal assistants, since these are
robot types that seem to be particularly well suited
to meet rapidly expanding healthcare demands in epi-
demics. In order to be really useful these robots
need to be aware at all moments of their current posi-
tion. We will use algorithms such as semantic SLAM
(Bowman et al., 2017), long term SLAM in dynamic
environments (Biber and Duckett, 2009; Pomerleau
et al., 2014), and strategies for switching between
SLAM (when exploring and updating the map of the
world) and localisation (when exploiting the current
map information). It will be necessary that the robot
walks at the same pace as patients/visitors, while the
robot is guiding the person to reach a specific destina-
tion or when a telepresence visit is taking place. Dis-
tance between robot and the person should be fixed
according to the circumstances. During telepresence
meetings, the robot should maintain a close distance
to allow a private conversation, but a longer distance
for a chat with a group of people.
6.1 Telepresence Functionality and User
Interfaces
Especially in the COVID-19 pandemic, patients suf-
fer from isolation effects. Telepresence robots en-
able health professionals and relatives to be virtually
present, to interact and socially engage with patients
remotely, and to physically manipulate the robot and
its immediate surroundings. However, associated
benefits as well include the reduction of physical con-
tact between patients and care givers. The human-
robot interaction will be based on gesture recognition,
as gestures are natural communication codes, and
complementary to a graphical user interface (GUI).
The information being displayed on the GUI mon-
itor will depend on the user group and on the task
being carried out. Another option can be voice in-
teraction, especially for patients who have difficulties
moving. However, in noisy environments, achieving
acceptable performances can be very challenging
6.2 Robot for Assistance, Transport and
Disinfection
Furthermore, assistant robots as shown in Figure 2
can take responsibility for simple tasks and routines,
allowing nurses to focus on more complex and press-
ing patient needs and reducing physical contact to
prevent infections. These simple tasks will include
performing regular check-ups on patients in a critical
state, reminding patients to take their medication, to
effectively and accurately monitor the health status of
the patients, to alert carers where specific problems
arise and to intervene where possible and necessary.
In this context, enabling robots with the skill of ver-
ifying and re-identifying people is necessary to pro-
vide services to the right person. Perceiving people
is needed in order to navigate efficiently and safely,
to approach people in an appropriate manner, to ini-
tiate and maintain social interaction, and to recover
the contact with people. Together with the ceiling-
mounted smart sensors, robots will jointly reduce the
amount of physical contact between healthcare pro-
fessionals and patients and ensure that patients do not
get lost and do not leave their ward or room unat-
tended, informing care staff when necessary. This
type of support allows for an immediate and auto-
mated retrieval of medical findings and issuing alerts
in pre-defined emergency situations such as falls, res-
piratory distress and circulatory collapse. Moreover,
robots are capable of physical transportation tasks
such as movement of physical objects (meals, medical
devices) or physically guiding patients and their fam-
ilies themselves from one location to another within
the hospital environment avoiding crowded places.
Furthermore, the robots will perform tasks for auto-
matic disinfection of areas in the hospitals by using
UV light, which kills the pathogens. For this purpose,
a path planning is implemented in the hospital to dis-
infect all areas and not to forget any corner.
7 DISCUSSION AND
CHALLENGES
To separate infected from non-infected persons as
early as possible, it is unavoidable to identify the user
group of each person in the hospital and track its path-
way through the hospital. With the help of multi-
object (person) tracking (MOT) each person gets an
automatically assigned ID. Challenges in MOT are ID
switches which occur mostly while person crossings
or during occlusions.
VISAPP 2021 - 16th International Conference on Computer Vision Theory and Applications
698
Figure 2: Overview of the different functionalities of the robots. Besides the transportation of different containers and goods,
the robot can be a mobility assistance for patients or disinfect hospital wards.
The challenges for the remote vital parameter de-
termination consists in high accuracy requirements,
low resolution in omnidirectional sensors, robustness
against illumination and motion artefacts. Further-
more, a night working mode by operating in near in-
frared has to implemented. It is crucial that all remote
vital parameter algorithms are validated in the hospi-
tal by gold standard measurement methods.
In addition to the optical sensors, robots can be
employed to automate processes in the hospital. As-
sistance robots can perform time-consuming trans-
portation tasks, as well as being a mobility support
for patients or UV-disinfecting hospital wards. This
allows nurses to focus on more complex patient needs
and reduces physical contact preventing infections.
For this purpose, path planning and perceiving peo-
ple in the dynamic environment is crucial for a save
navigation and appropriate human-robot interaction.
During the COVID-19 pandemics many health-
care related institutions have been targeted by cyber-
attacks which shows the need for the identification of
data leaks in the digital infrastructure. This fact leads
to two recommendations for action; first, the improve-
ment of existing infrastructure especially in hospitals
and second, the consideration of these aspects during
the development of new technical systems.
Beside data security one of the major challenges
is the protection of the user data and its identity. For
this, the data captured by a fisheye camera in a 3D
smart sensor device needs to be anonymised. To over-
come the trade-off between privacy and video ana-
lytics each usergroups’ data (patient, visitor, staff) is
anonymised through face anonymisation and the ad-
dition of depth maps.
In the near future the assistance system is planned
for large-scale piloting in several hospitals in the EU.
A secure infrastructure for sensor-communication and
data storage needs to be integrated into the telematics
infrastructure of the hospital. The major challenges
here are the design of the interface due to different
HIS-providers on the market and the lack of technical
and administrative staff at the hospitals. To increase
the user acceptance in hospitals it is necessary to cre-
ate an AI-guidbook with a fast-response reaction plan
for switching the hospital to crisis mode.
8 CONCLUSION
In our work, we proposed a concept for an assistance
system in hospitals to encounter the effects of pan-
demics such as COVID-19. The components of such
a system consist of the contactless measurement of
health state for patients and medical staff, of robots
with several assistance and guiding function and the
implementation in a secure hospital infrastructure.
Whereas single components are in the market nowa-
days, a combination of these tools is not implemented
in clinical environments. Therefore, the next step is
the implementation in a hospital embedded during a
clinical study in order to measure technical, medical,
social and economical effects. We expect a signifi-
cantly smaller amount of infections in the hospital, a
reduction of workload for caregivers and a cost reduc-
tion due to digitalisation.
An User-centred AI-based Assistance System to Encounter Pandemics in Clinical Environments: A Concept Overview
699
REFERENCES
Biber, P. and Duckett, T. (2009). Experimental analysis of
sample-based maps for long-term slam. The Interna-
tional Journal of Robotics Research, 28(1):20–33.
Bodenhagen, L., Suvei, S.-D., Juel, W. K., Brander, E., and
Kr
¨
uger, N. (2019). Robot technology for future wel-
fare: meeting upcoming societal challenges–an out-
look with offset in the development in scandinavia.
Health and Technology, 9(3):197–218.
Bowman, S. L., Atanasov, N., Daniilidis, K., and Pappas,
G. J. (2017). Probabilistic data association for seman-
tic slam. In 2017 IEEE international conference on
robotics and automation (ICRA), pages 1722–1729.
IEEE.
Cesta, A., Cortellessa, G., Orlandini, A., and Tiberio,
L. (2016). Long-term evaluation of a telepresence
robot for the elderly: methodology and ecological
case study. International Journal of Social Robotics,
8(3):421–441.
Da
ˇ
si
´
c, P., Da
ˇ
si
´
c, J., and Crvenkovi
´
c, B. (2017). Improving
patient safety in hospitals through usage of cloud sup-
ported video surveillance. Open access Macedonian
journal of medical sciences, 5(2):101.
Ferrari, M., Harrison, B., Rawashdeh, O., Hammond, R.,
Avery, Y., Rawashdeh, M., Sadeh, W., and Maddens,
M. (2012). Clinical feasibility trial of a motion de-
tection system for fall prevention in hospitalized older
adult patients. Geriatric Nursing, 33(3):177–183.
Gonzalez-Jimenez, J., Galindo, C., and Gutierrez-
Castaneda, C. (2013). Evaluation of a telepresence
robot for the elderly: a spanish experience. In Inter-
national Work-Conference on the Interplay Between
Natural and Artificial Computation, pages 141–150.
Springer.
Lee, H., Kim, Y., and Bianchi, A. (2017). A survey on
medical robotic telepresence design from the perspec-
tive of medical staff. Archives of Design Research,
30(1):61–71.
Liao, S., Jain, A. K., and Li, S. Z. (2016). A Fast and Ac-
curate Unconstrained Face Detector. IEEE Transac-
tions on Pattern Analysis and Machine Intelligence,
38(2):211–223.
Liu, H., Jie, Z., Jayashree, K., Qi, M., Jiang, J., Yan, S., and
Feng, J. (2017). Video-based person re-identification
with accumulative motion context. IEEE transac-
tions on circuits and systems for video technology,
28(10):2788–2802.
Lukac, T., Pucik, J., and Chrenko, L. (2014). Contactless
recognition of respiration phases using web camera.
In Radioelektronika (RADIOELEKTRONIKA), 2014
24th International Conference, pages 1–4. IEEE.
Meinel, L., Wiede, C., Findeisen, M., Apitzsch, A., and
Hirtz, G. (2014). Virtual perspective views for real-
time people detection using an omnidirectional cam-
era. In Imaging Systems and Techniques (IST), 2014
IEEE International Conference on, pages 312–315.
IEEE.
Niemel
¨
a, M., Van Aerschot, L., Tammela, A., Aaltonen, I.,
and Lammi, H. (2019). Towards ethical guidelines of
using telepresence robots in residential care. Interna-
tional Journal of Social Robotics, pages 1–9.
Poh, M.-Z., McDuff, D. J., and Picard, R. W. (2010). Non-
contact, automated cardiac pulse measurements using
video imaging and blind source separation. Optics ex-
press, 18(10):10762–10774.
Pomerleau, F., Kr
¨
usi, P., Colas, F., Furgale, P., and Siegwart,
R. (2014). Long-term 3d map maintenance in dynamic
environments. In 2014 IEEE International Con-
ference on Robotics and Automation (ICRA), pages
3712–3719. IEEE.
Rantz, M. J., Banerjee, T. S., Cattoor, E., Scott, S. D.,
Skubic, M., and Popescu, M. (2013). Automated
fall detection with quality improvement rewind to re-
duce falls in hospital rooms. Journal of gerontological
nursing, 40(1):13–17.
Scheck, T., Seidel, R., and Hirtz, G. (2020). Learning from
theodore: A synthetic omnidirectional top-view in-
door dataset for deep transfer learning. In The IEEE
Winter Conference on Applications of Computer Vi-
sion, pages 943–952.
Seidel, R., Scheck, T., Grassi, A. C. P., Seuffert, J. B.,
Apitzsch, A., Yu, J., Nestler, N., Heinz, D., Lehmann,
L., Goy, A., and Hirtz, G. (2020). Contactless inter-
active fall detection and sleep quality estimation for
supporting elderly with incipient dementia. In BMT
2020 Conference (in-press). VDE.
Shi, J. and Tomasi, C. (1993). Good Features to Track.
Technical report, Cornell University, Ithaca, NY,
USA.
Tomasi, C. and Kanade, T. (1991). Detection and Tracking
of Point Features. Technical report, Carnegie Mellon
University.
Verkruysse, W., Svaasand, L. O., and Nelson, J. S. (2008).
Remote plethysmographic imaging using ambient
light. Optics express, 16(26):21434–21445.
Wiede, C., Grundmann, K., Wuerich, C., Rademacher, R.,
Heidemann, B., and Grabmaier, A. (2020). Fast triage
of covid-19 patients in hospitals by means of remote
respiration rate determination. In BMT 2020 Confer-
ence (in-press). VDE.
Wiede, C., Richter, J., and Hirtz, G. (2019). Contact-less vi-
tal parameter determination: An e-health solution for
elderly care. In VISIGRAPP (5: VISAPP), pages 908–
915.
Wiede, C., Richter, J., Manuel, M., and Hirtz, G. (2017).
Remote respiration rate determination in video data-
vital parameter extraction based on optical flow and
principal component analysis. In International Con-
ference on Computer Vision Theory and Applications,
volume 5, pages 326–333. SCITEPRESS.
Zhu, X. and Ramanan, D. (2012). Face detection, pose esti-
mation, and landmark localization in the wild. In Pro-
ceedings of the IEEE Computer Society Conference
on Computer Vision and Pattern Recognition, pages
2879–2886.
VISAPP 2021 - 16th International Conference on Computer Vision Theory and Applications
700