Authors:
Christoph Kalkbrenner
;
Steffen Hacker
;
Maria-Elena Algorri
and
Ronald Blechschmidt-Trapp
Affiliation:
University of Applied Science Ulm, Germany
Keyword(s):
Motion Analysis, Inertial Measurement Unit, IMU, Motion Tracking, Kinect, Data Fusion, Kalman Filter.
Related
Ontology
Subjects/Areas/Topics:
Biomedical Engineering
;
Biomedical Instruments and Devices
;
Biomedical Sensors
Abstract:
This paper presents an approach for the tracking of limb movements using orientation information acquired from Inertial Measurement Units (IMUs) and optical information from a Kinect sensor. A new algorithm that uses a Kalman filter to fuse the Kinect and IMU data is presented. By fusing optical and orientation information we are able to track the movement of limb joints precisely, and almost drift-free. First, the IMU data is processed using the gradient descent algorithm proposed in (Madgwick et al., 2011) which calculates the orientation information of the IMU using acceleration and velocity data. Measurements made with IMUs tend to drift over time, so in a second stage we compensate for the drift using absolute position information obtained from a Microsoft Kinect sensor. The fusion of sensor data also allows to compensate for faulty or missing measurements. We have carried out some initial experiments on arm tracking. The first results show that our technique for data fusion has
the potential to be used to record common medical exercises for clinical movement analysis.
(More)