loading
Papers

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Carina Figueira 1 ; Ricardo Matias 2 and Hugo Gamboa 1

Affiliations: 1 Universidade Nova de Lisboa and Associação Fraunhofer Portugal Research, Portugal ; 2 Instituto Politécnico de Setúbal, Portugal

ISBN: 978-989-758-170-0

Keyword(s): Human Activity Recognition, Signal Processing, Feature Extraction, Feature Selection, Machine Learning.

Related Ontology Subjects/Areas/Topics: Animation and Simulation ; Artificial Intelligence ; Biomedical Engineering ; Biomedical Signal Processing ; Computer Vision, Visualization and Computer Graphics ; Data Manipulation ; Detection and Identification ; Health Engineering and Technology Applications ; Human-Computer Interaction ; Methodologies and Methods ; Motion Control ; Neurocomputing ; Neurotechnology, Electronics and Informatics ; Pattern Recognition ; Physiological Computing Systems ; Sensor Networks ; Soft Computing

Abstract: Human Activity Recognition (HAR) is increasingly common in people’s daily lives, being applied in health areas, sports and safety. Because of their high computational power, small size and low cost, smartphones and wearable sensors are suitable to monitor user’s daily living activities. However, almost all existing systems require devices to be worn in certain positions, making them impractical for long-term activity monitoring, where a change in position can lead to less accurate results. This work describes a novel algorithm to detect human activity independent of the sensor placement. Taking into account the battery consumption, only two sensors were considered: the accelerometer (ACC) and the barometer (BAR), with a sample frequency of 30 and 5 Hz, respectively. The signals obtained were then divided into 5 seconds windows. The dataset used is composed of 25 subjects, with more than 7 hours of recording. Daily living activities were performed with the smartphone worn in 12 differe nt positions. From each window a set of statistical, temporal and spectral features were extracted and selected. During the classification process, a decision tree was trained and evaluated using a leave one user out cross validation. The developed framework achieved an accuracy of 94.5±6.8 %, regardless the subject and device’s position. This solution may be applied to elderly monitoring, as a rehabilitation tool in physiotherapy fields and also to be used by ordinary users, who just want to check their daily level of physical activity. (More)

PDF ImageFull Text

Download
CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 34.237.51.35

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Figueira, C.; Matias, R. and Gamboa, H. (2016). Body Location Independent Activity Monitoring.In Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies - Volume 2: BIOSIGNALS, (BIOSTEC 2016) ISBN 978-989-758-170-0, pages 190-197. DOI: 10.5220/0005699601900197

@conference{biosignals16,
author={Carina Figueira. and Ricardo Matias. and Hugo Gamboa.},
title={Body Location Independent Activity Monitoring},
booktitle={Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies - Volume 2: BIOSIGNALS, (BIOSTEC 2016)},
year={2016},
pages={190-197},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005699601900197},
isbn={978-989-758-170-0},
}

TY - CONF

JO - Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies - Volume 2: BIOSIGNALS, (BIOSTEC 2016)
TI - Body Location Independent Activity Monitoring
SN - 978-989-758-170-0
AU - Figueira, C.
AU - Matias, R.
AU - Gamboa, H.
PY - 2016
SP - 190
EP - 197
DO - 10.5220/0005699601900197

Login or register to post comments.

Comments on this Paper: Be the first to review this paper.