A Human Activity Recognition Framework for Healthcare Applications: Ontology, Labelling Strategies, and Best Practice

Przemyslaw Woznowski, Rachel King, William Harwin, Ian Craddock

2016

Abstract

Human Activity Recognition (AR) is an area of great importance for health and well-being applications including Ambient Intelligent (AmI) spaces, Ambient Assisted Living (AAL) environments, and wearable healthcare systems. Such intelligent systems reason over large amounts of sensor-derived data in order to recognise users’ actions. The design of AR algorithms relies on ground-truth data of sufficient quality and quantity to enable rigorous training and validation. Ground-truth is often acquired using video recordings which can produce detailed results given the appropriate labels. However, video annotation is not a trivial task and is, by definition, subjective. In addition, the sensitive nature of the recordings has to be foremost in minds of the researchers to protect the identity and privacy of participants. In this paper, a hierarchical ontology for the annotation of human activity recognition in the home is proposed. Strategies that support different levels of granularity are presented enabling consistent, and repeatable annotations for training and validating activity recognition algorithms. Best practice regarding the handling of this type of sensitive data is discussed.

References

  1. Allen, F. R., Ambikairajah, E., Lovell, N. H., and Celler, B. G. (2006). Classification of a known sequence of motions and postures from accelerometry data using adapted Gaussian mixture models. 27(10):935-51.
  2. Atallah, L., Lo, B., Ali, R., King, R., and Yang, G.-Z. (2009). Real-time activity classification using ambient and wearable sensors. IEEE transactions on information technology in biomedicine : a publication of the IEEE Engineering in Medicine and Biology Society, 13(6):1031-9.
  3. Bao, L. and Intille, S. S. (2004). Activity Recognition from User-Annotated Acceleration Data. Proceedings of PERVASIVE 2004, pages 1-17.
  4. Brugman, H. and Russel, A. (2004). Annotating multimedia / multi-modal resources with ELAN. Proceedings of the 4th International Conference on Language Resources and Language Evaluation (LREC 2004), pages 2065-2068.
  5. Cippitelli, E., Gasparrini, S., Gambi, E., Spinsante, S., Wahslen, J., Orhan, I., and Lindh, T. (2015). Time synchronization and data fusion for rgb-depth cameras and wearable inertial sensors in aal applications. In IEEE ICC Workshop on ICT-enabled services and technologies for eHealth and AAL.
  6. Fafoutis, X., Tsimbalo, E., Mellios, E., Hilton, G., Piechocki, R., and Craddock, I. (2016). A residential maintenance-free long-term activity monitoring system for healthcare applications. EURASIP Journal on Wireless Communications and Networking, 2016(31).
  7. Filippaki, C., Antoniou, G., and Tsamardinos, I. (2011). Using constraint optimization for conflict resolution and detail control in activity recognition. In Ambient Intelligence, pages 51-60. Springer.
  8. Hamilton, J. (2008). Think you're multitasking? think again. Morning Edition, National Public Radio (2 October 2008).
  9. Hoque, E. and Stankovic, J. (2012). AALO: Activity recognition in smart homes using Active Learning in the presence of Overlapped activities. Proceedings of the 6th International Conference on Pervasive Computing Technologies for Healthcare, pages 139-146.
  10. Kipp, M. (2012). Annotation Facilities for the Reliable Analysis of Human Motion. In Proceedings of the Eighth International Conference on Language Resources and Evaluation (LREC), pages 4103-4107.
  11. Logan, B., Healey, J., Philipose, M., Tapia, E. M., and Intille, S. (2007). A Long-term Evaluation of Sensing Modalities for Activity Recognition. In Proceedings of the 9th International Conference on Ubiquitous Computing, UbiComp 7807, pages 483-500, Berlin, Heidelberg. Springer-Verlag.
  12. Longstaff, B., Reddy, S., and Estrin, D. (2010). Improving activity classification for health applications on mobile devices using active and semi-supervised learning. In Proceedings of the 4th International ICST Conference on Pervasive Computing Technologies for Healthcare, pages 1-7.
  13. Maurer, U., Smailagic, A., Siewiorek, D., and Deisher, M. (2006). Activity Recognition and Monitoring Using Multiple Sensors on Different Body Positions. In International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06), pages 113-116. IEEE.
  14. Pärkkä, J., Ermes, M., Korpipää, P., Mäntyjärvi, J., Peltola, J., and Korhonen, I. (2006). Activity classification using realistic data from wearable sensors. IEEE Transactions on Information Technology in Biomedicine.
  15. Tsipouras, M. G., Tzallas, A. T., Rigas, G., Tsouli, S., Fotiadis, D. I., and Konitsiotis, S. (2012). An automated methodology for levodopa-induced dyskinesia: assessment based on gyroscope and accelerometer signals. Artificial intelligence in medicine , 55(2):127-35.
  16. van Kasteren, T., Noulas, A., Englebienne, G., and Kröse, B. (2008). Accurate activity recognition in a home setting. In Proceedings of the 10th international conference on Ubiquitous computing - UbiComp 7808, New York, New York, USA. ACM Press.
  17. Vondrick, C., Patterson, D., and Ramanan, D. (2013). Efficiently Scaling up Crowdsourced Video Annotation. International Journal of Computer Vision, 101(1):184-204.
  18. Woznowski, P. (2013). Rule-based semantic sensing platform for activity monitoring. PhD thesis, Cardiff University.
  19. Woznowski, P., Fafoutis, X., Song, T., Hannuna, S., Camplani, M., Tao, L., Paiement, A., Mellios, E., Haghighi, M., Zhu, N., et al. (2015). A multi-modal sensor infrastructure for healthcare in a residential environment. In IEEE ICC Workshop on ICT-enabled services and technologies for eHealth and AAL.
Download


Paper Citation


in Harvard Style

Woznowski P., King R., Harwin W. and Craddock I. (2016). A Human Activity Recognition Framework for Healthcare Applications: Ontology, Labelling Strategies, and Best Practice . In Proceedings of the International Conference on Internet of Things and Big Data - Volume 1: IoTBD, ISBN 978-989-758-183-0, pages 369-377. DOI: 10.5220/0005932503690377


in Bibtex Style

@conference{iotbd16,
author={Przemyslaw Woznowski and Rachel King and William Harwin and Ian Craddock},
title={A Human Activity Recognition Framework for Healthcare Applications: Ontology, Labelling Strategies, and Best Practice},
booktitle={Proceedings of the International Conference on Internet of Things and Big Data - Volume 1: IoTBD,},
year={2016},
pages={369-377},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005932503690377},
isbn={978-989-758-183-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Internet of Things and Big Data - Volume 1: IoTBD,
TI - A Human Activity Recognition Framework for Healthcare Applications: Ontology, Labelling Strategies, and Best Practice
SN - 978-989-758-183-0
AU - Woznowski P.
AU - King R.
AU - Harwin W.
AU - Craddock I.
PY - 2016
SP - 369
EP - 377
DO - 10.5220/0005932503690377