Collaborative Vision Network for Personalized Office Ergonomics

Tommi Määttä, Aki Härmä, Chih-Wei Chen, Hamid Aghajan

2014

Abstract

This paper proposes a collaborative vision network that leverages a personal webcam and cameras of the workplace to provide feedback relating to an office-worker’s adherence to ergonomic guidelines. This can lead to increased well-being for the individual and better productivity in their work. The proposed system is evaluated with a recorded multi-camera dataset from a regular office environment. First, analysis results on various ergonomic issues are presented based on personal webcams of the two workers. Second, both personal and ambient cameras are used through sensor fusion to infer the mobility state of one of the workers. Results for various fusion approaches are shown and their impact on vision network design is briefly discussed.

References

  1. Andriluka, M., Roth, S., and Schiele, B. (2008). People-tracking-by-detection and people-detectionby-tracking. In IEEE Conference on Computer Vision and Pattern Recognition, pages 1-8.
  2. Chau, M. and Betke, M. (2005). Real time eye tracking and blink detection with usb cameras. Technical report, Boston University.
  3. Chen, C.-W. and Aghajan, H. (2011). Multiview social behavior analysis in work environments. In ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC), Ghent, Belgium.
  4. Chen, C.-W., Aztiria, A., Ben Allouch, S., and Aghajan, H. (2011). Understanding the influence of social interactions on individual's behavior pattern in a work environment. In Proceedings of the Second international conference on Human Behavior Unterstanding, pages 146-157, Berlin, Heidelberg. Springer-Verlag.
  5. Dalal, N. and Triggs, B. (2005). Histograms of oriented gradients for human detection. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), volume 1, pages 886-893.
  6. Dasarathy, B. V. (1997). Sensor fusion potential exploitation - innovative architectures and illustrative applications. Proceedings of the IEEE, 85:24-38.
  7. Hansen, D. and Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3).
  8. Jaimes, A. (2005). Sit straight (and tell me what i did today): a human posture alarm and activity summarization system. In Proceedings of the 2nd ACM workshop on Continuous archival and retrieval of personal experiences, CARPE 7805, pages 23-34, New York, NY, USA. ACM.
  9. Lalonde, M., Byrns, D., Gagnon, L., Teasdale, N., and Laurendeau, D. (2007). Real-time eye blink detection with gpu-based sift tracking. In Proceedings of the Fourth Canadian Conference on Computer and Robot Vision, pages 481-487, Washington, DC, USA. IEEE Computer Society.
  10. Li, I., Dey, A., and Forlizzi, J. (2010). A stage-based model of personal informatics systems. In Proceedings of the 28th international conference on Human factors in computing systems, CHI 7810, pages 557-566, New York, NY, USA. ACM.
  11. Määttä, T. T. (2013). Sensor fusion in smart camera networks for ambient intelligence. PhD thesis, Technische Universiteit Eindhoven.
  12. OSHA (2012). Laboratory safety ergonomics for the prevention of musculoskeletal disorders in laboratories. www.osha.gov/Publications/laboratory/ OSHAfactsheet-laboratory-safety-ergonomics.pdf.
  13. Sanderson, C. and Paliwal, K. K. (2004). Identity verification using speech and face information. Digital Signal Processing, pages 449-480.
  14. Seeing Machines (2013). www.seeingmachines.com/product/faceapi/.
  15. Shirom, A., Toker, S., Alkaly, Y., Jacobson, O., and Balicer, R. Work-based predictors of mortality: A 20-year follow-up of healthy employees. Health Psychology, (3):268-275.
  16. Valenti, R., Sebe, N., and Gevers, T. (2012). What are you looking at? - improving visual gaze estimation by saliency. International Journal of Computer Vision, 98(3):324-334.
Download


Paper Citation


in Harvard Style

Määttä T., Härmä A., Chen C. and Aghajan H. (2014). Collaborative Vision Network for Personalized Office Ergonomics . In Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014) ISBN 978-989-758-004-8, pages 403-410. DOI: 10.5220/0004683604030410


in Bibtex Style

@conference{visapp14,
author={Tommi Määttä and Aki Härmä and Chih-Wei Chen and Hamid Aghajan},
title={Collaborative Vision Network for Personalized Office Ergonomics},
booktitle={Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014)},
year={2014},
pages={403-410},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004683604030410},
isbn={978-989-758-004-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014)
TI - Collaborative Vision Network for Personalized Office Ergonomics
SN - 978-989-758-004-8
AU - Määttä T.
AU - Härmä A.
AU - Chen C.
AU - Aghajan H.
PY - 2014
SP - 403
EP - 410
DO - 10.5220/0004683604030410