VISUAL SCENE AUGMENTATION FOR ENHANCED HUMAN PERCEPTION

Daniel Hahn, Frederik Beutler, Uwe D. Hanebeck

Abstract

In this paper we present an assistive system for hearing-impaired people that consists of a wearable microphone array and an Augmented Reality (AR) system. This system helps the user in communication situations, where many speakers or sources of background noise are present. In order to restore the “cocktail party” effect multiple microphones are used to estimate the position of individual sound sources. In order to allow the user to interact in complex situations with many speakers, an algorithm for estimating the user’s attention is developed. This algorithm determines the sound sources, which are in the user’s focus of attention. It allows the system to discard irrelevant information and enables the user to focus on certain aspects of the surroundings. Based on the user’s hearing impairment, the perception of the speaker in the focus of attention can be enhanced, e.g. by amplification or using a speech-to-text conversion. A prototype has been built for evaluating this approach. Currently the prototype is able to locate sound beacons in three-dimensional space, to perform a simple focus estimation, and to present floating captions in the Augmented Reality. The prototype uses an intentionally simple user interface, in order to minimize distractions.

References

  1. Azuma, R. T. (1997). A Survey of Augmented Reality. Presence: Teleoperators and Virtual Environments, 6(4):355-385.
  2. Backer, G. and Mertsching, B. (2003). Two Selection Stages Provide Efficient Object-Based Attentional Control for Dynamic Vision. In International Workshop on Attention and Performance in Computer Vision, pages 9 - 16, Graz, Austria.
  3. Billinghurst, M. et al. (2004). ARToolkit Augmented Reality Toolkit. http://www.hitl.washington.edu/research /shared space/download/.
  4. Broadbent, D. (1958). Perception and Communication. Pergamon Press, London.
  5. Chun, M. M. and Wolfe, J. M. (2001). Visual Attention. In Blackwell's Handbook of Perception, chapter 9, pages 272-310. Blackwell.
  6. Cohen, A. (2003). Selective Attention. In Encyclopedia of Cognitive Science. Nature Publishing Group (Macmillan).
  7. Draper, B. A. and Lionelle, A. (2003). Evaluation of Selective Attention under Similarity Transforms. In International Workshop on Attention and Performance in Computer Vision, pages 31-38, Graz, Austria.
  8. Pentland, A. (2000). Perceptual User Interfaces: Perceptual Intelligence. Communications of the ACM, 43(3):35- 44.
  9. Schmalstieg, D., Fuhrmann, A., Hesina, G., Szalavari, Z., Encarnacao, L. M., Gervautz, M., and Purgathofer, W. (2002). The Studierstube Augmented Reality Project. Technical Report TR-188-2-2002-05, Interactive Media Systems Group, Institute for Software Technology and Interactive Systems, Vienna University of Technology.
  10. Shell, J. S., Selker, T., and Vertegaal, R. (2003). Interacting with Groups of Computers. Communications of the ACM, 46(3):40-46.
  11. Sibert, L. E. and Jacob, R. J. K. (2000). Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 281-288, The Hague, The Netherlands. ACM Press.
  12. Stiefelhagen, R. (2002). Tracking Focus of Attention in Meetings. In International Conference on Multimodal Interfaces, page 273, Washington, DC, USA.
  13. Tan, D. S., Poupyrev, I., Billinghurst, M., Kato, H., Regenbrecht, H., and Tetsutani, N. (2001). On-demand, In-place Help for Augmented Reality Environments. In Ubicomp 2001, Atlanta, GA, USA.
  14. Treisman, A. and Gelade, G. (1980). A Feature-Integration Theory of Attention. Cognitive Psychology, (12):97- 137.
  15. Vertegaal, R. (2002a). Designing Attentive Interfaces. In Proceedings of the Symposium on Eye Tracking Research & Applications, pages 23-30, New Orleans, La, USA. ACM Press.
  16. Vertegaal, R. (2002b). What do the eyes behold for humancomputer interaction? In Proceedings of the symposium on Eye tracking research & applications, pages 59-60. ACM Press.
  17. Vertegaal, R. (2003). 46(3):30-33.
  18. Vertegaal, R., Dickie, C., Sohn, C., and Flickner, M. (2002). Designing Attentive Cell Phone using Wearable Eyecontact Sensors. In CHI 7802 Extended Abstracts on Human Factors in Computing Systems, pages 646- 647. ACM Press.
  19. Yarbus, A. L. (1967). Eye Movements During Perception of Complex Objects. In Eye Movements and Vision, pages 171-196. Plenum Press, New York, NY, USA.
Download


Paper Citation


in Harvard Style

Hahn D., Beutler F. and D. Hanebeck U. (2005). VISUAL SCENE AUGMENTATION FOR ENHANCED HUMAN PERCEPTION . In Proceedings of the Second International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO, ISBN 972-8865-30-9, pages 146-153. DOI: 10.5220/0001173701460153


in Bibtex Style

@conference{icinco05,
author={Daniel Hahn and Frederik Beutler and Uwe D. Hanebeck},
title={VISUAL SCENE AUGMENTATION FOR ENHANCED HUMAN PERCEPTION},
booktitle={Proceedings of the Second International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,},
year={2005},
pages={146-153},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001173701460153},
isbn={972-8865-30-9},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Second International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,
TI - VISUAL SCENE AUGMENTATION FOR ENHANCED HUMAN PERCEPTION
SN - 972-8865-30-9
AU - Hahn D.
AU - Beutler F.
AU - D. Hanebeck U.
PY - 2005
SP - 146
EP - 153
DO - 10.5220/0001173701460153