MODEL-FREE MARKERLESS TRACKING FOR REMOTE SUPPORT IN UNKNOWN ENVIRONMENTS

Alexander Ladikos, Selim Benhimane, Nassir Navab, Mirko Appel

2008

Abstract

We propose a complete system that performs real-time markerless tracking for Augmented Reality-based remote user support in a priori unknown environments. In contrast to existing systems, which require a prior setup and/or knowledge about the scene, our system can be used without preparation. This is due to our tracking algorithm which does not need a 3D-model of the scene or a learning-phase for the initialization. This allows us to perform fast and robust markerless tracking of the objects which are to be augmented. The proposed solution does not require artificial markers or special lighting conditions. The only requirement is the presence of locally planar objects in the scene, which is true for almost every man-made structure and in particular technical installations. The augmentations are chosen by a remote expert who is connected to the user over a network and receives a live stream of the scene.

References

  1. Baker, S. and Matthews, I. (2004). Lucas-kanade 20 years on: A unifying framework. IJCV, 56(3):221 - 255.
  2. Barakonyi, I., Fahmy, T., and Schmalstieg, D. (2004). Remote collaboration using augmented reality videoconferencing. In Proceedings of the Conference on Graphics Interface, pages 89-96.
  3. Benhimane, S. and Malis, E. (2004). Real-time imagebased tracking of planes using efficient second-order minimization. In IROS, pages 943-948.
  4. Friedrich, W. (2004). ARVIKA - Augmented Reality für Entwicklung, Produktion und Service. Publicis, first edition.
  5. Hager, G. and Belhumeur, P. (1998). Efficient region tracking with parametric models of geometry and illumination. PAMI, 20(10):1025-1039.
  6. Ladikos, A., Benhimane, S., and Navab, N. (2007). A realtime tracking system combining template-based and feature-based approaches. In VISAPP.
  7. Lee, T. and Höllerer, T. (2006). Viewpoint stabilization for live collaborative video augmentations. In ISMAR, pages 241-242.
  8. Lepetit, V., Lagger, P., and Fua, P. (2005). Randomized trees for real-time keypoint recognition. In CVPR, pages 775-781.
  9. Lowe, D. (2004). Distinctive image features from scaleinvariant keypoints. IJCV, 60(2):91-110.
  10. Pressigout, M. and Marchand, E. (2006). Hybrid tracking algorithms for planar and non-planar structures subject to illumination changes. In ISMAR, pages 52-55.
  11. Vacchetti, L., Lepetit, V., and Fua, P. (2004). Combining edge and texture information for real-time accurate 3d camera tracking. In ISMAR, pages 48-57.
Download


Paper Citation


in Harvard Style

Ladikos A., Benhimane S., Navab N. and Appel M. (2008). MODEL-FREE MARKERLESS TRACKING FOR REMOTE SUPPORT IN UNKNOWN ENVIRONMENTS . In Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008) ISBN 978-989-8111-21-0, pages 627-630. DOI: 10.5220/0001071206270630


in Bibtex Style

@conference{visapp08,
author={Alexander Ladikos and Selim Benhimane and Nassir Navab and Mirko Appel},
title={MODEL-FREE MARKERLESS TRACKING FOR REMOTE SUPPORT IN UNKNOWN ENVIRONMENTS},
booktitle={Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008)},
year={2008},
pages={627-630},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001071206270630},
isbn={978-989-8111-21-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Third International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2008)
TI - MODEL-FREE MARKERLESS TRACKING FOR REMOTE SUPPORT IN UNKNOWN ENVIRONMENTS
SN - 978-989-8111-21-0
AU - Ladikos A.
AU - Benhimane S.
AU - Navab N.
AU - Appel M.
PY - 2008
SP - 627
EP - 630
DO - 10.5220/0001071206270630