EYE AND GAZE TRACKING ALGORITHM FOR COLLABORATIVE LEARNING SYSTEM

Djamel Merad, Stephanie Metz, Serge Miguet

2006

Abstract

Our work focuses on the interdisciplinary field of detailed analysis of behaviors exhibited by individuals during sessions of distributed collaboration. With a particular focus on ergonomics, we propose new mechanisms to be integrated into existing tools to enable increased productivity in distributed learning and working. Our technique is to record ocular movements (eye tracking) to analyze various scenarios of distributed collaboration in the context of computer-based training. In this article, we present a low-cost oculometric device that is capable of making ocular measurements without interfering with the natural behavior of the subject. We expect that this device could be employed anywhere that a natural, non-intrusive method of observation is required, and its low-cost permits it to be readily integrated into existing popular tools, particularly E-learning campus.

References

  1. A.Corbel, P.Jaillon, Serpaggi, X., Baker, M., Quignard, M., Lund, K., and Sjourn, A. (2003). A drew: Un outil internet pour crer des situations d'apprentissage cooprant. In Environnements Informatiques pour l'Apprentissage Humain (EIAH 2003), Strasbourg, France.
  2. Anders, G. (2001). Pilots attention allocation during approach and landingeye- and head-tracking research in an a330 full flight simulator. In International Symposium on Aviation Psychology (ISAP).
  3. Baluja, S. and Pomerleau, D. (1994). Non-intrusive gazetracking using artificial neural networks. In Neural Information Processing Systems, Morgan Kaufman Publishers, New York.
  4. Cho, Y. and Neumann, U. (1998). Multi-ring color fiducial systems for scalable fiducial tracking augmented reality. In Proceedings of the Virtual Reality Annual International Symposium (VRAIS98), page 212, Washington, DC, USA.
  5. Collewijn, H. (1999). Multi-representationnal argumentative interactions : the case of computer-mediated communication in cooperative learning situation. In H. S. Carpenter and J.G.Robson [Eds.], Vision Research: A practical Guide to Laboratory Methods, pages 245- 285, Oxford: Oxford Univ. Press.
  6. Duchowski, A. T. (2002). A breadth-first survey of eye tracking applications. In Behavior Research Methods, Instruments, and Computers.
  7. Fiala, M. (2005). Artag, a fiducial marker system using digital techniques. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR05), Volume 2, pages 590- 596, Washington, DC, USA.
  8. Glenstrup, A. and Nielson, T. (1995). Eye controlled media : Prensent and future state. In Masters thesis, University of Copenhagen.
  9. Henderson, J. M. and Hollingworth, A. (1998). Eye movements during scene viewing: An overview. In G.Underwood (Ed.), Eye Guidance in Reading and Scene Perception.
  10. Kato, H. and Billinghurst, M. (1999). Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR99), pages 85-92, Washington, DC, USA.
  11. Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K., and Tachibana, K. (2000). Virtual object manipulation on a table-top ar environment. In Proceedings of the International Symposium on Augmented Reality (ISAR 2000), pages 111-119, Munich, Germany.
  12. Kim, K. and Ramakrishna, R. (1999). Vision-based eyegaze tracking for human computer interface.. In International Conference on Systems, Man, and Cybernetics, pages 324-329.
  13. Matsumoto, Y. and Zelinsky, A. (2000). An algorithm for real-time stereo vision implmentation of head pose and gaze direcetion measurement. In International Conference on Automatic Face and Gesture Recognition, pages 499-504.
  14. Naimark, L. and Foxlin, E. (2002). Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker. In Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR02), pages 27-36, Washington, DC, USA.
  15. Pastoor, S., Liu, J., and Renault, S. (1999). An experimental multimedia system allowing 3-d visualization and eyecontrolled interaction without user-worn devices. In IEEE Trans. Multimedia, 1(1), pages 41-52.
  16. Perona, P. and Malik, J. (1990). Scale-space and edge detection using anisotropic diffusion. In IEEE Transactions on pattern and machine intelligence. Vol 12. NO. 7, pages 629-639.
  17. Pomplun, M., Velichkovsky, B., and Ritter, H. (1994). An artificial neural network for high precision eye movement tracking. In B. Nebel and L. DrescherFischer (Eds.), Lectures Notes in Artificial Interlligence, Springer Verlag, Berlin.
  18. Reingold, E. M., Charness, N., Pomplun, M., and Stampe, D. M. (2002). Visual span in expert chess players: Evidence from eye movements. In Psychological Science.
  19. Rekimoto, J. (1998). Matrix : A realtime object identification and registration method for augmented reality. In Proceedings of the Third Asian Pacific Computer and Human Interaction (APCHI98), page 6368, Washington, DC, USA.
  20. Rekimoto, J. and Ayatsuka, Y. (2000). Cybercode : designing augmented reality environments with visual tags. In Proceedings of DARE 2000 on Designing augmented reality environments (DARE00), pages 1-10.
  21. Sibert, L. E. and Jacob, R. J. (2000). Evaluation of eye gaze interaction. In Human Factors in Computing Systems: CHI 2000 Conference Proceedings. ACM Press.
  22. Stiefelhagen, R. and Yang, J. (1997). Gaze tracking for multimodal human-computer interaction. In International Conference on Acoustics, Speech, and Signal Processing, pages 2617-2620.
  23. Toyama, K. (1998). Look, ma . no hands!. hands-free cursor control with real-time 3d face tracking. In Workshop on Perceptual User Interfaces.
  24. Wang, J. G., Sung, E., and Venkateswarlu, R. (2003). A eye gaze estimation from a single image of one eye. In Proceedings of the Ninth IEEE International Conference on Computer Vision (ICCV 2003) 2-Volume.
  25. Wu, H., Chen, Q., and Wada, T. (2005). Visual direction estimation from a monocular image. In IEICE Trans. Inf. and Syst., Vol. E88D, No.10., pages 2277-2285.
  26. Zhang, Z. (1999). Flexible camera calibration by viewing a plane from unknown orientations. In IEEE International Conference on Computer Vision.
  27. Zhu, J. and Yang, J. (2002). Subpixel eye gaze tracking. In IEEE International Conference on Automatic Face and Gesture Recognition.
Download


Paper Citation


in Harvard Style

Merad D., Metz S. and Miguet S. (2006). EYE AND GAZE TRACKING ALGORITHM FOR COLLABORATIVE LEARNING SYSTEM . In Proceedings of the Third International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO, ISBN 978-972-8865-60-3, pages 326-333. DOI: 10.5220/0001207503260333


in Bibtex Style

@conference{icinco06,
author={Djamel Merad and Stephanie Metz and Serge Miguet},
title={EYE AND GAZE TRACKING ALGORITHM FOR COLLABORATIVE LEARNING SYSTEM},
booktitle={Proceedings of the Third International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,},
year={2006},
pages={326-333},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001207503260333},
isbn={978-972-8865-60-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Third International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,
TI - EYE AND GAZE TRACKING ALGORITHM FOR COLLABORATIVE LEARNING SYSTEM
SN - 978-972-8865-60-3
AU - Merad D.
AU - Metz S.
AU - Miguet S.
PY - 2006
SP - 326
EP - 333
DO - 10.5220/0001207503260333