3D Gaze Estimation using Eye Vergence

Esteban Gutierrez Mlot, Hamed Bahmani, Siegfried Wahl, Enkelejda Kasneci


We propose a fast and robust method to estimate the 3D gaze position based on the eye vergence information extracted from eye-tracking data. This method is specially designed for Point-of-Regard (PoR) estimation in non-virtual environments with the aim to make it applicable to the study of human visual attention deployment in natural scenarios. Our approach starts with a calibration step at different depth distances in order to achieve the best depth approximation. In addition, we investigate the distance range, for which state-of-the-art eyetracking technology allows 3D gaze estimation based on eye vergence. Our method provides a mean accuracy of 1.2◦ at a working distance between 200 mm and 400 mm from the user without requiring calibrated lights or cameras.


  1. Braunagel, C., Stolzmann, W., Kasneci, E., and Rosenstiel, W. (2015). Driver-activity recognition in the context of conditionally autonomous driving. In 2015 IEEE 18th International Conference on Intelligent Transportation Systems (ITSC), pages 1652-1657.
  2. Cassin, B., Rubin, M. L., and Solomon, S. (1984). Dictionary of eye terminology. Triad Publishing Company.
  3. Cerrolaza, J. J., Villanueva, A., and Cabeza, R. (2012). Study of Polynomial Mapping Functions in VideoOculography Eye Trackers. ACM Transactions on Computer-Human Interaction, 19(2):1-25.
  4. Cheng, A. C., Rao, S. K., Cheng, L. L., and Lam, D. S. (2006). Assessment of pupil size under different light intensities using the procyon pupillometer. Journal of Cataract & Refractive Surgery, 32(6):1015-1017.
  5. Chennamma, H. and Yuan, X. (2013). A Survey on Eye-Gaze Tracking Techniques. arXiv preprint arXiv:1312.6410, 4(5):388-393.
  6. Craig Hennessey, P. L. (2009). Noncontact binocular eyegaze tracking for point-of-gaze estimation in three dimensions. IEEE Transactions on Biomedical Engineering, 56(3):790-799.
  7. Davson, H. (2012). Physiology of the Eye. Elsevier.
  8. Dodgson, N. A. (2004). Variation and extrema of human interpupillary distance,” in stereoscopic displays and virtual reality systems. In Proc. SPIE 5291, pages 36- 46.
  9. Duchowski, A. T. (2007). Foveated Gaze-Contingent Displays for Peripheral LOD Management , 3D Visualization , and Stereo Imaging. 3(4).
  10. Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., and Nair, S. (2001). Binocular eye tracking in VR for visual inspection training. Proceedings of the ACM symposium on Virtual reality software and technology - VRST 7801, page 1.
  11. Essig, K., Pomplun, M., and Ritter, H. (2006). A neural network for 3D gaze recording with binocular eye trackers. International Journal of Parallel, Emergent and Distributed Systems, 21(February 2015):79-95.
  12. Fletcher, L., Loy, G., Barnes, N., and Zelinsky, A. (2005). Correlating driver gaze with the road scene for driver assistance systems. Robotics and Autonomous Systems, 52(1):71-84.
  13. Fuhl, W., Kübler, T. C., Sippel, K., Rosenstiel, W., and Kasneci, E. (2015a). Excuse: Robust pupil detection in real-world scenarios. In Azzopardi, G. and Petkov, N., editors, Computer Analysis of Images and Patterns - 16th International Conference, CAIP 2015, Valletta, Malta, September 2-4, 2015 Proceedings, Part I, volume 9256 of Lecture Notes in Computer Science, pages 39-51. Springer.
  14. Fuhl, W., Santini, T. C., Kuebler, T., and Kasneci, E. (2015b). ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments. arxiv:1511.06575.
  15. Gidlöf, K., Wallin, A., Dewhurst, R., and Holmqvist, K. (2013). Using eye tracking to trace a cognitive process: Gaze behaviour during decision making in a natural environment. Journal of Eye Movement Research, 6(1):1-14.
  16. Healy, A. F. and Proctor, R. W. (2003). Handbook of psychology: Experimental psychology.
  17. Hillaire, S., Lecuyer, A., Cozot, R., and Casiez, G. (2008). Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments. In Virtual Reality Conference, 2008. VR 7808. IEEE, pages 47-50.
  18. Howard, I. P. (2012). Depth from accommodation and vergence. In Perceiving in DepthVolume 3 Other Mechanisms of Depth Perception, pages 1-14. Oxford University Press (OUP).
  19. Kandel, E. R., Schwartz, J. H., and Jessell, T. M. (2000). Principles of neural science. McGraw-Hill, New York.
  20. Kasneci, E., Kasneci, G., Kübler, T. C., and Rosenstiel, W. (2015). Online Recognition of Fixations, Saccades, and Smooth Pursuits for Automated Analysis of Traffic Hazard Perception. In Artificial Neural Networks, volume 4 of Springer Series in Bio-/Neuroinformatics, pages 411-434. Springer International Publishing.
  21. Kasneci, E., Sippel, K., Heister, M., Aehling, K., Rosenstiel, W., Schiefer, U., and Papageorgiou, E. (2014). Homonymous visual field loss and its impact on visual exploration: A supermarket study. TVST, 3(6).
  22. Kasthurirangan, S. (2014). Current methods for objectively measuring accommodation. Presented as AAO Workshop on Developing Novel Endpoints for Premium Intraocular Lenses.
  23. Kourkoumelis, N. and Tzaphlidou, M. (2011). Eye safety related to near infrared radiation exposure to biometric devices. The Scientific World Journal, 11:520-528.
  24. Lopes, P., Lavoie, R., Faldu, R., Aquino, N., Barron, J., Kante, M., and (advisor, W. M. (2012). Icraft eyecontrolled robotic feeding arm technology members.
  25. Lukic, L., Santos-Victor, J., and Billard, A. (2014). Learning robotic eye-arm-hand coordination from human demonstration: A coupled dynamical systems approach. Biol. Cybern., 108(2):223-248.
  26. Mulvey, F., Villanueva, A., Sliney, D., Lange, R., Cotmore, S., and Donegan, M. (2008). Exploration of safety issues in eyetracking. Technical Report IST-2003- 511598, COGAIN EU Network of Excellence.
  27. Reichelt, S., Haussler, R., Fütterer, G., and Leister, N. (2010). Depth cues in human visual perception and their realization in 3D displays. In Three Dimensional Imaging, Visualization, and Display 2010, pages 76900B-76900B-12.
  28. Rogers, A. (1988). Mosby's guide to physical examination. Journal of anatomy, 157:235.
  29. Schaeffel, F., Wilhelm, H., and Zrenner, E. (1993). Interindividual variability in the dynamics of natural accommodation in humans: relation to age and refractive errors. The Journal of Physiology, 461(1):301- 320.
  30. Shih, S.-W. and Liu, J. (2004). A novel approach to 3-d gaze tracking using stereo cameras. IEEE Transactions on Syst. Man and Cybern., part B, 34:234-245.
  31. Sunday, D. (2012). Distance between 3d lines & segments.
  32. Swirski, L. and Dodgson, N. (2013). A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. Proc. PETMEI.
  33. Tafaj, E., Kasneci, G., Rosenstiel, W., and Bogdan, M. (2012). Bayesian online clustering of eye movement data. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA 7812, pages 285-288. ACM.
  34. Wang, R. I., Pelfrey, B., Duchowski, A. T., and House, D. H. (2014). Online 3D Gaze Localization on Stereoscopic Displays. ACM Transactions on Applied Perception, 11(1):1-21.

Paper Citation

in Harvard Style

Gutierrez Mlot E., Bahmani H., Wahl S. and Kasneci E. (2016). 3D Gaze Estimation using Eye Vergence . In Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies - Volume 5: HEALTHINF, (BIOSTEC 2016) ISBN 978-989-758-170-0, pages 125-131. DOI: 10.5220/0005821201250131

in Bibtex Style

author={Esteban Gutierrez Mlot and Hamed Bahmani and Siegfried Wahl and Enkelejda Kasneci},
title={3D Gaze Estimation using Eye Vergence},
booktitle={Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies - Volume 5: HEALTHINF, (BIOSTEC 2016)},

in EndNote Style

JO - Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies - Volume 5: HEALTHINF, (BIOSTEC 2016)
TI - 3D Gaze Estimation using Eye Vergence
SN - 978-989-758-170-0
AU - Gutierrez Mlot E.
AU - Bahmani H.
AU - Wahl S.
AU - Kasneci E.
PY - 2016
SP - 125
EP - 131
DO - 10.5220/0005821201250131