Sotirios Ch. Diamantas, Anastasios Oikonomidis, Richard M. Crowder


In this paper a novel biologically inspired method is addressed for the robot homing problem where a robot returns to its home position after having explored an a priori unknown environment. The method exploits the optical flow patterns of the landmarks and based on a training data set a probability is inferred between the current snapshot and the snapshots stored in memory. Optical flow, which is not a property of landmarks like color, shape, and size but a property of the camera motion, is used for navigating a robot back to its home position. In addition, optical flow is the only information provided to the system while parameters like position and velocity of the robot are not known. Our method proves to be effective even when the snapshots of the landmarks have been taken from varying distances and velocities.


  1. Argyros, A. A., Bekris, C., Orphanoudakis, S. C., and Kavraki, L. E. (2005). Robot homing by exploiting panoramic vision. Journal of Autonomous Robots, 19(1):7-25.
  2. Avargues-Weber, A., Portelli, G., Benard, J., Dyer, A., and Giurfa, M. (2009). Configural processing enables discrimination and categorization of face-like stimuli in honeybees. Journal of Experimental Biology, 213(4):593-601.
  3. Barron, J. L., Fleet, D. J., and Beauchemin, S. S. (1994). Performance of optical flow techniques. International Journal of Computer Vision, 12(1):43-77.
  4. Blackburn, M. and Nguyen, H. (1995). Vision based autonomous robot navigation: Motion segmentation. In Proceedings for the Dedicated Conference on Robotics, Motion, and Machine Vision in the Automotive Industries, pages 353-360, Stuttgart, Germany.
  5. Camus, T., Coombs, D., Herman, M., and Hong, T.-S. (1996). Real-time single-workstation obstacle avoidance using only wide-field flow divergence. In Proceedings of the 13th International Conference on Pattern Recognition, volume 3.
  6. Cartwright, B. and Collett, T. S. (1987). Landmark maps for honeybees. Biological Cybernetics, 57(1-2):85-93.
  7. Cartwright, B. A. and Collett, T. S. (1983). Landmark learning in bees. Journal of Comparative Physiology A, 151(4):521-543.
  8. Collett, T. (2002). Insect vision: Controlling actions through optic flow. Current Biology, 12(18):615-617.
  9. Diamantas, S. C. (2010). Biological and Metric Maps Applied to Robot Homing. PhD thesis, School of Electronics and Computer Science, University of Southampton.
  10. Diamantas, S. C., Oikonomidis, A., and Crowder, R. M. (2010). Towards optical flow-based robotic homing. In Proceedings of the International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence), pages 1-9, Barcelona, Spain.
  11. Esch, H., Zhang, S., Srinivasan, M., and Tautz, J. (2001). Honeybee dances communicate distances measured by optic flow. Nature, 411(6837):581-583.
  12. Frenz, H. and Lappe, M. (2005). Absolute travel distance from optic flow. Vision Research, 45(13):1679-1692.
  13. Gibson, J. J. (1974). The Perception of the Visual World. Greenwood Publishing Group, Santa Barbara, CA, USA.
  14. Hafner, V. V. (2004). Adaptive navigation strategies in biorobotics: Visual homing and cognitive mapping in animals and machines. Shaker Verlag GmbH, Aachen.
  15. Hrabar, S., Sukhatme, G., Corke, P., Usher, K., and Roberts, J. (2005). Combined optic-flow and stereo-based navigation of urban canyons for a UAV. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 302-309.
  16. Kendoul, F., Fantoni, I., and Nonami, K. (2009). Optic flowbased vision system for autonomous 3D localization and control of small aerial vehicles. Robotics and Autonomous Systems, 57(6-7):591-602.
  17. Klein, J. (2002). Breve: A 3D simulation environment for the simulation of decentralized systems and artificial life. In Proceedings of Artificial Life VIII, the 8th International Conference on the Simulation and Synthesis of Living Systems.
  18. Kral, K. and Poteser, M. (1997). Motion parallax as a source of distance information in locusts and mantids. Journal of Insect Behavior, 10(1):145-163.
  19. Lambrinos, D., Moller, R., Labhart, T., Pfeifer, R., and Wehner, R. (2000). A mobile robot employing insect strategies for navigation. Robotics and Autonomous Systems, special issue: Biomimetic Robots, 30(1- 2):39-64.
  20. Langer, M. and Mann, R. (2003). Optical snow. International Journal of Computer Vision, 55(1):55-71.
  21. Lucas, B. D. and Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI), August 24-28, pages 674-679.
  22. Madjidi, H. and Negahdaripour, S. (2006). On robustness and localization accuracy of optical flow computation for underwater color images. Computer Vision and Image Understanding, 104(1):61-76.
  23. MATLAB (2007).
  24. Matsuno, F. and Tadokoro, S. (2004). Rescue robots and systems in Japan. In IEEE International Conference on Robotics and Biomimetics, pages 12-20.
  25. Merrell, P. C., Lee, D.-J., and Beard, R. (2004). Obstacle avoidance for unmanned air vehicles using optical flow probability distributions. Sensing and Perception, 5609:13-22.
  26. Ohnishi, N. and Imiya, A. (2007). Corridor navigation and obstacle avoidance using visual potential for mobile robot. In Proceedings of the Fourth Canadian Conference on Computer and Robot Vision, pages 131-138.
  27. OpenCV (2008).
  28. Schuster, S., Strauss, R., and Gotz, K. G. (2002). Virtualreality techniques resolve the visual cues used by fruit flies to evaluate object distances. Current Biology, 12(18):1591-1594.
  29. Simpson, W. (1993). Optic flow and depth perception. Spatial Vision, 7(1):35-75.
  30. Tautz, J., Zhang, S., Spaethe, J., Brockmann, A., Si, A., and Srinivasan, M. (2004). Honeybee odometry: Performance in varying natural terrain. Plos Biology, 2(7):915-923.
  31. Vardy, A. (2005). Biologically Plausible Methods for Robot Visual Homing. PhD thesis, School of Computer Science, Carleton University.
  32. von Frisch, K. (1993). The Dance Language and Orientation of Bees. Harvard University Press, Cambridge, MA, USA.
  33. Warren, W. and Fajen, B. R. (2004). From optic flow to laws of control. In Vaina, L. M., Beardsley, S. A., and Rushton, S. K., editors, Optic Flow and Beyond, pages 307-337. Kluwer Academic Publishers.
  34. Zufferey, J.-C., Beyeler, A., and Floreano, D. (2008). Optic flow to control small UAVs. In IEEE/RSJ International Conference on Intelligent Robots and Systems: Workshop on Visual Guidance Systems for small autonomous aerial vehicles, Nice, France.

Paper Citation

in Harvard Style

Ch. Diamantas S., Oikonomidis A. and M. Crowder R. (2011). BIOLOGICALLY INSPIRED ROBOT NAVIGATION BY EXPLOITING OPTICAL FLOW PATTERNS . In Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2011) ISBN 978-989-8425-47-8, pages 645-652. DOI: 10.5220/0003377706450652

in Bibtex Style

author={Sotirios Ch. Diamantas and Anastasios Oikonomidis and Richard M. Crowder},
booktitle={Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2011)},

in EndNote Style

JO - Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2011)
SN - 978-989-8425-47-8
AU - Ch. Diamantas S.
AU - Oikonomidis A.
AU - M. Crowder R.
PY - 2011
SP - 645
EP - 652
DO - 10.5220/0003377706450652