STUDIES ON VISUAL PERCEPTION FOR PERCEPTUAL ROBOTICS

Özer Ciftcioglu, Michael S. Bittermann, I. Sevil Sariyildiz

2006

Abstract

Studies on human visual perception measurement for perceptual robotics are described. The visual perception is mathematically modelled as a probabilistic process obtaining and interpreting visual data from an environment. The measurement involves visual openness perception in virtual reality, which has direct implications for navigation issues of actual autonomous robotics. The perception is quantified by means of a mapping function which converts a distance to an elemental perception estimate. The measurement is carried out with the averaging of the elemental perceptions in real time. This is accomplished by means of exponential averaging. The mapping function parameters are optimized uniquely by means of genetic algorithm approach where the data set for model development consists of a number of perception data samples. These are obtained from individuals who are confronted with a number of scenes and asked for their perceptual openness statements. Based on this data, a perception model is developed for a virtual robot where the simulated vision interaction of the robot with the environment is converted to visual openness estimation through the model output. The model outcome is essential visual information for the navigation of an autonomous perceptual robot.

References

  1. Beetz, M. et al., 2001. Integrated, plan-based control of autonomous robots in Human Environments. In IEEE Intelligent Systems. September-October, pp. 2-11.
  2. Ciftcioglu, Ö., Bittermann, M.S. and Sariyildiz, I.S., 2006. Application of a visual perception model in virtual reality. In Proc. APGV06, Symposium on Applied Perception in Graphics and Visualization, ACM SIGGRAP. July 28-30, Boston, USA.
  3. Florczyk, S., 2005. Robot Vision: Video-based Indoor Exploration with Autonomous and Mobile Robot, Wiley.
  4. Oriolio, G., Ulivi, G. and Vendittelli, M., 1998. Real-time map building and navigation for autonomous robots in unknown environments. In IEEE Trans. Syst., Man, Cybern. - Part B: Cybernetics. 28:3, pp. 316-333.
  5. Song, G.B., Cho, S.B., 2000. Combining incrementally evolved neural networks based on cellular automata for complex adaptive behaviours. ECNN2000, IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks. May 11-13, San Antonio, TX, USA, pp. 121-129.
  6. Surmann, H., Lingemann, K., Nüchter, A. and Hertzberg, J., 2001. A 3D laser range finder for autonomous mobile robots. In Proc. 32nd Intl. Symp. on Robotics (ISR2001). April 19-21, Seoul, Korea, pp. 153-158.
  7. Wang, M., Liu, J. N. K., 2004. On line path searching for autonomous robot navigation. In Proc. IEEE Robotics, Autom. and Mechatronics, Singapore, December 1-2, pp. 746-751.
Download


Paper Citation


in Harvard Style

Ciftcioglu Ö., Bittermann M. and Sariyildiz I. (2006). STUDIES ON VISUAL PERCEPTION FOR PERCEPTUAL ROBOTICS . In Proceedings of the Third International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO, ISBN 978-972-8865-60-3, pages 352-359. DOI: 10.5220/0001221603520359


in Bibtex Style

@conference{icinco06,
author={Özer Ciftcioglu and Michael S. Bittermann and I. Sevil Sariyildiz},
title={STUDIES ON VISUAL PERCEPTION FOR PERCEPTUAL ROBOTICS},
booktitle={Proceedings of the Third International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,},
year={2006},
pages={352-359},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001221603520359},
isbn={978-972-8865-60-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Third International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,
TI - STUDIES ON VISUAL PERCEPTION FOR PERCEPTUAL ROBOTICS
SN - 978-972-8865-60-3
AU - Ciftcioglu Ö.
AU - Bittermann M.
AU - Sariyildiz I.
PY - 2006
SP - 352
EP - 359
DO - 10.5220/0001221603520359