A MULTI-CAMERA FRAMEWORK FOR INTERACTIVE VIDEO GAMES

Tom Cuypers, Cedric Vanaken, Yannick Francken, Frank Van Reeth, Philippe Bekaert

2008

Abstract

We present a framework that allows for a straightforward development of multi-camera controlled interactive video games. Compared to traditional gaming input devices, cameras provide players with many degrees of freedom and a natural kind of interaction. The use of cameras can even obsolete the need for special clothing or other tracking devices. This partly accounted for the success of the currently popular single-camera video games like the Sony Eyetoy. However, these games are fairly limited in the use of 3D scene information. Using multi-camera setups, this limitation can be surpassed, but typically many different image processing and computer vision techniques are involved. Our framework divides multi-camera based games into basic algorithms that are easily combinable into several sequentially executed stages. Therefore the amount of effort to develop new games can significantly be reduced. The capabilities of our framework are demonstrated with a number of conceptual games, proving that multi-camera controlled video games can be created with off-the-shelf hardware.

References

  1. Balcisoy, S. and Thalmann, D. (1997). Interaction between real and virtual humans in augmented reality. In CA 7897: Proceedings of the Computer Animation, page 31, Washington, DC, USA. IEEE Computer Society.
  2. Betke, M., Gips, J., and Fleming, P. (2002). The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities. Neural Systems and Rehabilitation Engineering, IEEE Transactions on [see also IEEE Trans. on Rehabilitation Engineering], 10(1):1-10.
  3. Chen, Y., Su, C., Chen, J., Chen, C., Hung, Y., and Fuh, C. (2001). Video-based eye tracking for autostereoscopic displays. Optical Engineering, 40:2726.
  4. Cheung, S. S. and Kamath, C. (2004). Robust techniques for background subtraction in urban traffic video. Proceedings of Video Communications and Image Processing, 5308:881-892.
  5. Crowley, J. L. (1997). Vision for man machine interaction. Robotics and Autonomous Systems, 19(3-4):347-359.
  6. Crowley, J. L., Coutaz, J., and Brard, F. (2000). Things that see: Machine perception for human computer interaction. ICommunications of the A.C.M., 43(3):55-64.
  7. De Decker, B., Mertens, T., and Bekaert, P. (2007). Interactive collision detection for free-viewpoint video. In GRAPP 2007: Proceedings of the Second International Conference on Computer Graphics Theory and Applications, pages 114-120.
  8. Dubois, E. and Sabri, S. (1984). Noise Reduction in Image Sequences Using Motion-Compensated Temporal Filtering. Communications, IEEE Transactions on [legacy, pre-1988], 32(7):826-831.
  9. Erol, A., Bebis, G., Boyle, R. D., and Nicolescu, M. (2005). Visual hull construction using adaptive sampling. In WACV-MOTION 7805: Proceedings of the Seventh IEEE Workshops on Application of Computer Vision (WACV/MOTION'05) - Volume 1, pages 234- 241, Washington, DC, USA. IEEE Computer Society.
  10. Eyetoy Sony (2003). http://www.eyetoy.com.
  11. Freeman, W. T., Anderson, D. B., Beardsley, P. A., Dodge, C. N., Roth, M., Weissman, C. D., Yerazunis, W. S., Kage, H., Kyuma, K., Miyake, Y., and Tanaka, K. (1998). Computer vision for interactive computer graphics. IEEE Comput. Graph. Appl., 18(3):42-53.
  12. Freeman, W. T., Tanaka, K., Ohta, J., and Kyuma, K. (1996). Computer vision for computer games. In FG 7896: Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG 7896), page 100, Washington, DC, USA. IEEE Computer Society.
  13. Gonzalez, R. C. and Woods, R. E. (1992). Digital Image Processing. Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA.
  14. Gorodnichy, D., Malik, S., and Roth, G. (2002). Nouse Use your nose as a mouse-a new technology for hands-free games and interfaces. Proceedings of International Conference on Vision Interface (VI2002), pages 354- 361.
  15. Guitar Hero Sony (2005). http://www.guitarherogame.com.
  16. Isard, M. and Blake, A. (1998). Condensation - conditional density propagation for visual tracking. International Journal of Computer Vision, 29(1):5-28.
  17. Jaume, A., Varona, J., Gonzalez, M., Mas, R., and Perales, F. J. (2006). Automatic human body modeling for vision-based moion capture. WSCG.
  18. Kalman, R. (1960). A new approach to linear filtering and prediction problems. Transactions of the ASMEJournal of Basic Engineering, 82(Series D):35-45.
  19. Konami (1981). http://www.konami.com.
  20. Kyuma, K., Lange, E., Ohta, J., Hermanns, A., Banish, B., and Oita, M. (1994). Artificial retinas-fast, versatile image processors. Nature, 372(6502):197-198.
  21. Laurentini, A. (1994). The visual hull concept for silhouette-based image understanding. IEEE Trans. Pattern Anal. Mach. Intell., 16(2):150-162.
  22. Li, P., Zhang, T., and Pece, A. (2003). Visual contour tracking based on particle filters. Image Vision Comput., 21(1):111-123.
  23. Magnor, M. A. (2005). Video-Based Rendering. AK Peters Ltd.
  24. Martin, W. and Aggarwal, J. K. (1983). Volumetric descriptions of objects from multiple views. IEEE Transactions on Pattern Analysis and Machine Intelligence, 5(2):150-158.
  25. McIvor, A. M. (2000). Background subtraction techniques.
  26. Metaxas, D. and Terzopoulos, D. (1993). Shape and nonrigid motion estimation through physics-based synthesis. IEEE Trans. Pattern Anal. Mach. Intell., 15(6):580-591.
  27. Mishima, Y. (1993). Soft edge chroma-key generation based upon hexoctahedral color space. U.S. Patent 5,355,174.
  28. Mühlmann, K., Maier, D., Hesser, J., and Männer, R. (2002). Calculating dense disparity maps from color stereo images, an efficient implementation. Int. J. Comput. Vision, 47(1-3):79-88.
  29. Nintendo Gameboy (1998). http://www.gameboy.com.
  30. Nintendo Wii (2006). http://wii.com.
  31. Open Dynamics Engine (2001). http://www.ode.org/.
  32. Parent, R. (2001). Computer animation: algorithms and techniques. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
  33. Point Grey Research (2007). http://www.ptgrey.com.
  34. Rubio, J. M. B., L ópez, F. J. P., Hidalgo, M. G., and Varona, X. (2006). Upper body tracking for interactive applications. AMDO.
  35. Scharstein, D. and Szeliski, R. (2002). A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. Int. J. Comput. Vision, 47(1-3):7-42.
  36. Sega Saturn (1994). http://www.sega.com.
  37. Singstar Sony (2004). http://www.singstargame.com.
  38. Smith, A. R. and Blinn, J. F. (1996). Blue screen matting. In SIGGRAPH 7896: Proceedings of the 23rd annual conference on Computer graphics and interactive techniques, pages 259-268, New York, NY, USA. ACM Press.
  39. Spacek, L. A. (1986). Edge detection and motion detection. Image Vision Computing, 4(1):43-56.
  40. Vezhnevets, V., Sazonov, V., and Andreeva, A. (2003). A survey on pixel-based skin color detection techniques. Graphics and Media Laboratory, Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow, Russia.
  41. Welman, C. (1993). Inverse kinematics and geometric constraints for articulated figure manipulation. Master's thesis, B.Sc. Simon Fraser University.
  42. Willey, M. (1997). Design and implementation of a stroke interface library.
  43. Woetzel, J. and Koch, R. (2004). Real-time multi-stereo depth estimation on gpu with approximative discontinuity. 1st European Conference on Visual Media Production (CVMP 2004), pages 245-254.
  44. Wren, C. R., Azarbayejani, A., Darrell, T., and Pentland, A. P. (1997). Pfinder: Real-time tracking of the human body. IEEE Trans. Pattern Anal. Mach. Intell., 19(7):780-785.
  45. Yang, R. and Pollefeys, M. (2003). Multi-resolution realtime stereo on commodity graphics hardware. cvpr, 01:211.
Download


Paper Citation


in Harvard Style

Cuypers T., Vanaken C., Francken Y., Van Reeth F. and Bekaert P. (2008). A MULTI-CAMERA FRAMEWORK FOR INTERACTIVE VIDEO GAMES . In Proceedings of the Third International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2008) ISBN 978-989-8111-20-3, pages 443-449. DOI: 10.5220/0001096904430449


in Bibtex Style

@conference{grapp08,
author={Tom Cuypers and Cedric Vanaken and Yannick Francken and Frank Van Reeth and Philippe Bekaert},
title={A MULTI-CAMERA FRAMEWORK FOR INTERACTIVE VIDEO GAMES},
booktitle={Proceedings of the Third International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2008)},
year={2008},
pages={443-449},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001096904430449},
isbn={978-989-8111-20-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Third International Conference on Computer Graphics Theory and Applications - Volume 1: GRAPP, (VISIGRAPP 2008)
TI - A MULTI-CAMERA FRAMEWORK FOR INTERACTIVE VIDEO GAMES
SN - 978-989-8111-20-3
AU - Cuypers T.
AU - Vanaken C.
AU - Francken Y.
AU - Van Reeth F.
AU - Bekaert P.
PY - 2008
SP - 443
EP - 449
DO - 10.5220/0001096904430449