Virtual Touch Screen “VIRTOS” - Implementing Virtual Touch Buttons and Virtual Sliders using a Projector and Camera

Takashi Homma, Katsuto Nakajima

2014

Abstract

We propose a large interactive display with virtual touch buttons and sliders on a pale-colored flat wall. Our easy-to-install system consists of a front projector and a single commodity camera. A button touch is detected based on the area of the shadow cast by the user’s hand; this shadow becomes very small when the button is touched. The shadow area is segmented by a brief change of the button to a different color when a large foreground (i.e., the hand and its shadow) covers the button region. Therefore, no time consuming operations, such as morphing or shape analysis, are required. Background subtraction is used to extract the foreground region. The reference image for the background is continuously adjusted to match the ambient light. Our virtual slider is based on this touch-button mechanism. When tested, our scheme proved robust to differences in illumination. The response time for touch detection was about 150 ms. Our virtual slider has a quick response and proved suitable as a controller for a Breakout-style game.

References

  1. Audet, S., Okutomi, M., and Tanaka, M. (2012). Augmenting Moving Planar Surfaces Interactively with Video Projection and a Color Camera. IEEE Virtual Reality (VRW 7812), pages 111-112.
  2. Borkowski, S., Letessier, J., and Crowley, J. L. (2004). Spatial Control of Interactive Surfaces in an Augmented Environment. EHCI/DS-VIS Lecture Notes in Computer Science, vol. 3425, pages 228-244.
  3. Borkowski, S., Letessier, J., Bérard, F., and Crowley, J.L. (2006). User-Centric Design of a Vision System for Interactive Applications. IEEE Conf. on Computer Vision Systems (ICVS 7806), pages 9.
  4. Brutzer, S., Höferlin, B., and Heidemann, G. (2011). Evaluation of Background Subtraction Techniques for Video Surveillance. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR 7811), pages 1937- 1944.
  5. Dai, J. and Chung, R. (2012). Making any planar surface into a touch-sensitive display by a mere projector and camera. IEEE Conf. on Computer Vision and Pattern Recognition Workshops (CVPRW 7812), pages 35-42.
  6. Fujiwara, T. and Iwatani, Y. (2011). Interactions with a Line-Follower: an Interactive Tabletop System with a Markerless Gesture Interface for Robot Control. IEEE Conf. on Robotics and Biomimetics (ROBIO 7811), pages 2037-2042.
  7. Hilario, M. N. and Cooperstock, J. R. (2004). Occlusion Detection for Front-Projected Interactive Displays. 2nd International Conf. on Pervasive Computing and Advances in Pervasive Computing, Austrian Computer Society.
  8. Kale, A., Kenneth, K., and Jaynes, C. (2004). Epipolar Constrained User Pushbutton Selection in Projected Interfaces. IEEE Conf. on Computer Vision and Pattern Recognition Workshops (CVPRW 7804), pages 156-163.
  9. Kim, S., Takahashi, S., and Tanaka, J. (2010). New Interface Using Palm and Fingertip without Marker for Ubiquitous Environment. In Proc. of International Conf. on Computer and Information Science (ACIS'10), pages 819-824.
  10. Kjeldsen, R., Pinhanez, C., Pingali, G., and Hartman, J. (2002). Interacting with Steerable Projected Displays. International Conf. on Automatic Face and Gesture Recognition (FGR 7802), pages 402-407.
  11. Lech, M. and Kostek, B. (2010). Gesture-based Computer Control System applied to the Interactive Whiteboard. 2nd International Conf. on Information Technology (ICIT 7810), pages 75-78.
  12. Licsar, A. and Sziranyi, T. (2004). Hand Gesture Recognition in Camera-Projector Systems. Computer Vision in Human-Computer Interaction, Springer, pages 83-93.
  13. Microsoft. (2010). Kinect for X-BOX 360. www: http://www.xbox.com/en-US/kinect.
  14. Park, J. and Kim, M.H. (2010). Interactive Display of Image Details using a Camera-coupled Mobile Projector. IEEE Conf. on Computer Vision and Pattern Recognition Workshops (CVPRW 7810), pages 9-16.
  15. Pinhanez, C., Kjeldsen, R., Levas, A., Pingali, G., Podlaseck, M., and Sukaviriya, N. (2003). Application of Steerable Projector-Camera Systems. In Proc. of International Workshop on Projector-Camera Systems (PROCAMS-2003).
  16. Sato, Y., Oka, K., Koike, H., and Nakanishi, Y. (2004). Video-Based Tracking of User's Motion for Augmented Desk Interface. International Conf. on Automatic Face and Gesture Recognition (FGR 7804), pages 805-809.
  17. Shah, S.A.H., Ahmed, A., Mahmood, I., and Khurshid, K. (2011). Hand gesture based user interface for computer using a camera and projector. IEEE International Conf. on Signal and Image Processing Applications (ICSIPA 7811), pages 168-173.
  18. Song, P., Winkler, S., Gilani, S.O., and Zhou, Z. (2007). Vision-Based Projected Tabletop Interface for Finger Interactions. Human-Computer Interaction, Lecture Notes in Computer Science, vol. 4796, Springer, pages 49-58.
  19. Wilson, D. (2005). Playanywhere: a compact interactive tabletop projection-vision system. Proc. 18th ACM Symposium on User Interface Software and Technology (UIST 7805), pages 83-92.
  20. Winkler, S., Yu, H., and Zhou, Z. (2007). Tangible mixed reality desktop for digital media management. SPIE: vol. 6490.
  21. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI), pages 1330-1334.
Download


Paper Citation


in Harvard Style

Homma T. and Nakajima K. (2014). Virtual Touch Screen “VIRTOS” - Implementing Virtual Touch Buttons and Virtual Sliders using a Projector and Camera . In Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2014) ISBN 978-989-758-009-3, pages 34-43. DOI: 10.5220/0004728600340043


in Bibtex Style

@conference{visapp14,
author={Takashi Homma and Katsuto Nakajima},
title={Virtual Touch Screen “VIRTOS” - Implementing Virtual Touch Buttons and Virtual Sliders using a Projector and Camera},
booktitle={Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2014)},
year={2014},
pages={34-43},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004728600340043},
isbn={978-989-758-009-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 3: VISAPP, (VISIGRAPP 2014)
TI - Virtual Touch Screen “VIRTOS” - Implementing Virtual Touch Buttons and Virtual Sliders using a Projector and Camera
SN - 978-989-758-009-3
AU - Homma T.
AU - Nakajima K.
PY - 2014
SP - 34
EP - 43
DO - 10.5220/0004728600340043