VISUAL HAND MOVEMENTS FOR MACHINE CONTROL

Sanjay Kumar, Dinesh Kant Kumar, Arun Sharma

2004

Abstract

A new technique for automated classification of human hand gestures for robotics and computer control applications is presented. It uses view-based approach for representation, and statistical technique for classification. This approach uses a cumulative image-difference technique where the time between the sequences of images is implicitly captured in the representation of action. This results in the construction of Temporal History Templates (THTs). These THT’s are used to compute the 7 Hu image moments that are invariant to scale, rotation and translation. The recognition criterion is established using K-nearest neighbor (K-NN) mahalanobis distance. The preliminary experiments show that such a system can classify human hand gestures with a classification accuracy of 92%. Our research has been carried on in the robotics framework. The overall goal of our research is to test for accuracy of the recognition of hand gestures using this computationally inexpensive way of dimensionality-reduced representation of gestures for its suitability for robotics.

References

  1. Aaron F. Bobick, J. W. D. (2001) The Recognition Of Human Movements Using Temporal Templates. IEEE - Pattern Analysis and Machine Intelligence, 23 No 3, 257-267.
  2. Akita, K. Pattern recognition, 1984, Image sequence analysis of real world human motion. 17 (No.1), 73- 83.
  3. Arun Sharma, D. K. K., Sanjay Kumar, Neil McLachalan (WITSP'2002.). Representation and Classification of Human Movement Using Temporal Templates and Statistical Measure of Similarity Workshop On Internet Telecommunications and Signal Processing. 2002. Wollongong, Sydney Australia.
  4. Baudel, T., Beaudouin-Lafon, M (1993) Charade: remote control of objects using free hand gestures CACM, 1993, 28 -35.
  5. Boehme, H.-J. (Sep. 1719, 1997) International Gesture workshop Bielefeld, Germany, 2 Neural Architecture for Gesture-based Human-Machine-Interaction”. 13232.
  6. Caroline Hummels, G. S., and Kees Overbeeke (Sep. 1719,1997) An Intuitive two-hands gestural interface for computer support product design International Gesture workshop Bielefeld, Germany, 197208.
  7. Davis, J. a. A. B. (November 1998) Virtual PAT: a virtual personal aerobics trainer Proc. Perceptual User Interfaces.
  8. Davis, J. S., M (April 1994) Visual gesture recognition. Vision, Image and Signal Processing IEE Proceedings Vision, Image and Signal Processing IEE Proceedings,, 141(Issue: 2), 101 -106.
  9. Hinton, S. S. F. a. G. E. (Jan 1993) "Glove-talk: a neural network interface between a data-glove and a speech synthesizer IEEE Trans. on Neural Networks,, vol-4, 2--8.
  10. Hu (1962) Visual Pattern Recognition By Moment Invariants IEEE - Pattern Transaction On Information Theory, 8(2), 179-187.
  11. Ivan Laptev and Tony Lindeberg (2001). Jochen Triesch and Christoph von der Malsburg. Tracking of multistage hand models using particle filtering and a hierarchy of multi-scale image feature
  12. Lafon, T. B. a. M. B. (July, 1996) Remote control of objects using free-hand gestures”,. Communications of ACM Communications of ACM, 3, 36(7), 2835.
  13. Lars Bretzner, I. L., Tiny Lindeberg, Soren and Lenman, a. Y. S. (Jan 1997) A prototype system for computer vision based human computer interaction Proc of the workshop on imaging and neural networks Little, J., and J. Boyd (November 1995) Describing motion for recognition International Symposium on Computer Vision,, 235-240.
  14. Moy, M. C. (July 18-22 1999) Gesture-based Interaction with a Pet Robot Proceedings of 6th National Conference on Artificial Intelligence and 11th Conference on Innovative Applications of Artificial Intelligence Orlando, Florida, USA, 628-633.
  15. N Ma, D. K. K., N D Pah ANZIIS 2001 Classification of Hand Direction using Multi-Channel Electromyography by Neural Networks PR106.
  16. Pentland, I. E. a. A. (July 1997) Coding, Analysis, Interpretation, and Recognition of Facial Expressions IEEE Trans. Pattern Analysis and Machine Intelligence, 19, no. 7, 757-763.
  17. Poole, E. D. K. K. IEEE EMBS 2002, Classification of EOG for Human Computer Interface USA.
  18. Sanjay Kumar, A. S., Dinesh Kant Kumar, Neil McLachlan (2002) Classification of Visual Hand Gestures Using Difference of Frames Proc. of the Int. Conf. on Imaging Science and Technology, Las Vegas, Nevada, USA , CISST'02. 2002. Las Vegas, USA: (CSREA Press, 2002).
  19. Starner, T. P., A. (1995) Visual Recognition of American Sign Language Using Hidden Markov Models Proc. Intl Workshop on Automated Face and Gesture Recognition Zurich, 1995.
  20. Sturman, D. J. Z., D (Jan, 1994) A survey of glove-based input ".14 (Issue: 1), 30-39.
  21. Terence Fong, F. C., (June 1996) Sebastien Grange, and Charles Baur Novel interfaces for remote driving: gesture, haptic and PDA”, 11th IEEE conference on Image Analysis
Download


Paper Citation


in Harvard Style

Kumar S., Kumar D. and Sharma A. (2004). VISUAL HAND MOVEMENTS FOR MACHINE CONTROL . In Proceedings of the First International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO, ISBN 972-8865-12-0, pages 215-220. DOI: 10.5220/0001147102150220


in Bibtex Style

@conference{icinco04,
author={Sanjay Kumar and Dinesh Kant Kumar and Arun Sharma},
title={VISUAL HAND MOVEMENTS FOR MACHINE CONTROL},
booktitle={Proceedings of the First International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,},
year={2004},
pages={215-220},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001147102150220},
isbn={972-8865-12-0},
}


in EndNote Style

TY - CONF
JO - Proceedings of the First International Conference on Informatics in Control, Automation and Robotics - Volume 2: ICINCO,
TI - VISUAL HAND MOVEMENTS FOR MACHINE CONTROL
SN - 972-8865-12-0
AU - Kumar S.
AU - Kumar D.
AU - Sharma A.
PY - 2004
SP - 215
EP - 220
DO - 10.5220/0001147102150220