Authors:
Hiromasa Yoshimoto
and
Yuichi Nakamura
Affiliation:
Kyoto University, Japan
Keyword(s):
Cooperative Recognition, Gesture Interface, and Usability.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Computer Vision, Visualization and Computer Graphics
;
Image Understanding
;
Pattern Recognition
;
Software Engineering
;
Virtual Environments
Abstract:
This paper introduces a novel scheme of gesture interface that guides the user toward obtaining better performance and usability. The accuracy of gesture recognition is heavily affected by how the user makes postures and moves, as well as environmental conditions such as lighting. The usability of the gesture interface can potentially be improved by notifying the user of when and how better accuracy is obtained. For this purpose, we propose a method for estimating the performance of gesture recognition in its current condition, and a method for suggesting possible ways to improve performance to the user. In performance estimation, accuracy in the current condition is estimated based on supervised learning with a large number of samples and corresponding ground truths. If the estimated accuracy is insufficient, the module searches for better conditions that can be reached with the user’s cooperation. If a good improvement is possible, the way to improve is communicated to the user in
terms of visual feedback, which shows how to avoid or how to recover from the undesirable condition. In this way, users benefit, i.e., better accuracy and usability, by cooperating with the gesture interface.
(More)