loading
Documents

Research.Publish.Connect.

Paper

Authors: Sergey Lobov 1 ; Nadia Krilova 1 ; Innokentiy Kastalskiy 1 ; Victor Kazantsev 1 and Valeri Makarov 2

Affiliations: 1 Lobachevsky State University of Nizhny Novgorod, Russian Federation ; 2 Lobachevsky State University of Nizhny Novgorod, Instituto de Matemática Interdisciplinar, Applied Mathematics Dept. and Universidad Complutense de Madrid, Russian Federation

ISBN: 978-989-758-204-2

Keyword(s): Electromyography, Human-Computer Interface, Pattern Classification, Artificial Neural Networks.

Related Ontology Subjects/Areas/Topics: Artificial Intelligence ; Biomedical Engineering ; Biomedical Signal Processing ; Data Manipulation ; EMG Signal Processing and Applications ; Health Engineering and Technology Applications ; Human-Computer Interaction ; Methodologies and Methods ; Neural Rehabilitation ; Neurocomputing ; NeuroSensing and Diagnosis ; Neurotechnology, Electronics and Informatics ; Pattern Recognition ; Physiological Computing Systems ; Sensor Networks ; Soft Computing ; Users’ Perception and Experience on Technologies

Abstract: Surface electromyographic (sEMG) signals represent a superposition of the motor unit action potentials that can be recorded by electrodes placed on the skin. Here we explore the use of an easy wearable sEMG bracelet for a remote interaction with a computer by means of hand gestures. We propose a human-computer interface that allows simulating “mouse” clicks by separate gestures and provides proportional control with two degrees of freedom for flexible movement of a cursor on a computer screen. We use an artificial neural network (ANN) for processing sEMG signals and gesture recognition both for mouse clicks and gradual cursor movements. At the beginning the ANN goes through an optimized supervised learning using either rigid or fuzzy class separation. In both cases the learning is fast enough and requires neither special measurement devices nor specific knowledge from the end-user. Thus, the approach enables building of low-budget user-friendly sEMG solutions. The interface w as tested on twelve healthy subjects. All of them were able to control the cursor and simulate mouse clicks. The collected data show that at the beginning users may have difficulties that are reduced with the experience and the cursor movement by hand gestures becomes smoother, similar to manipulations by a computer mouse. (More)

PDF ImageFull Text

Download
Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 54.198.96.198

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Lobov S., Krilova N., Kastalskiy I., Kazantsev V. and Makarov V. (2016). A Human-Computer Interface based on Electromyography Command-Proportional Control.In Proceedings of the 4th International Congress on Neurotechnology, Electronics and Informatics - Volume 1: NEUROTECHNIX, ISBN 978-989-758-204-2, pages 57-64. DOI: 10.5220/0006033300570064

@conference{neurotechnix16,
author={Sergey Lobov and Nadia Krilova and Innokentiy Kastalskiy and Victor Kazantsev and Valeri Makarov},
title={A Human-Computer Interface based on Electromyography Command-Proportional Control},
booktitle={Proceedings of the 4th International Congress on Neurotechnology, Electronics and Informatics - Volume 1: NEUROTECHNIX,},
year={2016},
pages={57-64},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006033300570064},
isbn={978-989-758-204-2},
}

TY - CONF

JO - Proceedings of the 4th International Congress on Neurotechnology, Electronics and Informatics - Volume 1: NEUROTECHNIX,
TI - A Human-Computer Interface based on Electromyography Command-Proportional Control
SN - 978-989-758-204-2
AU - Lobov S.
AU - Krilova N.
AU - Kastalskiy I.
AU - Kazantsev V.
AU - Makarov V.
PY - 2016
SP - 57
EP - 64
DO - 10.5220/0006033300570064

Login or register to post comments.

Comments on this Paper: Be the first to review this paper.