
 
 
Figure 5: The SUS score reported by the users. 
quickly learn how to activate the new interface and 
efficiently use it to operate a PC. Our future research 
goals include: 1. Extend the usability study to 
severely handicapped users. 2. Test MindDesktop’s 
performance and user satisfaction against other 
solutions. 3. Explore different keyboard layouts to 
increase user satisfaction and maximize efficiency. 
REFERENCES 
Barreto, A. B., Scargle, S. D. and M. Adjouadi, 1999. A 
real-time assistive computer interface for users with 
motor disabilities. ACM SIGCAPH, 64, 6–16. 
Betke, M., Gips, J. and P. Fleming, 2002. The camera 
mouse: visual tracking of body features to provide 
computer access for people with severe disabilities. 
IEEE TNSRE, 10, 1–10. 
Bradski, G. R., 1998 Computer vision face tracking as a 
component of a perceptual user interface. IEEE 
WACV, 214–219. 
Brooke, J., 1996 Sus: a ”quick and dirty” usability scale. 
In P. W. Jordan, B. Thomas, B. A. Weerdmeester, and 
A. L. McClelland, editors, Usability Evaluation in 
Industry. Taylor and Francis, London. 
Doherty, E., Stephenson, G. and Engel, W., 2000. Using a 
cyberlink mental interface for relaxation and 
controlling a robot. ACM SIGCAPH, 68, 4–9. 
Emotiv. brain computer interface technology. http:// 
www.emotiv.com/. 
Felzer, T. and Freisleben, B., 2002.  Brainlink: A software 
tool supporting the development of an EEG-based 
brain-computer interface. METMBS ’03, 329–335. 
Goncharova, I., McFarland, D., Vaughan, T. and Wolpaw, 
J., 2003. EMG contamination of EEG: spectral and 
topographical characteristics. Clinical 
neurophysiology, 114, 9, 1580–1593. 
Jacob, R. J. K., 1991. The use of eye movements in 
human-computer interaction techniques: what you 
look at is what you get. ACM TOIS, 9, 2, 152–169. 
Lileg, E., Wiesspeiner, G. and Hutten, H., 1999. 
Evaluation of the EOG for communication through 
eye movements. ECEM 10. 
Markand, O. N., 1976. Electroencephalogram in locked-in 
syndrome.  Electroencephalography and Clinical 
Neurophysiology, 40, 5, 529–534. 
Matsumotot, Y. Ino, T. and Ogsawara, T., 2001. 
Development of intelligent wheelchair system with 
face and gaze based interface. In Proceedings of the 
10th IEEE International Workshop on Robot-Human 
Interactive Communication, 262–267. 
Mauri, C., Granoolers, T., Lores, J. M. and Garcia, M., 
2006. Computer vision interaction for people with 
severe movement restrictions. Human Technology, 2, 
1, 38–54. 
Neuper, C., ller, G. R. M., Kübler, A., Birbaumer, N. and 
Pfurtscheller, G., 2003. Clinical application of an 
EEG-based brain-computer interface: a case study in a 
patient with severe motor impairment. Clinical 
Neurophysiology, 114, 3, 399–409. 
Obrenovic, Z., Abascal, J. and Starcevic, D., 2007. 
Universal accessibility as a multimodal design issue. 
Communication of the ACM, 50, 5, 83–88. 
http://www.ise.bgu.ac.il/faculty/liorr/MindDesktop1.zip 
(password: bgulab). 
Päivi, M. and Kari-Jouko, R., 2002. Twenty years of eye 
typing: systems and design issues. ACM ETRA ’02, 
15–22. 
Plotkin, A., Sela, L., Weissbrod, A., Kahana, R., Haviv, 
L., Yeshurun, Y., Soroker, N. and Sobel, N., 2010. 
Sniffing enables communication and environmental 
control for the severely disabled. PNAS, 107, 32, 
14413–14418. 
Smith, E. and Delargy, M., 2005. Locked-in syndrome. 
BMJ, 330, 406–409. 
Varona, J., Manresa-Yee, M. and Perales, F. J., 2008. 
Hands-free vision-based interface for computer 
accessibility. JNCA, 31, 4, 357–374. 
Wolpaw, J. R., Birbaumer, N., McFarland, D. J. and 
Pfurtscheller, T. M. V. G., 2002. Brain-computer 
interfaces for communication and control. Clinical 
Neurophysiology, 113, 767–791. 
ICEIS 2011 - 13th International Conference on Enterprise Information Systems
320