Braille Vision Using Braille Display and Bio-inspired Camera

Roman Graf, Ross King, Ahmed Nabil Belbachir

2014

Abstract

This paper presents a system for Braille learning support using real-time panoramic views generated from the novel smart panorama camera 360SCAN. The system makes use of the modern image processing libraries and state-of-the-art features extraction and clustering methods. We compare the real-time frames recorded by the bio-inspired camera to the reference images in order to determine particular figures. One contribution of the proposed method is that image edges can be transformed to the presentation on Braille display directly without any image processing. It is possible due to the bio-inspired construction of camera sensor. Another contribution is that our approach provides Braille users with images recorded from natural scenes. We conducted several experiments that verify the methods that demonstrate learning figures captured by the smart camera. Our goal is to process such images and present them on the Braille Display in a form appropriate for visually impaired people. All evaluations were performed in the natural environment with ambient illumination of 200 lux, which demonstrates high camera reliability in difficult light conditions. The system can be optimized by applying additional filters and features algorithms and by decreasing the rotational speed of the camera. The presented Braille learning support system is a building block for a rich and qualitative educational system for the efficient information transfer focused on visually impaired people.

References

  1. Belbachir, A., Mayerhofer, M., Matolin, D., and Colineau, J. (2012). Real-time 360 panoramic views using bica360, the fast rotating dynamic vision sensor to up to 10 rotations per sec. In Proceedings of the Circuits and Systems (ISCAS), 2012 IEEE International Symposium on, pages 727-730.
  2. Cinque, L., Lombardi, L., and Manzini, G. (1998). A multiresolution approach for page segmentation. Pattern Recognition Letters, 19(2):217 - 225.
  3. Epshtein, B., Ofek, E., and Wexler, Y. (2010). Detecting text in natural scenes with stroke width transform. In Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, pages 2963 -2970.
  4. Erp, J., Kyung, K.-U., Kassner, S., Carter, J., Brewster, S., Weber, G., and Andrew, I. (2010). Setting the standards for haptic and tactile interactions: Isos work. In Kappers, A., Erp, J., Bergmann Tiest, W., and Helm, F., editors, Haptics: Generating and Perceiving Tangible Sensations, volume 6192 of Lecture Notes in Computer Science, pages 353-358. Springer Berlin Heidelberg.
  5. Huber-Mörk, R. and Schindler, A. (2012). Quality assurance for document image collections in digital preservation. In Proc. of the 14th Intl. Conf. on ACIVS (ACIVS 2012), LNCS, Brno, Czech Republic. Springer.
  6. Jiménez, J., Olea, J., Torres, J., Alonso, I., Harder, D., and Fischer, K. (2009). Biography of louis braille and invention of the braille alphabet. In Survey of Ophthalmology, pages 142-149.
  7. Lowe, D. G. (2004). Distinctive image features from scale-invariant keypoints. Int. J. of Comput. Vision, 60(2):91-110.
  8. Matschulat, G. D. (1999). Tactile reading device. Number EP0911784.
  9. Prescher, D., Weber, G., and Spindler, M. (2010). A tactile windowing system for blind users. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 7810, pages 91-98, New York, NY, USA. ACM.
  10. Salehi, S. (2006). Imagemagick tricks, web image effects from the command line and php. page 226, Olton, Birmingham, B27 6PA, UK. Packt Publishing Ltd.
  11. Zeng, L. and Weber, G. (2010). Collaborative accessibility approach in mobile navigation system for visually impaired. In Virtuelle Enterprises, Communities and Social Networks, Workshop GeNeMe 7810, TU Dresden 07./08.10.2010, pages 183-192.
  12. Zeng, L. and Weber, G. (2011). Accessible maps for the visually impaired. In Proc. ADDW 2011, CEUR, volume 792, pages 54-60.
Download


Paper Citation


in Harvard Style

Graf R., King R. and Belbachir A. (2014). Braille Vision Using Braille Display and Bio-inspired Camera . In Proceedings of the 6th International Conference on Computer Supported Education - Volume 3: CSEDU, ISBN 978-989-758-022-2, pages 214-219. DOI: 10.5220/0004949302140219


in Bibtex Style

@conference{csedu14,
author={Roman Graf and Ross King and Ahmed Nabil Belbachir},
title={Braille Vision Using Braille Display and Bio-inspired Camera},
booktitle={Proceedings of the 6th International Conference on Computer Supported Education - Volume 3: CSEDU,},
year={2014},
pages={214-219},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004949302140219},
isbn={978-989-758-022-2},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 6th International Conference on Computer Supported Education - Volume 3: CSEDU,
TI - Braille Vision Using Braille Display and Bio-inspired Camera
SN - 978-989-758-022-2
AU - Graf R.
AU - King R.
AU - Belbachir A.
PY - 2014
SP - 214
EP - 219
DO - 10.5220/0004949302140219