PERCEPT Indoor Wayfinding for Blind and Visually Impaired Users: Navigation Instructions Algorithm and Validation Framework

Yang Tao, Linlin Ding, Sili Wang, Aura Ganz

Abstract

This paper introduces an algorithm that generates indoor navigation instructions for Blind and Visually Impaired (BVI) users as well as a validation framework that validates these instructions. The validation framework which utilizes Unity game engine incorporates: a) a virtual environment that mirrors the physical environment and b) a game avatar that traverses this virtual environment following action code sequences which correspond to the navigation instructions. A navigation instruction from a source to a destination is correct if the game avatar successfully reaches from the source to the destination in the virtual environment following this navigation instruction. We successfully tested the navigation instruction generation algorithm and the validation framework in a large two-story building with 66 landmarks and 1500 navigation instructions. To the best of our knowledge this is the first automated generation instructions algorithm for BVI users in indoor environments, and the first validation framework for navigation instructions in indoor environments. This paper is a significant step towards the development of a cost effective indoor wayfinding solution for BVI users.

References

  1. Ahmetovic, D., Gleason, C., Ruan, C., Kitani, K., Takagi, H., & Asakawa, C. (2016, September). NavCog: a navigational cognitive assistant for the blind. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI'16) (pp. 90-99). ACM.
  2. American Fedration for the Blind. (2016). [Online]. Available at: http://www.afb.org/info/blindnessstatistics/adults/facts-and-figures/235/. [Accessed 19 Nov. 2016].
  3. Basso, S., Frigo, G., & Giorgi, G. (2015, May). A smartphone-based indoor localization system for visually impaired people. In Medical Measurements and Applications (MeMeA), 2015 IEEE International Symposium on Medical Measurements and Applications (pp. 543-548). IEEE.
  4. Doush, I.A., Alshatnawi, S., Al-Tamimi, A.K., Alhasan, B. and Hamasha, S. (2016, June). ISAB: Integrated Indoor Navigation System for the Blind. Interacting with Computers, (2017) 29 (2), (pp. 181-202).
  5. Ganz, A., Schafer, J. M., Tao, Y., Wilson, C., & Robertson, M. (2014, August). PERCEPT-II: Smartphone based indoor navigation system for the blind. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (pp. 3662-3665). IEEE.
  6. Idrees, A., Iqbal, Z., & Ishfaq, M. (2015, June). An efficient indoor navigation technique to find optimal route for blinds using QR codes. In 2015 IEEE 10th Conference on Industrial Electronics and Applications (ICIEA) (pp. 690-695). IEEE.
  7. Jonas, S.M., Sirazitdinova, E., Lensen, J., Kochanov, D., Mayzek, H., de Heus, T., Houben, R., Slijp, H. and Deserno, T.M. (2015). IMAGO: Image-guided navigation for visually impaired people. Journal of Ambient Intelligence and Smart Environments, 7(5), (pp.679-692).
  8. Kim, J. E., Bessho, M., Kobayashi, S., Koshizuka, N., & Sakamura, K. (2016, April). Navigating visually impaired travelers in a large train station using smartphone and bluetooth low energy. In Proceedings of the 31st Annual ACM Symposium on Applied Computing (pp. 604-611). ACM.
  9. Microsoft .netDxf. (2016). [Online]. Available at: https://netdxf.codeplex.com/. [Accessed 18 Feb. 2016].
  10. Nakajima, M., & Haruyama, S. (2013). New indoor navigation system for visually impaired people using visible light communication. EURASIP Journal on Wireless Communications and Networking, 2013(1), (pp. 37).
  11. Riehle, T. H., Anderson, S. M., Lichter, P. A., Whalen, W. E., & Giudice, N. A. (2013, July). Indoor inertial waypoint navigation for the blind. In 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 5187- 5190). IEEE.
  12. Rituerto, A., Fusco, G., & Coughlan, J. M. (2016, October). Towards a Sign-Based Indoor Navigation System for People with Visual Impairments. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 287-288). ACM.
  13. Tao, Y. (2015). Scalable and Vision Free User Interface Approaches for Indoor Navigation Systems for the Visually Impaired.
  14. Unity Technologies. (2016). [Online]. Available at: https://unity3d.com/. [Accessed 19 Nov. 2016].
  15. World Health Organization. (2016). [Online]. Available at: http://www.who.int/mediacentre/factsheets/fs282/en/. [Accessed 20 Nov. 2016].
Download


Paper Citation


in Harvard Style

Tao Y., Ding L., Wang S. and Ganz A. (2017). PERCEPT Indoor Wayfinding for Blind and Visually Impaired Users: Navigation Instructions Algorithm and Validation Framework . In Proceedings of the 3rd International Conference on Information and Communication Technologies for Ageing Well and e-Health - Volume 1: ICT4AWE, ISBN 978-989-758-251-6, pages 143-149. DOI: 10.5220/0006312001430149


in Bibtex Style

@conference{ict4awe17,
author={Yang Tao and Linlin Ding and Sili Wang and Aura Ganz},
title={PERCEPT Indoor Wayfinding for Blind and Visually Impaired Users: Navigation Instructions Algorithm and Validation Framework},
booktitle={Proceedings of the 3rd International Conference on Information and Communication Technologies for Ageing Well and e-Health - Volume 1: ICT4AWE,},
year={2017},
pages={143-149},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0006312001430149},
isbn={978-989-758-251-6},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 3rd International Conference on Information and Communication Technologies for Ageing Well and e-Health - Volume 1: ICT4AWE,
TI - PERCEPT Indoor Wayfinding for Blind and Visually Impaired Users: Navigation Instructions Algorithm and Validation Framework
SN - 978-989-758-251-6
AU - Tao Y.
AU - Ding L.
AU - Wang S.
AU - Ganz A.
PY - 2017
SP - 143
EP - 149
DO - 10.5220/0006312001430149