Increasing accessibility to the blind of virtual environments, using a virtual mobility aid based on the "EyeCane": Feasibility study

Shachar Maidenbaum, Shelly Levy-Tzedek, Daniel Robert Chebat, Amir Amedi

Research output: Contribution to journalArticlepeer-review

65 Scopus citations

Abstract

Virtual worlds and environments are becoming an increasingly central part of our lives, yet they are still far from accessible to the blind. This is especially unfortunate as such environments hold great potential for them for uses such as social interaction, online education and especially for use with familiarizing the visually impaired user with a real environment virtually from the comfort and safety of his own home before visiting it in the real world. We have implemented a simple algorithm to improve this situation using single-point depth information, enabling the blind to use a virtual cane, modeled on the "EyeCane" electronic travel aid, within any virtual environment with minimal preprocessing. Use of the Virtual-EyeCane, enables this experience to potentially be later used in real world environments with identical stimuli to those from the virtual environment. We show the fast-learned practical use of this algorithm for navigation in simple environments.

Original languageEnglish
Article number0072555
JournalPLoS ONE
Volume8
Issue number8
DOIs
StatePublished - 19 Aug 2013
Externally publishedYes

ASJC Scopus subject areas

  • General Biochemistry, Genetics and Molecular Biology
  • General Agricultural and Biological Sciences
  • General

Fingerprint

Dive into the research topics of 'Increasing accessibility to the blind of virtual environments, using a virtual mobility aid based on the "EyeCane": Feasibility study'. Together they form a unique fingerprint.

Cite this