TY - GEN
T1 - The transfer of non-visual spatial knowledge between real and virtual mazes via sensory substitution
AU - Chebat, Daniel Robert
AU - Maidenbaum, Shachar
AU - Amedi, Amir
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2017/8/10
Y1 - 2017/8/10
N2 - Many attempts are being made to ease navigation for people who are blind, both in terms of spatial learning and of navigation. One promising approach is the use of virtual environments for safe and versatile training. While it is known that humans can transfer non-visual spatial knowledge between real and virtual environments, limitations of these studies typically include results obtained mainly in simple environments, using mainly blindfolded-sighted participants and different methods of sensory input for real and virtual environments. In this study, participants with a wide range of visual experience use the EyeCane and Virtual EyeCane to solve complex Hebb-Williams mazes in real and virtual environments. The EyeCane and its virtual counterpart are minimalistic sensory substitution devices that code single-point distance information into sound. We test whether participants improve performance in the real-to-virtual sequence: Solve a real maze and subsequently improve performance in the virtual maze. We also test whether participants can sole a virtual maze and subsequently improve performance in the virtual world: The virtual-to-real sequence. We find that participants can use sensory substitution guided navigation to extract spatial information from the virtual world and apply it to significantly improve their behavioral performance in the real world and vice versa. Our results demonstrate transfer in both direction, strengthening and extending the existing literature in terms of complexity, parameters, input-matching and varying levels of visual experience.
AB - Many attempts are being made to ease navigation for people who are blind, both in terms of spatial learning and of navigation. One promising approach is the use of virtual environments for safe and versatile training. While it is known that humans can transfer non-visual spatial knowledge between real and virtual environments, limitations of these studies typically include results obtained mainly in simple environments, using mainly blindfolded-sighted participants and different methods of sensory input for real and virtual environments. In this study, participants with a wide range of visual experience use the EyeCane and Virtual EyeCane to solve complex Hebb-Williams mazes in real and virtual environments. The EyeCane and its virtual counterpart are minimalistic sensory substitution devices that code single-point distance information into sound. We test whether participants improve performance in the real-to-virtual sequence: Solve a real maze and subsequently improve performance in the virtual maze. We also test whether participants can sole a virtual maze and subsequently improve performance in the virtual world: The virtual-to-real sequence. We find that participants can use sensory substitution guided navigation to extract spatial information from the virtual world and apply it to significantly improve their behavioral performance in the real world and vice versa. Our results demonstrate transfer in both direction, strengthening and extending the existing literature in terms of complexity, parameters, input-matching and varying levels of visual experience.
KW - Acquired Blindness
KW - Congenital Blindness
KW - Environmental Rehabilitation
KW - Low Vision
KW - Maze Learning
KW - Perceptual Learning
KW - Sensory Substitution
KW - Spatial knowledge
KW - Virtual Reality
KW - Visual Rehabilitation
KW - assistive technology
KW - blind
UR - http://www.scopus.com/inward/record.url?scp=85034243902&partnerID=8YFLogxK
U2 - 10.1109/ICVR.2017.8007542
DO - 10.1109/ICVR.2017.8007542
M3 - Conference contribution
AN - SCOPUS:85034243902
T3 - International Conference on Virtual Rehabilitation, ICVR
BT - 2017 International Conference on Virtual Rehabilitation, ICVR 2017
PB - Institute of Electrical and Electronics Engineers
T2 - 2017 International Conference on Virtual Rehabilitation, ICVR 2017
Y2 - 19 June 2017 through 22 June 2017
ER -