Visual Echolocation Concept for the Colorophone Sensory Substitution Device Using Virtual Reality

Sensors (Basel). 2021 Jan 1;21(1):237. doi: 10.3390/s21010237.

Abstract

Detecting characteristics of 3D scenes is considered one of the biggest challenges for visually impaired people. This ability is nonetheless crucial for orientation and navigation in the natural environment. Although there are several Electronic Travel Aids aiming at enhancing orientation and mobility for the blind, only a few of them combine passing both 2D and 3D information, including colour. Moreover, existing devices either focus on a small part of an image or allow interpretation of a mere few points in the field of view. Here, we propose a concept of visual echolocation with integrated colour sonification as an extension of Colorophone-an assistive device for visually impaired people. The concept aims at mimicking the process of echolocation and thus provides 2D, 3D and additionally colour information of the whole scene. Even though the final implementation will be realised by a 3D camera, it is first simulated, as a proof of concept, by using VIRCO-a Virtual Reality training and evaluation system for Colorophone. The first experiments showed that it is possible to sonify colour and distance of the whole scene, which opens up a possibility to implement the developed algorithm on a hardware-based stereo camera platform. An introductory user evaluation of the system has been conducted in order to assess the effectiveness of the proposed solution for perceiving distance, position and colour of the objects placed in Virtual Reality.

Keywords: 3D camera; 3D scene sonification; auditory SSD; colour sonification; distance sonification; stereo vision; virtual reality.

MeSH terms

  • Animals
  • Blindness
  • Echolocation*
  • Humans
  • Male
  • Self-Help Devices
  • Virtual Reality*
  • Visually Impaired Persons