Colorophone 2.0: A Wearable Color Sonification Device Generating Live Stereo-Soundscapes-Design, Implementation, and Usability Audit

Sensors (Basel). 2021 Nov 5;21(21):7351. doi: 10.3390/s21217351.

Abstract

The successful development of a system realizing color sonification would enable auditory representation of the visual environment. The primary beneficiary of such a system would be people that cannot directly access visual information-the visually impaired community. Despite the plethora of sensory substitution devices, developing systems that provide intuitive color sonification remains a challenge. This paper presents design considerations, development, and the usability audit of a sensory substitution device that converts spatial color information into soundscapes. The implemented wearable system uses a dedicated color space and continuously generates natural, spatialized sounds based on the information acquired from a camera. We developed two head-mounted prototype devices and two graphical user interface (GUI) versions. The first GUI is dedicated to researchers, and the second has been designed to be easily accessible for visually impaired persons. Finally, we ran fundamental usability tests to evaluate the new spatial color sonification algorithm and to compare the two prototypes. Furthermore, we propose recommendations for the development of the next iteration of the system.

Keywords: Colorophone; assistive device; color sonification; human–computer interaction; multimodal perception; sensory substitution; wearable device.

MeSH terms

  • Algorithms
  • Humans
  • Sound
  • Visually Impaired Persons*
  • Wearable Electronic Devices*