Perceptual learning evidence for supramodal representation of stimulus orientation at a conceptual level

Vision Res. 2021 Oct:187:120-128. doi: 10.1016/j.visres.2021.06.010. Epub 2021 Jul 9.

Abstract

When stimulus inputs from different senses are integrated to form a coherent percept, inputs from a more precise sense are typically more dominant than those from a less precise sense. Furthermore, we hypothesized that some basic stimulus features, such as orientation, can be supramodal-represented at a conceptual level that is independent of the original modality precision. This hypothesis was tested with perceptual learning experiments. Specifically, participants practiced coarser tactile orientation discrimination, which initially had little impact on finer visual orientation discrimination (tactile vs. visual orientation thresholds = 3:1). However, if participants also practiced a functionally orthogonal visual contrast discrimination task in a double training design, their visual orientation performance was improved at both tactile-trained and untrained orientations, as much as through direct visual orientation training. The complete tactile-to-visual learning transfer is consistent with a conceptual supramodal representation of orientation unconstrained by original modality precision, likely through certain forms of input standardization. Moreover, this conceptual supramodal representation, when improved through perceptual learning in one sense, can in turn facilitate orientation discrimination in an untrained sense.

Keywords: Double training; Multisensory perception; Orientation; Perceptual learning; Supramodal concept.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Humans
  • Learning*
  • Orientation*
  • Touch
  • Transfer, Psychology
  • Visual Perception