The specificity of feature-based attentional guidance is equivalent under single- and dual-target search

J Exp Psychol Hum Percept Perform. 2023 Nov;49(11):1430-1446. doi: 10.1037/xhp0001157.

Abstract

Individuals actively maintain attentional templates to prioritize target-matching inputs. While previous works have established that multiple templates can be held simultaneously, current understanding is limited with respect to the representational quality of such templates. We thus investigated: (a) whether the maintenance of two templates is limited to broad, coarse-grained representations, and if not, (b) whether there is nonetheless a decline in the achievable level of specificity when multiple attentional templates are held simultaneously. Using a spatial cueing procedure, we probed the breadth of attentional templates while participants maintained either one (Experiment 1) or two target colors (Experiment 2) under conditions of low- or high-similarity search and found specific template maintenance during high-similarity search for both single- and dual-target conditions. We then directly compared template specificity during single- and dual-target maintenance in Experiment 3, probing at the point of differentiation between target and nontarget feature values observed during single-target search. Here we found no difference in the selectivity of cue validity effects between single- and dual-target search, suggesting equivalent template specificity regardless of whether one or two features are relevant to search. Lastly, in Experiment 4, we established that such template specificity is dependent on access to visual working memory. (PsycInfo Database Record (c) 2023 APA, all rights reserved).

MeSH terms

  • Attention*
  • Cues
  • Humans
  • Memory, Short-Term*
  • Reaction Time
  • Visual Perception