Sequential biases on subjective judgments: Evidence from face attractiveness and ringtone agreeableness judgment

PLoS One. 2018 Jun 11;13(6):e0198723. doi: 10.1371/journal.pone.0198723. eCollection 2018.

Abstract

When people make decisions about sequentially presented items in psychophysical experiments, their decisions are always biased by their preceding decisions and the preceding items, either by assimilation (shift towards the decision or item) or contrast (shift away from the decision or item). Such sequential biases also occur in naturalistic and real-world judgments such as facial attractiveness judgments. In this article, we aimed to cast light on the causes of these sequential biases. We first found significant assimilative and contrastive effects in a visual face attractiveness judgment task and an auditory ringtone agreeableness judgment task, indicating that sequential effects are not limited to the visual modality. We then found that the provision of trial-by-trial feedback of the preceding stimulus value eliminated the contrastive effect, but only weakened the assimilative effect. When participants orally reported their judgments rather than indicated them via a keyboard button press, we found a significant diminished assimilative effect, suggesting that motor response repetition strengthened the assimilation bias. Finally, we found that when visual and auditory stimuli were alternated, there was no longer a contrastive effect from the immediately previous trial, but there was an assimilative effect both from the previous trial (cross-modal) and the 2-back trial (same stimulus modality). These findings suggested that the contrastive effect results from perceptual processing, while the assimilative effect results from anchoring of the previous judgment and is strengthened by response repetition and numerical priming.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adolescent
  • Adult
  • Face*
  • Female
  • Humans
  • Judgment*
  • Music*
  • Photic Stimulation
  • Visual Perception
  • Young Adult

Grants and funding

This research was supported by the National Natural Science Foundation of China (31671132, J1210024, J1310031), Innovation Fund for College Students (20160110) and MOE Project of Key Research Institute of Humanities and Social Sciences in Universities (15JJD190005).