The nature of anchor-biased estimates and its application to the wisdom of crowds

Cognition. 2024 May:246:105758. doi: 10.1016/j.cognition.2024.105758. Epub 2024 Mar 4.

Abstract

We propose a method to achieve better wisdom of crowds by utilizing anchoring effects. In this method, people are first asked to make a comparative judgment such as "Is the number of new COVID-19 infections one month later more or less than 10 (or 200,000)?" As in this example, two sufficiently different anchors (e.g., "10" or "200,000") are set in the comparative judgment. After this comparative judgment, people are asked to make their own estimates. These estimates are then aggregated. We hypothesized that the aggregated estimates using this method would be more accurate than those without anchor presentation. To examine the effectiveness of the proposed method, we conducted three studies: a computer simulation and two behavioral experiments (numerical estimation of perceptual stimuli and estimation of new COVID-19 infections by physicians). Through computer simulations, we could identify situations in which the proposed method is effective. Although the proposed method is not always effective (e.g., when a group can make fairly accurate estimations), on average, the proposed method is more likely to achieve better wisdom of crowds. In particular, when a group cannot make accurate estimations (i.e., shows biases such as overestimation or underestimation), the proposed method can achieve better wisdom of crowds. The results of the behavioral experiments were consistent with the computer simulation findings. The proposed method achieved better wisdom of crowds. We discuss new insights into anchoring effects and methods for inducing diverse opinions from group members.

Keywords: Anchoring effect; Diverse response; Numerical estimation; Rationality; Wisdom of crowds.

MeSH terms

  • COVID-19*
  • Computer Simulation
  • Crowding
  • Humans
  • Judgment*