Exploring the artificial intelligence "Trust paradox": Evidence from a survey experiment in the United States

PLoS One. 2023 Jul 18;18(7):e0288109. doi: 10.1371/journal.pone.0288109. eCollection 2023.

Abstract

Advances in Artificial Intelligence (AI) are poised to transform society, national defense, and the economy by increasing efficiency, precision, and safety. Yet, widespread adoption within society depends on public trust and willingness to use AI-enabled technologies. In this study, we propose the possibility of an AI "trust paradox," in which individuals' willingness to use AI-enabled technologies exceeds their level of trust in these capabilities. We conduct a two-part study to explore the trust paradox. First, we conduct a conjoint analysis, varying different attributes of AI-enabled technologies in different domains-including armed drones, general surgery, police surveillance, self-driving cars, and social media content moderation-to evaluate whether and under what conditions a trust paradox may exist. Second, we use causal mediation analysis in the context of a second survey experiment to help explain why individuals use AI-enabled technologies that they do not trust. We find strong support for the trust paradox, particularly in the area of AI-enabled police surveillance, where the levels of support for its use are both higher than other domains but also significantly exceed trust. We unpack these findings to show that several underlying beliefs help account for public attitudes of support, including the fear of missing out, optimism that future versions of the technology will be more trustworthy, a belief that the benefits of AI-enabled technologies outweigh the risks, and calculation that AI-enabled technologies yield efficiency gains. Our findings have important implications for the integration of AI-enabled technologies in multiple settings.

MeSH terms

  • Artificial Intelligence*
  • Autonomous Vehicles
  • Fear
  • Humans
  • Mediation Analysis
  • Trust*
  • United States

Grants and funding

The author(s) received no specific funding for this work.