Are Videos or Text Better for Describing Attributes in Stated-Preference Surveys?

Patient. 2020 Aug;13(4):401-408. doi: 10.1007/s40271-020-00416-9.

Abstract

Objective: In stated-preference research, the conventional approach to describing study attributes is through text, often with easy-to-understand graphics. More recently, researchers have begun to present attribute descriptions and content in videos. Some experts have expressed concern regarding internalization and retention of information conveyed via video.

Objective: Our study aimed to compare respondents' understanding of attribute information provided via text versus video.

Methods: Potential respondents were randomized to receive a text or video version of the survey. In the text version, all content was provided in text format along with still graphics. In the video version, text content was interspersed with four video clips, providing the same information as the text version. In both versions, 10 questions were embedded to assess respondents' understanding of the information presented relating to ovarian cancer treatments. Half of the questions were on treatment benefits and the other half were on treatment-related risks. Some questions asked about the decision context and definitions of treatment features, and others asked about the graphic presentation of treatment features. Preferences for ovarian cancer treatments were also compared between respondents receiving text versus video versions.

Results: Overall, 150 respondents were recruited. Of the 95 who were eligible and completed the survey, 54 respondents received the text version and 41 received the video version. Median times to completion were 24 and 30 min in the video and text arms, respectively (p < 0.01). Both groups spent an average of 35 min completing the survey. On the first comprehension question, 43% in the text arm and 61% in the video arm provided the correct response (p = 0.08). Although the mean number of correct responses was significantly higher in the video versus text arms (9.1 vs. 8.6, p = 0.02), there were no systematic differences in preferences between arms.

Conclusions: The quality of stated-preference data relies on respondents' understanding of study content. Information provided via video may better engage survey participants and improve their retention of content.

Publication types

  • Randomized Controlled Trial
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Aged
  • Decision Making
  • Female
  • Humans
  • Middle Aged
  • Ovarian Neoplasms / therapy
  • Patient Education as Topic / methods*
  • Patient Preference / psychology*
  • Risk Assessment
  • Socioeconomic Factors
  • Surveys and Questionnaires
  • Text Messaging*
  • Time Factors
  • Video Recording*