Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI): A methodology for assessing complex intervention implementability

Implement Res Pract. 2020 Sep 21:1:2633489520932924. doi: 10.1177/2633489520932924. eCollection 2020 Jan-Dec.

Abstract

Background: Most evidence-based practices in mental health are complex psychosocial interventions, but little research has focused on assessing and addressing the characteristics of these interventions, such as design quality and packaging, that serve as intra-intervention determinants (i.e., barriers and facilitators) of implementation outcomes. Usability-the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction-is a key indicator of design quality. Drawing from the field of human-centered design, this article presents a novel methodology for evaluating the usability of complex psychosocial interventions and describes an example "use case" application to an exposure protocol for the treatment of anxiety disorders with one user group.

Method: The Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI) methodology comprises four steps: (1) identify users for testing; (2) define and prioritize EBPI components (i.e., tasks and packaging); (3) plan and conduct the evaluation; and (4) organize and prioritize usability issues. In the example, clinicians were selected for testing from among the identified user groups of the exposure protocol (e.g., clients, system administrators). Clinicians with differing levels of experience with exposure therapies (novice, n =3; intermediate, n = 4; advanced, n = 3) were sampled. Usability evaluation included Intervention Usability Scale (IUS) ratings and individual user testing sessions with clinicians, and heuristic evaluations conducted by design experts. After testing, discrete usability issues were organized within the User Action Framework (UAF) and prioritized via independent ratings (1-3 scale) by members of the research team.

Results: Average IUS ratings (80.5; SD = 9.56 on a 100-point scale) indicated good usability and also room for improvement. Ratings for novice and intermediate participants were comparable (77.5), with higher ratings for advanced users (87.5). Heuristic evaluations suggested similar usability (mean overall rating = 7.33; SD = 0.58 on a 10-point scale). Testing with individual users revealed 13 distinct usability issues, which reflected all four phases of the UAF and a range of priority levels.

Conclusion: Findings from the current study suggested the USE-EBPI is useful for evaluating the usability of complex psychosocial interventions and informing subsequent intervention redesign (in the context of broader development frameworks) to enhance implementation. Future research goals are discussed, which include applying USE-EBPI with a broader range of interventions and user groups (e.g., clients).

Plain language abstract: Characteristics of evidence-based psychosocial interventions (EBPIs) that impact the extent to which they can be implemented in real world mental health service settings have received far less attention than the characteristics of individuals (e.g., clinicians) or settings (e.g., community mental health centers), where EBPI implementation occurs. No methods exist to evaluate the usability of EBPIs, which can be a critical barrier or facilitator of implementation success. The current article describes a new method, the Usability Evaluation for Evidence-Based Psychosocial Interventions (USE-EBPI), which uses techniques drawn from the field of human-centered design to evaluate EBPI usability. An example application to an intervention protocol for anxiety problems among adults is included to illustrate the value of the new approach.

Keywords: Human-centered design; complex psychosocial interventions; evidence-based psychosocial interventions; usability; user-centered design.