Effects of a demand-led evidence briefing service on the uptake and use of research evidence by commissioners of health services: a controlled before-and-after study

Review
Southampton (UK): NIHR Journals Library; 2017 Feb.

Excerpt

Background: The Health and Social Care Act 2012 (Great Britain. Health and Social Care Act 2012. London: The Stationery Office; 2012) has mandated research use as a core consideration of health service commissioning arrangements. We evaluated whether or not access to a demand-led evidence briefing service improved the use of research evidence by commissioners, compared with less intensive and less targeted alternatives.

Design: Controlled before-and-after study.

Setting: Clinical Commissioning Groups (CCGs) in the north of England.

Main outcome measures: Change at 12 months from baseline of a CCG’s ability to acquire, assess, adapt and apply research evidence to support decision-making. Secondary outcomes measured individual clinical leads’ and managers’ intentions to use research evidence in decision-making.

Methods: Nine CCGs received one of three interventions: (1) access to an evidence briefing service; (2) contact plus an unsolicited push of non-tailored evidence; or (3) an unsolicited push of non-tailored evidence. Data for the primary outcome measure were collected at baseline and 12 months post intervention, using a survey instrument devised to assess an organisation’s ability to acquire, assess, adapt and apply research evidence to support decision-making. In addition, documentary and observational evidence of the use of the outputs of the service was sought and interviews with CCG participants were undertaken.

Results: Most of the requests were conceptual; they were not directly linked to discrete decisions or actions but were intended to provide knowledge about possible options for future actions. Symbolic use to justify existing decisions and actions were less frequent and included a decision to close a walk-in centre and to lend weight to a major initiative to promote self-care already under way. The opportunity to impact directly on decision-making processes was limited to work to establish disinvestment policies. In terms of impact overall, the evidence briefing service was not associated with increases in CCGs’ capacity to acquire, assess, adapt and apply research evidence to support decision-making, individual intentions to use research findings or perceptions of CCGs’ relationships with researchers. Regardless of the intervention received, at baseline participating CCGs indicated that they felt that they were inconsistent in their research-seeking behaviours and their capacity to acquire research remained so at follow-up. The informal nature of decision-making processes meant that there was little or no traceability of the use of evidence.

Limitations: Low baseline and follow-up response rates (of 68% and 44%, respectively) and missing data limit the reliability of these findings.

Conclusions: Access to a demand-led evidence briefing service did not improve the uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives. Commissioners appear to be well intentioned but ad hoc users of research.

Future work: Further research is required on the effects of interventions and strategies to build individual and organisational capacity to use research. Resource-intensive approaches to providing evidence may best be employed to support instrumental decision-making. Comparative evaluation of the impact of less intensive but targeted strategies on the uptake and use of research by commissioners is warranted.

Funding: The National Institute for Health Research Health Services and Delivery Research programme.

Publication types

  • Review