Evaluating change in a pressured healthcare system: a cross-sectional study of implementation outcomes using routine data indicators and proxies

Implement Sci Commun. 2023 Aug 16;4(1):96. doi: 10.1186/s43058-023-00471-x.

Abstract

Background: Implementation evaluation should focus on implementation success, guided by theories and frameworks. With high staff vacancies in the health services, it is important to consider pragmatic methods of data collection for implementation evaluation. This paper presents a cross-sectional rapid evaluation of a handheld medical device designed for remote examinations, piloted in Northern England. By using downloaded device data and administrative records mapped to domains from the implementation outcomes framework, this evaluation offers a pragmatic example of assessing implementation success.

Methods: The pilot design was pragmatic: sites volunteered, decided which services to use the device in, and launched when ready. The pilot and evaluation together lasted 1 year. Data was downloaded from the devices, and administrative records for the pilot accessed. Variables were mapped to five of the implementation outcomes, after reviewing with the device manufacturer and pilot team to assess robustness.

Results: N=352 care episodes were recorded using the device with 223 patients. Out of 19 sites 'signed up' to the pilot, 5 launched and delivered 10 of 35 proposed projects: a site and project adoption rate of 26 and 29%, respectively. Six sites signed up to an extension period; three had launched and three had not during the original timelines, indicating some sustainability. Feasibility was high, with only one in seven care episodes needing to be repeated due to poor device quality or error (sound/audio/internet). Fidelity of device usage was low for two of the eight available device examinations. Device and staffing costs were high but potential cost savings were attributable to fewer in-person appointments.

Conclusions: Through using device and administrative data, this evaluation minimised burden on busy healthcare staff yet was still guided by an evaluation framework. Five out of the eight implementation outcomes were measured, including sustainability and costs. The findings give insight into implementation challenges, particularly around adoption. For future research, it is recommended to engage with staff to prioritise outcome measurements and to focus on meaningful interpretation of indicators.

Keywords: Administrative data; Adoption; Cost; Downloaded data; Feasibility; Fidelity; Implementation evaluation; Implementation outcomes framework; Indicators; Routine data; Sustainability.