We Know What You Agreed To, Don't We?-Evaluating the Quality of Paper-Based Consents Forms and Their Digitalized Equivalent Using the Example of the Baltic Fracture Competence Centre Project

Methods Inf Med. 2023 Jun;62(S 01):e10-e18. doi: 10.1055/s-0042-1760249. Epub 2023 Jan 9.

Abstract

Introduction: The informed consent is the legal basis for research with human subjects. Therefore, the consent form (CF) as legally binding document must be valid, that is, be completely filled-in stating the person's decision clearly and signed by the respective person. However, especially paper-based CFs might have quality issues and the transformation into machine-readable information could add to low quality. This paper evaluates the quality and arising quality issues of paper-based CFs using the example of the Baltic Fracture Competence Centre (BFCC) fracture registry. It also evaluates the impact of quality assurance (QA) measures including giving site-specific feedback. Finally, it answers the question whether manual data entry of patients' decisions by clinical staff leads to a significant error rate in digitalized paper-based CFs.

Methods: Based on defined quality criteria, monthly QA including source data verification was conducted by two individual reviewers since the start of recruitment in December 2017. Basis for the analyses are the CFs collected from December 2017 until February 2019 (first recruitment period).

Results: After conducting QA internally, the sudden increase of quality issues in May 2018 led to site-specific feedback reports and follow-up training regarding the CFs' quality starting in June 2018. Specific criteria and descriptions on how to correct the CFs helped in increasing the quality in a timely matter. Most common issues were missing pages, decisions regarding optional modules, and signature(s). Since patients' datasets without valid CFs must be deleted, QA helped in retaining 65 datasets for research so that the final datapool consisted of 840 (99.29%) patients.

Conclusion: All quality issues could be assigned to one predefined criterion. Using the example of the BFCC fracture registry, CF-QA proved to significantly increase CF quality and help retain the number of available datasets for research. Consequently, the described quality indicators, criteria, and QA processes can be seen as the best practice approach.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Consent Forms*
  • Humans
  • Informed Consent*

Grants and funding

Funding The BFCC project was funded by the EU Interreg Baltic Sea Programme 2014-2020 (grant number #R001). gICS was developed as a part of the research grant program “Information infrastructure for research data” (grant number HO 1937/2-1) funded by the German Research Foundation (DFG).