An explanatory mixed methods study on the validity and validation of students' assessment results in the undergraduate surgery course

Med Teach. 2018 Sep;40(sup1):S56-S67. doi: 10.1080/0142159X.2018.1465181. Epub 2018 May 2.

Abstract

Background/purpose: There is inadequate evidence of reported validity of the results of assessment instruments used to assess clinical competence. This study aimed at combining multiple lines of quantitative and qualitative evidence to support interpretation and use of assessment results.

Method: This study is a mixed methods explanatory research set in two stages of data collection and analysis (QUAN : qual). Guided by Messick's conceptual model, quantitative evidences as reliability and correlation coefficients of various validity components were calculated using students' scores, grades and success rates of the whole population of students in 2012/2013 and 2013/2014 (n= 383; 326). The underlying values that scaffold validity evidences were identified via Focus Group Discussions (FGD) with faculty and students; sampling technique was purposive; and results were analyzed by content analysis.

Results: (1) Themes that resulted from content analysis aligned with quantitative evidences. (2) Assessment results showed: (a) content validity (table of specifications and blueprinting in another study); (b) consequential validity (positive unintended consequences resulted from new assessment approach); (c) relationships to other variables [a statistically significant correlation among various assessment methods; with combined score (0.64-0.86) and between mid and final exam results (r = 0.672)]; (d) internal consistency (high reliability of MCQ and OSCE: 0.81, 0.80); (3) success rates and grades distribution alone could not provide evidence to advocate an argument on validity of results.

Conclusion: The unified approach pursued in this study created a strong evidential basis for meaningful interpretation of assessment scores that could be applied in clinical assessments.

Publication types

  • Validation Study

MeSH terms

  • Clinical Competence / standards*
  • Curriculum / standards*
  • Education, Medical, Undergraduate / statistics & numerical data
  • Educational Measurement / standards*
  • General Surgery / education*
  • Humans
  • Reproducibility of Results
  • Students, Medical