Evaluating validity evidence for 2 instruments developed to assess students' surgical skills in a simulated environment

Vet Surg. 2022 Jul;51(5):788-800. doi: 10.1111/vsu.13791. Epub 2022 Mar 8.

Abstract

Objective: To gather and evaluate validity evidence in the form of content and reliability of scores produced by 2 surgical skills assessment instruments, 1) a checklist, and 2) a modified form of the Objective Structured Assessment of Technical Skills (OSATS) global rating scale (GRS).

Study design: Prospective randomized blinded study.

Sample population: Veterinary surgical skills educators (n =10) evaluated content validity. Scores from students in their third preclinical year of veterinary school (n = 16) were used to assess reliability.

Methods: Content validity was assessed using Lawshe's method to calculate the Content Validity Index (CVI) for the checklist and modified OSATS GRS. The importance and relevance of each item was determined in relation to skills needed to successfully perform supervised surgical procedures. The reliability of scores produced by both instruments was determined using generalizability (G) theory.

Results: Based on the results of the content validation, 39 of 40 checklist items were included. The 39-item checklist CVI was 0.81. One of the 6 OSATS GRS items was included. The 1-item GRS CVI was 0.80. The G-coefficients for the 40-item checklist and 6-item GRS were 0.85 and 0.79, respectively.

Conclusion: Content validity was very good for the 39-item checklist and good for the 1-item OSATS GRS. The reliability of scores from both instruments was acceptable for a moderate stakes examination.

Impact: These results provide evidence to support the use of the checklist described and a modified 1-item OSAT GRS in moderate stakes examinations when evaluating preclinical third-year veterinary students' technical surgical skills on low-fidelity models.

MeSH terms

  • Animals
  • Checklist
  • Clinical Competence*
  • Humans
  • Internship and Residency*
  • Prospective Studies
  • Reproducibility of Results
  • Students