Context: One goal of undergraduate assessment is to test students' (future) performance. In the area of skills testing, the objective structured clinical examination (OSCE) has been of great value as a tool with which to test a number of skills in a limited time, with bias reduction and improved reliability. But can OSCEs measure undergraduate internship expertise in basic clinical skills?
Methods: Undergraduate students (n = 32) were given a questionnaire listing 182 basic clinical skills. We asked them to score the number of times they had performed each skill during their internships (a 12-month period in Year 6). We assessed the students at the end of Year 5 (before the start of their internships) and again at the start of Year 7 (undergraduate training takes 7 years in Belgium, with internships during Year 6), using a 14-station OSCE assessing basic clinical skills. Global ratings were used to score performance. The relationship between internship experience and the OSCE Year 7 score was analysed using a linear regression model, controlling for variation in OSCE scores from Year 5. A multi-level analysis was performed considering students as level-1 units and stations as level-2 units.
Results: Year 7 OSCE scores (post-internships) were not affected by the number of times that students practised basic medical skills during their internships.
Discussion: Scores on OSCEs do not seem to reflect clinical expertise acquired during internships. Other more integrated assessment methods may prove to be more valid for testing final undergraduate skills levels.