Effects of response format on achievement and aptitude assessment results: multi-level random effects meta-analyses

R Soc Open Sci. 2023 May 3;10(5):220456. doi: 10.1098/rsos.220456. eCollection 2023 May.

Abstract

Psychological achievement and aptitude tests are fundamental elements of the everyday school, academic and professional lives of students, instructors, job applicants, researchers and policymakers. In line with growing demands for fair psychological assessment tools, we aimed to identify psychometric features of tests, test situations and test-taker characteristics that may contribute to the emergence of test bias. Multi-level random effects meta-analyses were conducted to estimate mean effect sizes for differences and relations between scores from achievement or aptitude measures with open-ended (OE) versus closed-ended (CE) response formats. Results from 102 primary studies with 392 effect sizes revealed positive relations between CE and OE assessments (mean r = 0.67, 95% CI [0.57; 0.76]), with negative pooled effect sizes for the difference between the two response formats (mean d av = -0.65; 95% CI [-0.78; -0.53]). Significantly higher scores were obtained on CE exams. Stem-equivalency of items, low-stakes test situations, written short answer OE question types, studies conducted outside the United States and before the year 2000, and test-takers' achievement motivation and sex were at least partially associated with smaller differences and/or larger relations between scores from OE and CE formats. Limitations and the results' implications for practitioners in achievement and aptitude testing are discussed.

Keywords: achievement and aptitude assessment; closed-ended; meta-analysis; open-ended; response format.

Associated data

  • figshare/10.6084/m9.figshare.c.6571642