Student Learning with Generated and Answered Peer-written Questions

Am J Pharm Educ. 2018 Mar;82(2):6315. doi: 10.5688/ajpe6315.

Abstract

Objective. To investigate the degree to which student-generated questions or answering student-generated multiple-choice questions predicts course performance in medicinal chemistry. Methods. Students enrolled in Medicinal Chemistry III over a 3-year period were asked to create at least one question per exam period using PeerWise; within the software, they were also asked to answer and rate one peer question per class session. Students' total reputation scores and its components (question authoring, answering, and rating) and total answer scores (correctness of answers submitted indicating agreement with the author's chosen answer) were analyzed relative to final course grades. Results. Students at the non-satellite campus and those who generated more highly rated questions performed better overall in the course accounting for 12% of the variability in course grades. The most notable differences were between the top third and bottom third performing students within the course. The number of questions answered by students was not a significant predictor of course performance. Conclusion. Student generation of more highly rated questions (referred to as more thoughtful in nature by the software program) is predictive of course performance but it only explained a small variability in course grades. The correctness of answers submitted, however, did not relate to student performance.

Keywords: course performance; medicinal chemistry; multiple choice questions; peer review; question generation.

MeSH terms

  • Adult
  • Chemistry, Pharmaceutical / education
  • Education, Pharmacy / methods*
  • Educational Measurement
  • Female
  • Humans
  • Learning*
  • Male
  • Middle Aged
  • Peer Group*
  • Students, Pharmacy*
  • Teaching
  • Young Adult