Quality control of epidemiological lectures online: scientific evaluation of peer review

Croat Med J. 2007 Apr;48(2):249-55.

Abstract

Aim: To examine the feasibility of using peer review for the quality control of online materials.

Methods: We analyzed the inter-rater agreement on the quality of epidemiological lectures online, based on the Global Health Network Supercourse lecture library. We examined the agreement among reviewers by looking at kappa statistics and intraclass correlations. Seven expert reviewers examined and rated a random sample of 100 Supercourse lectures. Their reviews were compared with the reviews of the lay Supercourse reviewers.

Results: Both expert and non-expert reviewers rated lectures very highly, with a mean overall score of 4 out of 5. Kappa (Kappa) statistic and intraclass correlations indicated that inter-rater agreement for experts and non-experts was surprisingly low (below 0.4).

Conclusions: To our knowledge, this was the first time that poor inter-rater agreement was demonstrated for the Internet lectures. Future research studies need to evaluate the alternatives to the peer review system, especially for online materials.

Publication types

  • Evaluation Study
  • Research Support, N.I.H., Extramural

MeSH terms

  • Computer-Assisted Instruction / standards
  • Computer-Assisted Instruction / statistics & numerical data*
  • Education, Medical, Continuing / standards
  • Education, Medical, Continuing / statistics & numerical data*
  • Epidemiology / education*
  • Humans
  • Internet*
  • Observer Variation
  • Peer Review / methods*
  • Quality Control