A strategy for monitoring and evaluating massive open online courses

Eval Program Plann. 2016 Aug:57:55-63. doi: 10.1016/j.evalprogplan.2016.04.006. Epub 2016 May 7.

Abstract

We argue that the complex, innovative and adaptive nature of Massive Open Online Course (MOOC) initiatives poses particular challenges to monitoring and evaluation, in that any evaluation strategy will need to follow a systems approach. This article aims to guide organizations implementing MOOCs through a series of steps to assist them in developing a strategy to monitor, improve, and judge the merit of their initiatives. We describe how we operationalise our strategy by first defining the different layers of interacting agents in a given MOOC system. We then tailor our approach to these different layers. Specifically, a two-pronged approach was developed, where we suggest that individual projects be assessed through performance monitoring; assessment criteria for which would be defined at the outset to include coverage, participation, quality and student achievement. In contrast, the success of an overall initiative should be considered within a more adaptive, emergent evaluation inquiry framework. We present the inquiry framework we developed for MOOC initiatives, and show how this framework might be used to develop evaluation questions and an assessment methodology. We also define the more fixed indicators and measures for project performance monitoring. Our strategy is described as it was developed to inform the evaluation of a MOOC initiative at the University of Cape Town (UCT), South Africa.

Keywords: Developmental evaluation; Evaluation; MOOC; Monitoring; Online course; Performance monitoring.

MeSH terms

  • Education, Distance / organization & administration
  • Education, Distance / standards*
  • Educational Measurement / methods
  • Educational Measurement / standards*
  • Humans
  • Internet
  • Program Evaluation / methods
  • Program Evaluation / standards*
  • South Africa
  • Universities