Prevention validation and accounting platform: a framework for establishing accountability and performance measures of substance abuse prevention programs

J Drug Educ. 2000;30(1):1-143. doi: 10.2190/6WVH-KXAV-6H54-777E.

Abstract

The field of substance abuse prevention has neither an overarching conceptual framework nor a set of shared terminologies for establishing the accountability and performance outcome measures of substance abuse prevention services rendered. Hence, there is a wide gap between what we currently have as data on one hand and information that are required to meet the performance goals and accountability measures set by the Government Performance and Results Act of 1993 on the other. The task before us is: How can we establish the accountability and performance measures of substance abuse prevention programs and transform the field of prevention into prevention science? The intent of this volume is to serve that purpose and accelerate the processes of this transformation by identifying the requisite components of the transformation (i.e., theory, methodology, convention on terms, and data) and by introducing an open forum called, Prevention Validation and Accounting (PREVA) Platform. The entire PREVA Platform (for short, the Platform) is designed as an analytic framework, which is formulated by a collectivity of common concepts, terminologies, accounting units, protocols for counting the units, data elements, and operationalizations of various constructs, and other summary measures intended to bring about an efficient and effective measurement of process input, program capacity, process output, performance outcome, and societal impact of substance abuse prevention programs. The measurement units and summary data elements are designed to be measured across time and across jurisdictions, i.e., from local to regional to state to national levels. In the Platform, the process input is captured by two dimensions of time and capital. Time is conceptualized in terms of service delivery time and time spent for research and development. Capital is measured by the monies expended for the delivery of program activities during a fiscal or reporting period. Program capacity is captured by fourteen measurement units, tapping into the dimensions of staff resources and community assets. Staff resources are, in turn, operationalized in terms of staff size, staff certification status, staff turnover rate, and the accreditation status of a provider agency. Community assets are operationalized by the number of community centers accessible to the funded agency, number of formalized teams or antidrug coalitions active in the catchment area, and other social/human services providers with whom the prevention agency has formalized networks. The totality of process output from all sources of program activities is reduced to eighteen classes of measures. These are operationalized by thirty-three summary measures. Some of these include: total count of events facilitated; total number of clients served; average number of clients served per event; clients served by single and multiple program sessions; classification of target population in terms of the severity of risk as defined by the Institute of Medicine; age groups and race/ethnicity of clients served; number of program participants retained by recurring programs; number of clients who have completed the program; penetration rates to the target population; client attrition rates; average referral rates per provider per time interval; referral success rates; and so on. All process output measures specified in the Platform are derived from two broad classes of events classified as either products or services. The collectivity of these measures is expected to present a cost-effective, parsimonious, yet comprehensive picture of the entire spectrum of the process output, i.e., "what came out of the program as program activities". For the measurement of performance outcomes, two types of data are incorporated into the Platform: outcome data from individuals and the behavior (or performance) of social indicators from aggregated data bases. Individual data are used to evaluate the outcome of substance abuse programs

MeSH terms

  • Accounting / methods*
  • Accounting / trends
  • Data Collection / methods
  • Humans
  • Models, Theoretical
  • Outcome and Process Assessment, Health Care / methods
  • Patient Satisfaction
  • Program Evaluation / methods*
  • Program Evaluation / trends
  • Reproducibility of Results
  • Risk Factors
  • Social Problems / prevention & control
  • Social Responsibility*
  • Substance-Related Disorders / economics
  • Substance-Related Disorders / prevention & control*
  • Terminology as Topic
  • United States