Accommodating uncertainty in comparative risk

Risk Anal. 2004 Oct;24(5):1323-35. doi: 10.1111/j.0272-4332.2004.00529.x.

Abstract

Comparative risk projects can provide broad policy guidance but they rarely have adequate scientific foundations to support precise risk rankings. Many extant projects report rankings anyway, with limited attention to uncertainty. Stochastic uncertainty, structural uncertainty, and ignorance are types of incertitude that afflict risk comparisons. The recently completed New Jersey Comparative Risk Project was innovative in trying to acknowledge and accommodate some historically ignored uncertainties in a substantive manner. This article examines the methods used and lessons learned from the New Jersey project. Monte Carlo techniques were used to characterize stochastic uncertainty, and sensitivity analysis helped to manage structural uncertainty. A deliberative process and a sorting technique helped manage ignorance. Key findings are that stochastic rankings can be calculated but they reveal such an alarming degree of imprecision that the rankings are no longer useful, whereas sorting techniques are helpful in spite of uncertainty. A deliberative process is helpful to counter analytical overreaching.