Quality assessment of external data: a further means of reducing animal use for toxicity testing--a case study

Qual Assur. 1992 Jun;1(3):207-12.

Abstract

One overlooked area of quality assurance (QA) is the critical, in-depth reassessment of toxicity data from secondary compilations. Such retrospective QA may play a role in avoiding needless additional or repeated animal testing, as this case study shows. Initially, the task was simply to carry out toxicity testing of a chemical for LD50 determination for regulatory purposes. The impetus for this proposed (re-)testing was the erroneously calculated low LD50 value for just one species and one route of administration. Examination of the original literature cited as the source of the seemingly anomalous LD50 value revealed that a combination of conceptual and transcriptional errors had been made when the results were translated from the original German research paper and were put into two widely used secondary compilations: RTECS and HSDB. Correcting these errors rendered the true value for LD50, which was no longer out of step with values for other species, nor was it sufficiently low to cause any concern in the work place. The critical reassessment removed the need to use any further animal studies to assess the situation. It is concluded that, in some cases, "reassessing" existing data can be added to the established list of "refining, reducing, and replacing" as a means of decreasing animal use in toxicological evaluation.

MeSH terms

  • Animals
  • Bias
  • Canada
  • Data Interpretation, Statistical*
  • Databases, Factual / standards*
  • Drug Evaluation, Preclinical*
  • Guinea Pigs
  • Lethal Dose 50
  • Maximum Allowable Concentration
  • Mice
  • Occupational Exposure* / legislation & jurisprudence
  • Potassium Chloride / toxicity*
  • Quality Control
  • Rats

Substances

  • Potassium Chloride