Prediction of the Acute or Late Radiation Toxicity Effects in Radiotherapy Patients Using Ex Vivo Induced Biodosimetric Markers: A Review

J Pers Med. 2020 Dec 16;10(4):285. doi: 10.3390/jpm10040285.

Abstract

A search for effective methods for the assessment of patients' individual response to radiation is one of the important tasks of clinical radiobiology. This review summarizes available data on the use of ex vivo cytogenetic markers, typically used for biodosimetry, for the prediction of individual clinical radiosensitivity (normal tissue toxicity, NTT) in cells of cancer patients undergoing therapeutic irradiation. In approximately 50% of the relevant reports, selected for the analysis in peer-reviewed international journals, the average ex vivo induced yield of these biodosimetric markers was higher in patients with severe reactions than in patients with a lower grade of NTT. Also, a significant correlation was sometimes found between the biodosimetric marker yield and the severity of acute or late NTT reactions at an individual level, but this observation was not unequivocally proven. A similar controversy of published results was found regarding the attempts to apply G2- and γH2AX foci assays for NTT prediction. A correlation between ex vivo cytogenetic biomarker yields and NTT occurred most frequently when chromosome aberrations (not micronuclei) were measured in lymphocytes (not fibroblasts) irradiated to relatively high doses (4-6 Gy, not 2 Gy) in patients with various grades of late (not early) radiotherapy (RT) morbidity. The limitations of existing approaches are discussed, and recommendations on the improvement of the ex vivo cytogenetic testing for NTT prediction are provided. However, the efficiency of these methods still needs to be validated in properly organized clinical trials involving large and verified patient cohorts.

Keywords: biodosimetry; chromosome aberrations; micronuclei; normal tissue toxicity; predictive tests; radiosensitivity; radiotherapy.

Publication types

  • Review