Investigation of the self-absorption effect using time-resolved laser-induced breakdown spectroscopy

Opt Express. 2019 Feb 18;27(4):4261-4270. doi: 10.1364/OE.27.004261.

Abstract

Self-absorption seriously affects the accuracy and stability of quantitative analysis in laser-induced breakdown spectroscopy (LIBS). To reduce the effect of self-absorption, we investigated the temporal evolution of the self-absorption effect by establishing exponential calibration curves. Meanwhile, the temporal evolution mechanism of the self-absorption effect was also investigated. The results indicated that self-absorption was weak at the early stage of plasma expansion. For determination of manganese (Mn) in steel, as an example, the concentration of upper bound of linearity (Cint) was 2.000 wt. % at the early stage of plasma expansion (in a time window of 0.2-0.4 μs)-much higher than 0.363 wt. % at a traditional optimization time window (2-3 μs). The accuracy and stability of quantitative analysis at the time window of 0.2-0.4 μs was also much better than at the time window of 2-3 μs. This work provides a simple method for improving quantitative analysis performance and avoiding the self-absorption effect in LIBS.