Study on the radiation and self-absorption characteristics of plasma under various background gases

Opt Express. 2023 May 8;31(10):16423-16433. doi: 10.1364/OE.489720.

Abstract

The self-absorption effect is a primary factor responsible for the decline in the precision of quantitative analysis techniques using plasma emission spectroscopy, such as laser-induced breakdown spectroscopy (LIBS). In this study, based on the thermal ablation and hydrodynamics models, the radiation characteristics and self-absorption of laser-induced plasmas under different background gases were theoretically simulated and experimentally verified to investigate ways of weakening the self-absorption effect in plasma. The results reveal that the plasma temperature and density increase with higher molecular weight and pressure of the background gas, leading to stronger species emission line intensity. To reduce the self-absorption effect in the later stages of plasma evolution, we can decrease the gas pressure or substitute the background gas with a lower molecular weight. As the excitation energy of the species increases, the impact of the background gas type on the spectral line intensity becomes more pronounced. Moreover, we accurately calculated the optically thin moments under various conditions using theoretical models, which are consistent with the experimental results. From the temporal evolution of the doublet intensity ratio of species, it is deduced that the optically thin moment appears later with higher molecular weight and pressure of the background gas and lower upper energy of the species. This theoretical research is essential in selecting the appropriate background gas type and pressure and doublets in self-absorption-free LIBS (SAF-LIBS) experiments to weaken the self-absorption effect.