Intrinsic Performance of Monte Carlo Calibration-Free Algorithm for Laser-Induced Breakdown Spectroscopy

Sensors (Basel). 2022 Sep 21;22(19):7149. doi: 10.3390/s22197149.

Abstract

The performance of the Monte Carlo (MC) algorithm for calibration-free LIBS was studied on the example of a simulated spectrum that mimics a metallurgical slag sample. The underlying model is that of a uniform, isothermal, and stationary plasma in local thermodynamical equilibrium. Based on the model, the algorithm generates from hundreds of thousands to several millions of simultaneous configurations of plasma parameters and the corresponding number of spectra. The parameters are temperature, plasma size, and concentrations of species. They are iterated until a cost function, which indicates a difference between synthetic and simulated slag spectra, reaches its minimum. After finding the minimum, the concentrations of species are read from the model and compared to the certified values. The algorithm is parallelized on a graphical processing unit (GPU) to reduce computational time. The minimization of the cost function takes several minutes on the GPU NVIDIA Tesla K40 card and depends on the number of elements to be iterated. The intrinsic accuracy of the MC calibration-free method is found to be around 1% for the eight elements tested. For a real experimental spectrum, however, the efficiency may turn out to be worse due to the idealistic nature of the model, as well as incorrectly chosen experimental conditions. Factors influencing the performance of the method are discussed.

Keywords: Monte Carlo algorithm; calibration-free analysis; laser-induced breakdown spectroscopy.

Grants and funding

This research was funded by BAM Federal Institute for Materials Research and Testing.