Image Segmentation of Fiducial Marks with Complex Backgrounds Based on the mARU-Net

Sensors (Basel). 2023 Nov 23;23(23):9347. doi: 10.3390/s23239347.

Abstract

Circuits on different layers in a printed circuit board (PCB) must be aligned according to high-precision fiducial mark images during exposure processing. However, processing quality depends on the detection accuracy of fiducial marks. Precise segmentation of fiducial marks from images can significantly improve detection accuracy. Due to the complex background of PCB images, there are significant challenges in the segmentation and detection of fiducial mark images. In this paper, the mARU-Net is proposed for the image segmentation of fiducial marks with complex backgrounds to improve detection accuracy. Compared with some typical segmentation methods in customized datasets of fiducial marks, the mARU-Net demonstrates good segmentation accuracy. Experimental research shows that, compared with the original U-Net, the segmentation accuracy of the mARU-Net is improved by 3.015%, while the number of parameters and training times are not increased significantly. Furthermore, the centroid method is used to detect circles in segmentation results, and the deviation is kept within 30 microns, with higher detection efficiency. The detection accuracy of fiducial mark images meets the accuracy requirements of PCB production.

Keywords: CBAM; U-Net; fiducial mark; mARU-Net; residual block.

Grants and funding

This research was funded by Research Program of Science and Technology at Universities of Inner Mongolia Autonomous Region (NJZY21308); Key Research Projects of Military-Civilian Integration of Inner Mongolia Autonomous Region (JMZD202203); Key Technology Research Program of Inner Mongolia (2021GG0258); The Natural Science Foundation of Inner Mongolia (2021MS05005); The Program for Innovative Research Team in Universities of Inner Mongolia Autonomous Region (NMGIRT2213); The Fundamental Research Funds for the Directly affiliated Universities of Inner Mongolia Autonomous Region (JY20220046).