Deep Learning-Based 3D Measurements with Near-Infrared Fringe Projection

Sensors (Basel). 2022 Aug 27;22(17):6469. doi: 10.3390/s22176469.

Abstract

Fringe projection profilometry (FPP) is widely applied to 3D measurements, owing to its advantages of high accuracy, non-contact, and full-field scanning. Compared with most FPP systems that project visible patterns, invisible fringe patterns in the spectra of near-infrared demonstrate fewer impacts on human eyes or on scenes where bright illumination may be avoided. However, the invisible patterns, which are generated by a near-infrared laser, are usually captured with severe speckle noise, resulting in 3D reconstructions of limited quality. To cope with this issue, we propose a deep learning-based framework that can remove the effect of the speckle noise and improve the precision of the 3D reconstruction. The framework consists of two deep neural networks where one learns to produce a clean fringe pattern and the other to obtain an accurate phase from the pattern. Compared with traditional denoising methods that depend on complex physical models, the proposed learning-based method is much faster. The experimental results show that the measurement accuracy can be increased effectively by the presented method.

Keywords: deep learning; denoising; fringe projection; phase retrieval; speckle noise.

MeSH terms

  • Algorithms*
  • Deep Learning*
  • Humans
  • Imaging, Three-Dimensional / methods
  • Neural Networks, Computer

Grants and funding

This research was funded by the National Natural Science Foundation of China (62075096, 62005121, U21B2033), the Leading Technology of Jiangsu Basic Research Plan (BK20192003), the “333 Engineering” Research Project of Jiangsu Province (BRA2016407), the Jiangsu Provincial “One Belt and One Road” Innovation Cooperation Project (BZ2020007), the Fundamental Research Funds for the Central Universities (30921011208, 30919011222, 30920032101), and the Open Research Fund of Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense (JSGP202105, JSGP202201).