An optimized GAN method based on the Que-Attn and contrastive learning for underwater image enhancement

PLoS One. 2023 Jan 6;18(1):e0279945. doi: 10.1371/journal.pone.0279945. eCollection 2023.

Abstract

Research on underwater image processing has increased significantly in the past decade due to the precious resources that exist underwater. However, it is still a challenging problem to restore degraded underwater images. Existing prior-based methods show limited performance in many cases due to their reliance on hand-crafted features. Therefore, in this paper, we propose an effective unsupervised generative adversarial network(GAN) for underwater image restoration. Specifically, we embed the idea of contrastive learning into the model. The method encourages two elements (corresponding patches) to map the similar points in the learned feature space relative to other elements (other patches) in the data set, and maximizes the mutual information between input and output through PatchNCE loss. We design a query attention (Que-Attn) module, which compares feature distances in the source domain, and gives an attention matrix and probability distribution for each row. We then select queries based on their importance measure calculated from the distribution. We also verify its generalization performance on several benchmark datasets. Experiments and comparison with the state-of-the-art methods show that our model outperforms others.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Benchmarking
  • Generalization, Psychological
  • Image Enhancement*
  • Image Processing, Computer-Assisted
  • Learning*

Grants and funding

This work was supported by the Natural Science Foundation of Shandong Province under Grant ZR2021MF031 and ZR2020MF147. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.