ZeRGAN: Zero-Reference GAN for Fusion of Multispectral and Panchromatic Images

IEEE Trans Neural Netw Learn Syst. 2023 Nov;34(11):8195-8209. doi: 10.1109/TNNLS.2021.3137373. Epub 2023 Oct 27.

Abstract

In this article, we present a new pansharpening method, a zero-reference generative adversarial network (ZeRGAN), which fuses low spatial resolution multispectral (LR MS) and high spatial resolution panchromatic (PAN) images. In the proposed method, zero-reference indicates that it does not require paired reduced-scale images or unpaired full-scale images for training. To obtain accurate fusion results, we establish an adversarial game between a set of multiscale generators and their corresponding discriminators. Through multiscale generators, the fused high spatial resolution MS (HR MS) images are progressively produced from LR MS and PAN images, while the discriminators aim to distinguish the differences of spatial information between the HR MS images and the PAN images. In other words, the HR MS images are generated from LR MS and PAN images after the optimization of ZeRGAN. Furthermore, we construct a nonreference loss function, including an adversarial loss, spatial and spectral reconstruction losses, a spatial enhancement loss, and an average constancy loss. Through the minimization of the total loss, the spatial details in the HR MS images can be enhanced efficiently. Extensive experiments are implemented on datasets acquired by different satellites. The results demonstrate that the effectiveness of the proposed method compared with the state-of-the-art methods. The source code is publicly available at https://github.com/RSMagneto/ZeRGAN.