Optimizing Latent Distributions for Non-Adversarial Generative Networks

IEEE Trans Pattern Anal Mach Intell. 2022 May;44(5):2657-2672. doi: 10.1109/TPAMI.2020.3043745. Epub 2022 Apr 1.

Abstract

The generator in generative adversarial networks (GANs) is driven by a discriminator to produce high-quality images through an adversarial game. At the same time, the difficulty of reaching a stable generator has been increased. This paper focuses on non-adversarial generative networks that are trained in a plain manner without adversarial loss. The given limited number of real images could be insufficient to fully represent the real data distribution. We therefore investigate a set of distributions in a Wasserstein ball centred on the distribution induced by the training data and propose to optimize the generator over this Wasserstein ball. We theoretically discuss the solvability of the newly defined objective function and develop a tractable reformulation to learn the generator. The connections and differences between the proposed non-adversarial generative networks and GANs are analyzed. Experimental results on real-world datasets demonstrate that the proposed algorithm can effectively learn image generators in a non-adversarial approach, and the generated images are of comparable quality with those from GANs.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Image Processing, Computer-Assisted* / methods
  • Neural Networks, Computer*