Clustering Analysis via Deep Generative Models With Mixture Models

IEEE Trans Neural Netw Learn Syst. 2022 Jan;33(1):340-350. doi: 10.1109/TNNLS.2020.3027761. Epub 2022 Jan 5.

Abstract

Clustering is a fundamental problem that frequently arises in many fields, such as pattern recognition, data mining, and machine learning. Although various clustering algorithms have been developed in the past, traditional clustering algorithms with shallow structures cannot excavate the interdependence of complex data features in latent space. Recently, deep generative models, such as autoencoder (AE), variational AE (VAE), and generative adversarial network (GAN), have achieved remarkable success in many unsupervised applications thanks to their capabilities for learning promising latent representations from original data. In this work, first we propose a novel clustering approach based on both Wasserstein GAN with gradient penalty (WGAN-GP) and VAE with a Gaussian mixture prior. By combining the WGAN-GP with VAE, the generator of WGAN-GP is formulated by drawing samples from the probabilistic decoder of VAE. Moreover, to provide more robust clustering and generation performance when outliers are encountered in data, a variant of the proposed deep generative model is developed based on a Student's-t mixture prior. The effectiveness of our deep generative models is validated though experiments on both clustering analysis and samples generation. Through the comparison with other state-of-art clustering approaches based on deep generative models, the proposed approach can provide more stable training of the model, improve the accuracy of clustering, and generate realistic samples.