Sparse Count Data Clustering Using an Exponential Approximation to Generalized Dirichlet Multinomial Distributions

IEEE Trans Neural Netw Learn Syst. 2022 Jan;33(1):89-102. doi: 10.1109/TNNLS.2020.3027539. Epub 2022 Jan 5.

Abstract

Clustering frequency vectors is a challenging task on large data sets considering its high dimensionality and sparsity nature. Generalized Dirichlet multinomial (GDM) distribution is a competitive generative model for count data in terms of accuracy, yet its parameters estimation process is slow. The exponential-family approximation of the multivariate Polya distribution has shown to be efficient to train and cluster data directly, without dimensionality reduction. In this article, we derive an exponential-family approximation to the GDM distributions, and we call it (EGDM). A mixture model is developed based on the new member of the exponential-family of distributions, and its parameters are learned through the deterministic annealing expectation-maximization (DAEM) approach as a new clustering algorithm for count data. Moreover, we propose to estimate the optimal number of EGDM mixture components based on the minimum message length (MML) criterion. We have conducted a set of empirical experiments, concerning text, image, and video clustering, to evaluate the proposed approach performance. Results show that the new model attains a superior performance, and it is considerably faster than the corresponding method for GDM distributions.