Cross-collection latent Beta-Liouville allocation model training with privacy protection and applications

Appl Intell (Dordr). 2023 Jan 13:1-25. doi: 10.1007/s10489-022-04378-3. Online ahead of print.

Abstract

Cross-collection topic models extend previous single-collection topic models, such as Latent Dirichlet Allocation (LDA), to multiple collections. The purpose of cross-collection topic modeling is to model document-topic representations and reveal similarities between each topic and differences among groups. However, the restriction of Dirichlet prior and the significant privacy risk have hampered those models' performance and utility. Training those cross-collection topic models may, in particular, leak sensitive information from the training dataset. To address the two issues mentioned above, we propose a novel model, cross-collection latent Beta-Liouville allocation (ccLBLA), which operates a more powerful prior, Beta-Liouville distribution with a more general covariance structure that enhances topic correlation analysis. To provide privacy protection for the ccLBLA model, we leverage the inherent differential privacy guarantee of the Collapsed Gibbs Sampling (CGS) inference scheme and then propose a hybrid privacy protection algorithm for the ccLBLA model (HPP-ccLBLA) that prevents inferring data from intermediate statistics during the CGS training process without sacrificing its utility. More crucially, our technique is the first attempt to use the cross-collection topic model in image classification applications and investigate the cross-collection topic model's capabilities beyond text analysis. The experimental results for comparative text mining and image classification will show the merits of our proposed approach.

Keywords: Beta-Liouville prior; Comparative text mining; Cross-collection model; Differential privacy; Image classification; Topic correlation.