Probabilistic Distance for Mixtures of Independent Component Analyzers

IEEE Trans Neural Netw Learn Syst. 2018 Apr;29(4):1161-1173. doi: 10.1109/TNNLS.2017.2663843. Epub 2017 Feb 24.

Abstract

Independent component analysis (ICA) is a blind source separation technique where data are modeled as linear combinations of several independent non-Gaussian sources. The independence and linear restrictions are relaxed using several ICA mixture models (ICAMMs) obtaining a two-layer artificial neural network structure. This allows for dependence between sources of different classes, and thus, a myriad of multidimensional probability density functions can be accurate modeled. This paper proposes a new probabilistic distance (PDI) between the parameters learned for two ICAMMs. The PDI is computed explicitly, unlike the popular Kullback-Leibler divergence (KLD) and other similar metrics, removing the need for numerical integration. Furthermore, the PDI is symmetric and bounded within 0 and 1, which enables its use as a posterior probability in fusion approaches. In this paper, the PDI is employed for change detection by measuring the distance between two ICAMMs learned in consecutive time windows. The changes might be associated with relevant states from a process under analysis that are explicitly reflected in the learned ICAMM parameters. The proposed distance was tested in two challenging applications using simulated and real data: 1) detecting flaws in materials using ultrasounds and 2) detecting changes in electroencephalography signals from humans performing neuropsychological tests. The results demonstrate that the PDI outperforms the KLD in change-detection capabilities.

Publication types

  • Research Support, Non-U.S. Gov't