A mutual information criterion with applications to canonical correlation analysis and graphical models

Stat (Int Stat Inst). 2021 Dec;10(1):e385. doi: 10.1002/sta4.385. Epub 2021 Sep 7.

Abstract

This paper derives a criterion for deciding conditional independence that is consistent with small-sample corrections of Akaike's information criterion but is easier to apply to such problems as selecting variables in canonical correlation analysis and selecting graphical models. The criterion reduces to mutual information when the assumed distribution equals the true distribution; hence, it is called mutual information criterion (MIC). Although small-sample Kullback-Leibler criteria for these selection problems have been proposed previously, some of which are not widely known, MIC is strikingly more direct to derive and apply.

Keywords: Akaike's information criterion; CCA; model selection; mutual information.