An empirical comparison of information-theoretic selection criteria for multivariate behavior genetic models

Behav Genet. 2004 Nov;34(6):593-610. doi: 10.1007/s10519-004-5587-0.

Abstract

Information theory provides an attractive basis for statistical inference and model selection. However, little is known about the relative performance of different information-theoretic criteria in covariance structure modeling, especially in behavioral genetic contexts. To explore these issues, information-theoretic fit criteria were compared with regard to their ability to discriminate between multivariate behavioral genetic models under various model, distribution, and sample size conditions. Results indicate that performance depends on sample size, model complexity, and distributional specification. The Bayesian Information Criterion (BIC) is more robust to distributional misspecification than Akaike's Information Criterion (AIC) under certain conditions, and outperforms AIC in larger samples and when comparing more complex models. An approximation to the Minimum Description Length (MDL; Rissanen, J. (1996). IEEE Transactions on Information Theory 42:40-47, Rissanen, J. (2001). IEEE Transactions on Information Theory 47:1712-1717) criterion, involving the empirical Fisher information matrix, exhibits variable patterns of performance due to the complexity of estimating Fisher information matrices. Results indicate that a relatively new information-theoretic criterion, Draper's Information Criterion (DIC; Draper, 1995), which shares features of the Bayesian and MDL criteria, performs similarly to or better than BIC. Results emphasize the importance of further research into theory and computation of information-theoretic criteria.

Publication types

  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Bayes Theorem
  • Computer Simulation
  • Genetics, Behavioral*
  • Humans
  • Models, Genetic*
  • Models, Psychological*
  • Monte Carlo Method
  • Normal Distribution
  • Patient Selection
  • Reproducibility of Results