Family-Based Benchmarking of Copy Number Variation Detection Software

PLoS One. 2015 Jul 21;10(7):e0133465. doi: 10.1371/journal.pone.0133465. eCollection 2015.

Abstract

The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Benchmarking / methods*
  • Computational Biology / methods*
  • Computer Simulation
  • DNA Copy Number Variations / genetics*
  • Genome, Human / genetics
  • Genome-Wide Association Study / methods
  • Genotype
  • Humans
  • Polymerase Chain Reaction / methods
  • Polymorphism, Single Nucleotide*
  • Reproducibility of Results
  • Software Validation
  • Software*

Grants and funding

This study was supported by the German Ministry of Education and Research (BMBF) through the DFG Cluster of Excellence EXC 306 “Inflammation at Interfaces”. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.