What is Normalization? The Strategies Employed in Top-Down and Bottom-Up Proteome Analysis Workflows

Proteomes. 2019 Aug 22;7(3):29. doi: 10.3390/proteomes7030029.

Abstract

The accurate quantification of changes in the abundance of proteins is one of the main applications of proteomics. The maintenance of accuracy can be affected by bias and error that can occur at many points in the experimental process, and normalization strategies are crucial to attempt to overcome this bias and return the sample to its regular biological condition, or normal state. Much work has been published on performing normalization on data post-acquisition with many algorithms and statistical processes available. However, there are many other sources of bias that can occur during experimental design and sample handling that are currently unaddressed. This article aims to cast light on the potential sources of bias and where normalization could be applied to return the sample to its normal state. Throughout we suggest solutions where possible but, in some cases, solutions are not available. Thus, we see this article as a starting point for discussion of the definition of and the issues surrounding the concept of normalization as it applies to the proteomic analysis of biological samples. Specifically, we discuss a wide range of different normalization techniques that can occur at each stage of the sample preparation and analysis process.

Keywords: 2D-PAGE; LC-MS/MS; normalization; top down proteomics.

Publication types

  • Review