The curse of normalization

Olaf Wolkenhauer*, Carla Möller-Levet, Fatima Sanchez-Cabo

*Corresponding author for this work

Research output: Contribution to journalReview article / Perspectivespeer-review

8 Scopus citations

Abstract

Despite its enormous promise to further our understanding of cellular processes involved in the regulation of gene expression, microarray technology generates data for which statistical pre-processing has become a necessity before any interpretation of data can begin. The process by which we distinguish (and remove) non-biological variation from biological variation is called normalization. With a multitude of experimental designs, techniques and technologies influencing the acquisition of data, numerous approaches to normalization have been proposed in the literature. The purpose of this short review is not to add to the many suggestions that have been made, but to discuss some of the difficulties we encounter when analysing microarray data.

Original languageEnglish
Pages (from-to)375-379
Number of pages5
JournalComparative and Functional Genomics
Volume3
Issue number4
DOIs
StatePublished - Aug 2002
Externally publishedYes

Keywords

  • Bioinformatics
  • Microarrays
  • Normalization
  • Singular value decomposition

Fingerprint

Dive into the research topics of 'The curse of normalization'. Together they form a unique fingerprint.

Cite this