The Fundamental Theorem of Natural Selection

Entropy (Basel). 2021 Oct 30;23(11):1436. doi: 10.3390/e23111436.

Abstract

Suppose we have n different types of self-replicating entity, with the population Pi of the ith type changing at a rate equal to Pi times the fitness fi of that type. Suppose the fitness fi is any continuous function of all the populations P1,…,Pn. Let pi be the fraction of replicators that are of the ith type. Then p=(p1,…,pn) is a time-dependent probability distribution, and we prove that its speed as measured by the Fisher information metric equals the variance in fitness. In rough terms, this says that the speed at which information is updated through natural selection equals the variance in fitness. This result can be seen as a modified version of Fisher's fundamental theorem of natural selection. We compare it to Fisher's original result as interpreted by Price, Ewens and Edwards.

Keywords: Fisher information metric; Lotka–Volterra equation; natural selection; population biology; replicator equation.