User:Gregor

From Conservapedia
This is an old revision of this page, as edited by Gregor (Talk | contribs) at 15:08, December 29, 2007. It may differ significantly from current revision.

Jump to: navigation, search

Gaussian adaptation as a model of evolution

According to a certain blog below a pocketful of theorems makes it plausible to use Gaussian adaptation as a simple second order statistical model of the evolution of quantitative traits provided that those traits are Gaussian distributed, or nearly so. The scientific community does not accept this opinion, but nobody has thus far showed that any one of the theorems - refered to - is wrong or that it can’t be applied to evolution.

Together those theorems shows a duality between mean fitness and average information ( phenotypic disorder, diversity) and that evolution may carry out a simultaneous maximization of mean fitness and average information. Also meaning that the process gives more information in the art of survival.


As earlier shown (see references), Gaussian adaptation, GA, may be used for maximization of manufacturing yield. The biological analogy to technical manufacturing yield becomes mean fitness. And a plausible definition of mean fitness, P, as a mean of probabilities is

  P = integral s(x) N(m – x) dx 

where s(x) is the probability that the individual having the array of n quantitative (Gaussian distributed) traits x(i), i = 1, 2, …, n. N is the Gaussian probability density function, p.d.f., with mean = m. It may be that this definition is not very suitable for breeding programs. Nevertheless, it seems very useful in many philosophical discussions.

According to point 7 below there must also be a balance between order and disorder obtained by a heritable mutation rate such that P is kept at a suitable level. In such a case evolution may maximize average information while keeping mean fitness constant.

1. The central limit theorem: Sums of a large number of random steps tend to become Gaussian distributed.

Since the development from fertilized egg to adult individual may be seen as a modified recapitulation of the stepwise evolution of a particular individual, morphological characters (parameters x) tend to become Gaussian distributed. As examples of such parameters we may mention the length of a bone or the distance between the pupils, or even the IQ.

2. The Hardy-Weinberg law: If mating takes place at random, then the allele frequencies in the next generation are the same as they were for the parents. Thus, the centre of gravity of phenotypes of offspring coincides with the centre of phenotypes of the parents.

3. Definitions of average information and phenotypic disorder, diversity, H - are equivalent and are valid for all statistical frequency functions, p(i) , (i = 1, 2, …, n). Sum{ p(i) } = 1.

     H = sum p(i) log[p(i)].

4. The second law of thermodynamics (the entropy law): The disorder will always increase in all isolated systems.

But in order to avoid considering isolated systems I prefer an alternative formulation: A system attains its possible macro states in proportion to their probability of occurrence. Then, the most probable states are the most disordered.

5. A theorem about disorder: The normal distribution is the most disordered distribution among all statistical distributions having the same moment matrix, M.

6. A more general formulation of the theorem of Gaussian adaptation: (a) The gradient of the mean fitness of a normal p. d. f. with respect to m is equal to

  grad P(m) = P inverse(M) ( m* – m). 

The maximizing necessary condition for mean fitness is m* = m (at selective equilibrium). m* is the centre of gravity of the phenotypes of the parents.

(b) The gradient of phenotypic disorder (entropy, average information, diversity) with respect to m – assuming P constant - points in the same direction as grad P(m).

(c) A normal p. d. f. may be adapted for maximum average information to any s(x) at any given value of P. The maximizing necessary conditions are

  m* = m   and M* proportional to M

When m* = m at selective equilibrium, as achieved according to point 2, the gradient = 0 and mean fitness and average information (phenotypic disorder, diversity) may be simultaneously maximal. For the proof see Kjellström & Taxén, 1981.

7. The theorem of efficiency. All measures of efficiency satisfying certain simple relevant postulates, are asymptotically proportional to -P*log(P) when the number of statistically independent parameters tend towards infinity.

The most important difference between the natural and the simulated evolution in my PC is that the natural one is able to test millions of individuals in parallel, while my PC has to test one at a time. This means that when evolution replaces one generation of a population with one million individuals with a new one in one year, the same operation will take one million years in my PC. In spite of this I find the simulated evolution very efficient.

As earlier shown, maximum efficiency is achieved when P = 1/e = 0.37. For the proof see Kjellström, 1991, in reference list --Gregor 10:08, 29 December 2007 (EST)

references

Bergström, R. M., 1969. An Entropy model of the Developing Brain. Developmental Psychobiology, 2(3): 139-152.

Bergström, M. Hjärnans resurser. Brain Books, ISBN 91-88410-07-2, Jönköping, 1992. (Swedish).

Bergström, M. Neuropedagogik. En skola för hela hjärnan. Wahlström & Widstrand, 1995. (Swedish).

Cramér, H. Mathematical Methods of Statistics. Princeton, Princeton University Press, 1961.

Dawkins, R. The Selfish Gene. Oxford University Press, 1976.

Eigen, M. Steps towards life. Oxford University Press, 1992.

Gaines, Brian R. Knowledge Management in Societies of Intelligent Adaptive Agents. Journal of intelligent Information systems 9, 277-298 (1997).

Goldberg, D. E. Genetic Algorithms in Search Optimization & Machine Learning. Addison-Wesley, New York, 1989.

Hartl, D. L. A Primer of Population Genetics. Sinauer, Sunderland, Massachusetts, 1981.

Kandel, E. R., Schwartz, J. H., Jessel, T. M. Essentials of Neural Science and Behavior. Prentice Hall International, London, 1995.

Kjellström, G. Network Optimization by Random Variation of component values. Ericsson Technics, vol. 25, no. 3, pp. 133-151, 1969.

Kjellström, G. Optimization of electrical Networks with respect to Tolerance Costs. Ericsson Technics, no. 3, pp. 157-175, 1970.

Kjellström, G. & Taxén, L. Stochastic Optimization in System Design. IEEE Trans. on Circ. and Syst., vol. CAS-28, no. 7, July 1981.

Kjellström, G. On the Efficiency of Gaussian Adaptation. Journal of Optimization Theory and Applications, vol. 71, no. 3, Dec. 1991.

Kjellström, G. & Taxén, L. Gaussian Adaptation, an evolution-based efficient global optimizer; Computational and Applied Mathematics, In, C. Brezinski & U. Kulish (Editors), Elsevier Science Publishers B. V., pp 267-276, 1992.

Kjellström, G. Evolution as a statistical optimization algorithm. Evolutionary Theory 11:105-117 (January, 1996).

Kjellström, G. The evolution in the brain. Applied Mathematics and Computation, 98(2-3):293-300, February, 1999.

Kjellström, G. Evolution in a nutshell and some consequences concerning valuations. EVOLVE, ISBN 91-972936-1-X, Stockholm, 2002.

Levine, D. S. Introduction to Neural & Cognitive Modeling. Laurence Erlbaum Associates, Inc., Publishers, 1991.

MacLean, P. D. A Triune Concept of the Brain and Behavior. Toronto, Univ. Toronto Press, 1973.

Maynard Smith, J. Evolutionary Genetics. Oxford University Press, 1998.

Mayr, E. What Evolution is. Basic Books, New York, 2001.

Middleton, D. An Introduction to Statistical Communication Theory. McGraw-Hill, 1960.

Rechenberg, I. Evolutionsstrategie. Stuttgart: Fromann - Holzboog, 1973.

Reif, F. Fundmentals of Statistical and Thermal Physics. McGraw-Hill, 1985.

Ridley, M. Evolution. Blackwell Science, 1996.

Shannon, C. E. A Mathematical Theory of Communication, Bell Syst. Techn. J., Vol. 27, pp 379-423, (Part I), 1948.

Stehr, G. On the Performance Space Exploration of Analog Integrated Circuits. Technischen Universität Munchen, Dissertation 2005.

Taxén, L. A Framework for the Coordination of Complex Systems’ Development. Institute of Technology, Linköping University, 2003.

Zohar, D. The quantum self : a revolutionary view of human nature and consciousness rooted in the new physics. London, Bloomsbury, 1990

Åslund, N. The fundamental theorems of information theory (Swedish). Nordisk Matematisk Tidskrift, Band 9, Oslo 1961.