Maximum entropy probability distributions, from Wikipedia.
For example, the normal distribution has maximal entropy among all distributions with a given mean and variance; the exponential distribution has maximal entropy among all distributions with positive support and a given mean; the uniform distribution has maximal entropy among all distributions supported on an interval.
These are all special cases of a theorem due to Boltzmann of which I was surprisingly unaware.
05 February 2008
Subscribe to:
Post Comments (Atom)
1 comment:
This beautiful formula by Boltzmann brings statistical physics (energy distribution in the Gibbs ensembles), informtion theory (entropy) and classical probablity (Gauss etc.) together. It looks like one of the great unifying ideas of science. It was used by George W. Mackey in his book "The Mathematical Foundations of Quantum Mechanics" to treat statistical mechanics. It also explains the prevalence of the bell curve in statistics.
Post a Comment