Entropy of a probability distribution: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Aleksander Stos
m (Internal Article=> CZ_Live)
imported>Aleksander Stos
m (typo)
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{subpages}}
The '''entropy''' of a [[probability distribution]] is a number that describes the degree of uncertainty or disorder the distribution represents.  
The '''entropy''' of a [[probability distribution]] is a number that describes the degree of uncertainty or disorder the distribution represents.  


==Examples==
==Examples==


Assume we have a set of two mutually exclusive propositions (or equivalently,  a [[random experiment]] with two possible outcomes). Assume all two possiblities are equally likely.  
Assume we have a set of two mutually exclusive propositions (or equivalently,  a [[random experiment]] with two possible outcomes). Assume all two possibilities are equally likely.  


Then our advance uncertainty about the eventual outcome is rather small - we know in advance it will be one of exactly two known alternatives.  
Then our advance uncertainty about the eventual outcome is rather small - we know in advance it will be one of exactly two known alternatives.  
Line 12: Line 14:


==Formal definitions==
==Formal definitions==
#Given a [[discrete probability distribution]] function f,  the entropy H of the distribution (measured in [[bits]]) is given by <math>H=-\sum_{\forall i : f(x_i) \ne 0}^{} f(x_{i}) log_{2} f(x_{i} )</math>
#Given a [[continuous probability distribution]] function f,  the entropy H of the distribution (again measure in bits) is given by <math>H=-\int_{\ x: f(x) \ne 0 } f(x) log_{2} f(x) dx</math>


#Given a [[discrete probability distribution]] function f,  the entropy H of the distribution is given by <math>H=-\sum_{i=-\infty}^{i=\infty} f(x_{i}) log_{2} f(x_{i} )</math>
Note that some authors prefer to use other units than bit to measure entropy,  the formulas are then slightly different.  Also,  the symbol S is sometimes used,  rather than H.
#Given a [[continuous probability distribution]] function f,  the entropy H of the distribution is given by <math>H=-\int_{-\infty}^{\infty} f(x) log_{2} f(x) dx</math>
 
Note that some authors prefer to use the natural logarithm rather than base two.


== See also ==
== See also ==
*[[Entropy in thermodynamics and information theory]]
*[[Discrete probability distribution]]
*[[Discrete probability distribution]]
*[[Continuous probability distribution]]
*[[Continuous probability distribution]]
Line 28: Line 30:


== External links ==
== External links ==
[[Category:Mathematics Workgroup]]
[[Category:CZ Live]]

Latest revision as of 07:17, 4 January 2008

This article is a stub and thus not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

The entropy of a probability distribution is a number that describes the degree of uncertainty or disorder the distribution represents.

Examples

Assume we have a set of two mutually exclusive propositions (or equivalently, a random experiment with two possible outcomes). Assume all two possibilities are equally likely.

Then our advance uncertainty about the eventual outcome is rather small - we know in advance it will be one of exactly two known alternatives.

Assume now we have a set of a million alternatives - all of them equally likely - rather than two.

It seems clear that our uncertainty now about the eventual outcome will be much bigger.

Formal definitions

  1. Given a discrete probability distribution function f, the entropy H of the distribution (measured in bits) is given by
  2. Given a continuous probability distribution function f, the entropy H of the distribution (again measure in bits) is given by

Note that some authors prefer to use other units than bit to measure entropy, the formulas are then slightly different. Also, the symbol S is sometimes used, rather than H.

See also

References

External links