Article 6Y651 Uniformity increases entropy

Uniformity increases entropy

by
John
from John D. Cook on (#6Y651)

Suppose you have a system withn possible states. The entropy of the system is maximized when all states are equally likely to occur. The entropy is minimized when one outcome is certain to occur.

You can say more. Starting from any set of probabilities, as you move in the direction of more uniformity, you increase entropy. And as you move in the direction of less uniformity, you decrease entropy.

These statements can be quantified and stated more precisely. That's what the rest of this post will do.

***

Let pi be the probability of the ith state and letp be the vector of the pi.

entropy_concave1.svg

Then the entropy ofp is defined as

entropy_concave2.svg

If one of theps is 1 and the rest of theps are zero, then H(p) = 0. (In the definition of entropy, 0 log2 0 is taken to be 0. You could justify this as the limit of x log2 x as x goes to zero.)

If each of the pi are equal, pi = 1/n, then H(p) = log2 n. The fact that this is the maximum entropy, and that compromises between the two extremes always decrease entropy, comes from the fact that the entropy functionH is concave (proof). That is, if p1 is one list of probabilities and p2 another, then

entropy_concave3.svg

When we speak informally of moving from p1 in the direction of p2, we mean we increase the parameter from 0 to some positive amount no more than 1.

Because entropy is concave, there are no local maxima. As you approach the location of global maximum entropy, i.e. equal state probabilities, from any direction, entropy increases monotonically.

The post Uniformity increases entropy first appeared on John D. Cook.
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TheEndeavour?format=xml
Feed Title John D. Cook
Feed Link https://www.johndcook.com/blog
Reply 0 comments