Article 494T2 Solving for probability given entropy

Solving for probability given entropy

by
John
from John D. Cook on (#494T2)

If a coin comes up heads with probability p and tails with probability 1-p, the entropy in the coin flip is

S = -p log2p - (1-p) log2 (1-p).

It's common to start with p and compute entropy, but recently I had to go the other way around: given entropy, solve for p. It's easy to come up with an approximate solution.

entropy_approx.svg

Entropy in this case is approximately quadratic

S a 4p(1-p)

and so

p a (1 a(1-S))/2.

This is a good approximation if S is near 0 or 1 but mediocre in the middle. You could use solve for p numerically, say with Newton's method, to get more accuracy if needed.

dFTwJTF9mak
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TheEndeavour?format=xml
Feed Title John D. Cook
Feed Link https://www.johndcook.com/blog
Reply 0 comments