- #1
joebloggs
- 9
- 0
Ok firstly I am new to statistics but have a layman's interest in entropy.
On Wikipedia i came across this article on the binary entropy function (http://en.wikipedia.org/wiki/Binary_entropy_function).
It says... If Pr(X = 1) = p, then Pr(X = 0) = 1 − p and the entropy of X is given by
H(X) = -p log p - (1-p) log (1-p)
For a a fair coin p = 1/2, so if I plug that into the equation I get
H(X) = -0.5log0.5 - 0.5log0.5
When I do this on my calculator I come up with H(X) = 0.301
The actual answer should be H(X) = 1
as this is the situation of maximum uncertainty (entropy in this case) as it is most difficult to predict the outcome of the next toss; the result of each toss of the coin delivers a full 1 bit of information.
What am I doing wrong? Am I not putting it into my calculator correctly?
On Wikipedia i came across this article on the binary entropy function (http://en.wikipedia.org/wiki/Binary_entropy_function).
It says... If Pr(X = 1) = p, then Pr(X = 0) = 1 − p and the entropy of X is given by
H(X) = -p log p - (1-p) log (1-p)
For a a fair coin p = 1/2, so if I plug that into the equation I get
H(X) = -0.5log0.5 - 0.5log0.5
When I do this on my calculator I come up with H(X) = 0.301
The actual answer should be H(X) = 1
as this is the situation of maximum uncertainty (entropy in this case) as it is most difficult to predict the outcome of the next toss; the result of each toss of the coin delivers a full 1 bit of information.
What am I doing wrong? Am I not putting it into my calculator correctly?