Trying to understand binary entropy function

In summary, on Wikipedia, the article discusses the binary entropy function, and it states that if Pr(X = 1) = p, then Pr(X = 0) = 1 − p and the entropy of X is given by H(X) = -p log p - (1-p) log (1-p). The article also mentions that you should use binary logarithms in the formula for the entropy of a number. On my calculator, I have a log (to the base 10) button and a natural log button. The only way I know to put something in log to the base 2 is to go log(x)/log2. Is there a better way to the get log to the base 2 of
  • #1
joebloggs
9
0
Ok firstly I am new to statistics but have a layman's interest in entropy.

On Wikipedia i came across this article on the binary entropy function (http://en.wikipedia.org/wiki/Binary_entropy_function).

It says... If Pr(X = 1) = p, then Pr(X = 0) = 1 − p and the entropy of X is given by
H(X) = -p log p - (1-p) log (1-p)

For a a fair coin p = 1/2, so if I plug that into the equation I get

H(X) = -0.5log0.5 - 0.5log0.5

When I do this on my calculator I come up with H(X) = 0.301

The actual answer should be H(X) = 1
as this is the situation of maximum uncertainty (entropy in this case) as it is most difficult to predict the outcome of the next toss; the result of each toss of the coin delivers a full 1 bit of information.

What am I doing wrong? Am I not putting it into my calculator correctly?
 
Physics news on Phys.org
  • #2
According to the article you're supposed to use binary logarithms in the formula. You've entered plain old base 10 logarithms in your calculator
 
  • #3
oh ok. How do I enter binary logarithms on my calculator?

Sorry about this, I've just realized that I should have posted this thread in the coursework forum.
 
  • #4
Oh - coursework: I'll get in trouble if I answer it. However, if you just think about the meaning of the binary logarithm, you should then see what the binary log of 0.5 immediately, and then see the answer to your equation without using a calculator.
 
  • #5
Ok I think I see what you are saying. 2 raised to the power of -1 = 0.5. So there is the 1 bit. This is not a problem I have to do for any paper or course (I study environmental planning). This is just personal study, I'm interested in getting a more thorough understanding of information entropy and how it relates to thermodynamic entropy.

The sticky thread said that for any textbook style questions it had to go to the coursework forum. But I don't know how to shift it. I've emailed one of the moderators about it.

On my calculator I have a log (to the base 10) button and a natural log button. The only way I know to put something in log to the base 2 is to go log(x)/log2.

Is there a better a way to the get log to the base 2 of a number on the calculator?

Cheers for your help though :)
 
Last edited:

FAQ: Trying to understand binary entropy function

What is binary entropy function?

Binary entropy function is a mathematical function used to measure the uncertainty or randomness of a binary random variable. It calculates the amount of information contained in a binary system, where there are two equally likely outcomes.

How is binary entropy function calculated?

The formula for calculating binary entropy function is -plog2(p) - (1-p)log2(1-p), where p is the probability of one of the outcomes occurring.

What is the significance of binary entropy function?

Binary entropy function is commonly used in information theory, coding theory, and statistical mechanics. It is used to measure the uncertainty or randomness of a binary system and is an important tool in understanding and analyzing various systems.

Can binary entropy function be negative?

Yes, binary entropy function can take on negative values, but it is typically bounded between 0 and 1. A negative value indicates that the system is not random or uncertain, while a value closer to 1 indicates a high level of randomness or uncertainty.

How is binary entropy function related to Shannon entropy?

Binary entropy function is a special case of Shannon entropy, which is a more general measure of uncertainty or randomness in a system with more than two outcomes. Shannon entropy takes into account the probabilities of all possible outcomes, while binary entropy function only considers the probabilities of two outcomes.

Similar threads

Replies
10
Views
1K
Replies
8
Views
1K
Replies
1
Views
3K
Replies
4
Views
1K
Replies
9
Views
7K
Replies
2
Views
1K
Back
Top