What Is the Total Probability of Rolling a 6 with Maximum Shannon Entropy?

In summary, the conversation discusses the problem of finding the maximum Shannon entropy for a loaded 6-sided dice, where the probability of rolling a 6 is twice as likely as rolling a 1. The conversation delves into the equations and steps needed to find the maximum entropy, including partial derivatives and constraints. The end result is a solution that shows that the probabilities of rolling each number must be equal, and that the maximum entropy occurs when the probability of rolling a 6 is approximately 0.38.
  • #1
ZetaOfThree
Gold Member
110
23

Homework Statement


A 6 sided dice is loaded such that 6 occurs twice as often as 1. What is the total probability of rolling a 6 if the Shannon entropy is a maximum?

Homework Equations


Shannon Entropy:
$$S=-\sum_i p_i \ln{p_i}$$
where ##p_i## is the probability that we roll ##i##.

The Attempt at a Solution


We know that ##\sum_i p_i = 1 ## and we are given that ##p_1=p_6/2##. So $$p_2+p_3+p_4+p_5+\frac{3}{2} p_6 =1 \Rightarrow p_5 = 1-p_2-p_3-p_4-\frac{3}{2} p_6$$ There we can write the Shannon entropy as $$S=-\left( \frac{p_6}{2} \ln{\frac{p_6}{2}} + p_2 \ln{p_2}+p_3 \ln{p_3}+p_4 \ln{p_4}+(1-p_2-p_3-p_4-\frac{3}{2} p_6) \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) + p_6 \ln{p_6}\right)$$
$$=-\left( \frac{3 p_6}{2} \ln{p_6} + p_2 \ln{p_2}+p_3 \ln{p_3}+p_4 \ln{p_4}+(1-p_2-p_3-p_4-\frac{3}{2} p_6) \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) - \frac{p_6}{2} \ln{2}\right)$$
To find an extremum, we partial differentiate and set it equal to zero:
$$\frac{\partial S}{\partial p_2} = - \left(\ln{p_2} +1 - \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) -1 \right) = -\left(\ln{p_2} - \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6) \right) =0$$
$$\Rightarrow \ln p_2 = \ln(1-p_2-p_3-p_4-\frac{3}{2} p_6)$$
Differentiating with respect to ##p_3## and ##p_4##, we find the same condition, so we conclude that ##p_2=p_3=p_4=p_5##. We now write the Shannon entropy as $$S=- \left(\frac{3 p_6}{2} \ln{p_6}+4p_2 \ln{p_2} - \frac{p_6}{2} \ln{2}\right)$$ So $$\frac{\partial S}{\partial p_6}= -\left(\frac{3}{2} \ln{p_6}+\frac{3}{2} - \frac{1}{2} \ln{2}\right) = 0$$
Therefore we find that ##p_6 = \frac{2^{1/3}}{e}##. But this is wrong, because if we plug in the probabilities into the Shannon entropy formula, we do not get a maximum. For example, we get a higher Shannon entropy if we plug in ##p_1=p_2=p_3=p_4=p_5=1/7## and ##p_6=2/7##. Where did I go wrong? Maybe I found a minimum or something? If so, how do I get the maximum?
Thanks for any help.
 
Physics news on Phys.org
  • #2
It looks as if you put ##\displaystyle {d p_2\over d p_6}=0##. But isn't there a constraint that ##4p_2+{3\over 2}p_6 = 1 \ \Rightarrow \ 4d p_2 + {3\over 2}dp_6 = 0 ## ?
 
  • #3
BvU said:
It looks as if you put ##\displaystyle {d p_2\over d p_6}=0##.
What step are you referring to? I don't think I used this.
BvU said:
But isn't there a constraint that ##4p_2+{3\over 2}p_6 = 1 \ \Rightarrow \ 4d p_2 + {3\over 2}dp_6 = 0 ## ?
Yes, you can use that condition to find ##p_2## after ##p_6## is found. So in the case above, we have ##p_1 = \frac{1}{2^{2/3}e}##, ##p_2=p_3=p_4=p_5= \frac{1}{8} \left(2- \frac{3 \sqrt [3] {2}}{e} \right)## and ##p_6 = \frac{\sqrt[3]{2}}{e}##.
 
  • #4
The partial derivative gives you an optimum if p6 is independent of the other probabilities. It is not. You should get the correct result if you express p2 as function of p6 and then calculate the partial derivative.
 
  • #5
ZetaOfThree said:
What step are you referring to? I don't think I used this.

Yes, you can use that condition to find ##p_2## after ##p_6## is found. So in the case above, we have ##p_1 = \frac{1}{2^{2/3}e}##, ##p_2=p_3=p_4=p_5= \frac{1}{8} \left(2- \frac{3 \sqrt [3] {2}}{e} \right)## and ##p_6 = \frac{\sqrt[3]{2}}{e}##.

Given that you know ##p_2=p_3=p_4=p_5##, and given the two other constraints you could write the entropy as a function of a single variable like ##p_6## and maximize that as a single variable problem. That should be straightforward.
 
  • #6
mfb said:
The partial derivative gives you an optimum if p6 is independent of the other probabilities. It is not. You should get the correct result if you express p2 as function of p6 and then calculate the partial derivative.

Dick said:
Given that you know ##p_2=p_3=p_4=p_5##, and given the two other constraints you could write the entropy as a function of a single variable like ##p_6## and maximize that as a single variable problem. That should be straightforward.

Sounds about right. I did that, and got an answer I'm pretty sure is correct. Thank you all for the help!
 

FAQ: What Is the Total Probability of Rolling a 6 with Maximum Shannon Entropy?

1. What is entropy in the context of a loaded dice throw?

Entropy is a measure of the randomness or disorder in a system. In the context of a loaded dice throw, entropy refers to the unpredictability of the outcome and the likelihood of certain numbers or outcomes appearing more frequently than others.

2. How is entropy related to the probability of a loaded dice throw?

Entropy is directly related to the probability of a loaded dice throw. The higher the entropy, the more unpredictable the outcome and the lower the probability of getting a specific number or outcome. This means that a loaded dice with high entropy will have a more equal chance of landing on any number, while a loaded dice with low entropy will have a higher chance of landing on certain numbers.

3. Can the entropy of a loaded dice throw be manipulated?

Yes, the entropy of a loaded dice throw can be manipulated by altering the weight distribution of the dice or the surface it is being rolled on. This can increase or decrease the randomness of the outcome and affect the probability of certain numbers appearing.

4. What factors can influence the entropy of a loaded dice throw?

The entropy of a loaded dice throw can be influenced by various factors such as the weight distribution of the dice, the surface it is being rolled on, and the force and angle of the roll. Other external factors such as air resistance and temperature can also affect the entropy of the throw.

5. Is it possible to calculate the exact entropy of a loaded dice throw?

No, it is not possible to calculate the exact entropy of a loaded dice throw as it is a probabilistic measure and can vary from throw to throw. However, it is possible to estimate the entropy based on the known factors and characteristics of the dice and the throwing environment.

Similar threads

Replies
6
Views
1K
Replies
3
Views
547
Replies
1
Views
3K
Replies
4
Views
2K
Replies
1
Views
1K
Replies
7
Views
2K
Back
Top