How Much Information Can Nods and Shakes Convey in a Bridge Game?

In summary, to convey 13 cards in a bridge game by nods and shakes, you need approximately 39.21 bits of entropy, assuming all cards are equiprobable.
  • #1
szpengchao
2
0
consider a pack of 52 cards in a bridge game. a player try to convey 13 cards by nods of head or shake of heads to his partner. find the shannon entropy
 
Mathematics news on Phys.org
  • #2
You need [itex]\rm{log}_{2}52 \approx 6 bits/card[/itex] to specify a single card (admitting that they are all equiprobable). For 13 independent cards, you'll need [itex]13\times\rm{log}_{2}52 bits[/itex].
 
  • #3
JSuarez said:
You need [itex]\rm{log}_{2}52 \approx 6 bits/card[/itex] to specify a single card (admitting that they are all equiprobable). For 13 independent cards, you'll need [itex]13\times\rm{log}_{2}52 bits[/itex].

but the question tells the answer is 40, and it asks to find a coding function with entropy 50
 
  • #4
Well, then the question is asking for the amount of information necessary to transmit an arrangement of 13 cards as a whole and not individually; that was not clear from the question.

There are [itex]\binom{52}{13}[/itex] possible arrangements, and this gives an entropy of [itex]-\rm{log_2}\binom{52}{13} \approx 39.21 bits[/itex].
 
  • #5


The Shannon entropy problem in this scenario would involve determining the amount of information that can be conveyed through the nods and shakes of the player's head. In other words, how much uncertainty is reduced by the player's actions?

To find the Shannon entropy, we would first need to calculate the probability of each nod or shake of the head conveying a specific card. In this case, there are 52 cards in the deck and the player is trying to convey 13 of them, so the probability of each nod or shake representing a specific card would be 1/13.

Next, we would use the formula for Shannon entropy, H = -∑ p(x) log2p(x), where p(x) is the probability of each possible outcome (in this case, each nod or shake representing a specific card).

So, the Shannon entropy for this scenario would be:
H = -∑ (1/13) log2(1/13) = -13(1/13) log2(1/13) = -log2(1/13) = 3.7 bits

This means that the player's nods and shakes of the head can convey approximately 3.7 bits of information in this bridge game scenario. This measurement of entropy can help us understand the efficiency and effectiveness of the player's communication method and could be used to improve communication strategies in similar situations.
 

FAQ: How Much Information Can Nods and Shakes Convey in a Bridge Game?

What is the Shannon entropy problem?

The Shannon entropy problem, also known as the information entropy problem, is a mathematical concept named after Claude Shannon that measures the uncertainty or randomness of a system. It is commonly used in information theory and communication to quantify the amount of information contained in a message or data set.

How is Shannon entropy calculated?

Shannon entropy is calculated using the formula: H = -Σ(P(x)*log(P(x))), where P(x) is the probability of a particular event or symbol occurring in a system. This formula accounts for the likelihood of different outcomes and assigns a higher entropy value to systems with more randomness and uncertainty.

What is the significance of Shannon entropy?

Shannon entropy is significant because it provides a measure of the amount of information contained in a system. It is used in various fields such as cryptography, data compression, and biology to analyze and understand the complexity and randomness of systems.

Can Shannon entropy be negative?

No, Shannon entropy cannot be negative. This is because the formula for calculating entropy includes a negative sign, and the probability value used in the formula is always between 0 and 1. Therefore, the result of the calculation will always be a non-negative value.

What are some applications of Shannon entropy?

Shannon entropy has various applications in different fields, such as in cryptography to measure the strength of a cipher, in data compression to reduce the size of a file, in bioinformatics to analyze DNA sequences, and in machine learning to assess the complexity of a data set.

Similar threads

Replies
4
Views
2K
Replies
9
Views
3K
Replies
4
Views
2K
Replies
1
Views
2K
Replies
4
Views
2K
Replies
1
Views
2K
Replies
11
Views
11K
Back
Top