- #1
szpengchao
- 2
- 0
consider a pack of 52 cards in a bridge game. a player try to convey 13 cards by nods of head or shake of heads to his partner. find the shannon entropy
JSuarez said:You need [itex]\rm{log}_{2}52 \approx 6 bits/card[/itex] to specify a single card (admitting that they are all equiprobable). For 13 independent cards, you'll need [itex]13\times\rm{log}_{2}52 bits[/itex].
The Shannon entropy problem, also known as the information entropy problem, is a mathematical concept named after Claude Shannon that measures the uncertainty or randomness of a system. It is commonly used in information theory and communication to quantify the amount of information contained in a message or data set.
Shannon entropy is calculated using the formula: H = -Σ(P(x)*log(P(x))), where P(x) is the probability of a particular event or symbol occurring in a system. This formula accounts for the likelihood of different outcomes and assigns a higher entropy value to systems with more randomness and uncertainty.
Shannon entropy is significant because it provides a measure of the amount of information contained in a system. It is used in various fields such as cryptography, data compression, and biology to analyze and understand the complexity and randomness of systems.
No, Shannon entropy cannot be negative. This is because the formula for calculating entropy includes a negative sign, and the probability value used in the formula is always between 0 and 1. Therefore, the result of the calculation will always be a non-negative value.
Shannon entropy has various applications in different fields, such as in cryptography to measure the strength of a cipher, in data compression to reduce the size of a file, in bioinformatics to analyze DNA sequences, and in machine learning to assess the complexity of a data set.