Is Shannon Entropy Dependent on Perspective?

In summary, the Shannon entropy of a system depends on the equal probability of outcomes and the predictive ability of the system making the observation. It also varies depending on who is asking, such as a Bayesian or a frequentist. The normal Shannon entropy gives an idea of the information density of a discrete state system. The more random a system is, the more information is needed to describe it, and this information decreases the uncertainty of the system. Entropy has a basis and there are various entropies that correspond to different distributions. These entropy measures are distribution invariant and have algebraic constraints that are consistent mathematically and probabilistically.
  • #1
T S Bailey
26
0
If you have multiple possible states of a system then the Shannon entropy depends upon whether the outcomes have equal probability. A predictable outcome isn't very informative after all. But this seems to rely on the predictive ability of the system making the observation/measurement. This suggests that the amount of Shannon entropy of a system depends on who you ask.
 
Last edited:
Physics news on Phys.org
  • #2
Especially if you ask a Bayesian or a frequentist.
 
  • Like
Likes jim mcnamara and T S Bailey
  • #3
The normal Shannon entropy for a discrete state system gives a person an idea of the information density or content of that system.

The more random a system is the more information is needed to describe that system. If you have a system with so much entropy and information with so much entropy then the information will decrease the uncertainty of the system itself.

Also realize that entropy - like anything else in mathematics and language in particular has a basis.

You can all sorts of entropies corresponding to the distribution you are looking at - and that includes all possible conditional distributions.

The entropy measures are distribution invariant (as long as they have a finite sample space for the proper entropies) and have a lot of algebraic constraints that are mathematically consistent and consistent probabilistically.
 
  • Like
Likes jim mcnamara and T S Bailey

FAQ: Is Shannon Entropy Dependent on Perspective?

1. What is Shannon Entropy?

Shannon Entropy, also known as information entropy, is a measure of the uncertainty or randomness of a system. It was developed by Claude Shannon in the 1940s to quantify the amount of information in a message or signal.

2. Is Shannon Entropy objective or subjective?

There is some debate about whether Shannon Entropy is objective or subjective. Some argue that it is objective, as it is a mathematical formula that can be calculated and measured. Others argue that it is subjective, as it can be influenced by the observer's interpretation and understanding of the information being measured.

3. How is Shannon Entropy calculated?

Shannon Entropy is calculated using the formula H = -Σp(x)log(p(x)), where p(x) is the probability of a certain event occurring. This formula takes into account the different probabilities of events occurring and assigns a numerical value to the amount of information in a system.

4. Can Shannon Entropy be used in any field of study?

Yes, Shannon Entropy has applications in various fields such as information theory, communication systems, data analysis, and even biology. It is a universal concept that can be applied to any system that involves information and uncertainty.

5. Is Shannon Entropy a reliable measure of information?

Shannon Entropy is a widely accepted measure of information and is used in many fields to quantify the amount of information in a system. However, it is important to note that it is not a perfect measure and may not accurately capture the complexity of certain systems. It should be used with caution and in conjunction with other measures to fully understand a system.

Similar threads

Back
Top