Max Entropy of 16-Symbol Source

In summary, the maximum entropy of a 16-symbol source is 0, which is achieved when there is a complete lack of information and all symbols have equal probability. This is due to the fundamental formula for entropy, which incorporates a convention for handling probabilities of 0.
  • #1
jNull
1
0
Hi I am studying entropy and I am new to the concept I don't know where to start in this question:
State the maximum entropy of a 16-symbol source.

thank you
 
Mathematics news on Phys.org
  • #2
jNull said:
Hi I am studying entropy and I am new to the concept I don't know where to start in this question:
State the maximum entropy of a 16-symbol source.

thank you

Wellcome on MHB jNull!... in Theory of Information the Entropy of a random variable X that can have n possible symbols is defined as...

$\displaystyle H(X) = - \sum_{k=1}^{n} P(x_{k})\ log_{2} P (x_{k})\ (1)$

... where $P(x_{k})$ is the probability $P \{X=x_{k}\}$ ...

In your case is n=16...

Kind regards

$\chi$ $\sigma$
 
  • #3
jNull said:
Hi I am studying entropy and I am new to the concept I don't know where to start in this question:
State the maximum entropy of a 16-symbol source.

thank you
Intuitively, entropy is a measurement of randomness or lack of information. In the case of the 16 symbols, you might think of them as being 16 doors. Behind one of the doors is a brand new S-Class Mercedes, yours to drive off with if you choose the right door. If you have some inside information telling you for certain that the car is behind a particular door, say door number 7, then you would assign the probability $1$ to door 7 and probability $0$ to each of the other 15 doors. There would then be no uncertainty about the situation, and the entropy of the system would be $0$. At the opposite extreme, if you had no prior information about the situation then you would have to assign the probability $1/16$ to each of the doors, and the entropy ("lack of information") of the system would be maximised.

Coming back to the mathematics of the situation, the fundamental formula for entropy is the one given by chisigma, \(\displaystyle H(X) = -\sum_{k=1}^nP(x_k)\log_2(P(x_k))\) (with the convention that $0\times (-\infty) = 0$, so that if a probability $P(x_k)$ is $0$ then $P(x_k)\log_2(P(x_k))$ is taken to be $0$). For the 16-symbol source, the entropy is minimised when one probability is $1$ and the others are all $0$. That gives the minimum entropy as $0$. The entropy is maximised in the situation where there is a complete lack of information, namely when $P(x_k) = 1/16$ for $1\leqslant k\leqslant 16$.
 
  • #4
Opalg said:
... coming back to the mathematics of the situation, the fundamental formula for entropy is the one given by chisigma, \(\displaystyle H(X) = -\sum_{k=1}^nP(x_k)\log_2(P(x_k))\) (with the convention that $0\times (-\infty) = 0$, so that if a probability $P(x_k)$ is $0$ then $P(x_k)\log_2(P(x_k))$ is taken to be $0$)...

A rigorous proof of the fact that, given the function $\displaystyle f(x) = x\ \ln x$, is $f(0)=0$ has been given in...

http://mathhelpboards.com/analysis-50/never-ending-dispute-2060.html?highlight=ending+dispute

Having studied for decades information theory, I would be very concerned that a fundamental result was due to a 'convention' that such a day as some 'imaginative mind' can change ...

http://d16cgiik7nzsna.cloudfront.net/82/e7/i98953090._szw1280h1280_.jpghttp://d16cgiik7nzsna.cloudfront.net/82/e7/i98953090._szw1280h1280_.jpg

MerryChristmas from Serbia

$\chi$ $\sigma$
 
  • #5
chisigma said:
A rigorous proof of the fact that, given the function $\displaystyle f(x) = x\ \ln x$, is $f(0)=0$ has been given in...

http://mathhelpboards.com/analysis-50/never-ending-dispute-2060.html?highlight=ending+dispute

Having studied for decades information theory, I would be very concerned that a fundamental result was due to a 'convention' that such a day as some 'imaginative mind' can change ...
The "convention" is of course completely well-founded in the context of the entropy function, and the use of the word does not in any way imply that there is something arbitrary or negotiable about it. But in the absence of some such context, the expression $0\times \infty$ is not well-defined. That is why I wanted to emphasise the need to define $f(0) = 0$ for the function $f(x) = x\log_2(x).$
 

FAQ: Max Entropy of 16-Symbol Source

What is the concept of "Max Entropy of 16-Symbol Source"?

The Max Entropy of 16-Symbol Source refers to the maximum amount of uncertainty or randomness that can be present in a source that produces 16 different symbols. It is a measure of the unpredictability of a system.

How is the Max Entropy of 16-Symbol Source calculated?

The Max Entropy of 16-Symbol Source can be calculated using the Shannon entropy formula, which takes into account the probabilities of each symbol occurring in the source. The formula is H = -∑p(x)log₂p(x), where p(x) represents the probability of symbol x occurring.

What is the significance of the Max Entropy of 16-Symbol Source?

The Max Entropy of 16-Symbol Source is significant because it represents the theoretical limit of how much information can be transmitted per symbol in a source with 16 symbols. It also helps to determine the efficiency of coding and compression algorithms.

Can the Max Entropy of 16-Symbol Source be exceeded?

No, the Max Entropy of 16-Symbol Source is the maximum possible entropy for a source with 16 symbols. It cannot be exceeded, as it represents the maximum amount of uncertainty or randomness in the source.

How is the concept of Max Entropy of 16-Symbol Source applied in real-world scenarios?

The Max Entropy of 16-Symbol Source is used in various fields, such as information theory, data compression, and machine learning. It helps in understanding the limits of communication systems and designing efficient coding schemes for data transmission.

Back
Top