- #1
rohanprabhu
- 414
- 2
I wanted to find the Information Entropy (Shannon Entropy) for a given image. I basically wanted to study the effects of image compression on the Shannon entropy for a given image. I am not sure as to how to go about it.
To do this, the Message Space is the image itself. Entropy is given as the uncertainty associated with a random variable. The random variable in my case would be a quantitative measure of the pixel. The measure could be:
For now, let us consider Luminance. I plan to write a small program for this and need help with the logic. What I first do is, I make a list of all the luminance values the pixels take and associate each luminance value with the no. of times it occurs in the image. Basically, the luminance is a random variable X. Let's say the list is something like:
Once I've done that, I find the self-information set for X. The set is basically the negative log (to some base) of the probability for each item in X. The luminance set would now look something like:
[itex]
L_{base=2} = {0.853, 0.387, 0.630, 0.239}
[/itex]
Then, the information entropy is given by:
[tex]
H = -\sum_{i=1}^n {p(x_i) \log_b p(x_i)}
[/tex]
where [itex]x_i[/itex] are the elements of the set X and [itex]\log_b p(x_i)[/itex] are the elements from the set L described above. This should give me H, which is the information entropy associated with this image. Am I doing this right? Are there any other suggestions you might like to give?
Thanks a lot
To do this, the Message Space is the image itself. Entropy is given as the uncertainty associated with a random variable. The random variable in my case would be a quantitative measure of the pixel. The measure could be:
1. Colour (either the true color or any of it's componenets)
2. Luminance
3. Saturation
4. Location (this will return the highest information entropy, so it's basically useless)
etc..
For now, let us consider Luminance. I plan to write a small program for this and need help with the logic. What I first do is, I make a list of all the luminance values the pixels take and associate each luminance value with the no. of times it occurs in the image. Basically, the luminance is a random variable X. Let's say the list is something like:
Code:
+-----------------+------------+-------------+
| Luminance value | Occurences | Probability |
+-----------------+------------+-------------+
| 128 | 8 | 0.444 |
| 50 | 3 | 0.167 |
| 48 | 6 | 0.333 |
| 98 | 1 | 0.055 |
+-----------------+------------+-------------+
Once I've done that, I find the self-information set for X. The set is basically the negative log (to some base) of the probability for each item in X. The luminance set would now look something like:
[itex]
L_{base=2} = {0.853, 0.387, 0.630, 0.239}
[/itex]
Then, the information entropy is given by:
[tex]
H = -\sum_{i=1}^n {p(x_i) \log_b p(x_i)}
[/tex]
where [itex]x_i[/itex] are the elements of the set X and [itex]\log_b p(x_i)[/itex] are the elements from the set L described above. This should give me H, which is the information entropy associated with this image. Am I doing this right? Are there any other suggestions you might like to give?
Thanks a lot