Information Entropy for an Image

In summary: The significance of multiplying the entropy by the no. of pixels is that it gives a better idea of how much information is in the image.
  • #1
rohanprabhu
414
2
I wanted to find the Information Entropy (Shannon Entropy) for a given image. I basically wanted to study the effects of image compression on the Shannon entropy for a given image. I am not sure as to how to go about it.

To do this, the Message Space is the image itself. Entropy is given as the uncertainty associated with a random variable. The random variable in my case would be a quantitative measure of the pixel. The measure could be:

1. Colour (either the true color or any of it's componenets)
2. Luminance
3. Saturation
4. Location (this will return the highest information entropy, so it's basically useless)
etc..

For now, let us consider Luminance. I plan to write a small program for this and need help with the logic. What I first do is, I make a list of all the luminance values the pixels take and associate each luminance value with the no. of times it occurs in the image. Basically, the luminance is a random variable X. Let's say the list is something like:

Code:
+-----------------+------------+-------------+
| Luminance value | Occurences | Probability |
+-----------------+------------+-------------+
| 128             | 8          | 0.444       |
| 50              | 3          | 0.167       |
| 48              | 6          | 0.333       |
| 98              | 1          | 0.055       |
+-----------------+------------+-------------+

Once I've done that, I find the self-information set for X. The set is basically the negative log (to some base) of the probability for each item in X. The luminance set would now look something like:

[itex]
L_{base=2} = {0.853, 0.387, 0.630, 0.239}
[/itex]

Then, the information entropy is given by:

[tex]
H = -\sum_{i=1}^n {p(x_i) \log_b p(x_i)}
[/tex]

where [itex]x_i[/itex] are the elements of the set X and [itex]\log_b p(x_i)[/itex] are the elements from the set L described above. This should give me H, which is the information entropy associated with this image. Am I doing this right? Are there any other suggestions you might like to give?

Thanks a lot
 
Physics news on Phys.org
  • #2
What you're written seems correct, for estimating the entropy of the luminence of each pixel. But I think that if you multiply this by the number of pixels in the image, the total entropy you'll get will be quite large compared to the sizes of modern image formats. This is for two reasons: one is that efficient image coders do not code the luminences individually, but in blocks (in entropy terms, this means that it's not the entropy of individual pixels that counts, but whole collections of luminences in a nearby region).

The other reason is that entropy measures how many bits it takes to describe the image losslessly. Using perceptual methods, however, it is possible to build lower-rate lossy image coders where the loss is not perceivable. There should exist some underlying "perceptual entropy" of a given image, but it is not the sort of thing that you can calculate directly from the data statistics unfortunately.

Also, I'm assuming that you use more than 4 luminence values in your experiments, and just shortened it to that for the example you include? Because if you're quantizing luminence that coarsely prior to entropy coding, the result is going to look pretty bad.
 
  • Like
Likes WWGD
  • #3
Firstly, I thank you for replying to my post.

quadraphonics said:
Also, I'm assuming that you use more than 4 luminence values in your experiments, and just shortened it to that for the example you include? Because if you're quantizing luminence that coarsely prior to entropy coding, the result is going to look pretty bad.

I will be using full-colour photographs, which will have around all of the luminance values on the Lab scale. I am using a range of 1000 Luminance values on my scale [from 0.000 - 1.000], and for a 1024x768 photograph, there are 1.44 million pixels, so I think i'll have a huge set of luminance values.

Also, could you please explain as to why you said: "if you multiply this by the number of pixels in the image". Could you please elaborate on the significance of multiplying the entropy by the no. of pixels?

Also, could you guide me as to what is the 'entropy rate' in information theory?

thanks a lot once again.
 
  • #4
rohanprabhu said:
Also, could you please explain as to why you said: "if you multiply this by the number of pixels in the image". Could you please elaborate on the significance of multiplying the entropy by the no. of pixels?

Well, the entropy calculation that you've done gives the average number of bits required to represent the luminence value from a single pixel. If you want to encode an entire image, however, you need to encode ALL of the pixels. So, if you design, say, a Huffman code that represents each pixel's luminence, the total number of bits will be the single-pixel entropy times the number of pixels. Schemes that encode multiple pixels together require less total bits, but more complicated estimation and code design.

rohanprabhu said:
Also, could you guide me as to what is the 'entropy rate' in information theory?

The entropy rate is an extension of the concept of entropy from a random variable to a random process. The regular entropy is defined for a single random variable, and tells you the number of bits needed, on average, to represent the value of the r.v. The entropy rate refers to situations where you have an entire random process (i.e., a sequence of r.v.'s that goes on for ever). In this case, the entropy rate is the number of bits you need to encode each new element in the sequence, given all the previous ones. Another way to think of this is that it's the amount of new information (on average) contained in each new element in the random process.
 
  • Like
Likes WWGD
  • #5
quadraphonics said:
Another way to think of this is that it's the amount of new information (on average) contained in each new element in the random process.

So, you mean to say, calculating the entropy rate would be an ideal measure for say, a video.. as in i calculate the shannon entropy for each frame and then the instantaneous entropy rate would be [itex]ER = H_i \times \frac{1}{fps}[/itex] where, [itex]H_i[/itex] is the entropy of the frame and fps is the frames per second of the video.

However, I am not able to understand a particular application of Entropy rate. From wikipedia:

The entropy rate of English text is between 1.0 and 1.5 bits per letter

How could this entropy rate have been calculated?
 
  • #6
rohanprabhu said:
So, you mean to say, calculating the entropy rate would be an ideal measure for say, a video..

Yeah, any signal that goes on for a long (or even indefinite) time: video, radio broadcast, whatever. Even finite sequences can be thought of this way, if they're really long. For example, if you were to view an image as a really long (one-dimensional) array of values.

rohanprabhu said:
as in i calculate the shannon entropy for each frame and then the instantaneous entropy rate would be [itex]ER = H_i \times \frac{1}{fps}[/itex] where, [itex]H_i[/itex] is the entropy of the frame and fps is the frames per second of the video.

Not quite. First, you don't normally bother normalizing the entropy rate to be in units of seconds, so you wouldn't include the fps term. Second, the entropy rate is a property of the process as a whole: you don't normally consider the "instantaneous" rate, but rather the average over the entire process.

rohanprabhu said:
How could this entropy rate have been calculated?

They take long strings of English text and build probabalistic models of it, and then use those to estimate the entropy rate (just like you're doing with images). In the simplest case, they model it as an i.i.d. process (like you're doing with luminence values) and just estimate the probability of each letter occurring. Then the entropy rate is just equal to the entropy of a single letter (which is substantially higher than the 1-1.5 bit figure). More complicated methods model the text as a process with stronger dependencies, and so estimate things like the probability of pairs of letters, or of entire words. The entropy rate corresponding to these more accurate models is lower. As your model gets more and more complicated, it also gets more and more accurate, and the associated entropy rate decreases towards the "true" entropy rate of the underlying process.
 
  • Like
Likes WWGD
  • #7
well.. that cleared out quite a lot. Thanks :D
 

FAQ: Information Entropy for an Image

What is information entropy for an image?

Information entropy for an image is a measure of the amount of information contained in an image. It is a statistical measure of the randomness or unpredictability of the pixels in an image.

How is information entropy calculated for an image?

Information entropy for an image is calculated by first converting the image to a digital format and then using mathematical formulas to analyze the distribution and patterns of the pixels. This calculation takes into account the brightness, color, and spatial relationships of the pixels in the image.

Why is information entropy important for images?

Information entropy is important for images because it provides a way to quantify the amount of information and complexity present in an image. It can be used to analyze and compare images, as well as to optimize image compression and storage techniques.

How does information entropy relate to file size?

Information entropy and file size have an inverse relationship. This means that as the information entropy of an image increases, the file size tends to decrease. This is because images with higher information entropy have more patterns and less redundancy, making them more compressible.

Can information entropy be used to measure image quality?

While information entropy can provide some indication of image quality, it is not a comprehensive measure. Other factors such as resolution, color accuracy, and visual appeal also play a role in determining the overall quality of an image.

Similar threads

Replies
19
Views
2K
Replies
2
Views
1K
Replies
0
Views
1K
Replies
12
Views
1K
Replies
4
Views
2K
Replies
7
Views
1K
Replies
1
Views
1K
Replies
1
Views
822
Back
Top