- #1
PhillipKP
- 65
- 0
Hi
I'm trying to convince myself that the conditional entropy of a function of random variable X given X is 0.
H(g(X)|X)=0
The book I'm reading (Elements of Information Theory) says that since for any particular value of X, g(X) is fixed, thus the statement is true. But I don't understand why this makes the conditional entropy 0.
Obviously I don't understand conceptually what conditional entropy really is...
Can anyone please provide some "gentle" insight into this?
I'm trying to convince myself that the conditional entropy of a function of random variable X given X is 0.
H(g(X)|X)=0
The book I'm reading (Elements of Information Theory) says that since for any particular value of X, g(X) is fixed, thus the statement is true. But I don't understand why this makes the conditional entropy 0.
Obviously I don't understand conceptually what conditional entropy really is...
Can anyone please provide some "gentle" insight into this?