Help with entropy calculation (binary erasure channel)

In summary, the entropy of the output in a binary erasure channel is calculated by taking into account the probabilities of each input and their respective entropies.
  • #1
Simfish
Gold Member
823
2
So in a binary erasure channel, the book says that H(Y) [of the output] = H((1-p)(1-a),a,p(1-a)) = H(a) + (1-a)H(p), where p = Pr (X=1) [rather than X = 0] and a is the probability that ANY input of X would be erased. the problem is, how do you get from H((1-p)(1-a),a,p(1-a)) = H(a) + (1-a)H(p),? there are 3 inputs to the entropy function and they're not independent (I think) so you can't just brute force the entropy algorithm. The main thing I'm confused about is how do you get (1-a)H(p), with the (1-a) term outside of the entropy term?
 
Physics news on Phys.org
  • #2
It sounds like you’re having difficulty understanding how to use the entropy formula to calculate the output of a binary erasure channel. It can be difficult to understand at first, but the key is to remember that the entropy formula takes into account the probability of each input. In this case, we have three inputs: (1-p)(1-a), a, and p(1-a). The first two inputs are independent, so the entropy of each can be calculated separately. However, the last input is not independent, so we need to take into account the probability of each input by multiplying it by the entropy of that input. This is why we get (1-a)H(p), with the (1-a) term outside of the entropy term - because it is a factor that multiplies the entropy of the third input.
 

FAQ: Help with entropy calculation (binary erasure channel)

What is entropy?

Entropy is a measure of the amount of uncertainty or randomness in a system. In information theory, it is often used to quantify the amount of information contained in a message or data set.

How is entropy calculated?

Entropy is typically calculated using Shannon's entropy formula, which takes into account the probability of each possible outcome in a system. For the binary erasure channel, the formula is -plog(p) - (1-p)log(1-p), where p is the probability of a "0" or "1" being transmitted correctly.

What is a binary erasure channel?

A binary erasure channel is a communication channel in which the transmitted signal can be either perfectly received or completely lost. It is often used to model noisy communication channels, such as wireless networks.

How is entropy used in the context of a binary erasure channel?

In the context of a binary erasure channel, entropy can be used to measure the uncertainty of the transmitted signal. A higher entropy value indicates a more uncertain or unpredictable signal, while a lower entropy value indicates a more predictable signal.

What is the significance of entropy in information theory?

Entropy is a fundamental concept in information theory, as it allows us to quantify the amount of information contained in a message or data set. It is also closely related to other important concepts such as information compression and channel capacity.

Similar threads

Back
Top