A problem in Information Theory: Shannon Entropy equation, Markov Chains

  • #1
billtodd
137
32
Homework Statement
I am attaching the problem with the questions in it.
Relevant Equations
Shannon Entropy equation, Markov Chain.
1719773628519.png

1719773674018.png


Here is my solution to this problem:
2.1 ##\log_2(10,000)\approx 13.2877[bits/sec]##.
2.2 ##\log_2(10,000/2)=12.2877[bits/sec]##
2.3 ##2\log_2(10,000)\approx 26.575[bits/sec]##
3. For this, I said that ##P(Y_n|X_n)=7/8 P(X_n|X_n)+1/8 P(X_n+0.5|X_n)=7/8\cdot 0 +1/8 \cdot 0.5+1/8\cdot 0##, where I think that: ##P(X_n|X_n)=0##, I am not sure this is correct. Anyways, I get that: ##P(Y_n|X_n)=1/16##, and in order to show that it's Markovian I need to just say that it's ##X_n## w.p. 7/8 and ##X_n+0.5## w.p 1/8. The seuqence ##X_n+0.5## is still Markovian, adding a constant to a Markovian doesn't change the fact that it's Markovian. Is this reasoning correct?
4. My answer was multiplying by 1/16 all of the results in section 2.
5. I am not sure how to answer.
Any hints, and advice on how to answer those questions?

Thanks in advance!
 

Similar threads

Replies
9
Views
3K
Replies
1
Views
4K
5
Replies
150
Views
17K
2
Replies
48
Views
10K
3
Replies
71
Views
10K
Back
Top