Entropy (Shannon) - Channel Capacity

AI Thread Summary
The discussion focuses on calculating channel capacity using Shannon's formula. A symbol representing 10 bits of information transmitted at 10 symbols per second suggests a capacity of 100 bits per second in a noiseless scenario. However, the presence of noise complicates the calculation, as Shannon's formula incorporates signal-to-noise ratio and probability. Without a statistical model of the channel, it's challenging to determine the true capacity beyond the initial assumption. Ultimately, the conversation highlights the importance of understanding entropy and noise in accurately assessing channel capacity.
frozz
Messages
2
Reaction score
0
Hi,

I am not sure how to count the channel capacity.

If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

C = 1 - H[x]

How to go from there?

Thanks!
 
Mathematics news on Phys.org
frozz said:
If a symbol represents 10 bits of information, and a channel can transmit 10 symbols per
second, what is the capacity of the channel in bits per second?

Err... 100 bits per second?

frozz said:
C = 1 - H[x]

How to go from there?

Well, how's your understanding of (Shannon) Entropy in the first place?
 
quadraphonics said:
Err... 100 bits per second?



Well, how's your understanding of (Shannon) Entropy in the first place?

Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Thank you!
 
frozz said:
Ya, logically it's 100 bits per second if the channel is noiseless. But, shannon's formula has signal noise ratio or probability.. That's why I'm not sure.

Well, to calculate the capacity, you first need a statistical model of the channel. Then you'd use that to look at how much mutual information there can possibly be between the inputs and outputs of the channel. But there is no such model presented here, only the statement that "the channel can transmit 10 symbols per second." So, there doesn't seem to be much to do here except to assume that this figure is the capacity. If the channel were truly noiseless, the capacity would be infinite, not 10 symbols per second.
 
Seemingly by some mathematical coincidence, a hexagon of sides 2,2,7,7, 11, and 11 can be inscribed in a circle of radius 7. The other day I saw a math problem on line, which they said came from a Polish Olympiad, where you compute the length x of the 3rd side which is the same as the radius, so that the sides of length 2,x, and 11 are inscribed on the arc of a semi-circle. The law of cosines applied twice gives the answer for x of exactly 7, but the arithmetic is so complex that the...
Back
Top