- #1
SW VandeCarr
- 2,199
- 81
Consider a character string randomly generated from an alphabet {T,H} of length L, where T and H each have a probability of 0.5. For an arbitrary finite L the probability of a given string is p=(0.5)^L.
A probability is the sole determinant of Shannon entropy (S). Therefore I'm claiming that such character strings have Shannon entropy which, given a uniform PDF, would be S=-logb2(P)^L.
This is my reasoning for claiming that such character strings have entropy. I've been challenged on this based on the argument that each element of the string is a random variable, but the entire string is a "constant". In fact, there is no specification that the string need be generated sequentially. A string, as defined above, where L=10 has 1024 possible outcomes or states. Is this not an example of entropy?
EDIT: In addition, I'm claiming that if L were an RV and P(T or F) is fixed, then S is a random variable with a known PDF.
(see also LuculentCabal:logarithm of discrete RV Jul 12)
A probability is the sole determinant of Shannon entropy (S). Therefore I'm claiming that such character strings have Shannon entropy which, given a uniform PDF, would be S=-logb2(P)^L.
This is my reasoning for claiming that such character strings have entropy. I've been challenged on this based on the argument that each element of the string is a random variable, but the entire string is a "constant". In fact, there is no specification that the string need be generated sequentially. A string, as defined above, where L=10 has 1024 possible outcomes or states. Is this not an example of entropy?
EDIT: In addition, I'm claiming that if L were an RV and P(T or F) is fixed, then S is a random variable with a known PDF.
(see also LuculentCabal:logarithm of discrete RV Jul 12)
Last edited: