Information Theory and Mathematics

AI Thread Summary
In information theory, the information content of a statement like "1+1=2" is considered to be zero because its probability of being true is 1.0, leading to a logarithmic value of zero. The discussion highlights the necessity of defining possible messages and their probabilities before assessing information content. It raises questions about the relativity of probability in mathematical truths and whether mathematical structure contributes to a probability of 1. However, the conversation notes that vague philosophical interpretations of "information" lack specificity and are often redirected to other sections of the forum. Ultimately, without a defined context, the concept of "information" remains ambiguous in mathematics.
wheelersbit
Messages
10
Reaction score
0
In (Shannon) information theory, information is said to be the log of the inverse of the probability - what is the information content of 1+1=2? Or for that matter any fundamental axiom.
 
Mathematics news on Phys.org
the information content (measured in the sense of Shannon) of the message: "1+1=2" or any other proven theorem is 0.

you have to define what all of the possible messages are first, then figure out the probability of each message, and then you can start asking what the measure of information content is.
 
That follows very simply from what you give. The probability that "1+ 1= 2" (or any axiom) is true is 1.0 and its logarithm is 0.
 
Doesn't this seem to use a relative sense of probability? For example could we have said at one time this was in fact a very improbable event or do we rely on mathematical realism to tell us this was never improbable?

To give another example can we say structure in mathematics is what gives this a probability of 1?

I really appreciate your responses.
 
wheelersbit said:
To give another example can we say structure in mathematics is what gives this a probability of 1?

You could say that, but it doesn't have a specific meaning! - and all the vague philosophical threads get locked or moved to the PF Lounge section.

If you are looking for some universal interpretation of "information" in the physical world, you'll have to go to the physics sections. In mathematics, the concept of "information" is no more universal than the concept of "rate of change". If you define a specific variable (such as the position of an object, or the price of a stock) then you can talk about a specific "rate of change". If you define a specific probability distribution, then you can talk about it's information. If you don't get specific then "information" isn't a specific thing.
 
I understand thank you.
 
Seemingly by some mathematical coincidence, a hexagon of sides 2,2,7,7, 11, and 11 can be inscribed in a circle of radius 7. The other day I saw a math problem on line, which they said came from a Polish Olympiad, where you compute the length x of the 3rd side which is the same as the radius, so that the sides of length 2,x, and 11 are inscribed on the arc of a semi-circle. The law of cosines applied twice gives the answer for x of exactly 7, but the arithmetic is so complex that the...
Back
Top