What is information?or how is information connected with change

In summary, information theory is a branch of mathematics that deals with quantifying the amount of information in a given system or situation. It is based on the concept of entropy, which measures the uncertainty involved in predicting the value of a random variable. Shannon's definition of information is -log2 of the a priori probability of the outcome, and it applies to situations where the outcome is not known in advance. In the case of continuous functions, which are idealizations used in modeling real processes, they carry infinite information and it makes no sense to discuss them in the context of information theory. However, in reality, measurements have limited accuracy and outcomes are usually correlated, resulting in a finite amount of information described by Shannon-Hartley theorem. This theorem
  • #1
nouveau_riche
253
0
what is information?
or
how is information connected with change and dependency ?
take an example ,
y=f(x)
if y is independent of x(say y=3),will there be information?

how does the change df(x)/dx effect the information content?
 
Physics news on Phys.org
  • #2


Classical Shannon's definition of information carried by some measurement giving the result x is that I(x)=-log2p(x); where p(x) was the a priori probability of that outcome. Especially, if the measurement may give only two equiprobable results (like tossed coin) the information carried by every measurement is 1 bit.

It makes no sense to speak about information carried by real functions (except of very limited subset of them), as such functions are never realized. Formally - they carry infinite information. But in reality your measurements have always limited accuracy, and outcomes on consecutive measurements are usually correlated.
The amount of information carried by signals of limited frequency bandwith, measured with limited accuracy (or blurred by noise) is finite and described by Shannon-Hartley theorem - which is one of foundations of telecommunications

See: http://en.wikipedia.org/wiki/Information_theory
 
  • #3


xts said:
Classical Shannon's definition of information carried by some measurement giving the result x is that I(x)=-log2p(x); where p(x) was the a priori probability of that outcome. Especially, if the measurement may give only two equiprobable results (like tossed coin) the information carried by every measurement is 1 bit.

It makes no sense to speak about information carried by real functions (except of very limited subset of them), as such functions are never realized. Formally - they carry infinite information. But in reality your measurements have always limited accuracy, and outcomes on consecutive measurements are usually correlated.
The amount of information carried by signals of limited frequency bandwith, measured with limited accuracy (or blurred by noise) is finite and described by Shannon-Hartley theorem - which is one of foundations of telecommunications

See: http://en.wikipedia.org/wiki/Information_theory
it would be better if you could clarify things with my example
 
  • #4


it would be better if you could clarify things with my example

XTS did give you an answer -

If df(x)/dx exists the function: f(x) is continuous and thus carries infinite information.
 
  • #5


nouveau_riche said:
it would be better if you could clarify things with my example
I either can't (as maybe I didn't get your point in not quite clear example) or I did it already (telling you, that for real functions, being only idealisations used in modelling of real processes, the information they carry is infinite, so it makes no sense to discuss them in context of information theory)
 
Last edited:
  • #6


Studiot said:
XTS did give you an answer -

If df(x)/dx exists the function: f(x) is continuous and thus carries infinite information.

as to my knowledge ,according to shannon's theorem anything has information if nothing if there isn't uncertainty at receiver's end
in the above case,knowing the function in advance reveals every information in advance
 
  • #7


as to my knowledge ,according to shannon's theorem anything has information if nothing if there isn't uncertainty at receiver's end
in the above case,knowing the function in advance reveals every information in advance

Pardon?
 
  • #8


Studiot said:
Pardon?

A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

those are lines from wiki link of information theory

if u still don't get it then this one is interesting

Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted
 
  • #9


nouveau_riche said:
Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted
That's exactly what I gave you as explanation in my first response: information is equal to -log2 of a priori probability of the outcome. If the outcome is known in advance, then its probability is 1 , thus information is 0.

But, how is it related to your original question about continuous functions and their derivatives in context of information theory? I still can't get what were you asking about.
 
  • #10


xts said:
That's exactly what I gave you as explanation in my first response: information is equal to -log2 of a priori probability of the outcome. If the outcome is known in advance, then its probability is 1 , thus information is 0.

But, how is it related to your original question about continuous functions and their derivatives in context of information theory? I still can't get what were you asking about.

okay let's begin from start
"suppose i gave you an equation describing an event"
now you know in advance about the behavior of event,so was there any information in the event you were observing?
 
  • #11


nouveau_riche said:
"suppose i gave you an equation describing an event"
now you know in advance about the behavior of event,so was there any information in the event you were observing?
If you give me not only equation, but also starting parameters (not only pendulum equation, but also its initial angle and the time you released it), then I can get no information watching the experiment. I've already got all possible knowledge about it - and there is nothing more to learn from it.
 
  • #12


xts said:
If you give me not only equation, but also starting parameters (not only pendulum equation, but also its initial angle and the time you released it), then I can get no information watching the experiment. I've already got all possible knowledge about it - and there is nothing more to learn from it.

so as per your lines there a difference in information between the following cases
case 1: forming a graph of any event from the equation
case 2:forming an equation from the observation
 
  • #13


the thread remains unresponsive
and i cannot absorb that there is not much left in this in thread to discuss
 
  • #14


I didn't respond, as I just didn't understand your question. Would you ask it more clearly?
 
  • #15


xts said:
I didn't respond, as I just didn't understand your question. Would you ask it more clearly?

is there a difference between observing events after having an equation with that of having an observation first followed by checking it's validation?
 
  • #16


There is no difference in observing.
But there is a difference in information you obtain by this observation.

Take an example of looking out the window to see the Sun on the sky.
If you just woke up with terrible hangover - it brings you some (don't ask how many bits) information, that it is already about 11 AM.
If you just got waken by an alarm clock programmed for 7:15AM and you see the Sun on the sky - it brings you no information (about time, however it may still bring some about the weather) - you already knew what time was, so - as you are fresh and sober - you could easily predict where the Sun should be seen. Confirmation of something certain brings no information.
 
  • #17


xts said:
There is no difference in observing.
But there is a difference in information you obtain by this observation.

Take an example of looking out the window to see the Sun on the sky.
If you just woke up with terrible hangover - it brings you some (don't ask how many bits) information, that it is already about 11 AM.
If you just got waken by an alarm clock programmed for 7:15AM and you see the Sun on the sky - it brings you no information (about time, however it may still bring some about the weather) - you already knew what time was, so - as you are fresh and sober - you could easily predict where the Sun should be seen. Confirmation of something certain brings no information.

if there is a difference in information
then the more laws we discover in this universe,the more we loose the net information in the universe?
 
  • #18


nouveau_riche said:
if there is a difference in information
then the more laws we discover in this universe,the more we loose the net information in the universe?
We don't lose information. We just sometimes receive the same information twice. Information is not an addititive property. If you know the laws ruling the experiment, you may better predict its results, so its actual outcome gives you less (or no at all) information.
 

FAQ: What is information?or how is information connected with change

What is information?

Information can be defined as data or knowledge that is communicated or received through various mediums, such as language, symbols, or technology. It can also refer to the organized and meaningful interpretation of data.

How is information connected with change?

Information plays a crucial role in driving change and progress. It enables individuals and organizations to make informed decisions, adapt to new situations, and improve processes. The availability and exchange of information can lead to the creation of new ideas, advancements in technology, and societal changes.

Can information cause change?

While information itself does not cause change, it can serve as a catalyst for change. When individuals or organizations receive new information, it can lead to a change in perspective, behavior, or decisions, which can ultimately result in change.

How does the concept of information relate to science?

In science, information is a crucial component in the process of understanding the natural world. Scientists gather and analyze data to generate information, which is then used to form theories and hypotheses. The exchange of information between scientists also allows for collaboration and advancement in the field.

How has technology impacted the way we access and share information?

Technology has greatly enhanced our ability to access and share information. The internet, smartphones, and other devices have made it easier to access a vast amount of information from anywhere at any time. Social media and other digital platforms have also made it easier to share and exchange information with a large audience quickly.

Similar threads

Back
Top