Just what is information, and what is its place in Nature

In summary, it is a physical concept that is important for the theory of classical physical information.
  • #1
plschwartzx
7
1
Just what is Information. That is in Physics how is it conceived? Is it a particle a wave a field or is it another thing in nature entirely. When info is exchanged what is being exchanged
Or is it just a maths concept that has no physical reality. Et cetera et cetera.
I have looked at other posts and they quickly become discussions of its attributes. But little head-on confrontation about its nature.
 
Physics news on Phys.org
  • #2
plschwartzx said:
Just what is Information. That is in Physics how is it conceived?

"Information" in general is not a well-defined physics concept. There are various ways of trying to define a technical concept that has features that make the word "information" a useful way to refer to it (for example, ways involving entropy), but no single concept that is labeled by that word.

plschwartzx said:
I have looked at other posts

Are there particular ones that you are confused about?
 
  • #3
PD
I don't think I am confused. I agree exactly that it is not a well defined concept. But it is passing strange that it has been allowed to remainill-defined And that does interest me. It seems that somehow physicists are having difficulty with integrating a definition of information into their general understanding. What is stopping a good definition et cetera et cetera
 
  • #4
What did you find as you read the Wikipedia entry on it, the various versions of entropy or other sources? Without a specific question there are simply too many areas in which corresponding definitions exist. A computer scientist will define it as a known bit, I would say: the known difference between two states and philosophers have certainly filled entire books with it. So at least you should provide a context and eventually a definition. If you specifically ask for a physical meaning, then probably thermodynamic entropy is the answer. Mathematicians might like the Shannon entropy.
 
  • #5
plschwartzx said:
What is stopping a good definition

Perhaps the fact that there is no need for one in order for our physical theories to make good predictions?
 
  • #6
plschwartzx said:
But it is passing strange that it has been allowed to remain ill-defined
It's not ill-defined, but there are several reasonable definitions. That's not a problem with the physics, it's a problem with the English language being less than precise and attaching the same word to different things. Thus, there's no single satisfactory answer to your question; but as long as you know which of the possible meanings is being used it doesn't matter.

Serious scientific papers get past this problem by stating clearly what they mean by the word "information", so it's not ill-defined when they use it. Unfortunately many less serious sources are not so scrupulous; it may be better to avoid these than to try to guess what they mean.
 
  • #7
Perhaps I am asking a meta-question. Why is it that :"there are simply too many areas in which corresponding definitions exist"
That just shouldn't exist in a field that is looking for the grand unifying theory of everything.
Is the correspondence say between "information" as used in thermodynamic entropy as a concept and Shannon entropy as a concept a unified entity.
I am trying to shake the tree a little because the Cartesian split seems to raise its ugly head here. IIRC Shannon is basically presenting a psychological theory nicely fit into a mathematical box. Thermodynamics is about physical entities. Information concept was seen as bridging dualism. But is the use of the same term "information" in so many different contexts a fudge. An example of the equivocation fallacy?
Maxwell was perfectly happy to have his demon have a mind which did the work. Szilard if I understand him, denies that the demon could have operate in a separate part of Nature and insisted that the processing information needed physical work. Others are still choking on exactly how the demon could work without working. (I am elsewise not interested in the Maxwell demon problem.)
I think if the apparent dual nature of the use of the term "information" with its physical and psychological aspects to be transcended will need a lot more weight given to the psychological aspects. Szilard-like attempts at physical reductionism will fail.
 
  • #8
plschwartzx said:
Why is it that :"there are simply too many areas in which corresponding definitions exist"
That just shouldn't exist in a field that is looking for the grand unifying theory of everything.

Maybe because a single definition of "information" is not necessary for a grand unifying theory of everything?

We don't know in advance which concepts are going to be fundamental. We find that out as we go. There is no reason why any particular concept, whether it's "information" or anything else, must have a place in a fundamental theory.
 
  • #9
PD
OK let us settle on Classical Physical Information. Where its preservation at least seems to be pretty fundimental.
But what is it? Is it a fundamental part of matter.Is it as elemental as say space, time, matter/energy? How does it exist? Is it a field like the Higgs? It seems that it can be transformed. How? are there particles (Information-ons?) involved in its transformation? Et cetera et cetera. Contents
 
  • #10
plschwartzx said:
let us settle on Classical Physical Information. Where its preservation at least seems to be pretty fundimental

Why?

plschwartzx said:
what is it?

Depends on who you ask.

plschwartzx said:
It seems that it can be transformed.

Why?

Basically, you're making a lot of assertions about "information" when you don't even know what it is, or if it's a well-defined concept at all. That seems backwards to me. First you need to go look at the physics literature to see what, if any, concepts of information are used to make actual physical predictions. Then you can try to make sense of those concepts. But you haven't even done step 1.
 
  • #11
plschwartzx said:
OK let us settle on Classical Physical Information.
Every scientific term is defined to either describe something measurable, which is a comparison between the outcome of an observation and a scale, to abbreviate a more complex situation, whether it is a summary of conditions, an underlying model or even a theory. It would not make sense at all to speak of, e.g. force, if it wasn't meant to describe something measurable and to use as an abbreviation of a principle. Thus there has to be some advantage by the usage of a term. Information has none in classical physics.

Information is a philosophical term in the first place. It has no direct application to physics. One might talk about the information content of a system, in which case it is called entropy as terminus technicus, which is precisely defined in various contexts (mathematics, thermodynamics, coding and information theory) where it is used.

"Information is a widely used and difficult-to-distinguish term. Different sciences (structural and human sciences) regard information as their field of work, in particular computer science, information theory, information science, information technology, information economics, and semiotics; it can be a mathematical, philosophical or empirical (e.g. sociological) concept.

It is only recently that there are efforts to link the individual approaches and to arrive at a general concept of information. Corresponding literature is currently mostly found under the heading of philosophy (e.g. in the area of knowledge theory). A unified, generally accepted theory of information can not yet be said for the time being
."
(Wikipedia, https://de.wikipedia.org/wiki/Information#Definitionen, Google translation)

This summarizes it quite well. Different fields have different concepts of information. Classical physics has none. The closest you can get there is thermodynamic entropy.

If you insist on a definition, we have to look at the science, that deals with information. Here's the definition:
The content on information of a sign ##x## with a probability ##p_x## to appear, is defiend as
$$
I(x) = \log_a \left( \frac{1}{p_x} \right) = - \log_a(p_x)
$$
where ##a## is the length of the alphabet, i.e. the number of possible states of an information source (message).
 
  • #12
Information is a noun. Unlike airplanes, which you can see in the sky, the only place information has popped up so far in this universe, is in the consciousness, so I don't think its much of a physics subject.
 
  • #13
I'd say, information theory is a mathematical subject, and it's very important in its application to statistical physics. The information-theoretical understanding of entropy is, in my opinion, the most general and correct one. Just in recent times, quantum Maxwell demons were realized, and the 2nd law is obeyed precisely as predicted in the information-theoretical way as worked out first by Landauer and Szilard. Just yesterday, there was an interesting experimental (!) paper, realizing a cavity Maxwell demon:

http://www.pnas.org/content/early/2017/06/30/1704827114
https://arxiv.org/abs/1702.05161

Although I've not understood the details of the measurement (yet?), I think it's a clear demonstration for the validity of the information-theoretical approach to entropy and thus statistical thermodynamics.
 
  • Like
Likes Boing3000
  • #14
plschwartzx said:
Just what is Information. That is in Physics how is it conceived? Is it a particle a wave a field or is it another thing in nature entirely. When info is exchanged what is being exchanged
Or is it just a maths concept that has no physical reality. Et cetera et cetera.

I have looked at other posts and they quickly become discussions of its attributes. But little head-on confrontation about its nature.
one area in physics where the concept of information has been useful is in the use of the flow of information as determining the structure of the multiverse (i.e everettian/quantum multiverse). In this approach computation and information play key roles in determining the structure of the multiverse. If one subscribes to scientific realism and takes the wavefunction as something objective then the many world interpretation is most likely the best candidate out there. David Deutsch wrote the paper on the flow of information as determining the structure of the multiverse (link: https://arxiv.org/ftp/quant-ph/papers/0104/0104033.pdf ).

He has recently come up with something called the constructor theory and written an interesting paper called the constructor theory of information which claims that for the first time a non-circular definition of information has been presented. Instead of saying more about his work I’ll share the link if you are interested in more. I myself have been quite interested in the concept – one reason being that I am really interested in the topic of what is “identity” and what does it mean to be “distinguishable”. I feel that the theory of computation and information may have a lot to offer down the line. Here’s the link on constructor theory where you will find the paper I mentioned and a lot more: http://constructortheory.org/research/

one thing I will mention is that Deutsch says that information is not regarded as an a-priori mathematical or logical concept but as something whose nature and properties are determined by the laws of physics alone ( in other words its physical and not mathematical). He approaches it from the theory of computation which he deems as being physical (contrary to what many say – he has presented his reasoning behind it in his book “fabric of reality” – an AWESOME read). The links to his work and his books present a lot of homework but I feel that his work and writing has brought me the closest to seeing information as a useful physical concept and in time we might find it to be an essential concept similar to the concept of energy.
 
  • #15
Information is also used in relativity as "what is known" about a system. If a system changes, what is known about the change can only be known at a distance from the system after a period of time, because the propagation of the information about the change is limited by the speed of light.
 
  • #16
Information seems to me a physics concept, in that the most basic requirement one can assume for having information (by almost any definition) is some degree of order. Thermodynamics, and entropy specifically, are very concerned with tendencies towards disorder. While the word may not yet have a formal definition within physics, it doesn't seem fair to label information as solely the domain of philosophy.
 
  • #17
There is a great deal of discussion of information in the context of black holes and whether it is destroyed when it passes the event horizon. Whole books have been written about this, but some folks still claim the problem is not resolved. How is this information defined? It seems to be that which makes the system reversible. If you don't know where you came from, you ain't going to get back there. Seems like a rather simple definition.
 
  • #18
joneall said:
How is this information defined?

In the context of the black hole information paradox, "information" really means "whatever is preserved by unitary quantum evolution". In other words, the key issue is that the singularity inside a black hole appears to violate unitarity: for example, if two particles are entangled, and one falls into the hole while the other stays outside, there is no unitary transformation that takes the state before the particle that fell in hits the singularity, to the state after it hits the singularity. When physicists talk about "loss of information" in this particular context, that is what they mean.
 

FAQ: Just what is information, and what is its place in Nature

What is information?

Information is a broad term that refers to knowledge, data, or facts that are communicated, received, or processed. It can take many forms, such as words, numbers, images, or symbols, and can be transmitted through various means, including language, technology, and sensory perception.

How is information created?

Information is created through the process of encoding, which involves transforming raw data into a meaningful form that can be understood and interpreted by a receiver. This can be done through various methods, including language, mathematics, and technology.

What is the role of information in nature?

Information plays a crucial role in nature as it allows organisms to communicate, learn, and adapt to their environment. It is also essential for the functioning of complex systems, such as ecosystems and the human brain.

Is information a physical entity?

This is a debated question in the field of information theory. Some argue that information is a physical entity that can be measured and quantified, while others believe that it is an abstract concept and cannot be treated as a physical object.

How does information relate to entropy?

Entropy is a measure of disorder or randomness within a system. Information can be seen as the opposite of entropy, as it reduces uncertainty and increases order. In other words, the more information that is present, the lower the entropy and vice versa.

Similar threads

Replies
29
Views
4K
Replies
25
Views
1K
Replies
124
Views
6K
Replies
28
Views
3K
Replies
27
Views
2K
Back
Top