What are the fundamental information-processes in physics?

In summary, in the conversation, the speaker discusses the concept of fundamental information-processes in physics, which has been a topic of debate and speculation. There are multiple theories and approaches, such as Carlo Rovelli's "Relational QM", but the issue is complex and often counter-intuitive. The speaker makes three main points: 1. Physical observation is not just recording data, 2. The actual information-processes in physics are complex, and 3. Observation in physics could involve a recursive process. They also highlight the contextual nature of observation and the fact that nothing is observed simply and directly in physics.
  • #1
ConradDJ
Gold Member
319
1
I’ve been thinking about comments made by Fra in a number of threads, where he raises questions like – what does an observer “see” at the sub-atomic scale? We could make a long list of more and less reputable ideas about the fundamental information-processes in physics, going back to Wheeler’s slogan “It from Bit”. My own favorite approach is Carlo Rovelli’s “Relational QM”, which tries to derive the quantum formalism from information theory – but there are many variations on this theme.

There’s also a lot of well-established theory directly relevant to the question of physical information-processes. The problem is, this is exactly where theories tend to become counter-intuitive and even contradictory. A couple of obvious examples – the issue in QM about when a physical interaction constitutes a “measurement”, or the problem of how “non-local” quantum correlations can occur, since Relativity limits physical communication to the speed of light.

So despite all the relevant theory and experiment, we have no clear picture of what actually happens with information, in the physical world. As a result, the ideas about fundamental information-processes in physics tend just to be guesses, sometimes almost unrelated to all that’s known about QM and particle physics.

There are three main points I want to make about this line of thought, which I’m putting in separate posts below.

1. Physical observation (communication) is not just recording (transmitting) data. This may be obvious, but it’s important because information-theory was to begin with a theory about data-transmission.

2. The actual information-processes in physics are anything but simple. Logically simple processes (like duplicating data) almost don’t occur in physics, while what does happen in physical interactions has an informational structure that’s profoundly complex, in more than one respect.

3. Observation (communication) in physics could be a process that defines itself recursively. This goes back to Rovelli’s interpretation of quantum measurement, and suggests a way that inherently complex information-processes might nevertheless be fundamental.
 
Physics news on Phys.org
  • #2
1. Physical observation (communication) is not just recording (transmitting) data.

Before QM, it was a nearly unquestioned assumption that information resides in the real, objective properties and states of physical entities. To “observe” something meant to make a more or less accurate copy of the data given in the thing itself – to make a mental or physical record of some kind. To “measure” something meant to copy the given data into a standard numeric format, as when we measure the length of a stick with a ruler.

Likewise standard information-theory was based on the idea that data given at one location can be copied through some data-transmission channel to another location. Since the data was assumed to be inherently determinate, the fundamental issue was only the accuracy of the copying process through a given channel.

But in QM, we no longer have inherently determinate data to begin with. It’s not clear how there get to be definite facts in the world, but all the evidence indicates that this occurs only to the extent that information about things is actually observed, i.e. communicated between one physical system and another.

So however we conceive the fundamental information-processes, it seems that they involve more than duplicating or processing logical data that's "just given" in advance -- they have to do with how information gets defined (determined) in the first place, through physical interaction. And physical communication is more than just transmitting logical data that’s assumed to be well-defined in and of itself.
 
  • #3
2. The actual information-processes in physics are anything but simple.

If we ask, how is information determined (observed) through physical interaction? – the problem isn’t that we have no answer. There are way too many answers, since there are multiple ways of measuring any physical parameter. And, measuring any parameter always requires that several other parameters be “known” – so any way of observing anything assumes we have information gained from other observations of other kinds of things. The bottom line is that in physics, nothing is ever observed simply and “directly”.

For example, nearly all observations refer to relative locations in space and time. This seems simple, but if we ask what’s physically required to define a distance in space or an interval of time, we have to go back to atomic structure, since without atoms there would be nothing physically usable as “clocks and rods”. This bothered Einstein quite a bit, that the simple logical elements of Relativity could only be defined, operationally, by reference to the complex structure of atomic matter.

The issue isn’t just that observation is complex – it has a kind of complexity we don’t know very well how to deal with, because it’s essentially contextual. Somehow or other, this web of many different kinds of physical interaction provides contexts in which each kind of interaction can convey definite, observable information. But each type of interaction requires an appropriate type of context, in which other types of interaction are involved.

Fra suggests that to get to the base-level information-processes, we need to “scale down” to the level of very simple observers. That makes sense – but what would be the simplest level at which each item of information can have a context of other information that makes it observationally meaningful?

Now even the simplest interactions – apart from virtual ones – convey more than one distinct type of information. E.g. photons transfer momentum (which is “analog” data) and also spin-orientation (which is “digital” data, with only two possible values in any reference-frame). And in neither case does the data just get copied from one system to another – instead, a change in momentum of the charged particle that emits a photon is balanced by an opposite change in the momentum of the receiving particle.

We know that the interaction-structure of physics – taken as a whole – supports the communication of observable information. This is in a sense the most basic and most obvious fact about our world. But each type of information gets communicated along with other types, and gets defined in terms of other types. If we “scale down” to the simplest, most fundamental information-structures, how much of this complexity do we have to preserve, to maintain this sort of functionality?
 
  • #4
Hello Conrad,

I largely agree with what you say. From what I recall previous discussions, we are reasonably close in reasoning but if i remember your viewpoint correctly, you - like rovelli - doesn't seek to change QM as change, just reinterpret it?

This is consistent with your beeing lead to your key question...

"but what would be the simplest level at which each item of information can have a context of other information that makes it observationally meaningful?"

ConradDJ said:
My own favorite approach is Carlo Rovelli’s “Relational QM”, which tries to derive the quantum formalism from information theory – but there are many variations on this theme.

A note: I'm not sure what of his writings you specifically refer to but my impression is that Rovelli is not as ambitious to actually try to "derive te formalism" from a deeper principle. This would IMHO at leat, imply different turns than taken by rovelli, but that might well be done reasonably in the spirit of the RQM paper. I think that paper is yet to be written.

ConradDJ said:
1. Physical observation (communication) is not just recording (transmitting) data. This may be obvious, but it’s important because information-theory was to begin with a theory about data-transmission.

Indeed. This information views I'm consistently referring to can not be treated by standard shannon type information theory. Indeed there ARE books of "quantum information theory" that applies to STANDARD QM, that more or less uses standard information theory, but this is of course not what I am talking about. The intrinsic information views is something deeper, not yet existing.

We actaully need also a new development of mathematical information theory that fits the intrisic evolving idea here.

I'm not sure howto put it, but maybe one could say that shannons theory relates to what we are looking for, abit like like SR relates to GR.

ConradDJ said:
but what would be the simplest level at which each item of information can have a context of other information that makes it observationally meaningful?

Assuming we more or less are on the same page about the general ideas, it's true that the biggest problems to solve are

i. How can these ideas be turned into something that can make predictions and thus tell us something about the structure of physical law?

ii. Where do we start?

(i) The basic conjecture is that the laws of physics are inferrable constrains on actions, and the physical interactions with actions and reactions from the environment have traits of an inference process. The self-reference suggests that the inference processes are also constrained by the current context, and again the current context is also affected by the inference processes (see the analogy with GR? dynamics in the spacetime vs dynamics of spacetime).

Now this will suggest that inferences processes and the context are evolving. If we can described this process, and find a reasonable "initial condition", then predictions of the processes that are most likely to emerge might be possible. The population of inference systems would correspond to the population of laws and propertis of matter (standard model).

(ii) Where to start, the initial condition? I've given this some thought and the current startingpoint I consider is simply the notion of distinguishability, which is basically a bit if you like. But since there is no certainty, you can say that the starting point is a unreliable bit.

So the processes is pictured like a game. The action is based on the premise at hand, reliable or not, because it's the only choice.

The other major issues is to describe how one can play with one uncertain bit, to by means of compression, get say TWO unreliable bits. And thus to grow memory (mass). I picture the origin of mass like a game, the attraction of bits is only possible if the action system has the right compression.

Note that the entire view here is from the point of view of the simplest possible observer.

The point is that when you start at this point, you can use combinatorics to build expectations and possibilities. There is no such thing as the continuum at this point.

it might be tempting to think that an unreliable bit is better described by a continuum probability, but unfortunately I see no a priori physicla basis for this continuum. Instead, as the complexity of the observer grows, time history sequences can be stored, and discrete probability spaces can be built. So that the continuum probability and continuum information models are emergent only in the "large mass/complexity limit".

But if you were to start with a limit case in a fundamental construction you have totally lost track of things. The limit is a limit, not a starting point.

ConradDJ said:
3. Observation (communication) in physics could be a process that defines itself recursively. This goes back to Rovelli’s interpretation of quantum measurement, and suggests a way that inherently complex information-processes might nevertheless be fundamental.

Recursion or evolution is the only solution I have to these ideas. Non-recurse static solutions are not consistent with my reasoning as far as I can see. Again, compare to Einsteins expanding universe. But recursion is not a problem. It's only a problem for mind if you insist of nailing a static realist view of the world, while there in fact is none.

Instead I try to understand the effective laws of evolution, it's as close to the truth I think we'll come.

/Fredrik
 
  • #5
There would be two ways of looking at information here.

One assumes information to exist at a locale. The other that information is created at a locale. And these are two different views (though in both cases you would appear to find information at a locale).

The idea that bits actually could just exist would be the standard view - the information theoretic approach. The static view.

The idea that information gets created, gets shaped up, would be the semiotic approach.
The active or process view.
 
  • #6
Thanks for your comments... I'll consider and respond below. Meanwhile here's my 3rd installment, trying to make the connection with biological evolution.


3. Observation (communication) in physics could be a process that defines itself recursively.
ConradDJ said:
If we ask, how is information determined (observed) through physical interaction? – the problem isn’t that we have no answer. There are way too many answers, since there are multiple ways of measuring any physical parameter. And, measuring any parameter always requires that several other parameters be “known” – so any way of observing anything assumes we have information gained from other observations of other kinds of things.


Since we want the foundations of physics to be ultimately simple, it hardly seems promising to struggle with untangling this complicated “semantic web” of different types of physical information, communicated by many different kinds of interaction in different physical contexts. In what way could that lead to any fundamental insight?

Let’s go back to the thought that in QM, information is only well-defined to the extent that it gets physically communicated from one system to another. First of all, that’s not really right – it glosses over the whole issue of entanglement. What QM actually says is that when two systems interact and exchange information, their states get correlated. But these “states” are still superpositions of all their possible correlations.

The big issue in QM is that information-transfer as such doesn’t “collapse” the wave-function to give a definite fact about a system’s state. It only creates an entangled superposition of the states of the two systems. And yet, any time an actual observer (whatever that is) looks at a system, it never sees a superposition – it always sees definite information.

Rovelli’s solution to this paradox in his “Relational QM” is simple: everything counts as an “observer” and every interaction produces definite information (collapsing the superposition) – but only for the two systems that interact. For any other system, the systems remain in an entangled superposition of their possible states – until they interact with that third system. That interaction also conveys definite information... but all three systems are still entangled in superposition so far as any other system is concerned, until a further interaction happens ... and so on. In other words, definite information only exists in the web of communications between individual systems, not in any overall “objective reality” common to all systems at once.

What I want to take from this is that the fundamental information-process in physics may be recursive – i.e. defined in terms of itself. That is, information about a system becomes a determinate fact insofar as it’s successfully “observed” by another system – which means that the information gets passed on and is “observed” by another system, and so on. Information is “communicated” insofar as it results in information that’s communicated.

This is interesting because of the parallel with “reproduction” in biology, which is also defined recursively. I’ll explain that statement in a moment. But first – reproduction is clearly the fundamental information-process underlying biological evolution. And, it’s an inherently complex process, because molecules don’t just make copies of themselves. Every complex organic molecule gets created in a catalytic process that requires other types of molecules, which require other types of molecules to make them. It’s complex in many ways – including that to reproduce any organism, many distinct replication processes are involved that all have to be as exact and reliable as possible – and yet the failure of exact replication (mutation) is also critical to the evolutionary process.

So biological reproduction does not mean the exact duplication of an organism’s structure. An organism reproduces itself successfully insofar as it produces offspring that reproduce successfully, which means their offspring also reproduce, and so on. At any given point in evolutionary history, all the ancestors of all living organisms have actually met this criterion of continuous successful reproduction – so this is more than a theoretical definition.

In Rovelli’s Relational QM, from the standpoint of any observer, all the information about systems in the past that’s relevant to all currently available information, is factually determinate for that observer... even though this “determining” happens only for one observer at a time, in each interaction.

Now I realize Relational QM has its own unresolved issues – I’m not trying to propose any specific answer here. My point is that what’s most paradoxical about the issue of quantum measurement – the lack of any clear difference between interactions that “entangle” the wave-functions of systems and interactions that “collapse” them – could well be pointing to a recursive information-process in physics. And we know of at least one other field in which this kind of process plays a fundamental role, despite its inherent complexity.


Is there any conclusion from all this? First, I think there are strong reasons for trying to sort out what’s actually involved in the many different ways that information gets physically defined and communicated – even though in no case is this a logically simple process. But even how to approach this kind of analysis is a big open question.

Second, identifying basic information-processes in physics may not mean hunting for a single logical or mathematical principle from which we can derive the whole variety of observable phenomena. It could mean getting a clearer notion of the functionality involved in a system where information isn’t just “given” in the reality of “the things themselves,” independent of any context, but instead actually has to be defined in the context of other information in the web of real-time interaction.
 
  • #7
Fra said:
I'm not sure what of his writings you specifically refer to but my impression is that Rovelli is not as ambitious to actually try to "derive the formalism" from a deeper principle.


Actually his original Relational QM paper did try this... based on two main postulates. He needed a third to complete the derivation, but the meaning of that one wasn't very clear.

1. There's a maximum amount of information that can be extracted from a system.
2. It's always possible to acquire new information about a system.

What's interesting is that these postulates seem to be directly contradictory... if we’ve gotten the maximum information from a system, how can we then get more?

Well, at any given time and place, a system carries finite information. (This distinguishes QM from classical physics, where the location, momentum etc. of any particle is supposed to be precisely determinate, and so requires an infinite quantity of information to specify it.)

But if we then interact with the system, putting it in a new physical context – as Rovelli says, “ask a new question” of the system – we get new information. The “answer” is inherently unpredictable, because this information wasn’t already there in advance. In QM answers come into being only in response to physical questions.

Now this is just the dichotomy apeiron pointed out above:
apeiron said:
There would be two ways of looking at information here. One assumes information to exist at a locale. The other that information is created at a locale.


Rovelli says that in the quantum world we have both finite pre-existing information (defined by past contexts/questions) and the creation of new information (defined in the context of a new question). Apparently just putting these two principles together goes a long way toward giving us the structure of QM.

I don’t know whether this has anything at all to do with Rovelli’s quantum gravity work. He’s recently been emphasizing the timelessness of fundamental physics, which seems to me to go in a different direction.

But I think the RQM approach definitely relates to the issue we’re thinking about. Instead of conceiving the universe as a body of given fact – that could in principle be described by an omniscient observer outside the universe, outside of time – Rovelli seemed to be imagining it as a kind of information-process that creates new facts “in real time,” based on a selected set of past facts carried forward through the web of interaction.

So there's a kind of "Q&A" model of physics here. I'm envisioning it as something like a web of "measurement events" that define new information in a context given by past events, and pass that information on to other events, as part of their context.

If anyone wants the link to Rovelli's RQM paper -- the postulates are on pg 11-12:

http://arxiv.org/abs/quant-ph/9609002v2"
 
Last edited by a moderator:
  • #8
Fra said:
Where to start, the initial condition? I've given this some thought and the current startingpoint I consider is simply the notion of distinguishability, which is basically a bit if you like. But since there is no certainty, you can say that the starting point is a unreliable bit.

So the processes is pictured like a game. The action is based on the premise at hand, reliable or not, because it's the only choice.

Hi Fredrik –

I tried to describe possible “initial conditions” in this recent thread –

https://www.physicsforums.com/showthread.php?t=331008"

I do think we’re working in a very similar direction. And I think that at some point, the basic evolutionary mechanism we’re looking for will become clear, though it certainly isn't yet.

Before Darwin, it was obvious to everyone that living organisms reproduce themselves. And in pointing out that “natural selection” would inevitably effect the development of species, just the same way that human breeders develop new strains of plants and animals through selective reproduction, he was pointing out something also quite obvious. Even so, it took a long time before people’s mental focus could adjust to the evolutionary concept. Even today it’s remarkable how people can completely misunderstand how biological evolution works.

The thing is, with biology the information that’s evolving is objectively there, built into the structure of molecules and organisms. We can deal with it entirely in terms of “classical” determinate facts about things-in-themselves.

But in physics, the information we’re talking about isn’t just there as a set of given, context-independent facts about real objects. What “evolves” is information being communicated through the web of real-time interaction between things. And even though this kind of information is actually all anyone ever experiences of the physical world, as we interact with things in the moment, we’re so used to thinking of the world in terms of given facts about real objects that it may take a long time to make the mental adjustment.

But like reproduction in biology, what we’re trying to describe would be something that happens everywhere in physics, in every interaction that communicates information. So it's whatever makes the difference between "actual" measurement-events and "virtual" events that don't convey definite information.

As to “initial conditions” and the “starting-point”... If we have an “unreliable bit” we also need a way for that to make a difference to something else, which could then make a difference to something else, and so on. So it seems we need to be thinking of something like an interaction-web from the start. It would have to support some kind of random decision or agreement – I think this is what you must mean by your process of “inference” – but something much more “primitive” than the precise causal determining of one event by another, in classical physics.

It’s hard to imagine what very primitive measurement-events might be like – just as it’s hard to imagine what the earliest self-replicating systems may have been like in biology. But for me the key point is to keep in mind – there’s no such thing as information without a context to which it makes a difference in some way – where “making a difference” means passing on information that makes a further difference, in some way... The "game" has to reproduce itself.

Thanks again to you and apeiron... -- Conrad
 
Last edited by a moderator:
  • #9
ConradDJ said:
Actually his original Relational QM paper did try this... based on two main postulates. He needed a third to complete the derivation, but the meaning of that one wasn't very clear.

1. There's a maximum amount of information that can be extracted from a system.
2. It's always possible to acquire new information about a system.

Ok I see what you mean. I have read his paper before.

Several people, also Ariel Caticha in other papers has tried to derive QM, from plausible axioms. But I have not yet found one that doesn't add nontrivial axioms. It is just alternative ways to axiomatically introduce standard QM.

I really do think that you can do away with some of the postulates, but it will also deform QM.

Postulate 3 is more or less the key to the QM logic, which he just postulates. I didn't like that at all.

Other than that I've said it before that parts of that RQM paper is excellent. So it's worth reading at any rate.

I might commnet more later.

/Fredrik
 
  • #10
Conrad, it seems you have some issues with trying to make sense out of rovelli's reasoning, in the light of these radical ideas we are discussing. If that's the point, I agree, and I think it's because Rovelli's ambition is not (IMHO at least) really to implement the full version of the reasoning we discuss here.

As I see it, Rovelli's doesn't acknowledge that the framework for the information about the state of a system, which can only be communicated, can not be distinguished from the framework of the information about the evolutionary laws (laws of physics, symmetries etc) that a systme obeys.

Rovelli's in other papers makes clear that he belives in fundamental and objective symmetries, like diffeomorphishm symmetry in GR. I think that by consistency of reasoning here, if we really take what we are discussing seriously, doesn't allow any fundamentally objective (non-inferred) laws. Simply because there is no perfect inference/measurement process, thus there can be no perfectly certain laws.

This last thing makes everymore much more complex indeed, but it also opens up more possibilities of unification of spacetime AND matter that IMO rovelli's reasoning won't easily allow.

About the recursive nature of nature, I also agree. The reason for the recursion as I see it, lies in that it's not possible to by physical inference, arrive at a static conclusion. Every conclusion you make, is constantly evolving due to a self-reference. But this is not circular reasoning, because the self-reference is sufficiently constrained to warrant stability. Which is pretty much exactly like in biology. Controlled imperfections to allow diversity, combined with a selection principle.

Lee Smolin is closer to this reasoning that is Rovelli. If you read any of this books. For example "the life of the cosmos" with is a popular style book without math. This is a book where he suggest his Cosmological Natural Selection that there is diversity and reproduction of universes with different laws. The diversity is small to ensure stability. His main idea with CNS is that this taks place in black holes. (A black hole gives raise to a new universe with slightle varied laws).
(https://www.amazon.com/dp/0195126645/?tag=pfamazon01-20)

I think there are different ways to picture this though, so I am not tuned in specifically on Smolins CNS, but from a general standpoint Smolins has written more in line with this that Rovelli seems to have been done.

Smolin also have this talk about "reality of law"
(http://pirsa.org/08100049/)

/Fredrik
 
Last edited by a moderator:
  • #11
Fra said:
Conrad, it seems you have some issues with trying to make sense out of rovelli's reasoning, in the light of these radical ideas we are discussing. If that's the point, I agree, and I think it's because Rovelli's ambition is not (IMHO at least) really to implement the full version of the reasoning we discuss here.

See page 16 in the RQM paper.

"My effort here is not to modify quantum mechanics to make it consistent with my view of the world, but to modify my view of the world to make it consistent with quantum mechanics"

Thus, he really does not even question QM. His derivation is "invented/constructed" to fit precisely.

I think rovelli's paper would be great to read in your first QM course. Because it gives line out a "proper way" to understand quantum theory, as opposed to classical thinking. But I have since long left realist thinking, this is not at all the problem I have with QM. I fullt accept the lack of realism and determinism of outcomes. My objections are deeper; these objections are not addressed by rovelli at all.

So I think Rovelli is not the answer to the evolving constraint/symmetry and physical inside inference views we were discussing.

/Fredrik
 
  • #12
I looked up Ariel Caticha on arXiv and found a bunch of interesting papers more recent than the one I’d seen before. Have you looked at this? What do you think?

From Inference to Physics -- http://xxx.tau.ac.il/abs/0808.1260"

As to Rovelli --
Fra said:
See page 16 in the RQM paper.

"My effort here is not to modify quantum mechanics to make it consistent with my view of the world, but to modify my view of the world to make it consistent with quantum mechanics"

Thus, he really does not even question QM... So I think Rovelli is not the answer to the evolving constraint/symmetry and physical inside inference views we were discussing.


I agree, he does not provide an answer. He doesn’t try to reach any conclusion about what happens in the measurement process. And I agree that the 3rd “supplementary” postulate he uses to derive QM isn’t trivial, so we don’t know what it might be telling us about QM. Most important, Rovelli doesn’t try to explain how it is that different observers end up agreeing on a common, “objective reality” – given that there are only “facts” about things for each observer, not in an absolute sense. He just says, this is the way QM describes the world, and experiments show that’s correct.

And I agree that in his other work he seems to be on a different track – though I don’t pretend to understand what’s going on with Quantum Gravity theories. But I think both Rovelli and Smolin still tend to think about the world as a structure of relationships “seen from outside” rather than only from a point of view inside the interaction-web

The reason I keep coming back to the Relational QM paper is that it puts forward this basic idea so clearly – that whatever a measurement is, it happens for one observer at a time, not as a once-and-for-all, objective “collapse” of a quantum superposition. To me this is key to seeing the web of one-on-one relationships “from inside”. I think that if more people took this point of view in thinking about QM, we’d start to make real progress at last with the measurement issue.

But I don’t really get why you want to go beyond QM, or to modify it in some way. Maybe that will ultimately be necessary. But we’re still only part-way to understanding what the basic weirdness and simplicity of QM are telling us about the world.

So I like your quote from Rovelli: Let’s not try to change QM to make it look “sensible” in terms of an old view of things that’s clearly inadequate – let’s find a new way of thinking about the world that makes QM make sense, as it is.

That isn’t to say we shouldn’t try to derive QM from more basic principles. Rovelli’s paper begins by saying we’ll only know we have a good understanding of QM when we can derive the formalism from clearly stated principles – the way Relativity is derived from the postulate about the speed of light.

Rovelli's first two postulates do seem pretty clear to me... so if you have ideas on what the third one says, I’m definitely interested. Does this relate to Caticha’s work in any way? In the earlier paper I read of his, he thought that the linearity of QM was essentially required by consistency conditions -- so I don't think he was proposing any change to QM.
 
Last edited by a moderator:
  • #13
ConradDJ said:
I looked up Ariel Caticha on arXiv and found a bunch of interesting papers more recent than the one I’d seen before. Have you looked at this? What do you think?

From Inference to Physics -- http://xxx.tau.ac.il/abs/0808.1260"

I think I've seen that, I like and share a good part of his overall spirit and reasoning. Relative to some other programs I'm fairly close to his reasoning, but I object to a few quite important fundamental things in his reasoning. I think I've elaborated on it before. Ariel continuous along a line of reasoning of ET Jaynes who has writte a nice book "Probability theory the logic of science"

https://www.amazon.com/dp/0521592712/?tag=pfamazon01-20

ET Jaynes even has a line of reasoning early in that book where he kind of "derives" the formalism of standard probability theory from a sequence of plausible reasoning and plausible postulates.

One key postulate is to associate degree of belief with a real number.

For reasons that is implicit in the way I've reasoned several times here, this does not quite make sense and is not satisfactory.

Ariel builds on to Jaynes foundation. I am suggesting that the very reconstruction (that Jaynes did be reconstructing probability theory as an extension to logic as a type of reasoning) should be done different. The introduction of the concept of a the contniuum must not be made so easily.

What I want to do is similar to what Ariel and Jaynes do, but with at least two key difference: the difference in reconstruction of probability as an extesion of logic and thus abstraction of inference is that I reject the real numbers as physically distinguishable, instead I am working on a discrete reconstruction of probability. so we get a discretized version of the probability theory, and I also don't like the way Ariel reasons about which entropy measure that is "right". In my view, there is no universal entropy measure. Instead the "right" measure is a result of evolving the reasoning sufficiently many times.

This is basically the objection that some have bout maxent methods - that the choice of entropy is amgious, because the entropy measures around are a result from requiring (for ambigous reasons) that the entropy measure "must have" certain properties, this is how shannon entropy is often "derived", sure it's adeduction, but made from ambigous premises - and it is. That's somehow wha I'm tring to cure.

There are some key components IMO that are not present at all in their reasoning. Other than that, I love their reasoning.

/Fredrik
 
Last edited by a moderator:
  • #14
ConradDJ said:
So I like your quote from Rovelli: Let’s not try to change QM to make it look “sensible” in terms of an old view of things that’s clearly inadequate – let’s find a new way of thinking about the world that makes QM make sense, as it is.

This way you gave the statement an extra twist, it sounds like Rovelli is is not representing an old view, but he does to me :)

He is not old relative to pre-QM thinking, but hey QM has been around for 100 years now, so with the old thinking I'm referring to the mainstream post-QM thinking. Note that rejecting QM can be for two reasonas - you want back to deterministic realism and pre-QM reasoning (which I dno't), or you want to push current framework into the next one and get a unified understanding that includes gravity.

The view of things where I think QM is objectionable is definitely not the pre-QM old views that I agree are inadequate (Einstein etc).

/Fredrik
 
  • #15
Fra said:
One key postulate is to associate degree of belief with a real number.

For reasons that is implicit in the way I've reasoned several times here, this does not quite make sense and is not satisfactory... The introduction of the concept of a the continuum must not be made so easily.


Fredrik --

I'm certainly inclined to agree with you on this. One of the most basic facts in physics seems to be that all interaction that communicates information between things takes the form of discrete, momentary one-on-one connection-events. This seems to be the meaning of Planck's "quantum of action" and to underlie the uncertainty principle. So if the world is fundamentally a structure of information, the continuum must be an approximation that evolves in the interaction-web, not a structural a priori.
 
  • #16
ConradDJ said:
So if the world is fundamentally a structure of information, the continuum must be an approximation that evolves in the interaction-web, not a structural a priori.

Yes, I don't think it's a structural a priori either.

ConradDJ said:
This seems to be the meaning of Planck's "quantum of action" and to underlie the uncertainty principle.

In my view it's clear that action measures and entropy are related measures, action measures is a measure of how probable a given change is, and entropy is a measure of how likely a given state is a priori.

As I see it, an generally uncertain structure - given that we have an information measure - implicitly defines a space of differential changes, on which you can define a related natural information divergence measure, that is closely related to an action measure. If we are only talking about one single microstructure, then what we get is something simple such as classical thermodynamics where the actions are basically of diffusion type.

But, if you, like I am trying to work in, think that the microstructure of an observer is more like several different microstructures, that by relations are related, a more non-trivial dynamics appears. In particular to more persistent cyclic phenomena, that aren't the simple dissipative types that end up in heath death.

In this case the relations will define a new logic, in the sense that when you try to compute the transition probability, the x OR y and AND operators when x and y come from different sample spaces, new stuff happens. In particular if one transformation is the Fourier transform we get position-momentum spaces. But that is merely a special case in this view. I'm exploring the case of general transformations, and how the transformaions themselves re subject to evolution.

So the answer to why Fourier transporm and the special relation between postiion and momentum seems to be around is probably that this transform in this context has special desirable properties. But I'm still trying to figure some things out.

The trick seems to be to find out what the structures are and how they are related. Such a systems of a set of microstructures, with defined transformations, are in my view a kind of data compression. Decompositing and spawning of new structures and transformations are driven by evolution for best fitness.

So IMO, there is indeed a deep relations between states, and actions, information and information divergences.

But this view still contains a lot of unresolved issues. Standard formalism such as QM is IMHO simply to static to harbour this vision.

/Fredrik
 
  • #17
Some source of inspiration: Ponder over how the brain handles the analogous tasks. Certainly the human brain does not store raw time history data with infinite sampling rate of all sensory input, for very good reasons! First of all it would run out of memory quite soon, and the processing capacity to make use of the stored into would stall daily business.

It's thought from brain research that the brain actually recodes(transforms) and stores the information in a way and form that the brain itself judges to be most useful for hte future. So in short the brain doens't seem designed to accurately remember the past as a time history, it rather remember the past in the way/form that is likely to be of highest utility for the future. This things/characteristics of the original state of information that is expected to be useless, is discared by lossy compressions.

There is a very strong analogy with the probable function of the human brain and the way I think physical law and nature is organised. Nature learns, and only the fittest persists. This logic can be applied down to the notion of physical law.

/Fredrik
 
  • #18
Fra said:
But, if you, like I am trying to work in, think that the microstructure of an observer is more like several different microstructures, that by relations are related, a more non-trivial dynamics appears.

The trick seems to be to find out what the structures are and how they are related. Such a systems of a set of microstructures, with defined transformations, are in my view a kind of data compression.


Fredrik -- So much in your point of view makes sense to me. But "data compression" I don't get, as a fundamental process... because it seems to assume the data is just "out there" and doesn't need to be defined or determined through interaction.

Fra said:
Certainly the human brain does not store raw time history data with infinite sampling rate of all sensory input, for very good reasons! First of all it would run out of memory quite soon, and the processing capacity to make use of the stored info would stall daily business.

So in short the brain doesn't seem designed to accurately remember the past as a time history, it rather remember the past in the way/form that is likely to be of highest utility for the future. This things/characteristics of the original state of information that is expected to be useless, is discarded by lossy compressions.


I get the relevance of this, but I don't see the brain as recording the time-history of sensations and compressing it. Rather it seems as though there are many levels of neural networks sensitive to certain kinds of patterns in the data, and then to patterns in those patterns, etc. -- filtering the data up to a level where the organism can respond. Seems more like selection than compression.

But the issue I'm raising is really -- what is the "data" in the first place, and how is it physically defined? Your point of view seems so right to me, though I'm over my head with probability theory, entropy measures and so on. But unless you have an idea what form the information takes -- specifically, in physical interaction -- how do you find a starting-point for the mathematical theory?
 
  • #19
Very interesting discussion on information...thanks guys...

One question: What do you guys think about information redundancy??

Charles Seife has some interesting views on information in his 2006 book DECODING THE UNIVERSE an information based perspective of physics. His book is interesting because it looks at things like QM and relativityfrom an information based perspective.

He says in the introduction:
Information theory is so powerful because information is physical. It is a concrete property of matter and energy that is quantifiable and measureable...just like mass and energy, information is subject to a set of physical laws that dictate how it can behave...how it can be manipulated, transferred, created,erased, or destroyed.

..no one really knows what entropy is...information is intimately related to entropy and energy...in a sense thermodynamics is just a special case of information theory.

A measurement is an extraction of information from a particle. the information does not come for for free...Something about that information - either extracting it or processing it - would increase the entropy of the universe.

He also discuss how information theory vanquished Maxwell's Demon (via his 1871, Theory of Heat) by Leon Brillouin around 1951 (finally by an IBM researcher in 1982!) and Landauer's Principle (bit erasure is the one activity that costs energy..increases entropy) irreversable operations increase entropy...

There is a current reference cited in another thread, one by Marcus I believe, that says essentially discrete sampling is identical to continuous information...I forgot to bookmark it ..if anyone can find that, I 'd like to read from the view of quantum discreteness vs continuous relativity...I could not just find it...will try again...

Seife also points out the Erwin Schrodinger noticed living things fight off decay...and likely sensed that "the essentail function iof living beings is the consumption, processing,preservation and duplication of information..." and also notes that "the principle of superposition..is information based..."
 
  • #20
Naty1 said:
One question: What do you guys think about information redundancy??

Zurek took the idea of decoherence in an interesting direction that relates to redundancy of information... he calls it "Quantum Darwinism". You might be interested -- this is one of several papers.

http://arxiv.org/PS_cache/quant-ph/pdf/0308/0308163v1.pdf"

I like this work, to the extent I understand it -- but it seems to me that it doesn't address the key issue, i.e. how physical information gets defined / determined / measured / communicated in the first place.

By the way, I'm starting a separate thread in the Philosophy forum having nothing to do with physics, but focusing from another point of view on the question of the structure of communication-systems -- "What do human beings do when they communicate?"

https://www.physicsforums.com/showthread.php?t=334249"

Thanks -- Conrad
 
Last edited by a moderator:
  • #21
ConradDJ said:
Fredrik -- So much in your point of view makes sense to me. But "data compression" I don't get, as a fundamental process... because it seems to assume the data is just "out there" and doesn't need to be defined or determined through interaction.

No, that's not what I mean, but I understand your objection. It's just prolbematic to give an accurate description of this.

Information requires a context, yes. And there is a context. However it's not a fundamental fixed context. it's a dynamical context. So there IS a self-reference between information relative to a context, and the structure of the context itself. Because the context itself, does contain also information in the sense that it comes with ergodic hypothesis etc.

This means that all there is, in my view, is an hierarchy of information. One state of information can define a context for a more fine structured level of information.

the solution I envision that a given observer only sees a "window" in this hierarchy of information structures - this effectively truncates the world of all possibilities to a window of inside-DISTINGUISHABLE possibilites. At one end the structure is indistinguishable fro ma fixed context (but not that indistinguishable is a relative notion:) and at the other end, we have unpredictability. The size of the windows so to speak depends onf hte observer information capacity.

But this is still under reflection, while I have a reasonably clear vision and idea here, the details are under evolution.

ConradDJ said:
I get the relevance of this, but I don't see the brain as recording the time-history of sensations and compressing it. Rather it seems as though there are many levels of neural networks sensitive to certain kinds of patterns in the data, and then to patterns in those patterns, etc. -- filtering the data up to a level where the organism can respond. Seems more like selection than compression.

In my view, the point is that there is a strong analogy with the CHOICE of WHICH lossy compression to implement, and selection. The neural networks and the actual biological makeup of the brain IS of course one level of context of the information of hte brain.

ConradDJ said:
But the issue I'm raising is really -- what is the "data" in the first place, and how is it physically defined? Your point of view seems so right to me, though I'm over my head with probability theory, entropy measures and so on. But unless you have an idea what form the information takes -- specifically, in physical interaction -- how do you find a starting-point for the mathematical theory?

It you are asking for an explicit explanation that will have to wait. I'm still working slowly on this, nothing is published. Also since this is quite radical it does not make sense to publish part results that make sense only in my program. I need to motivate the program itself, and to do that I need to make a lot of progress.

My starting point is an abstraction in terms of a microstructure of distinguishable states that are constructed in a way that there is an inside, and and interface (communication channel). the communication channels, has certain distinguishable states (like a surface of a sphere) that defines the distinguishable events, inside the sphere then, the events are processed.

Conceptually my starting point is an imagined "inside view", this inside view constrains the possibilities of both stats and actions. (this is the basis of the reconsturction of the "probability theory"). It's a reconstruction of the information hierarchy I mentioned, starting from the smallest complexity. So the mathematical starting points are combinatorics and various dynamics sets, that correspond to sets of related discrete probability spaces. The fact tht there are relations between the possibility spaces ithe key to nontrivial actions.

These structures then are assume to respone rationally to disturbances. The rational response predices an action, and also an selection for the perturbed action itself. So the starting poitns is basically combinatorics, the continuum are emergent as the sets grow large, by since it's an hierarchy the cnotniuum hypopthesis is not valid at all levels. So there might one stronly quantised, or strongly continuous phenomena depending on the level.

/Fredirk
 
  • #22
Fra said:
Information requires a context, yes. And there is a context. However it's not a fundamental fixed context. it's a dynamical context. So there IS a self-reference between information relative to a context, and the structure of the context itself.

This means that all there is, in my view, is an hierarchy of information. One state of information can define a context for a more fine structured level of information.

My starting point is an abstraction in terms of a microstructure of distinguishable states that are constructed in a way that there is an inside, and and interface (communication channel).

Conceptually my starting point is an imagined "inside view", this inside view constrains the possibilities of both states and actions... The fact that there are relations between the possibility spaces is the key to nontrivial actions.


This all makes sense to me, and seems very interesting. Is your sense of the basic information-process that an "observing" system has at any given point a "space" of possible outcomes, in relation to another "observed" system? And the "action" then would consist of choosing a particular outcome as what gets communicated to another system -- i.e. as a constraint on its set of possible outcomes?

I'm wondering whether you envision something like my suggestion in post #6 above --
ConradDJ said:
What I want to take from this is that the fundamental information-process in physics may be recursive – i.e. defined in terms of itself. That is, information about a system becomes a determinate fact insofar as it’s successfully “observed” by another system – which means that the information gets passed on and is “observed” by another system, and so on. Information is “communicated” insofar as it results in information that’s communicated.

... identifying basic information-processes in physics ...could mean getting a clearer notion of the functionality involved in a system where information isn’t just “given” in the reality of “the things themselves,” independent of any context, but instead actually has to be defined in the context of other information in the web of real-time interaction.


You commented on this --
Fra said:
About the recursive nature of nature, I also agree. The reason for the recursion as I see it, lies in that it's not possible to by physical inference, arrive at a static conclusion. Every conclusion you make, is constantly evolving due to a self-reference. But this is not circular reasoning, because the self-reference is sufficiently constrained to warrant stability. Which is pretty much exactly like in biology. Controlled imperfections to allow diversity, combined with a selection principle.


It seems that you focus on modeling an "inference" process internal to the observing system, and that the "evolution" takes place in the observer's possibility-space, while I tend to think of it taking place in the structure of the communications channels and how they provide a context for each other. But it does seem that we may be looking at the same "game" from different viewpoints.
ConradDJ said:
It’s hard to imagine what very primitive measurement-events might be like – just as it’s hard to imagine what the earliest self-replicating systems may have been like in biology. But for me the key point is to keep in mind – there’s no such thing as information without a context to which it makes a difference in some way – where “making a difference” means passing on information that makes a further difference, in some way... The "game" has to reproduce itself.
 
  • #23
As a general comment of my impression of your last post I would like to say that on the philosophical side we seem quite close in our reasoning, even if there might be slight details.

Note that I said this also I think in your very first post here. The similarly in our reasoning was clear already in your first post. :)

But at another level, we are also discussing problems that are not yet solved, so there is unavoidable uncertain things there, which might be attributed to the fact that it's still open issues, rather than clean disagreements, although the symptms is sometimes disagreements due to the fuzzy nature of this. This is the kind of disgreement you easily achieve even when discussing with yourself ;-)

So relative to a lot of other major views represented here, I think we are fairly tuned. Perfecting tuning might be impossible simply because we're discussing open issues.

ConradDJ said:
This all makes sense to me, and seems very interesting. Is your sense of the basic information-process that an "observing" system has at any given point a "space" of possible outcomes, in relation to another "observed" system? And the "action" then would consist of choosing a particular outcome as what gets communicated to another system -- i.e. as a constraint on its set of possible outcomes?

With the resevation for the diffuculty of beeing precise, yes that sounds reasonably close!

In relation to the ENVIRONMENT. Ie. from a strict point of view of mine, to take a part of the environment and think that I am observing that, is ambigous and an idealisation. More properly I think that every observation/interaction is simply with your environment.

ConradDJ said:
I'm wondering whether you envision something like my suggestion in post #6 above --

Yes, it makes quite good sense to me, probably becuse our reasoning is so close, your words actually make sense to me :)

ConradDJ said:
It seems that you focus on modeling an "inference" process internal to the observing system, and that the "evolution" takes place in the observer's possibility-space, while I tend to think of it taking place in the structure of the communications channels and how they provide a context for each other. But it does seem that we may be looking at the same "game" from different viewpoints.

Actually I see no contradiction here. I see what you mean, and indeed communication channels is in my view identified with in a certain sense the boundary of the observers state space. IE. there is a relation between the "possibility-space" and thte communiation channel even in my view! The capacity of the communicaation channel can NEVER exceed the complexity of the possibility space. This is why remodulation of the inside, influences also the communication channels.

what you say makes sense to me, and the two view are not I think in contradiction.

The state space and communication channels are supporting each other. One without the other makes no sense, and it's mutuatl, by the same token you can't have an ordinary object with non-zero volume with zero boundary area.

/Fredrik
 
  • #24
Fra said:
This means that all there is, in my view, is an hierarchy of information. One state of information can define a context for a more fine structured level of information.


Have you tried thinking about this as an issue of thermodynamic-like equilibration? So observation is, as you seem to be describing, a semiotic process. A relationship between locales and contexts, figures and their grounds.

In terms of "where is the information?", I would suggest this then leads to a view that the "data-compression" is about the minimisation of information represented in this dynamic interaction between locales and contexts. There is a least mean path, a sum over histories, story going on. So locales and contexts become equilibrated as far as possible. Some information be left - an irreducible configuration energy if locales are knotted up in some particle or soliton like fashion. But generally the dynamic is about minimising all visible differences. And that is how data gets compressed. By the flattening effect of interactions.

In QM, this would be the decoherence approach. The observer exists evenly over all scales. Hence QM collapse follows a powerlaw over physical spatiotemporal scale. A poisson distribution for "QM events" like atomic decay.

It can also be likened to a tensegrity approach - http://en.wikipedia.org/wiki/Tensegrity

So a system is disordered and then becomes ordered along an axis of fractal scale symmetry as "observerhood" - interaction measured in some information theoretic coin - is equilibrated over all possible scales.

You really ought to check out hierarchy theory - Salthe's scalar hierarchy in particular - for good insights on this. I would also link it to the "pansemiotic" approach that derives from Peirce.

The question I am unsure of is whether you are thinking "information" has some particular scale for the universe as a system - especially, for instance, being a Planckscale phenomenon? So the bit is fundamentally the size of the Planck limit.

In the scalefree description of information I have given - and which you seem to be alluding to - information would exist for observers over all scales. So it would be holographic in this way. A whole stack of horizons stretching out to "infinity". There is a local Planck scale limit of course that anchors things. But also the global context of the visible cosmic horizon. And the two complementary limits are what are in dynamic interaction, equilibrating their mutual information to produce the final "maximally compressed" - or better yet, dissipated - state of irreducible information. Flat and even observerhood.

The notion of observerhood - a located point of view - has become an issue because of GR and, even more so, QM. But this was because in the mathematical models, information has become so divorced from meaning - the local bits so divorced from their interactions with a contextual frame. Thermodynamic approaches - of the dissipative structure kind - seem now a very natural framework for going back and making some sense of this. Of reuniting what has been broken apart - the idea of the observer and the observed.

Fra said:
These structures then are assume to respone rationally to disturbances. The rational response predices an action, and also an selection for the perturbed action itself. So the starting poitns is basically combinatorics, the continuum are emergent as the sets grow large, by since it's an hierarchy the cnotniuum hypopthesis is not valid at all levels. So there might one stronly quantised, or strongly continuous phenomena depending on the level.

/Fredirk


As far as I can follow your ideas, you are indeed making the mistake (sorry) of thinking of information as hard located bits. Which then must create ontological issues with the emergence of the global GR continuum. But if you instead think of "soft bits" that are formed as the local limit, then your modelling begins to look like heading towards the same hierarchical outcome.

Salthe, for example, started his scale hierarchy by assuming the existence of "entities" over all spatiotemporal scales and then pointing out how QM-like discreteness would appear holographic-like at the smallest scales, and GR-like continuity would appear at the largest scales. As an observer-based effect. So there are models out there that may help shape your thinking, even if just because you disagree with them.

The key question here is: should information have a fundamental scale? (Or equivalently, should an observer?). Then what happens to your modelling if you instead assume that information is free to arise, to exist, over all scales. And then this existence must equilibrate. And then some kind of holographic, self-organised, limits will be observable. There will be a smallest scale and a largest scale as emergent effects.

Information theory is based on the atomistic notion that bits are fundamentally small. So the local scale is fixed and not free-floating - self-organised through the equilibration of "observation" - dynamic interaction. This is one way of looking at things for sure. But then there is a second completely different approach which abandons the fixed atoms and starts instead with free (vague) possibility. Recovering the local atoms, along with the global continuum, in the holographic event horizon limits.
 
  • #25
apeiron said:
Have you tried thinking about this as an issue of thermodynamic-like equilibration?

Yes, except of course that there are only equilibriums at each level, here is no fixed microstructure to which equilibrium refers.

apeiron said:
As far as I can follow your ideas, you are indeed making the mistake (sorry) of thinking of information as hard located bits. Which then must create ontological issues with the emergence of the global GR continuum. But if you instead think of "soft bits" that are formed as the local limit, then your modelling begins to look like heading towards the same hierarchical outcome.

Nooo, definiteily not hard objective bits! :cry: I think your impression is because this is hard to describe.

That's indeed the whole point, there are bits in each view, but these bits or degrees of freedom are subjecto evolution, and can thus be said to be soft.

This is completely analogous to the conecpt of background independence, the special meaning of metrics, or the general meaning of any context. There HAS to be a background and the context, this is why it makes no sense to talk to complete background independence. It's just that the background (the background here is hte microstrucutre deininf the degrees of freedom) is subject to change, it is uncertain! To suggest objective hard bits would be in complete contradiction with my main point! This is not what I think even if you interprete fragments of what I tried to convey in that sense.

But the difficulty here is how to turn a framework of soft bits into quantiative mathematical predictions. This is still an open technical problem I think. The problem for me at this point is not the conceptal part, I have a pretty good vision. I am working on making this vision translate into a quantiative predictions. All attempts so far indicats without exception that this process itself is part of the point, and that the quantiative prediction IS evolving or iterative by nature. And it's important then to capture the mathematical nature of this evolution process.

Apeiron, we are also in reasonable tune I think. Your and Condrad's have a lot in common and I share large parts of it.

I think the repeating confusions, like the hard bits, is because of difficulty of describing this. There ARE open problems here - in progress, this is why I think the main discussion is about the main spirit that sets the direction of research. Here I pereceive that we are quite close.

Now when we are converging on the same spirit, the discussions must be changed from the general spirit to the the real open questions the seeks quantification here (that we understand more or less due to tuning) and this of course gets harder. I don't have all answers. What I do have too immature and is not ready to by published out of context.

But it's indeed a generalisation of statistical mechanics and probabilistic inference, but based on a new measure of information construct. Words can't nail this.

/Fredrik
 
  • #26
Fra said:
Yes, except of course that there are only equilibriums at each level, here is no fixed microstructure to which equilibrium refers.


/Fredrik

Glad it is soft bits after all! And soft contexts too! Actually, I should call the difference vague-crisp rather than soft-hard as this would be the more appropriate jargon.

And while I agree we would be talking about equilibriums at every level, I think the next crucial point is that there is then only one value for the resulting overall systems equilibrium - its Lyapunov exponent so to speak.

So across all scales, the universe's information/observerhood must be thermalised. A single temperature rules.

But where this gets tricky is that the universe is of course an expanding space. It is not a static system, closed in scale, but a dynamic system, open in scale.

So the right statistics is not the usual gaussian model of a closed system but the powerlaw or fractal statistics of a scale free system. The "temperature" is not damped around a single mean value but is expressed flatly - log/log fashion - over all scales. Hence the Lyapunov exponent analogy.

Basically, this all needs a modern open systems model of entropy, perhaps like Renyi or Tsallis non-extensive models. A fractal equilibrium model with an emergent axis of scale rather than a damped gaussian equilibrium such as is only possible inside a closed box.
 
  • #27
apeiron said:
So across all scales, the universe's information/observerhood must be thermalised. A single temperature rules.

This would be exact only at global equlibrium, if it exists yes. But knowledge of global equilibrium can not be known to be exact. It does in my view more take the form of a rational expectation, on which actions are based.

apeiron said:
But where this gets tricky is that the universe is of course an expanding space. It is not a static system, closed in scale, but a dynamic system, open in scale.

Exactly my point above. In general the spaces of possibilities are changing. Not always expanding, sometimes it's contracting too (at least in principle).

So this is a challange.

The point of course, if we take this reasoning seriously, the escape here can NOT be as simple to to invent an ad hoc structure, an external container state space for in which to describe the expanding state space. This external container doesn't exists. The fact the inside is the only physical container is what yield the evolving nature of this.

Here more work is needed.

I think the KEY to make predictions here is that the ACTION of a system, which is in a certain sense the containing for it's own environment should follow a rational scheme of inference - this is my key conjecture that makes this predictive, it's not just philosophy.

In current models, the actions of a system follows from the initial state AND the laws of physics. In my suggested reasoning here, state and law are treated on the same principal footing, the difference is that the sit in different levels in this hiearachy window, this also imples that the laws of physics usually put in in the lagrangian or hamiltonian, actually follows from the evoution. Not only does this provide a framework for understanding the state spaces, hilbert spaces etc - it provides a potential insight also to the understanding of the ORIGIN of symmetries/laws encoded in the hamiltonians or lagrangians.

apeiron said:
Basically, this all needs a modern open systems model of entropy,

I agree. In general we have an open system. So the problem is how to instead of basing actions of the fixed constraints of a closed system, construct actions based on the softer constraints of an open system - and then more importantly - what happens when two such systems interact.

/Fredrik
 
Last edited:
  • #28
Fra said:
In general the spaces of possibilities are changing... In general we have an open system.

So the problem is how to instead of basing actions of the fixed constraints of a closed system, construct actions based on the softer constraints of an open system - and then more importantly - what happens when two such systems interact.


One way to approach this is to think about what's involved in an evolutionary process. When I look at physics from that standpoint, it seems as though the fundamental processes may necessarily involve more than two interacting systems. Here's what I have in mind --


Evolution pertains to a process (like biological reproduction) that happens over and over again, where each iteration depends on the success of previous iterations, and inherits something from them. The process has to preserve certain information from the past, so that it doesn’t have to begin from the same baseline again and again each time.

It seems that each iteration must be able to give rise to more than one further iteration, so there can be a proliferation of variant forms, subject to natural selection -- some variants passing on information better than others. And to allow for variation, the passing on of information from one generation to the next has to be reliable, but not complete and exact.

As to the nature of this information, the one thing that must be passed on, in every generation, is the ability to pass that information on to another generation. In biological evolution, this key functionality is the ability of organisms to reproduce organisms that can reproduce.

I'm thinking that a "measurement event" may represent this kind of process. It involves the gathering of several kinds of information determined in other measurement-events, some of which are local – i.e. information preserved in the state of the "observer" as determined in its prior interactions – and some of which are distant events on the observer's past light-cone.

The sum of these moments constitutes a "measurement set-up" in which an interaction with some other system can happen in a number of ways -- the possibility-space -- and which way it happens makes a difference. The difference it makes gets stored as new information in the observer's state, which will sooner or later be communicated back out into the world in another interaction.

The idea is that what's essentially being passed on, from one measurement-event to the next – not through anyone interaction, but in the sum of many interactions that constitutes the "entire measurement situation" – is the functionality of measurement itself, i.e. the ability to create new measurement-situations. Essentially what each measurement does is to make other measurements possible, by preserving past information along with newly determined information in the current state of the observer, and then by contributing new information to other observers.

If such a process could somehow get itself started, then it's reasonable to suppose it could evolve ways to define more and more specific information, both more precisely and reliably (less "guessing"), and yet with more variation. The more that gets determined, the richer the basis is for determining new information.

I'm thinking this kind of evolution might eventually explain why the universe we actually observe looks so much like a "deterministic" system, even though the fundamental information-processes appear to be essentially random.

Fra said:
But it's indeed a generalisation of statistical mechanics and probabilistic inference, but based on a new measure of information construct. Words can't nail this.


I think this must be right. I’m hoping that to think about what’s required to make evolution work might help indicate what kinds of probabilities we’re dealing with, at a basic level.
 
  • #29
ConradDJ said:
Here's what I have in mind --

Evolution pertains to a process (like biological reproduction) that happens over and over again, where each iteration depends on the success of previous iterations, and inherits something from them. The process has to preserve certain information from the past, so that it doesn’t have to begin from the same baseline again and again each time.

It seems that each iteration must be able to give rise to more than one further iteration, so there can be a proliferation of variant forms, subject to natural selection -- some variants passing on information better than others. And to allow for variation, the passing on of information from one generation to the next has to be reliable, but not complete and exact.
...
I'm thinking that a "measurement event" may represent this kind of process. It involves the gathering of several kinds of information determined in other measurement-events, some of which are local – i.e. information preserved in the state of the "observer" as determined in its prior interactions – and some of which are distant events on the observer's past light-cone.

Yes, once we have agree on the general direction here, the next question is how to satisfy the basic requirements of evolution, ie. "diversity", "reproduction" and "selection" etc.

Smolins idea in his CNS is the black holes produce offsprings by producing new universes that produce new black holes. This is why in his reasoning he is lead to suggest that a typical universe should be optimized to produce black holes. And that during each offspring small variations in physical laws appear. Enough to get diversity, but not large enough to destroy the stability.

My current idea has similarities to yours. In my view the "DNA" if we put it like that, of physical law, is the action which is implicit in the measurement complex that constitutes and observer. Or rather the DNA is a certain trait of such complex. And when an observer interacts with it's environment, not only does is spread information, it also spreads it's way of reasoning - and THIS is the DNA of physical law.

So in my view, a viable system is able to convey it's action to the environment, and this ultimately makes the environment gradually more friendly for emergense of offsprings in consistency with the same DNA (or action).

I think this is similary to what you describe.

This means that such an observer is both self-preserving and produced offstrings INDIRECTLY (and not it the direct sense we konw from biology; like cellular division and copying of DNA etc) byt means of simple interactions. So the offspring is pretty much procuded by "induction" so to speak.

For selection, I see it as negotiation. Those who manage to negotiation with it's environment and still maintain a coherent strucutre, are selected. I see this a more or less a rational inference. Rational inference is selected among a random inference.

/Fredrik
 
  • #30
Fra said:
Yes, ...the next question is how to satisfy the basic requirements of evolution, ie. "diversity", "reproduction" and "selection" etc.

... when an observer interacts with its environment, not only does it spread information, it also spreads its way of reasoning - and THIS is the DNA of physical law.

... So the offspring is pretty much produced by "induction" so to speak.

For selection, I see it as negotiation. Those who manage to negotiation with its environment and still maintain a coherent structure, are selected. I see this a more or less a rational inference. Rational inference is selected among a random inference.


Fredrik -- I'm thinking about this... trying to see how to envision it in terms of familiar physical processes. Say two particles interact and there is a transfer of linear / angular momentum... is "induction" involved here? Or are you thinking of a different sort of process?

Note that "selection" is already built into the basic reproductive process, in biology -- that is, reproduction can succeed or fail, in any instance -- where "success" means that the organism succeeds in producing offspring that also succeed, etc...

"Negotiation with the environment" certainly comes into play here -- you could say, an organism or a species has to "maintain a coherent structure" while coping with its environment. This is true, but the selective "criterion" of success is in a sense already there prior to any issue of "adaptation".

This was an important issue in biology, where Lamarck saw the process of adjusting to the environment as something that happens at the level of individual organisms... in his theory of the "inheritance of acquired characteristics". That blurred the picture, by focusing on adaptation as the basic process rather than differential reproduction of fixed genetic information. (There may not be any analogy here to the evolutionary process in physics, where we're not dealing with literal reproduction. So far we don't have anything like the clarity of Darwin's basic insight.)

Anyhow if the basic process is "inference" or "reasoning" in some physical sense, I'm thinking about what might constitute "success", in terms of what gets passed on. I understand this is all a work-in-progress!...
 
  • #31
One assumes information to exist at a locale. The other that information is created at a locale. And these are two different views (though in both cases you would appear to find information at a locale).

I have my doubts about those assumptions...here's why...and I do have yet to read thru all that is posted above here, but before that, I wanted to post a dramatically different idea from that expressed in the above quote: From Leonard Susskind, THE BLACK HOLE WAR,2008.

I can't find the exact paragraph I want: the essential idea is the holographic principle (conjecture), that information in a region of space resides on the enclosing surface...In the case of a black hole, for example, Beckenstein's and Hawking's work shows if you add a bit to the black hole the horizon increases by one Planck area...but more generally, every time you describe a volume of space you can pick an ever larger "horizon", a larger enclosing surface, even to the edge of our universe if one exists, and the information content of the original volume is included on the surface area...but each time it resides at a different location..a different horizon! Information about a location in spacetime appears to have no definite location itself!

Susskind goes on to discuss that information in a finite region (or equivalently surface area) of space is itself finite...hence it appears space is discrete...this has been discussed in at least one other thread recently...and so appears to conflict with quantum field theory which is continuous...

and if that were not enough to support Conrad's assertion that information in the world is not very clear, you can also consider the horizon of a black hole and it's information content: Susskind points out
...the experimenter is faced with a choice: remain outside the black hole and record data from the safe side of the horizon, or jump into the holeand do observations from the inside...'You can't do both' "
he claims.

So it sounds like information resides in different places and your location may determine what information is accessible...

And as a reminder, I want to see what's been made in this thread of information loss in black holes ...

and just for fun here's the vote taken in 1993 at the theoretical physics conference in Sanata Barbara California:

WHAT HAPPENS TO IFNORMATION THAT FALLS INTO A BLACK HOLE (votes cast)
1. It's lost: 25
2. It comes out with Hawking radiation: 39
3. It remains (accessible) in a black hole remnant: 7
4. Something else: 6

I wonder how such a vote would go today??
 
  • #32
Conrad, post # 7 says:

But first – reproduction is clearly the fundamental information-process underlying biological evolution.

Via Charles Seife, Decoding the Universe, Chapter 4, LIFE, :

It is not the individual that is driving reporoduction; it is the information in the individual. The information in an organism has a goal of replicating itself. While the organisms body is a by-product, a tool for attaining that goal, it is just the vehicle for carrying that information around, sheltering it, and hel;ping the information reproduce.

(I know that's weird punctuation, but I quoted it as published.)
 
  • #33
Naty1 said:
In the case of a black hole, for example, Beckenstein's and Hawking's work shows if you add a bit to the black hole the horizon increases by one Planck area...

And so what about reversing the argument? If you instead keep subtracting away bits from the event horizon around a locale, eventually you would get down to some minimal amount of information. With QM saying you can never get down to just nothing at a locale.

Discrete points in spacetime would thus be seen as a limit on observation. All about getting down to the least amount of bits that can be seen. And so that becomes the event horizon which defines something as a location.
 
  • #34
And so what about reversing the argument?

No problem..I agree...it's Hawking radiation...

Discrete points in spacetime would thus be seen as a limit on observation

I'd say it differently: Below Planck Scale, nothing exists as we know it...there might be no information...

... when an observer interacts with its environment, not only does it spread information,

or perhaps the information is already everywhere...encapsulated in a boundary/surface condition...

The process has to preserve certain information from the past, so that it doesn’t have to begin from the same baseline again and again each time.

This seems to be different from what Rovelli in RQM says...removing information via new question (postulate #2) eliminates some prior information...to maintain his postulate #1.

I'd be interested if you guys that have been in the thick of the discussion could agree on a list of issues/uncertainties...I suspect that would be incredibly long when you got done. Then it would be interesting the pare the list down to a manageable number to try to tie together in a coherent theory...Given the relatively narrow scope of Rovelli"s RQM paper and all the things it touches even so, suggests a tough road ahead...good luck...
 
  • #35
Naty1 said:
This seems to be different from what Rovelli in RQM says...removing information via new question (postulate #2) eliminates some prior information...to maintain his postulate #1.


Well, certainly much prior information is lost. This is also true in biological evolution, of course. To recap from my post above –
ConradDJ said:
I'm thinking that a "measurement event" may represent this kind of [evolutionary] process. It involves the gathering of several kinds of information determined in other measurement-events...


Not all past information needs to be preserved, only what’s “relevant” (Rovelli’s term) to determining what can happen in future.
ConradDJ said:
The idea is that what's essentially being passed on, from one measurement-event to the next – not through anyone interaction, but in the sum of many interactions that constitutes the "entire measurement situation" – is the functionality of measurement itself, i.e. the ability to create new measurement-situations.


As to “tough road” – right. But to me, it’s not so much that there’s a long list of issues... it’s that there are so many ways in which information gets physically determined / communicated – all of physics is involved.

As in the black hole issue you raised – we know how to discuss information as if it’s “just there” in the world... we can quantify it, we can break it down into information about particles, information about fields... But to approach information from the standpoint of how it gets to be physically observable, in each case... is like heading into an unexplored jungle.

For his limited purposes, Rovelli could avoid all that. But it also means that he offers no answer to basic questions like – how do all these different observers actually end up agreeing on what’s going on in the world? As he shows, the QM formalism says that indeed they all do... but we get no insight into what makes this work.

Again, I think the reason it’s hard to understand the basic information-processes is that there are quite a few of them, and none are simple, and they’re all interdependent. This is what we would expect, as the result of an evolutionary process... but that doesn’t make this kind of approach less daunting. So "good luck" is needed...thanks.
 
<h2> What is the definition of information-processes in physics?</h2><p>The fundamental information-processes in physics refer to the ways in which information is transmitted, stored, and manipulated within the physical world. This includes the transfer of energy, the movement of particles, and the interactions between different systems.</p><h2> How do information-processes impact our understanding of the physical world?</h2><p>Information-processes are essential for understanding the behavior of matter and energy in the universe. They allow us to explain and predict the movements and interactions of objects, from the smallest subatomic particles to the largest galaxies.</p><h2> What are some examples of information-processes in physics?</h2><p>Some examples of information-processes in physics include electromagnetic waves, quantum entanglement, and the laws of thermodynamics. These processes play a crucial role in fields such as mechanics, thermodynamics, and electromagnetism.</p><h2> How do scientists study information-processes in physics?</h2><p>Scientists use a combination of theoretical models, mathematical equations, and experimental data to study information-processes in physics. They also use advanced technologies such as particle accelerators and telescopes to observe and measure these processes in action.</p><h2> What are the implications of understanding fundamental information-processes in physics?</h2><p>Understanding the fundamental information-processes in physics has numerous implications for our daily lives and the advancement of technology. It allows us to develop new technologies, improve existing ones, and gain a deeper understanding of the natural world around us.</p>

FAQ: What are the fundamental information-processes in physics?

What is the definition of information-processes in physics?

The fundamental information-processes in physics refer to the ways in which information is transmitted, stored, and manipulated within the physical world. This includes the transfer of energy, the movement of particles, and the interactions between different systems.

How do information-processes impact our understanding of the physical world?

Information-processes are essential for understanding the behavior of matter and energy in the universe. They allow us to explain and predict the movements and interactions of objects, from the smallest subatomic particles to the largest galaxies.

What are some examples of information-processes in physics?

Some examples of information-processes in physics include electromagnetic waves, quantum entanglement, and the laws of thermodynamics. These processes play a crucial role in fields such as mechanics, thermodynamics, and electromagnetism.

How do scientists study information-processes in physics?

Scientists use a combination of theoretical models, mathematical equations, and experimental data to study information-processes in physics. They also use advanced technologies such as particle accelerators and telescopes to observe and measure these processes in action.

What are the implications of understanding fundamental information-processes in physics?

Understanding the fundamental information-processes in physics has numerous implications for our daily lives and the advancement of technology. It allows us to develop new technologies, improve existing ones, and gain a deeper understanding of the natural world around us.

Back
Top