What is the mechanism behind Quantum Entanglement?

In summary: Locality means that the effect and the cause have to be within the same vicinity.Both of these assumptions hold true for all other aspects of physics.Yet, at least one of them must not be universally true or quantum entanglement would not give rise to the phenomena that we observe.There are a variety of speculative hypotheses for the mechanism of quantum entanglement, but none of them can be singled out as correct with existing experiments.
  • #176
gentzen said:
A different suggestion (SCNR): Fra, your views seem to be sufficiently evolved and detailed that it would make sense to write them down in a more coherent form than just as comments on other peoples questions and answers. Maybe as an FQXi essay, maybe as a paper of some form, maybe as a series of blog post, or... I am not suggesting that you should link your PF account to those "external activities" and give away more of your identity than you want. But I do suggest that you should do some activity in that direction. Otherwise you risk kidding yourself with respect to your views and their impact.
Thanks for your concern! But I have no illusions of anything here. I've been struggling with this for 25 years by now, and the a priori chance of working this out is of course nil. I decided long ago to not officially publish and vauge ideas anywhere, but have the ambition to work the theory out and publish something iff it solves some of the major the problems. Before then, no one will care about this no more than I care about string theory. Whoever has an idea, bears the responsibility to realize it. All else along the way are informal discussions for me that is often interesting for several reasons.

/Fredrik
 
Physics news on Phys.org
  • #177
vanhees71 said:
How do you come to that conclusion? To the contrary, with more and more advanced technology we are more and more able to observe the "quantum world" (as if there were any other world than the "quantum world"). To handle generic quantum system nowadays becomes more and more applied, and more and more universities of applied sciences develop curricula for the development of "quantum technology".

I think at the moment, direct observation of the quantum world, such as the scanning tunnelling microscope, requires the use of a macro object. But I take your point - technology is progressing rapidly, and such may (perhaps even likely) not hold in the future. So I stand corrected. It is a leftover from the early days of QM and is only an intuitive starting point to a theory based on observables. Even then, there are several QM formulations, all equivalent, some of which do not require observation:
http://math.bu.edu/people/mak/papers/Styer Am J Phys 2002.pdf

So I retract my statement, except as a starting point to a deeper understanding of QM.

Thanks
Bill
 
  • Like
Likes vanhees71
  • #178
vanhees71 said:
How do you come to that conclusion? To the contrary, with more and more advanced technology we are more and more able to observe the "quantum world"
This is what I consider to be the meaning of the indirect contact. Ie. The increasing amount of complexity of both preparation and postprocessing of large amounts of information for the image to emerge is what creates a sort of distance in the inference chain. But yes technology makes us reach further.

/Fredrik
 
  • #179
Sure, without measurement devices and other technology there'd not be much of physics and the other natural sciences as we know it today!
 
  • Like
Likes bhobba
  • #180
It is just impossible to go much below several nm of scale as the familiar building blocks of matter turn to quantumness(unpredictability). Sure, single atoms can still be manipulated but on a very different set of terms. I'd love to be able to dive in that Sea of unpredictability.
 
  • Like
Likes Fra
  • #181
vanhees71 said:
Causality means that the state of a (quantum) system can be influenced only by the past and not the future. In relativistic models of spacetime this implies that there cannot be causal influences from space-like separated events. In both classical and quantum relativistic theories this has been realized by a strict use of the paradigm of local field theories. In quantum field theory it is realized by a formal mathematical demand called the "microcausality principle", i.e., the quantum fields are the building blocks for all the operators that describe observables at a point in spacetime (usually densities like charge density, energy-momentum densities, etc.) must commute with the Hamilton density for space-like separated space-time arguments. This rules out any "spooky actions at a distance", i.e., causal effects can only be due to signals that propagate with a speed less than or equal to the speed of light in vacuum.
Why is it that signals in QFT can't propagate backwards in time at the speed of light in a vacuum (or less), thereby violating causality?

The only interactions in the SM have a preferred direction of time on their face are those involving the W boson, and even then, CP violation is well quantified in the CKM/PMNS matrixes and CPT symmetry still holds to limit the way that time asymmetry can change the relevant laws. Moreover, observations of entanglement do not generically involve interactions that include W bosons (except, perhaps virtual W bosons in high order loops).

For example, suppose that two particles are entangled and one of them is measured sometime later.

Why can't information regarding the resolution of that measurement travel back in time to the point of entanglement along the path that we perceive that the particle took to get there; and then, the information could be transmitted to the other particle from the point of entanglement to the point in time where the second entangled particle is measured?

The two particles are connected by an unbroken chain within space-time to each other in the same light cone, so that wouldn't violate locality, only causality. (It isn't even obvious to me in that case that "reality", which I agree is poorly named, would be broken.)

Doesn't the observation that a Feynman diagram can be rotated in any way desired and still hold true imply that the SM does not require causality, in the sense of there being a preferred direction of time?
 
  • #182
ohwilleke said:
Why is it that signals in QFT can't propagate backwards in time at the speed of light in a vacuum (or less), thereby violating causality?
This look like the transactional interpretation of QM; discussion of that belongs in a separate thread in the interpretations subforum.
 
  • Like
Likes bhobba
  • #183
PeterDonis said:
This look like the transactional interpretation of QM; discussion of that belongs in a separate thread in the interpretations subforum.
I'm sure you are right. But could you help me understand what makes this a transactional interpretation of QM as opposed to what was already being discussed here? I'm not sure I grasp what the distinction is.
 
  • #184
ohwilleke said:
could you help me understand what makes this a transactional interpretation of QM as opposed to what was already being discussed here?
Actually, I just noticed that this thread is in the interpretations subforum. So the transactional interpretation can be discussed here, at least as far as how it would account for entanglement and the associated correlations that violate the Bell inequalities, and doesn't require a separate thread.
 
  • Like
Likes bhobba
  • #185
ohwilleke said:
ould you help me understand what makes this a transactional interpretation of QM
Look up the transactional interpretation, and you will see that it says what you were saying in the post I responded to.
 
  • Like
Likes bhobba
  • #186
PeterDonis said:
Look up the transactional interpretation, and you will see that it says what you were saying in the post I responded to.
O.K., I had mostly been wondering why this wasn't an interpretation of quantum mechanics, so your previous post actually answered that question, although I appreciate knowing what this interpretation is called.

The transactional interpretation of quantum mechanics (TIQM) takes the wave function of the standard quantum formalism, and its complex conjugate, to be retarded (forward in time) and advanced (backward in time) waves that form a quantum interaction as a Wheeler–Feynman handshake or transaction. It was first proposed in 1986 by John G. Cramer, who argues that it helps in developing intuition for quantum processes.
"Cramer claims it avoids the philosophical problems with the Copenhagen interpretation and the role of the observer, and resolves various quantum paradoxes, such as quantum nonlocality, quantum entanglement and retrocausality." (from the article linked below).

and relatedly:

The Wheeler–Feynman absorber theory (also called the Wheeler–Feynman time-symmetric theory), named after its originators, the physicists Richard Feynman and John Archibald Wheeler, is an interpretation of electrodynamics derived from the assumption that the solutions of the electromagnetic field equations must be invariant under time-reversal transformation, as are the field equations themselves. Indeed, there is no apparent reason for the time-reversal symmetry breaking, which singles out a preferential time direction and thus makes a distinction between past and future. A time-reversal invariant theory is more logical and elegant. Another key principle, resulting from this interpretation and reminiscent of Mach's principle due to Tetrode, is that elementary particles are not self-interacting. This immediately removes the problem of self-energies.

It makes sense that I'm inclined towards this approach, since most of what I initially learned about quantum mechanics I learned from reading things written by Feynman.

The link calls this interpretation explicitly non-local, however, and I'm still not clear on why it is non-local instead of acausal. Apparently, it defines causality different than @vanhees71. But, I suppose that that is really just splitting hairs.
 
  • #187
ohwilleke said:
Why is it that signals in QFT can't propagate backwards in time at the speed of light in a vacuum (or less), thereby violating causality?
This is an assumption we make in all of physics.

In classical electrodynamics we choose the retarded solutions connecting the sources (charge and current densities) with the em. field. Of course there are infinitely many Green's functions that formally also solve the Maxwell equations (among the the advanced Green's function). The reason is that classical electrodynamics is time-reversal invariant, and we need the causality assumption in addition to the Maxwell equations to impose the corresponding boundary conditions to select the retarded solution as the one describing the emission of em. waves from their sources. Of course the theory must be formulated such that such a "causality choice" is possible, and the wave equation is such an equation, and it is closely related to the relativistic spacetime model, Minkowski space, which admits such a "causal order".

In relativistic QFT the way to enable the "causal order" is the microcausality constraint, i.e., that local observables commute at space-like separated arguments. Among other things that makes the time-ordering in the perturbative evaluation of S-matrix elements frame-independent and the S-matrix elements Poincare covariant. Also it ensures the cluster-decomposition principle.
ohwilleke said:
The only interactions in the SM have a preferred direction of time on their face are those involving the W boson, and even then, CP violation is well quantified in the CKM/PMNS matrixes and CPT symmetry still holds to limit the way that time asymmetry can change the relevant laws. Moreover, observations of entanglement do not generically involve interactions that include W bosons (except, perhaps virtual W bosons in high order loops).
Sure, among other things the weak interaction breaks time-reversal invariance. This, however, just says that for some processes the time-reversed process does not exist in nature.
ohwilleke said:
For example, suppose that two particles are entangled and one of them is measured sometime later.

Why can't information regarding the resolution of that measurement travel back in time to the point of entanglement along the path that we perceive that the particle took to get there; and then, the information could be transmitted to the other particle from the point of entanglement to the point in time where the second entangled particle is measured?

The two particles are connected by an unbroken chain within space-time to each other in the same light cone, so that wouldn't violate locality, only causality. (It isn't even obvious to me in that case that "reality", which I agree is poorly named, would be broken.)

Doesn't the observation that a Feynman diagram can be rotated in any way desired and still hold true imply that the SM does not require causality, in the sense of there being a preferred direction of time?
The preferred direction of time in the sense of a "causal time arrow" is, indeed, an assumption you impose to any physical theory. If a model (like electrodynamics or quantumchromodynamics) is time-reversal invariant for any process also the time-reversed process is possible in Nature (according to this theory).

In our classical example suppose you have a point-like source of radiation (i.e., time-dependent charges and currents within some small region). The usual situation of course is that electromagnetic waves are propagating out from this source, described by the retarded solution of the Maxwell equations. As an example take the Hertzian dipole radiation treated in any textbook on electrodynamics.

The time reversed situation is, in principle, possible. It describes some wave propagating towards the source in such a way that it is completely absorbed by this source. This situation indeed does not violate any laws of physics (including causality!). It is, however, very difficult to realize in practice. You'd have to somehow arrange sources of this incoming wave precisely such that it get's completely absorbed when it arrives at the localized charge distribution, and you'd have to do this over a wide area far away from it very precisely, and this is practically impossible.

Of course, it's not impossible for microscopic situations, where you often can prepare the "time-reversed process".
 
  • Like
Likes bhobba
  • #188
vanhees71 said:
The time reversed situation is, in principle, possible.
Don't you agree that a time-symmetric picture is more natural? Certainly for microscopic processes.

The virtue of the transactional interpretation (TI) is that it incorporates the Born rule, which in most other interpretations is an incongruent addition to unitary evolution, leading to the infamous measurement problem. Every physicist should know that a ket by itself is meaningless, and measurable quantities arise only when it is combined with a bra. And those have opposite time-dependencies. (At least in the Schrödinger and interaction pictures.)

The deficiency of TI is that it doesn't explain how forward and backward traveling waves ("offer" and "confirmation" waves) give rise to transactions ("handshakes"). Moreover, these waves cannot be waves in real space. They are merely mathematical devices describing correlations between events (Green functions), and the formalism was worked out long before Cramer introduced TI: the Schwinger/Keldysh closed time-path formalism.

To give the argument another twist, one could rephrase TI in terms of particles: a transaction could be seen as (for example) the exchange of a photon traveling forwards in time and an anti-photon traveling backwards in time. Of course we are habituated to think of objects moving forward in time, but the formalism doesn't dictate this. In fact I think it's futile to try to explain quantum processes in terms objects (be they waves or particles), because statements about the properties of those objects are inevitably contradictory. The formalism doesn't require the existence of such objects in the classical sense at all; electrons and photons enter QED only as correlation functions (Green functions) describing correlations between events.

A typical Bell-type experiment involves some "wiggling" of electrons in a Ca-atom, followed by similar "wiggling" of electrons in the detectors a few meters away, a few nanoseconds later. On what happens in between theory remains silent. We should be happy to have a theory that predicts the statistical regularities (non-local correlations) of those short-lived, localized current fluctuations. We shouldn't ask for more. :smile:
 
  • Like
Likes ohwilleke
  • #189
WernerQH said:
Don't you agree that a time-symmetric picture is more natural?
I don't, because our experience is not time symmetric. So our physical model should not be time symmetric either.

Note that time symmetry of laws is not the same as time symmetry of a model. A particular model is based on a particular solution of the laws, not just the laws themselves. Time symmetric laws can have time asymmetric solutions; such solutions just have to come in pairs, each one the time reverse of the other. So there is no problem in building time asymmetric models using time symmetric laws.
 
  • Like
Likes Dragrath, bhobba and vanhees71
  • #190
PeterDonis said:
So there is no problem in building time asymmetric models using time symmetric laws.
I don't think anyone would argue otherwise. But this is the interpretation section. Doesn't this ad hoc requirement bother you?
 
  • #191
WernerQH said:
Don't you agree that a time-symmetric picture is more natural? Certainly for microscopic processes.
I don't know. I don't consider the "discrete spacetime symmetries" very intuitive. Why should nature be invariant under time-reversal? Our everyday experience is also such that there's a clear direction of time.

In physics the most fundamental "arrow of time" is the just postulated causal ordering, i.e., the cause of an event must be temporally before this event.

Then one can show that various other "arrows of time" follow from this fundamental "causal arrow of time". One is the "thermodynamic arrow of time", which is defined by that direction of time for which the entropy of a coarse-grained description of the dynamics is not decreasing (staying constant defines then a thermal-equilibrium state). The usual way to derive it is to use the Boltzmann equation, which follows from the microscopic dynamics by neglecting correlations at the two-body level, i.e., the two-body phase-space distribution function in the collision term is assumed to be well approximated by the corresponding product of one-body distribution functions, and then the H-theorem follows from unitarity of the S-matrix (note that you don't need the assumption of time-reversal invariance as often claimed in the textbook literature; see Landau&Lifshitz vol. X for this important point), but looking closely at the derivation of the Boltzmann equation you see that this thermodynamic direction of time comes just from the assumption of the causal direction of time.
WernerQH said:
The virtue of the transactional interpretation (TI) is that it incorporates the Born rule, which in most other interpretations is an incongruent addition to unitary evolution, leading to the infamous measurement problem. Every physicist should know that a ket by itself is meaningless, and measurable quantities arise only when it is combined with a bra. And those have opposite time-dependencies. (At least in the Schrödinger and interaction pictures.)
I'm not familiar with the various interpretations. For me Born's rule is just another postulate of QT, which connects the abstract formalism to real-world observations, providing the minimal interpretation by just saying how to get the probabilities for measurement outcomes given the initial (pure or mixed) state and the Hamiltonian of the system, providing the dynamics.
WernerQH said:
The deficiency of TI is that it doesn't explain how forward and backward traveling waves ("offer" and "confirmation" waves) give rise to transactions ("handshakes"). Moreover, these waves cannot be waves in real space. They are merely mathematical devices describing correlations between events (Green functions), and the formalism was worked out long before Cramer introduced TI: the Schwinger/Keldysh closed time-path formalism.
The Schwinger-Keldysh formalism is based on the usual time evolution of quantum theory, combining the time-ordered ##\hat{U}(t,t_0)## and anti-time-ordered ##\hat{U}^{\dagger}(t,t_0)## when calculating the time evolution of the statistical operator. It's a calculational tool. I don't understand what this should have to do with forward or backward traveling waves.
WernerQH said:
To give the argument another twist, one could rephrase TI in terms of particles: a transaction could be seen as (for example) the exchange of a photon traveling forwards in time and an anti-photon traveling backwards in time. Of course we are habituated to think of objects moving forward in time, but the formalism doesn't dictate this. In fact I think it's futile to try to explain quantum processes in terms objects (be they waves or particles), because statements about the properties of those objects are inevitably contradictory. The formalism doesn't require the existence of such objects in the classical sense at all; electrons and photons enter QED only as correlation functions (Green functions) describing correlations between events.
In the standard formalism of QFT nothing travels backwards in time, and the photons are anyway identical with the anti-photons, because photons are strictly neutral. The claim that antiparticles were particles moving backward in time is ironically exactly the wrong interpretation of the formalism. Microcausality demands that the free-field operators must always contain both positive- and negative-frequency modes, but one writes a creation operator in front of the negative-frequency modes and a destruction operator in front of the positive-frequency modes, leading to particles and antiparticles having both positive energy and moving both forward in time. In fact this "Feynman-Stueckelberg trick" saves causality by implementing the microcausality constraint.
WernerQH said:
A typical Bell-type experiment involves some "wiggling" of electrons in a Ca-atom, followed by similar "wiggling" of electrons in the detectors a few meters away, a few nanoseconds later. On what happens in between theory remains silent. We should be happy to have a theory that predicts the statistical regularities (non-local correlations) of those short-lived, localized current fluctuations. We shouldn't ask for more. :smile:
 
  • Like
Likes LittleSchwinger, physika and ohwilleke
  • #192
hutchphd said:
Doesn't this ad hoc requirement bother you?
Why is it an "ad hoc" requirement that our models should match what we actually observe?
 
  • Like
Likes Dragrath and vanhees71
  • #193
PeterDonis said:
Why is it an "ad hoc" requirement that our models should match what we actually observe?
My interpretation is that the apparent/effective timelessness of microphysics is plausible due to how the laws and states are decomposed as we infer them, and I do not worry about it. This assymmetry should go away if we treat information about states and information about laws on similar footing.

I interpret it to be related to the asymmetry in the way we infer the laws, and the way we infer the initial state(preparation). This assymmetry holds most clearly specifically for small subsystems, for cosmological scale observations the inference of states and laws blur more, because we(the observer) can not observe these phenomena sufficiently many times with many different intitial conditions.

So what I find to be "ad hoc" is the artificial separation of "knowledge about laws" and "knowledge about initial conditions"; from the view of inference this sticks out as an inconsistency. Some sort of understanding of unification or "relation" between states and laws is missing.

If this didn't make much sense, there is a whole book-length attempt to explain (Time Reborn, by Lee Smolin)

/Fredrik
 
  • #194
vanhees71 said:
The Schwinger-Keldysh formalism is based on the usual time evolution of quantum theory, combining the time-ordered ##\hat{U}(t,t_0)## and anti-time-ordered ##\hat{U}^{\dagger}(t,t_0)## when calculating the time evolution of the statistical operator. It's a calculational tool. I don't understand what this should have to do with forward or backward traveling waves.
When you apply contractions of operators, anti-time-ordering implies that you have propagators going backwards in time. I understand that you prefer classical habits of thought and choose to simply redefine what "propagates".
 
  • #195
PeterDonis said:
Why is it an "ad hoc" requirement that our models should match what we actually observe?
It is "ad hoc" because that is what it is (it flows from no more fundamental consideration)
It clearly doesn't bother you so you have answered my question I believe.. I guess part of me thinks the arrow of time shoukd somehow appear on a celestial billboard !
 
  • #196
hutchphd said:
It is "ad hoc" because that is what it is (it flows from no more fundamental consideration)
By this definition, every requirement is ultimately "ad hoc" because it ultimately rests on some proposition that just "is what it is" and doesn't flow from a "more fundamental consideration". Ultimately there must always be some set of propositions that are that way; otherwise we have an infinite regress of "more fundamental considerations" that never bottom out in anything.
 
  • Like
Likes Dragrath, vanhees71 and hutchphd
  • #197
Absolutely true. But we do not know when we have reached some minimum number of "fundamental" truths (nor perhaps can we know). A phenomenological theory with 100 adjustable parameters is not equivalent to QFT even though each is rooted in "what it is". The fact that all theories are "ad hoc" does not make them equally interesting.
 
  • Like
Likes bhobba
  • #198
hutchphd said:
we do not know when we have reached some minimum number of "fundamental" truths
Yes, but the rule you raised a question about was "our models needs to match our actual observations". Wouldn't this end up being part of that minimum number of fundamental truths no matter what else happens?
 
  • Like
Likes bhobba
  • #199
But my point was that the phenomenological theory with 100 parameters was more "ad hoc" and therefore less interesting. It is Feynman's argument that the fundamental physical law is U=0 where U is the "unworldlyness".
 
  • Like
Likes bhobba
  • #200
hutchphd said:
my point was that the phenomenological theory with 100 parameters was more "ad hoc" and therefore less interesting.
But your original question to me in this subthread was why the "ad hoc" requirement for models to match our actual observations doesn't bother me. And my answer is simply that, whether you want to label the requirement as "ad hoc" or not, it seems to me like a requirement that's going to be there regardless of anything else, so why should it bother me? It shouldn't bother anyone. It's necessary to build models at all.
 
  • Like
Likes bhobba
  • #201
But then why not use 100 parameter theories for each small subset of physics?
 
  • #202
hutchphd said:
But then why not use 100 parameter theories for each small subset of physics?
I never said models matching observations was the only requirement, just that it is a requirement and I can't see why anyone would be bothered by it. Your original question to me seemed to indicate that you are bothered by it, so I'm trying to understand why. Nothing you have said addresses that at all.
 
  • #203
I guess I have never truly been satisfied with the "arrow of time" arguments, which are based on statistical inference (to my undertstanding) being applied to microscopic events. It bothers me because of the arbitrary asymmetry I guess. I know that likely sounds foolish.
 
  • #204
hutchphd said:
the "arrow of time" arguments, which are based on statistical inference (to my undertstanding) being applied to microscopic events
Not really. The basic argument is that we live in a time asymmetric solution to the underlying laws, which (with a few exceptions that don't seem like they should matter for most of what we observe) are time symmetric. There's no need for statistical arguments to show that; we just need to show that there is a time asymmetric solution to the laws that matches what we observe.

Some authors do present the argument as though it were statistical, for example in many accounts of the second law of thermodynamics. But those arguments have an additional assumption that is often unstated, namely, that the initial conditions from which the statistical arguments are made satisfy particular constraints (for example, low entropy). That unstated assumption is equivalent to the assumption that we live in a time asymmetric solution to the underlying laws; and once you assume that, as above, you don't need statistics to explain why there is time asymmetry. Even if we had infinitely precise knowledge of the initial conditions and there were no statistical uncertainty at all, the solution would still be time asymmetric and we would still observe the kinds of things we observe.

hutchphd said:
It bothers me because of the arbitrary asymmetry I guess.
But if the asymmetry matches our observations, why would it be "arbitrary"? It's there in our models because it's there in our observations.
 
  • Like
Likes Dragrath
  • #205
PeterDonis said:
But your original question to me in this subthread was why the "ad hoc" requirement for models to match our actual observations doesn't bother me. And my answer is simply that, whether you want to label the requirement as "ad hoc" or not, it seems to me like a requirement that's going to be there regardless of anything else, so why should it bother me? It shouldn't bother anyone. It's necessary to build models at all.
I think it's a matter of ambition of explanatory value that is bothering? A theory that needs 1000 parameters emprically fixed, adds less explanarory value than one that corroborates after tuning only 100 parameters. Similarly a theory that requires a priori improbable (ad hoc ~ low entropy) initial conditions to fit current observations comes with a bothering less explanatory value than a model that works with less fine tuning.

/Fredrik
 
  • Like
Likes Dragrath
  • #206
Fra said:
A theory that needs 1000 parameters emprically fixed, adds less explanarory value than one that corroborates after tuning only 100 parameters.
I understand this, but it's a different requirement than the requirement that models should match observations. The requirement you describe here is basically Occam's Razor: given two models that both match the observations equally well, the simpler of the two (which in this case means the one with fewer adjustable parameters) should be preferred.
 
  • Love
  • Like
Likes Dragrath and bhobba
  • #207
WernerQH said:
Don't you agree that a time-symmetric picture is more natural? Certainly for microscopic processes.

It is easy to fall into this kind of trap. It may help to look into the history of the Wheeler-Feynman absorber theory and why it never caught on. That is not to say it is wrong, and I think it was one of the inspirations of the transactional interpretation. For ideas differing from the norm, I think most physicists want collaborative evidence.

Thanks
Bill
 
  • Like
Likes Dragrath
  • #208
hutchphd said:
But then why not use 100 parameter theories for each small subset of physics?
It's the same reason SR was preferred to Aether theories like LET, and why these days, many (probably most) explanations of SR are based on symmetry rather than discussions of simultaneity as Einstein originally did. There seems to be an aesthetic filter most physicists/mathematicians have to hone in on the 'simplest' theory. One of the problems I find with discussing physics with lay people is they find it difficult to grasp that the 'aesthetic' filter is based on mathematical formulation. It's part of why I believe our education system has taken a backward step (at least in Australia), with most students not doing calculus in HS. When I went to HS, everyone chose to do it. Surprisingly, the reason guidance councillors gave to encourage students to take it wasn't science but economics. Back in those days, economics was taught using calculus, and in some places like Caltech still is, but that seems to have fallen out of favour although, IMHO, it is easier to understand that way. Of course, every informed citizen needs to know the basics of economics. At a practical level, it would be great if we could refer those interested in QM to Susskind's excellent book, knowing they have done the required calculus. He wrote it for such an audience ie those with dim memories of calculus from school.

Thanks
Bill
 
  • Like
Likes Dragrath
  • #209
hutchphd said:
I guess I have never truly been satisfied with the "arrow of time" arguments, which are based on statistical inference (to my undertstanding) being applied to microscopic events. It bothers me because of the arbitrary asymmetry I guess. I know that likely sounds foolish.
The most fundamental arrow-of-time argument is at the very foundations of all of physics. It assumes that there is causality, i.e., that it makes sense to look for regular patterns in the phenomena in nature to begin with, and the success of physics in describing these phenomena indicates that this is a pretty justified assumption.

As I tried to say in some postings above, the other "arrows of time" like the "thermodynamic" or the "electromagnetic" arrows of time follow from this most fundamental "causal arrow of time".
 
  • Like
Likes Dragrath, LittleSchwinger, bhobba and 2 others
  • #210
PeterDonis said:
I understand this, but it's a different requirement than the requirement that models should match observations.
Yes I agree they are different as such.

But my view is that the two things are still coupled in that they compete with common inference resources so effiency of progress rather than subjective measure of "simplicity". Ie do we (or observer/agent) invest in fine tuning a fixed parameter set to improve, or is the payoff better by evolving the theory itself?

I find that a fixed split between parameter sets and the theory that defines the parameters are quite disturbing and as hoc and begging for something more.

/Fredrik
 
  • Like
Likes Dragrath

Similar threads

Replies
5
Views
432
Replies
25
Views
303
Replies
54
Views
3K
Replies
10
Views
1K
Replies
9
Views
124
Replies
87
Views
5K
Replies
96
Views
6K
Replies
244
Views
10K
Back
Top