- #351
ddd123
- 481
- 55
As you know, the thermodynamic limit has for some cases shown noncomputability of the gap in quantum many-body theory though, which is even worse than nondeterminism, so it's a double-edged sword.
That makes it nondeterministic. Once probabilities are the basic quantities, one has a stochastic system. Note that in any classical stochastic system, probabilities have a deterministic dynamics, but they nevertheless describe stochastic, nondeterministic processes.stevendaryl said:If you treat Brownian motion using statistical mechanics, then it's deterministic. If you analyze a dust particle suspended in a liquid, your statistical mechanics will give a probability distribution for the location of the particle as a function of time
?ddd123 said:shown noncomputability [...] which is even worse than nondeterminism
A. Neumaier said:That makes it nondeterministic. Once probabilities are the basic quantities, one has a stochastic system. Note that in any classical stochastic system, probabilities have a deterministic dynamics, but they nevertheless describe stochastic, nondeterministic processes.
To go from the probabilities to the actual events is the classical version of collapse; cf. the companion thread. But nobody working on stochastic processes uses that weird language for it.
On the other hand, for a system in equilibrium (which involves a thermodynamic limit), quantum statistical mechanics produces the deterministic equations of equilibrium thermodynamics, where no trace is left of anything probabilistic or stochastic.
C Davidson said:A. Neumaier claims that quantum mechanics has no weirdness, despite demonstrations that objects as small as photons can share properties over more than a kilometer in Bell theorem tests. This sort of fuzzyheaded thinking has led to a "mass boson" called the Higgs which is so massive it cannot exist for a fraction of a second, despite the evidence the Universe has existed for 13 billion years. So the physicists "cook the books" with "virtual particles", and where the claims of "magic" cannot be refuted (as in entanglement), they simply demand it be accepted without explanation. No mechansim, nothing to see here, move along now.
Quantum mechanics isn't weird, but the explanations we have historically accepted are wrong. We will discover better ones.
vanhees71 said:In quantum theory the probabilities are also deterministic in the sense that the statistical operator and the operators representing observables follow deterministic equations of motion. That doesn't make quantum theory a deterministic theory in the usually understood sense. Determinism means that, as within classical physics, all observables at each time have a determined value and these values change via an equation of motion which let's you know any value at any time ##t>t_0##, if you know these values at a time ##t_0##.
Yes, and in the quantum case it is the same, if you drop the word ''macroscopic''.stevendaryl said:That's because it's pretty clear what the relationship is between the actual events and the statistical model: The actual case is one element of an ensemble of cases with the same macroscopic description. The collapse is just a matter of updating knowledge about which case we are in.
Equilibrium thermodynamics doesn't have the concept of a partition function. One needs statistical mechanics to relate the former to a probabilistic view of matter.stevendaryl said:I wouldn't say that. Equilibrium thermodynamics can be interpreted probabilistically: the actual system has a probability of [itex]e^{- \beta E_j}/Z[/itex] of being in state [itex]j[/itex], where [itex]E_j[/itex] is the energy of state [itex]j[/itex], and [itex]\beta = \frac{1}{kT}[/itex], and [itex]Z[/itex] is the partition function. (Something more complicated has to be done to take into account continuum-many states in classical thermodynamics...)
You can use statistical mechanics to do that, but not equilibrium thermodynamics, which is a 19th century classical theory that doesn't have a notion of particles. Statistical mechanics is much more versatile than thermodynamics, as one isn't limited to locally homogeneous substances.stevendaryl said:You can use the equilibrium thermodynamics to compute distributions on particle velocities, and thus to analyze the stochastic behavior of a dust particle suspended in a fluid.
A. Neumaier said:Yes, and in the quantum case it is the same, if you drop the word ''macroscopic''.
Equilibrium thermodynamics doesn't have the concept of a partition function. One needs statistical mechanics to relate the former to a probabilistic view of matter.
In the sense that it results in 19th century classical thermodynamics. In the latter theory there are known, exact, nonrandom relations between the thermodynamic quantities, and one can predict (from a thermodynamic potential and the values of a few state variables) the results of all reversible changes with certainty. No thermodynamics textbook mentions randomness (unless it refers to an underlying microscopic picture, i.e., to statistical mechanics).stevendaryl said:So in what sense is the thermodynamic limit of QFT deterministic?
Well, I argued that it might be nonlocal hidden variables - namely all those that describe the neglected environment. No Bell-type theorem excludes this possibility, and statistical mechanics demands that these variables must be taken into account. The only open question is whether these abundant hidden variables are enough to explain everything random. My strong suspicion is that they do.stevendaryl said:that sounds like a hidden-variables theory of the type that is supposed to not exist.
A. Neumaier said:Well, I argued that it might be nonlocal hidden variables - namely all those that describe the neglected environment. No Bell-type theorem excludes this possibility, and statistical mechanics demands that these variables must be taken into account. The only open question is whether these abundant hidden variables are enough to explain everything random. My strong suspicion is that they do.
This is a well-known argument, used already long ago by Wigner, I believe.stevendaryl said:it seems to me that taking into account the environment can't possibly resolve the nondeterminism using only unitary evolution. My argument is pretty simple:
Let [itex]|\psi_U\rangle[/itex] be a state (including an electron, a stern-gerlach device, and the environment) which leads to measurement outcome "spin-up" for a spin measurement. Let [itex]|\psi_D\rangle[/itex] be a state which leads to measurement outcome "spin-down". Then the state [itex]|\psi_?\rangle = \alpha |\psi_U\rangle + \beta |\psi_D\rangle[/itex] would be a state that would lead to an undetermined outcome to the measurement. Maybe you can argue that there is no way to produce state [itex]|\psi_?\rangle[/itex], but it certainly exists in the Hilbert space, and it's not at all obvious to me that it would be unachievable.
A. Neumaier said:This is a well-known argument, used already long ago by Wigner, I believe.
But it is not valid in my setting, where, in some huge, universal Hilbert space, there is a unique density matrix of the universe that describes reality, and all systems that are observable are projections of this universal density matrix to the tiny Hilbert space describing the microscopic system under investigation.
Most of the superpositions, while they exist in the tiny Hilbert space, have no relation to the universal density matrix, hence cannot be used to make an argument.
I only need to assume that the observed part of the universe is approximately in local equilibrium. This is amply corroborated by experiment, and provides a very strong constraint on the universal density matrix. Indeed, local equilibrium is just the assumption needed to derive fluid mechanics or elasticity theory from quantum field theory, and for more than a century we describe every macroscopic object in these terms. Thus only those density matrices qualify as typical that satisfy this experimental constraint.stevendaryl said:Why does the universal density matrix necessarily lead to definite outcomes to all possible experiments? Is there a way to prove this for a typical density matrix, or are your assuming some kind of "fine-tuning" of the initial density matrix to insure that it's true?
This is obviously not the case but this was not my claim. We do not need definite values but only values accurate enough to match experimental practice. This is a much less severe condition.stevendaryl said:Let [itex]\rho[/itex] be the density matrix of the universe at some time (let's pick a frame/coordinate system so that we can talk about the state at one time). Then the claim might be that there is a decomposition of [itex]\rho[/itex] into the form [itex]\rho = \sum_j p_j |\psi_j\rangle \langle \psi_j |[/itex] where [itex]\psi_j[/itex] is an orthonormal basis such that for each [itex]j[/itex], all macroscopic quantities (such as the outcomes of measurements) have definite values. I don't see why that should be the case.
A. Neumaier said:I only need to assume that the observed part of the universe is approximately in local equilibrium. This is amply corroborated by experiment, and provides a very strong constraint on the universal density matrix. Indeed, local equilibrium is just the assumption needed to derive fluid mechanics or elasticity theory from quantum field theory, and for more than a century we describe every macroscopic object in these terms. Thus only those density matrices qualify as typical that satisfy this experimental constraint.
In my book (see post #2 of this thread), I call the corresponding states Gibbs states.
Don't ask me. I don't understand this claim at all.stevendaryl said:So in what sense is the thermodynamic limit of QFT deterministic?
A. Neumaier said:This is obviously not the case but this was not my claim. We do not need definite values but only values accurate enough to match experimental practice. This is a much less severe condition.
stevendaryl said:I think that's just a clarification of what I mean by "macroscopic quantities". I like your suggestion of giving coarse-grained descriptions of the mass-energy density, and field values. If the description is coarse enough, then the uncertainty principle doesn't get in the way of knowing the "macroscopic state of the universe" to that level of accuracy.
Quantum theory is derived from empirical observations and organizes these into a coherent whole. Quantum field theory predicts - under the usual assumptions of statistical mechanics, which include local equilibrium - hydrodynamics and elasticity theory, and hence everything computable from it.stevendaryl said:But to me, the question is about quantum theory, not empirical observations. Does QM predict those observations?
In the view I outlined above, the universe is not in a pure state but in a Gibbs state where local equilibrium holds to a good approximation. This is not a pure state but a mixture, ##\rho=e^{-S/k}## where ##S## is an entropy operator and ##k## the Boltzmann constant.stevendaryl said:The question is: Can the universe be in a superposition of states that have different macroscopic states? If not, why not?
Interesting how you imagine these "non-local hidden variables" and their effects... In particular, are they actual variables, i.e. do they get changed by some processes? I think this is critical for distinguishing from LHV models - because constant "variables", even if called "non-local" in some sense, can in my opinion always be modeled by local copies. Only their non-local change, or in other words spooky action at a distance, is what sets a model apart of LHV models and allows Bell violations.A. Neumaier said:Well, I argued that it might be nonlocal hidden variables - namely all those that describe the neglected environment. No Bell-type theorem excludes this possibility, and statistical mechanics demands that these variables must be taken into account. The only open question is whether these abundant hidden variables are enough to explain everything random. My strong suspicion is that they do.
A. Neumaier said:This is a well-known argument, used already long ago by Wigner, I believe.
But it is not valid in my setting: Here, in the algebra of linear operators of some huge, universal Hilbert space, there is a unique density matrix of the universe that describes reality, and all systems that are observable are described by the projections of this universal density matrix to the algebra of linear operators of the tiny Hilbert space describing the observable system under investigation.
Most of the superpositions, while they exist in the tiny Hilbert space, have no relation to the universal density matrix, hence cannot be used to make an argument.
They change according to the Schroedinger equation of the universe, which determiens how ##\rho(t)## depends on time. The Hamiltonian would be known if we had a common generalization of the standard model and gravitation.georgir said:are they actual variables, i.e. do they get changed by some processes?
I don't think the paper has any relevance. The will of the experimenter is not relevant for Bell-type experiments, as all choices can be made by automatic devices. (See https://www.physicsforums.com/posts/5347224/ , especially point 9.)Hornbein said:I'd like to have your expert opinion on the Conway-Kochen theorem. http://arxiv.org/pdf/quant-ph/0604079.pdf and http://arxiv.o rg/pdf/0807.3286.pdf
Aha. So you are a superdeterminist, like t'Hooft? You are correct: the theorem does not exclude this possibility.A. Neumaier said:I don't think the paper has any relevance. The will of the experimenter is not relevant for Bell-type experiments, as all choices can be made by automatic devices. (See https://www.physicsforums.com/posts/5347224/ , especially point 9.)
In particular, the assumption made in their theorem is highly unrealistic. The choices made by an automatic device always depend on its internal state and its input, hence are in some sense determined by the information available to the device.
There is also no reason to believe that things would be different with humans, although here the definition of ''free will'' is beset with philosophical difficulties.
Off topic but, I wouldn't post emails on a public forum, it's inviting spam doomsday. Today's services are filtered but you increase it tenfold if not more. I may be wrong.rkastner said:I got a pingback on my blog from someone with a question/comment about my blog post concerning 'Wrong idea...' but I can't find the post and I don't know who asked the question.
Thanks, fixed itddd123 said:Off topic but, I wouldn't post emails on a public forum, it's inviting spam doomsday. Today's services are filtered but you increase it tenfold if not more. I may be wrong.
Further discussion of this part (concerning reality described by a universal density matrix), if any, please in this new thread!A. Neumaier said:Here, in the algebra of linear operators of some huge, universal Hilbert space, there is a unique density matrix of the universe that describes reality, and all systems that are observable are described by the projections of this universal density matrix to the algebra of linear operators of the tiny Hilbert space describing the observable system under investigation.
Most of the superpositions, while they exist in the tiny Hilbert space, have no relation to the universal density matrix, hence cannot be used to make an argument.
A. Neumaier said:One offshoot of this discussion (and the twin discussion of an associated experimental setting) is that I arrived at a new, improved understanding of relativistic causality. This settles (for me) all problems with causality in Bell-type theorems, and reduces the weirdness of nonlocality experiments to a problem in the psychology of knowledge. The residual weirdness is only of the same kind as the weirdness of being able to know what happens if some object falls into a classical black hole and when it will hit the singularity, although no information can escape from a black hole.