Quantum mechanics and the macroscopic universe .

In summary, there is ongoing debate about whether quantum mechanics can fully explain phenomena at the macroscopic level. While classical mechanics cannot fully explain some macroscopic phenomena, such as ferromagnetism, quantum mechanics is able to explain properties of materials and even behaviors of large objects under certain conditions. However, there are interpretational issues surrounding the idea of superposition of states at the macroscopic level, and various solutions have been proposed, including the Copenhagen interpretation and other views that try to reconcile quantum mechanics with classical observations.
  • #36
Appreciate all the comments. I'm new to these forums. Looking forward to lurking around. Path integrals anyone? Joking...
 
Physics news on Phys.org
  • #37
Sample1 said:
Over the past 20 years or so many systems have been shown to fulfill them, i.e. quantum superpositions of macroscopic states DO exist.​

The full quote is: One can therefore assert that a quantum superposition of macroscopic states is never produced in reality. Decoherence is waiting to destroy them before they can occur.

But really, it depends on what you mean by "macroscopic". Does 10^11 particles making up the "object" be considered as "macroscopic"? If it does, then it has been done in the Delft/Stony Brook experiments. And in fact, the most recent proposal coming from Penrose and company would go even much larger than that using a set of mirrors.

Zz.
 
  • #38
Sample1 said:
Is it incorrect to state that the Schroedinger's Cat thought experiment is a metaphor? My understanding is that "One can therefore assert that a quantum superposition of macroscopic states is never produced in reality (Roland Omnes Understanding Quantum Mechanics when discussing decoherence)"

Well, there are those people who claim that the macroscopic world is NOT ruled by quantum mechanics (that is, the superposition principle is not valid), and there are those that claim that the macroscopic world as well as the microscopic world are ruled by the same physical theory.

The first ones have to explain where quantum mechanics stops to be valid, and how and why it links to the macroscopic theory etc...

The second ones have to explain how it comes that we don't OBSERVE obvious superpositions. One can find such an explanation, and such a view is called a "many worlds" view. Indeed, the misunderstanding of Schroedinger with his cat, and others, is to think that, for instance, *within the same environment* one will see some kind of ghostly mixture of a dead and a live cat, for instance. But this is not what would happen, if quantum mechanics were true on the macroscopic level: quickly, one macroscopic state (say, live cat) would entangle with its environment, including the "observer", and produce ONE set of consistent states, and the other macroscopic state (dead cat) would entangle DIFFERENTLY with the environment, to produce an entirely different but consistent set of states, ALSO including the "observer" (but in a different state now).

So each individual "observer state" would only see ONE thing: the first state would be such that it is consistent with having seen a live cat, and the second observer state would be in a state consistent with having seen a dead cat. NO observer state would be present that "sees both at the same time". And so, no, quantum mechanics does NOT predict, even on the macroscopic level that an observer would SEE "a cat both alive and dead at the same time".
 
  • #39
vanesch said:
Well, there are those people who claim that the macroscopic world is NOT ruled by quantum mechanics (that is, the superposition principle is not valid), and there are those that claim that the macroscopic world as well as the microscopic world are ruled by the same physical theory.

The first ones have to explain where quantum mechanics stops to be valid, and how and why it links to the macroscopic theory etc...

The second ones have to explain how it comes that we don't OBSERVE obvious superpositions. One can find such an explanation, and such a view is called a "many worlds" view. Indeed, the misunderstanding of Schroedinger with his cat, and others, is to think that, for instance, *within the same environment* one will see some kind of ghostly mixture of a dead and a live cat, for instance. But this is not what would happen, if quantum mechanics were true on the macroscopic level: quickly, one macroscopic state (say, live cat) would entangle with its environment, including the "observer", and produce ONE set of consistent states, and the other macroscopic state (dead cat) would entangle DIFFERENTLY with the environment, to produce an entirely different but consistent set of states, ALSO including the "observer" (but in a different state now).

So each individual "observer state" would only see ONE thing: the first state would be such that it is consistent with having seen a live cat, and the second observer state would be in a state consistent with having seen a dead cat. NO observer state would be present that "sees both at the same time". And so, no, quantum mechanics does NOT predict, even on the macroscopic level that an observer would SEE "a cat both alive and dead at the same time".


First, at the macroscopic level quantum superposition states are very close in energy -- per the usual assumptions about reservoirs. Further, the environment will interact with the object, and will create thermal fluctuations. For both a classical statistical approach or a quantum one, the upshot is that the actual state "seen" will be an average one. How so? I'll start with 1. the important states -- to be seen -- are essentially degenerate. Then, I'll assume that The thermal fluctuations can be modeled as a boson field, which creates a random walk of the macroscopic object in momentum space. A random walk interaction in momentum space, using degenerate perturbation theory, gives a single state with non-zero energy, the average state in fact. The other states correspond to quasi- particles with zero energy. So we see the average. We do the same thing ascribing a continuous nature to electric current.

In fact, given the way our visual system works, we actually see an average of an average. Seems to me that for macroscopic objects, the quantum fluctuations are dwarfed by the thermal fluctuations -- something like a quantum fluctuation of 0.01 ev vs a thermal fluctuation of maybe 100 ev -- from a speeding molecule. Again, seems to me that a complete analysis of seeing macroscopic objects will show clearly why we don't see macroscopic superpositions thereof -- we don't have the necessary resolution to do so.

(This is very similar to the basics of superconductivity.)

Note: I've tried numerous times to find an understandable account of decoherance, without much luck. So, the extent to which my ideas are in consonance or dissonance with decoherance is a mystery to me.

Regards,
Reilly Atkinson
 
  • #40
reilly said:
I've tried numerous times to find an understandable account of decoherance, without much luck.

I think this webpage does a decent job at providing a somewhat intuitive view of decoherence:

http://www.ipod.org.uk/reality/reality_decoherence.asp
 
Last edited by a moderator:
  • #41
nanobug said:
I think this webpage does a decent job at providing a somewhat intuitive view of decoherence:

http://www.ipod.org.uk/reality/reality_decoherence.asp

That article has an awful lot of words; I was hoping for a more concise description. From what I can tell, the notion of decoherence is very similar to getting to the target state of non-equilibrium statistical mechanics system, which, of course is what my last post is about.

Further, I saw nothing about collapse in the classical case. That is, for example, before the conclusion of a football game, at best we can know the estimated probability of,say, the Seattle Seahawks winning over the Oakland Raiders. Once the game is concluded, the initial probability of winning becomes the certainty of winning. The probability system valid prior to the win, collapses to a 0-100%, from, maybe, 57% probability that the Seahawks win.

Collapse is the handmaiden of any probability system -- because we are talking about the application of probability before and after some event. At the minimum, this event will result in a new probability system, conditional on the event.

Another strong reason for thermal effects in macroscopic measurements, at least for human vision, is that the light we see is a superposition of many photon coherent states. This means that Poisson processes are at work -- we are talking quantum E&M fields of classical currents -- which means the light with which we see is generated by random processes, which, I surmise, tend to behave along the lines of stochastic convergence.

And, the rods and cones of your eye are basically photoelectric detectors, and are quantum devices. There's tons of noise, in the sense of a communication system. As Shannon intuited, and Feinstein proved, the best way to beat noise is to take an average(s), and that's exactly what your visual system does. Both spatial and temporal averages are used. Not only that, but the samples involved are typically large, so that the standard deviations of the means involved are very small. (This is very nicely explained in Dowling's The Retina; and in Shannon and Weaver's Communication Theory.)


It seems reasonable to assert that thermal/random pertubations on many systems will result in convergence to the mean -- stochastic convergence, of course --, and thus a single value for a measurement is explainable.

However, it seems to me that there are plenty of systems not so amenable to experimental certainty. For example, consider a variant on the Kramers double well problem. For simplicity, consider two identical potential wells connected by a a barrrier.

That is, __ the potential looks like below.
________ | | __________________
|__| |__|

The wells both go from v=0 to -V, with width L, while the barrier goes fvrom -V to V', with width L'. Assume that the wells are deep enough to have bound states, and that V' >>V. Can you demonstrate how decoherence solves the "collapse" issue in such a set up

We are, of course, talking scattering, which here can have four basic outcomes. The particle, incident from the left, can end up captured in either one of the wells, can be buried in the barrier, or can proceed off to the right as a free particle. What can decoherence tell us about the outcomes?

Regards,
Reilly Atkinson
 
Last edited by a moderator:
  • #42
Just a random question but wouldn't light hitting our retina have decohered long before then? Being as it travels through the lens and the aqueous and vitreous humour? Not to mention through an atmosphere. Why would your photoreceptors be quantum devices?
 
Last edited:
  • #44
reilly said:
Further, I saw nothing about collapse in the classical case. That is, for example, before the conclusion of a football game, at best we can know the estimated probability of,say, the Seattle Seahawks winning over the Oakland Raiders. Once the game is concluded, the initial probability of winning becomes the certainty of winning. The probability system valid prior to the win, collapses to a 0-100%, from, maybe, 57% probability that the Seahawks win.

Of course, decoherence and other interpretations don't mean anything if you think that a quantum state is already a statistical ensemble...

The whole point in all these things is to try to give a picture of how a *single actual physical state* gives rise to a *statistical distribution* over a set of potential states: it is the whole interpretational difficulty.

But my question to you is: IF, as you claim regularly, a quantum state is nothing else but a statistical distribution (the particle came through one of the slits, only we didn't know which one), then why don't we work directly with the probability distributions ? Why do we bother using amplitudes ? In what way does quantum mechanics differ then from classical statistical mechanics ?

See, in a state: |az+> |bz-> - |az-> |bz+>, why don't we say that we have 50% of |az+> |bz-> and 50% of |az-> |bz+>, and why don't we work out the consequences *in the first case*, then the consequences *in the second case* and consider that the observed statistics will be a 50%-50% mixture of these consequences ?
Answer: because this doesn't work out!
 

Similar threads

Replies
20
Views
2K
Replies
3
Views
821
Replies
2
Views
2K
Replies
23
Views
1K
Replies
6
Views
1K
Back
Top