- #1
- 10,825
- 3,690
New Scientist recently published an article entitled 'A Bold New Take on Quantum Theory'. I found interesting
Unfortunately, it is behind a paywall, but I will give my precis.
How QM, which only predicts probabilities, gives rise to the solid, well-defined world around us is still a mystery.
The author suspects the answer could lie with Boltzmann and thermodynamics. Those quantum possibilities are never lost; instead, they are mixed so thoroughly into the 'cracks of reality' that we can’t see them.
We never (added by me - that should be rarely, e.g. the behaviour of liquid helium) see QM's odd effects in the classical, everyday world. So, what is happening?
When physicists considered this question over the years, they often considered measurements. No matter where an electron was before being detected (added by me - assuming it is anywhere - we have no idea what is going on before measurement), we only see it in one place once it is measured. Somehow, measurement snaps the wave-like cloud of possibilities to a well-defined position. This has been shown time and again in experiments. The process seems random and instantaneous, but physicists like me aren’t satisfied since nothing else acts this way.
Debate around how to interpret this weirdness has been raging for more than 100 years. In the 1920s, great thinkers like John von Neumann settled on the idea that when a measurement is made, the wave function “collapses” into a single outcome, deleting the other possibilities. But this explanation, which came to be known as the Copenhagen interpretation, is far from the only reading of the situation. (Added by me - I think the author may have what the great von Neumann thought a bit off the mark - I will leave the reader to investigate what he thought - Bohr would be a better person to mention, IMHO). The many-worlds interpretation says that every possible outcome of a measurement happens in other worlds we can’t access. Physicist David Bohm’s interpretation says the other possibilities never existed – they were only illusions created by our lack of information.
To make things more complicated, it has been clear since experiments in the 1970s that measurements don’t just happen on lab benches. Even stray air molecules hitting electrons can “measure” them and destroy their 'quantumness'. This process is called decoherence, which explains why we don’t see quantum effects at everyday scales. Once something gets big enough, too many other objects fly around that can “measure” it and upset its delicate quantum properties. But the same question still applies: how precisely does this process happen?
Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated or interact in ways such that the quantum state of each particle cannot be described independently.
In the face of new evidence, physicists are starting to view the cosmos not as made up of disparate layers but as a quantum whole linked by entanglement.
In the 2000s, physicists Robin Blume-Kohout, then at the California Institute of Technology, and Wojciech Żurek at Los Alamos National Laboratory in New Mexico took the idea of decoherence one step further. They argued that all the information in a system, including the quantum kind, spreads into the surrounding environment during this process. This quantum information includes the system’s superposition. But it also accounts for other intrinsically quantum features, like the bizarre, long-range “entanglement” that appears to allow instantaneous interaction between two quantum objects.
The pair claim that only certain types of information are easy to access after this spreading process, namely, the classical variety. The quantum information is there; it is just practically impossible to see. They named this idea quantum Darwinism in analogy to the evolution of living things. In this reckoning, the environment around a quantum object “selects” for the classical information, akin to how an environment – in a very different meaning of the word – selects for long necks in giraffes.
This framework is a powerful way of describing the interactions between a quantum system and its environment. However, it has only recently had a rigorous description of the physical process that makes the selection happen. This is where our group comes in with the idea of quantum thermodynamics.
Every step in the process can be explained using thermodynamics. And we are interested in what happens to the vanishing quantum states, which tends not to be focused on in quantum Darwinism. In a nutshell, we think this quantum information gets spread out between the object and the detector. And this spreading-out process mirrors the way things mix according to thermodynamics.
Now, historically, thermodynamics and quantum mechanics don’t go well together. The conventional idea of quantum measurements appears to break the laws of thermodynamics. These laws, which are sacrosanct to physicists, say that energy can’t be created or destroyed and that the universe becomes more disordered over time. The textbook description of a measurement seems to violate all of this. What’s more, it involves deleting information: when the particle goes from being in two places at once to only one, details about the second position seem to be destroyed. This violates the conservation of information, a principle upheld by every other law of physics.
These problems were easy to sweep under the rug for decades since we couldn’t probe in detail the exact interactions between quantum objects and the thing doing the measuring. It was easy to imagine that the problem was a by-product of inaccurate modelling of the measuring “device”. Though experiments have improved, the discrepancies have become harder to hide.
The strength of our idea is that it can’t help but obey the rules of thermodynamics since these are built in from the start. “Any measurement model should be in keeping with the rest of the laws of physics,” says Sophie Engineer, who is part of our team and works between the University of Bristol and Heriot-Watt University, UK.
Inspired by experiments showing entanglement over time, not just space, physicist Vlatko Vedral is reconsidering how we think of time in quantum mechanics. The new approach treats space and time as part of one entity and could help us unravel black holes and make quantum time travel possible.
At the heart of our idea is a thermodynamic process that Boltzmann studied called equilibration. Our group loves coffee, so we imagine this process by picturing a splash of milk poured into a cup of coffee. At first, the milk is a separate blob, but as the various particles randomly move around, it rapidly spreads and mixes with the coffee. Once the milk and coffee particles are fully mixed, it is extremely unlikely that all the milk particles will spontaneously gather up into a blob again. Eventually, the coffee-milk mixture settles into an equilibrium – we say that it equilibrates.
The laws of thermodynamics, however, say that given long enough, the milk and coffee will spontaneously separate back into the original, unmixed state. We would never see this happen because it would take far longer than the universe's age. But we do see it happen in much simpler setups.
We recently learned that something similar happens in the quantum world, too. In a 2018 study, Jörg Schmiedmayer and his colleagues at the Vienna University of Technology (TU Wien) in Austria showed that quantum decoherence can also undo itself. They observed a few thousand ultra-cold atoms in a box and saw how the atoms’ positions became less correlated with each other through random collisions. The amount of correlation eventually reached a low “equilibrium” value. But, after a few milliseconds, the correlation returned to almost its initial value.
This was a brilliant result. Decoherence should destroy these correlations, so seeing them spontaneously reappear indicates that it isn’t deleting information, just scrambling or hiding it. Findings like this inspired my colleagues, Maximilian Lock and Marcus Huber, also at TU Wien, to wonder if equilibration could also underpin quantum measurements. Along with Emanuel Schwarzhans, formerly at TU Wien, and Felix Binder at Trinity College Dublin in Ireland, they collected their ideas in a framework they dubbed the measurement-equilibration hypothesis (MEH).
MEH describes measurement as a process where a quantum system interacts with a measuring device. A “device” could be anything that interacts with the quantum object, not just what we would typically think of as a measuring device. This spreads information into the device until an information equilibrium is reached between the system and the device. The bigger the device, the more places there are for the quantum information to hide, making it harder to get that information back – but never impossible.
How would this work in practice? Let’s take the simple example of a particle in a cloud of many locations simultaneously. Before a detector measures that particle’s position, there is information about the potential places it could have been detected. When the detector comes into contact with the particle, these pieces of information mix into the particles of the detector. We think this spreading process somehow “broadcasts” information from the system, making the information about its classical position available to read but its “two-places-at-once” information harder to spot.
What are the implications?
The mathematics behind this process is complicated, so the first two papers on the framework, still in peer review, are heavy on calculations. First, my colleagues showed that equilibration between a quantum system and a detector can make the system look classical while only hiding the quantum behaviour, not destroying it. But a small enough detector would still allow quantum effects to peek through. The next paper, led by an Engineer and the first to which I contributed, takes the first step towards connecting this to experiments by looking at how best to extract information here.
Eventually, we would like to test our ideas in a lab, and thankfully Schmiedmayer is keen to work with our group to make this happen. With a set-up like Schmiedmayer’s 2018 experiment, we could potentially watch the measurement process happen in a small system, then watch it un-happen, Lock reckons. “We could then maybe show that, as the system gets larger, the ‘un-happening’ gets less likely,” he says. If we saw this, it would be evidence that MEH is on the right track. “That would be an extremely happy day,” says Lock.
So far, we have remained agnostic about what this idea means for any philosophical interpretations of quantum mechanics. But our ideas do brush up against these concepts. For instance, MEH explains what happens to all the measurement outcomes you don’t see – the other “worlds” of the many-worlds idea. They are all still here in our world; we can’t control the quantum system fine enough to observe them. “If we could grab hold of every single electron and control them in whatever manner we wanted, we wouldn’t be asking ourselves why the particle went left or right,” says Lock. “The idea of measurement becomes moot.”
This would remove much of the supposed mystique from wave function collapse since measurement only seems mysterious when we overlook how difficult it is in practice. As Lock puts it, it is about asking: “How do I, an inaccurate, ape-sized lump-thing, try to access something as finely detailed as the spin of an electron?”
It would also rule out the idea that collapse is a physical process that deletes information and that there is some harsh transition between classical and quantum realities. “Nobody forces you to make the classical world different from the quantum one,” says Schmiedmayer. “All you can say is that, in the classical world, the complexity is too big. I just can’t see the quantum part.”
There is another possible implication of our ideas. If you take Boltzmann’s ideas about equilibration to their extremes, you can imagine the whole universe equilibrating. If this is the case, some have speculated that long after the last stars die out in the ludicrously distant future, random fluctuations away from equilibrium will happen, resulting in sentient beings spontaneously and very briefly flickering into existence. This thought experiment, known as the “Boltzmann brain”, suggests that equilibration isn’t the end of the story for a big, dynamic system.
What happens if we push our idea to such an extreme? If we could keep track of every single subatomic particle in a detector and its environment, MEH says that you could, in principle, find all the hidden quantum information. This would almost be akin to seeing a particle in two places at once again, even after it has been measured. It is certainly an outlandish idea, and I sometimes can’t help wondering what Boltzmann would make of it if he were still around.
Well, that's the end of the precis. I found it interesting and hope others do too.
Thanks
Bill
Unfortunately, it is behind a paywall, but I will give my precis.
How QM, which only predicts probabilities, gives rise to the solid, well-defined world around us is still a mystery.
The author suspects the answer could lie with Boltzmann and thermodynamics. Those quantum possibilities are never lost; instead, they are mixed so thoroughly into the 'cracks of reality' that we can’t see them.
We never (added by me - that should be rarely, e.g. the behaviour of liquid helium) see QM's odd effects in the classical, everyday world. So, what is happening?
When physicists considered this question over the years, they often considered measurements. No matter where an electron was before being detected (added by me - assuming it is anywhere - we have no idea what is going on before measurement), we only see it in one place once it is measured. Somehow, measurement snaps the wave-like cloud of possibilities to a well-defined position. This has been shown time and again in experiments. The process seems random and instantaneous, but physicists like me aren’t satisfied since nothing else acts this way.
Debate around how to interpret this weirdness has been raging for more than 100 years. In the 1920s, great thinkers like John von Neumann settled on the idea that when a measurement is made, the wave function “collapses” into a single outcome, deleting the other possibilities. But this explanation, which came to be known as the Copenhagen interpretation, is far from the only reading of the situation. (Added by me - I think the author may have what the great von Neumann thought a bit off the mark - I will leave the reader to investigate what he thought - Bohr would be a better person to mention, IMHO). The many-worlds interpretation says that every possible outcome of a measurement happens in other worlds we can’t access. Physicist David Bohm’s interpretation says the other possibilities never existed – they were only illusions created by our lack of information.
To make things more complicated, it has been clear since experiments in the 1970s that measurements don’t just happen on lab benches. Even stray air molecules hitting electrons can “measure” them and destroy their 'quantumness'. This process is called decoherence, which explains why we don’t see quantum effects at everyday scales. Once something gets big enough, too many other objects fly around that can “measure” it and upset its delicate quantum properties. But the same question still applies: how precisely does this process happen?
Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated or interact in ways such that the quantum state of each particle cannot be described independently.
In the face of new evidence, physicists are starting to view the cosmos not as made up of disparate layers but as a quantum whole linked by entanglement.
In the 2000s, physicists Robin Blume-Kohout, then at the California Institute of Technology, and Wojciech Żurek at Los Alamos National Laboratory in New Mexico took the idea of decoherence one step further. They argued that all the information in a system, including the quantum kind, spreads into the surrounding environment during this process. This quantum information includes the system’s superposition. But it also accounts for other intrinsically quantum features, like the bizarre, long-range “entanglement” that appears to allow instantaneous interaction between two quantum objects.
The pair claim that only certain types of information are easy to access after this spreading process, namely, the classical variety. The quantum information is there; it is just practically impossible to see. They named this idea quantum Darwinism in analogy to the evolution of living things. In this reckoning, the environment around a quantum object “selects” for the classical information, akin to how an environment – in a very different meaning of the word – selects for long necks in giraffes.
This framework is a powerful way of describing the interactions between a quantum system and its environment. However, it has only recently had a rigorous description of the physical process that makes the selection happen. This is where our group comes in with the idea of quantum thermodynamics.
Every step in the process can be explained using thermodynamics. And we are interested in what happens to the vanishing quantum states, which tends not to be focused on in quantum Darwinism. In a nutshell, we think this quantum information gets spread out between the object and the detector. And this spreading-out process mirrors the way things mix according to thermodynamics.
Now, historically, thermodynamics and quantum mechanics don’t go well together. The conventional idea of quantum measurements appears to break the laws of thermodynamics. These laws, which are sacrosanct to physicists, say that energy can’t be created or destroyed and that the universe becomes more disordered over time. The textbook description of a measurement seems to violate all of this. What’s more, it involves deleting information: when the particle goes from being in two places at once to only one, details about the second position seem to be destroyed. This violates the conservation of information, a principle upheld by every other law of physics.
These problems were easy to sweep under the rug for decades since we couldn’t probe in detail the exact interactions between quantum objects and the thing doing the measuring. It was easy to imagine that the problem was a by-product of inaccurate modelling of the measuring “device”. Though experiments have improved, the discrepancies have become harder to hide.
The strength of our idea is that it can’t help but obey the rules of thermodynamics since these are built in from the start. “Any measurement model should be in keeping with the rest of the laws of physics,” says Sophie Engineer, who is part of our team and works between the University of Bristol and Heriot-Watt University, UK.
Inspired by experiments showing entanglement over time, not just space, physicist Vlatko Vedral is reconsidering how we think of time in quantum mechanics. The new approach treats space and time as part of one entity and could help us unravel black holes and make quantum time travel possible.
At the heart of our idea is a thermodynamic process that Boltzmann studied called equilibration. Our group loves coffee, so we imagine this process by picturing a splash of milk poured into a cup of coffee. At first, the milk is a separate blob, but as the various particles randomly move around, it rapidly spreads and mixes with the coffee. Once the milk and coffee particles are fully mixed, it is extremely unlikely that all the milk particles will spontaneously gather up into a blob again. Eventually, the coffee-milk mixture settles into an equilibrium – we say that it equilibrates.
The laws of thermodynamics, however, say that given long enough, the milk and coffee will spontaneously separate back into the original, unmixed state. We would never see this happen because it would take far longer than the universe's age. But we do see it happen in much simpler setups.
We recently learned that something similar happens in the quantum world, too. In a 2018 study, Jörg Schmiedmayer and his colleagues at the Vienna University of Technology (TU Wien) in Austria showed that quantum decoherence can also undo itself. They observed a few thousand ultra-cold atoms in a box and saw how the atoms’ positions became less correlated with each other through random collisions. The amount of correlation eventually reached a low “equilibrium” value. But, after a few milliseconds, the correlation returned to almost its initial value.
This was a brilliant result. Decoherence should destroy these correlations, so seeing them spontaneously reappear indicates that it isn’t deleting information, just scrambling or hiding it. Findings like this inspired my colleagues, Maximilian Lock and Marcus Huber, also at TU Wien, to wonder if equilibration could also underpin quantum measurements. Along with Emanuel Schwarzhans, formerly at TU Wien, and Felix Binder at Trinity College Dublin in Ireland, they collected their ideas in a framework they dubbed the measurement-equilibration hypothesis (MEH).
MEH describes measurement as a process where a quantum system interacts with a measuring device. A “device” could be anything that interacts with the quantum object, not just what we would typically think of as a measuring device. This spreads information into the device until an information equilibrium is reached between the system and the device. The bigger the device, the more places there are for the quantum information to hide, making it harder to get that information back – but never impossible.
How would this work in practice? Let’s take the simple example of a particle in a cloud of many locations simultaneously. Before a detector measures that particle’s position, there is information about the potential places it could have been detected. When the detector comes into contact with the particle, these pieces of information mix into the particles of the detector. We think this spreading process somehow “broadcasts” information from the system, making the information about its classical position available to read but its “two-places-at-once” information harder to spot.
What are the implications?
The mathematics behind this process is complicated, so the first two papers on the framework, still in peer review, are heavy on calculations. First, my colleagues showed that equilibration between a quantum system and a detector can make the system look classical while only hiding the quantum behaviour, not destroying it. But a small enough detector would still allow quantum effects to peek through. The next paper, led by an Engineer and the first to which I contributed, takes the first step towards connecting this to experiments by looking at how best to extract information here.
Eventually, we would like to test our ideas in a lab, and thankfully Schmiedmayer is keen to work with our group to make this happen. With a set-up like Schmiedmayer’s 2018 experiment, we could potentially watch the measurement process happen in a small system, then watch it un-happen, Lock reckons. “We could then maybe show that, as the system gets larger, the ‘un-happening’ gets less likely,” he says. If we saw this, it would be evidence that MEH is on the right track. “That would be an extremely happy day,” says Lock.
So far, we have remained agnostic about what this idea means for any philosophical interpretations of quantum mechanics. But our ideas do brush up against these concepts. For instance, MEH explains what happens to all the measurement outcomes you don’t see – the other “worlds” of the many-worlds idea. They are all still here in our world; we can’t control the quantum system fine enough to observe them. “If we could grab hold of every single electron and control them in whatever manner we wanted, we wouldn’t be asking ourselves why the particle went left or right,” says Lock. “The idea of measurement becomes moot.”
This would remove much of the supposed mystique from wave function collapse since measurement only seems mysterious when we overlook how difficult it is in practice. As Lock puts it, it is about asking: “How do I, an inaccurate, ape-sized lump-thing, try to access something as finely detailed as the spin of an electron?”
It would also rule out the idea that collapse is a physical process that deletes information and that there is some harsh transition between classical and quantum realities. “Nobody forces you to make the classical world different from the quantum one,” says Schmiedmayer. “All you can say is that, in the classical world, the complexity is too big. I just can’t see the quantum part.”
There is another possible implication of our ideas. If you take Boltzmann’s ideas about equilibration to their extremes, you can imagine the whole universe equilibrating. If this is the case, some have speculated that long after the last stars die out in the ludicrously distant future, random fluctuations away from equilibrium will happen, resulting in sentient beings spontaneously and very briefly flickering into existence. This thought experiment, known as the “Boltzmann brain”, suggests that equilibration isn’t the end of the story for a big, dynamic system.
What happens if we push our idea to such an extreme? If we could keep track of every single subatomic particle in a detector and its environment, MEH says that you could, in principle, find all the hidden quantum information. This would almost be akin to seeing a particle in two places at once again, even after it has been measured. It is certainly an outlandish idea, and I sometimes can’t help wondering what Boltzmann would make of it if he were still around.
Well, that's the end of the precis. I found it interesting and hope others do too.
Thanks
Bill
Last edited: