Heisenbergs Uncertainty Principle

In summary, the conversation discusses the application of Heisenberg's Uncertainty Principle in the macroscopic world and whether it is possible to measure the position and velocity of a car with 100% accuracy. The participants debate the limitations of instruments and the effects of the Uncertainty Principle on measurements. They also touch on the boundary between classical and quantum measurements and the role of quantum mechanics in understanding the behavior of objects like cars.
  • #1
Niles
1,866
0
Hi all

While watching the first stage of the Tour de France, I thought of Heisenberg and his Uncertainty Principle.

Does the Uncertainty Principle also work in the macroscopic world? I.e., can I tell with 100% accuracy how fast a car is going and where it is?

Regards,
Niles.
 
Physics news on Phys.org
  • #2
We usually treat objects like cars classically, neglecting any quantum effects. So effectively, [itex]\hbar = 0[/itex] and there is no uncertainty. But if you want to write down the wave function of the combined system of electrons, protons and neutrons that make up the car, be my guest :smile:
 
  • #3
The theoretical guts of the HUP is based on the idea that prior to measurement, the properties of a system are fundamentally indeterminate; after you've measured it as having some particular values you can go on to measure even canonically conjugate variables to arbitrary accuracy.
Decoherence theory would say (I think!) that any thermodynamically irreversible interaction is what constitutes a measurement. By virtue of its size, the car will be undergoing billions of such interactions (heat exchange with its surroundings, light bouncing off it, etc).
So the short answer would be no :biggrin:
 
  • #4
Niles said:
Hi all

While watching the first stage of the Tour de France, I thought of Heisenberg and his Uncertainty Principle.

Does the Uncertainty Principle also work in the macroscopic world? I.e., can I tell with 100% accuracy how fast a car is going and where it is?

Regards,
Niles.

How can you tell with 100% accuracy how fast a car is going? You have to measure it first. But the measurement itself would change its speed at the same time. So the accuracy will be less than 100%
 
  • #5
muppet said:
The theoretical guts of the HUP is based on the idea that prior to measurement, the properties of a system are fundamentally indeterminate; after you've measured it as having some particular values you can go on to measure even canonically conjugate variables to arbitrary accuracy.
Right, but as soon as you started measuring the conjugate variables, the original values you got would immediately become unreliable.

kahoomann said:
How can you tell with 100% accuracy how fast a car is going? You have to measure it first. But the measurement itself would change its speed at the same time. So the accuracy will be less than 100%
Suppose I set up two lasers at a very accurately measured distance, I measure when the light beams are blocked by the front of the car and I do the calculation (taking into account the processing speed of my circuit, the finite speed of the light and the signals in my wiring, etc). How does that not give me the velocity of the car to arbitrary accuracy? There is nothing in the HUP that prohibits me from measuring it as precisely as I please (and am able to set up my experiment).
 
  • #6
CompuChip said:
Suppose I set up two lasers at a very accurately measured distance, I measure when the light beams are blocked by the front of the car and I do the calculation (taking into account the processing speed of my circuit, the finite speed of the light and the signals in my wiring, etc). How does that not give me the velocity of the car to arbitrary accuracy?

I think this method would work, in principle, only until the measurement precision reached the wavelength of the laser light, since then we would have to take into account the exact nature of the reflection on that scale.

There is nothing in the HUP that prohibits me from measuring it as precisely as I please (and am able to set up my experiment).

Your measurements of the reflected laser beams are subject to the energy-time uncertainty principle. In order for your method to work, you need to measure the exact time at which the reflected beam arrives. But this introduces a corresponding uncertainty in the wavelength which introduces all kinds of trouble.
 
  • #7
Even in the macroscopic world you have to distinguish between "100% accuracy" and "accuracy within the limits of my instruments". If you are going to measure the speec of a car with "100% accuracy" you will need to tell us how your instruments are going to give that accuracy.
 
  • #8
Hmm, I can see we are dealing with uncertainties from the instruments and from HUP. Let us just say that we can neglect the uncertanties from the instrument, so what they tell us is 100% accurate (it's just a thought-experiment).Isn't the setup where I measure the speed of the car equivalent to "Heisenbergs Microscope" (http://en.wikipedia.org/wiki/Uncertainty_principle)? So even in the macroscopic world, 100% accuracy doesn't exist because of HUP (still neglecting the uncertanties from instruments)?
 
Last edited by a moderator:
  • #9
Niles said:
Isn't the setup where I measure the speed of the car equivalent to "Heisenbergs Microscope" (http://en.wikipedia.org/wiki/Uncertainty_principle)? So even in the macroscopic world, 100% accuracy doesn't exist because of HUP (still neglecting the uncertanties from instruments)?

Have you encountered any situation where, the MORE accurate you measure the position of a classical object, the less you are able to predict its momentum? I haven't, and I haven't seen any report of such an observation.

So lacking that fact, we have no basis to claim that the HUP exists in a classical measurement. If it does, then it shouldn't have been that weird in the first place.

Zz.
 
Last edited by a moderator:
  • #10
ZapperZ said:
Have you encountered any situation where, the MORE accurate you measure the position of a classical object, the less you are able to predict its momentum? I haven't, and I haven't seen any report of such an observation.

So lacking that fact, we have no basis to claim that the HUP exists in a classical measurement. If it does, then it shouldn't have been that weird in the first place.

Zz.

Can you tell me where is the boundary between a classical measurement and a quantum measurement if such thing exist? Cars are classical objects but are made of atoms, so quantum mechanics still apply.
That's why Schrodinger use Schrodinger's cat to show how absurd it is since cat is a classical object

Following quote is from http://en.wikipedia.org/wiki/Uncertainty_principle
In quantum mechanics, the position and velocity of particles do not have precise values, but have a probability distribution. There are no states in which a particle has both a definite position and a definite velocity. The narrower the probability distribution is in position, the wider it is in momentum.

Physically, the uncertainty principle requires that when the position of an atom is measured with a photon, the reflected photon will change the momentum of the atom by an uncertain amount inversely proportional to the accuracy of the position measurement. The amount of uncertainty can never be reduced below the limit set by the principle, regardless of the experimental setup.
 
Last edited:
  • #11
kahoomann said:
Can you tell me where is the boundary between a classical measurement and a quantum measurement if such thing exist? Cars are classical objects but are made of atoms, so quantum mechanics still apply.
That's why Schrodinger use Schrodinger's cat to show how absurd it is since cat is a classical object

Following quote is from http://en.wikipedia.org/wiki/Uncertainty_principle
In quantum mechanics, the position and velocity of particles do not have precise values, but have a probability distribution. There are no states in which a particle has both a definite position and a definite velocity. The narrower the probability distribution is in position, the wider it is in momentum.

Physically, the uncertainty principle requires that when the position of an atom is measured with a photon, the reflected photon will change the momentum of the atom by an uncertain amount inversely proportional to the accuracy of the position measurement. The amount of uncertainty can never be reduced below the limit set by the principle, regardless of the experimental setup.

I've cited several papers already https://www.physicsforums.com/showthread.php?t=241396" trying to induce more and more decoherence onto a system. A paper that I had highlighted showed that even ONE interactions with another particle can destroy the single-particle state to behave more in a classical fashion.

Here's what we know: classical system and quantum system are very much different from each other. Where they meet, and how they meet, no one knows. We have seen "large" systems that can behave quantum mechanically, ranging from something as large as a buckyball, to a superfluid having 10^11 particles. However, these are quantum systems, not classical. To simply infer that there is a continuous crossover between classical and quantum systems is to trivialize the problem via an unfounded speculation. If it is that trivial, why are we continuing to pour money and effort into studying this boundary?

Zz.
 
Last edited by a moderator:
  • #12
To simply infer that there is a continuous crossover between classical and quantum systems is to trivialize the problem via an unfounded speculation. If it is that trivial, why are we continuing to pour money and effort into studying this boundary?

ZZ, Phillip Ball and decoherence theory are addressing the transition between quantum superpositions and classical mixtures. The reason we need to address this is because the existence of quantum superpositions contradict normal classical experiments (I'm not talking about BEC, etc).

In contrast, the HUP does not contradict normal macroscopic experiments. There is no need to search for a HUP transition, because there is no contradiction, the effects of HUP become vanishingly small at macroscopic scales.

Similarly, we do not need to explore the transition between continuous and discrete energy states, because we can show that discrete states become arbitrarily close together as h -> 0.

Prior to decoherence theory, there was no way to make superpositions transform into mixtures by letting h get small, or m get large, etc, that is the reason for all the effort you mention.
 
  • #13
Crosson said:
ZZ, Phillip Ball and decoherence theory are addressing the transition between quantum superpositions and classical mixtures. The reason we need to address this is because the existence of quantum superpositions contradict normal classical experiments (I'm not talking about BEC, etc).

In contrast, the HUP does not contradict normal macroscopic experiments. There is no need to search for a HUP transition, because there is no contradiction, the effects of HUP become vanishingly small at macroscopic scales.

But that is my point. We cannot detect the effects of HUP for classical measurement even IF it is there (and that is a big IF). Thus, that is why I asked for signatures of the HUP in classical measurement. We know how the HUP rears itself in quantum measurement and that's who we know the HUP is valid.

It is still isn't obvious when and where a quantum system becomes totally classical where all the quantum behavior goes away. It is why this is continued to be studied and tested. Maybe the experimentalist in me is rearing its ugly head again, but considering all the myriad of tests and experiments being planned to study this, I'd say that the whole community of physics wants to know as well. This means that this is not a done deal where we can simply think that there is a smooth, continuous boundary between the two.

Zz.
 
  • #14
The question was "Does HUP also work in the classical world?"

The answer should be "yes, because HUP is a fundamental property of matter and we have no reason to believe it should go away at a certain scale."

Compare this with superposition, which we know does go away. That's why we need to detect the boundary. There are no experiments that indicate the HUP goes away at some large scale, so why would we look for the transition?
Maybe the experimentalist in me is rearing its ugly head again, but considering all the myriad of tests and experiments being planned to study this, I'd say that the whole community of physics wants to know as well.

What does your 'this' refer to? I don't know of any experiments being planned to test the HUP in the classical domain. If 'this' refers to 'the classical/quantum transition' then I think that statement is misleadingly vague, because the transition they are interested in is from superpositions to mixtures.
 
  • #15
Crosson said:
The question was "Does HUP also work in the classical world?"

The answer should be "yes, because HUP is a fundamental property of matter and we have no reason to believe it should go away at a certain scale."

I disagree. The HUP is actually isn't "fundamental", but rather a consequence of what is some time called the First Quantization. It is how we define what observables are and how 2 non-commuting observables influence each other. In fact, and I'm sure you know this, we seldom actually use the HUP in detailed calculations, and certainly not anything based on First Principles, even in QFT. While we invoke the HUP to explain various effects (zero-point energy, deBoer effect, etc), you seldom can simply get away with just using it without any kind of formal derivation to describe a phenomena.

So one simply can't say the HUP works but superposition doesn't, since they are ALL related and came out of the same description. You can't simply unravel one without affecting the other. To me, the HUP is one of only a suite of consequences and properties (such as superposition, Hermiticity, pure/mixed states, etc... ) that comes along with the whole baggage.

Zz.
 
  • #16
It's worthwhile reiterating that the physical origin of the uncertainty principle has nothing to do with the act of observation. If you don't believe that, have a look at a proof of it. You won't find any reference to the wavelength of an incident photon from a gamma-ray microscope or anything like that. Remember that Heisenberg thought that a trajectory did not exist until it was measured; the HUP quantified an indeterminacy inherent in nature.
The reason so many people misunderstand the HUP is because Heisenberg, immediately after its formulation, carried out his famous thought experiment (which he infamously got wrong first time around...) about a gamma ay microscope. The reason he did this was to satisfy himself that you couldn't carry out an experiment which would immediately measure something to greater precision than was allowed by the HUP.
 
  • #17
Yes, 'fundamental' was a poor word choice for the HUP. I should have said 'is derived under very general assumptions' which as Zz points out are not totally valid anyway.

So one simply can't say the HUP works but superposition doesn't, since they are ALL related and came out of the same description. You can't simply unravel one without affecting the other. To me, the HUP is one of only a suite of consequences and properties (such as superposition, Hermiticity, pure/mixed states, etc... ) that comes along with the whole baggage.

Interesting, I thought the explanation would be along the lines of 'the decoherence times are short', 'the uncertainty is small', but you bring up the more interesting possibility of a total cutoff for quantum effects, rather then an asymptotic vanishing.
 
  • #18
Crosson said:
Interesting, I thought the explanation would be along the lines of 'the decoherence times are short', 'the uncertainty is small', but you bring up the more interesting possibility of a total cutoff for quantum effects, rather then an asymptotic vanishing.

Actually, I don't think I did, because I have no idea if it is simply a smooth crossover or an abrupt phase-transition type either. So I certainly can't insist on a particular viewpoint.

Now, having said that, and having cited the paper where even just ONE interaction with another particle is sufficient to destroy the single-particle quantum state and cause the system to start to resemble a classical description, one is tempted to think that maybe the transition is abrupt. I find this to be very tantalizing, but still quite far from being convincing.

Zz.
 
  • #19
muppet said:
It's worthwhile reiterating that the physical origin of the uncertainty principle has nothing to do with the act of observation. If you don't believe that, have a look at a proof of it. You won't find any reference to the wavelength of an incident photon from a gamma-ray microscope or anything like that. Remember that Heisenberg thought that a trajectory did not exist until it was measured; the HUP quantified an indeterminacy inherent in nature.
The reason so many people misunderstand the HUP is because Heisenberg, immediately after its formulation, carried out his famous thought experiment (which he infamously got wrong first time around...) about a gamma ay microscope. The reason he did this was to satisfy himself that you couldn't carry out an experiment which would immediately measure something to greater precision than was allowed by the HUP.

The gamma-ray microscope is just an example or application of HUP. No one is saying it's the proof of HUP itself.

Following quote is from http://en.wikipedia.org/wiki/Uncertainty_principle
Physically, the uncertainty principle requires that when the position of an atom is measured with a photon, the reflected photon will change the momentum of the atom by an uncertain amount inversely proportional to the accuracy of the position measurement. The amount of uncertainty can never be reduced below the limit set by the principle, regardless of the experimental setup.
 
  • #20
kahoomann said:
The gamma-ray microscope is just an example or application of HUP. No one is saying it's the proof of HUP itself.

Following quote is from http://en.wikipedia.org/wiki/Uncertainty_principle
Physically, the uncertainty principle requires that when the position of an atom is measured with a photon, the reflected photon will change the momentum of the atom by an uncertain amount inversely proportional to the accuracy of the position measurement. The amount of uncertainty can never be reduced below the limit set by the principle, regardless of the experimental setup.

But I could come in and change that, and replace it with this that is equally valid:

One can also determine the "instantaneous" position of an atom by shooting a stream of atom through a "slit". The ones that passes through the slit had that position the moment it passed through. Here, I used NO PHOTON to measure its position. The uncertainty it the position corresponds to the width of the slit.

I could also located an atom using something ranging from an STM to AFM. No photons there either!

So it is not a "requirement" that one must use a photon to measure a position. That's a fallacy.

Zz.
 
  • #21
Niles said:
Hi all

While watching the first stage of the Tour de France, I thought of Heisenberg and his Uncertainty Principle.

Does the Uncertainty Principle also work in the macroscopic world? I.e., can I tell with 100% accuracy how fast a car is going and where it is?

Regards,
Niles.


Yes, the uncertainty principle poses a fundamental limit on the accuracy of any measurement. A good example of a macroscopic manifestation of this effect that is commonly encountered in the lab is in electric circuits. Even if there is no EMF between two points, a voltmeter will still show fluctuations in the voltage. At room temperature, this is mainly due to Nyquist noise (also inside the voltmeter itself). But if you cool everything, inclluding your voltmeter down to absolute zero, there are still fluctuations in the voltage.

The fluctuations are in fact fluctuations in the power (voltage times current). In a quantum mechanical description, voltage and current are operators that do not commute. They satisfy commutation relations similar to position and momentum.

You can also consider a supercondicting LC circuit. Suppose the angular resonance frequency is omega. Then the circuit has energy levels of:

(n+1/2) hbar omega

The lowest possible energy the LC circuit can have is thus
1/2 hbar omega. So, you always have a fluctuating voltage and current and this is readily detected in the lab.
 
  • #22
First, the HUP. Of course it holds in all circumstances which are subject to Quantum Mechanics. Uncertainty relations occur for all linear wave equations. There are two issues: first, what does it mean physically?,2. under what circumstances can it be observed in action?

What does it mean -- the Fourier transform of a delta function is a constant, 1 in fact. The Fourier transform of a Gaussian is another Gaussian. For practical purposes, one can think of coordinate-momentum or time-energy (E&M) relations. The upshot of this math is that uncertainty relations. associated with linear wave equations, are responsible for the phenomena of diffraction - speaking somewhat figuratively. So, the HUP is a natural consequence of the Schrodinger Eq.

Let's sneak in superposition, another property of linear wave equations -- one that is linear in the wave function and its derivatives. We know that for virtually any F(x), that
F(X-cT) is a solution of Maxwell's free field wave equation. Suppose that F is a Gaussian, with unit variance. Where's the superposition in F? Well, we can expand F(Z) is a power series, with Z =X-cT, and clearly get a superimposed solution. Then with
Fourier transforms we can get another superimposed solutions as we can also with Hermite polynomials.

What does superposition mean? In and of itself, it does not mean much of anything. Its just a useful mathematical procedure. But, just as spherical coordinates facilitate the description of central potential problems, the frequency-wavelength approach is particularly suitable for diffraction problems. Here's where the superposition comes in, via Hugyen's Principle, which says that any slit through which light -- or water, or... -- passes acts as a distributed source of radiation -- hence superposition of sources is in play. Plug in the notion of optical path-length, and the diffraction distribution will emerge; just as velocity of a rowboat in a stream can be found by a superposition of appropriate vectors.

On the other hand, the Rutherford cross section is identical whether computed classically or quantum mechanically, and shows no signs of superposition, at least at first glance.

Superposition-- just a common garden mathematical technique, sometimes useful, sometimes not. In optics, in particular, it proves to be a very powerful tool, which makes the explanation of lots of phenomena very simple.

When is all this stuff detectable? Macroscopically, generally not. The fluctuations, ultimately induced by the HUP, are too small to be detected, and are averaged out anyway. But phenomena like superconductivity and other cooperative phenomena certainly occur -- in a sense what happens here is that lots of states combine, superimpose if you will, so that the components of the system are in lock step.

In the quantum world, "slits" must be no more than a few hundred angstroms , a few wavelengths-- if I remember correctly -- to generate a diffraction pattern. That's pretty small. But, you can see lots of wave phenomena with water waves.

Regards,
Reilly Atkinson
 
  • #23
Heisenberg's uncertainty principle applies to the macroscopic world. Here is a problem from an old QM textbook:

A snooker ball of mass 0.1 kg rests on top of an identical ball and is stabilized by a dent [tex]10^{-4}[/tex]m (1 millimeter, pretty damn big) wide on the surface of the lower ball. Calculate how long the system takes to topple, neglecting all but quantum disturbances.

Now, that's a realistic question, and an experiment like that could be carried out in theory. The answer is [tex]10^{27}[/tex] seconds... And the question is macroscopic...
 
  • #24
Crazy Tosser said:
Heisenberg's uncertainty principle applies to the macroscopic world. Here is a problem from an old QM textbook:

A snooker ball of mass 0.1 kg rests on top of an identical ball and is stabilized by a dent [tex]10^{-4}[/tex]m (1 millimeter, pretty damn big) wide on the surface of the lower ball. Calculate how long the system takes to topple, neglecting all but quantum disturbances.

Now, that's a realistic question, and an experiment like that could be carried out in theory. The answer is [tex]10^{27}[/tex] seconds... And the question is macroscopic...

"... an experiment like that could be carried out in theory..."

The same can be said about any String Theory predictions.

A question in a textbook tends to IGNORE a whole lot of other non-negligible factor, because the pedagogical idea in presenting it is to test the knowledge of a principle, NOT to present any realistic case. A "ball" isn't a coherent object. That is the most crucial aspect of this. It is why in all the treads asking for the probability of a ball "tunneling" through a wall (it could be "carried out in theory") is rather meaningless unless one simply ignores a whole set of realistic conditions and make it into an unrealistic problem. That is what is going on here. It is why it takes a superconductor to produce a coherent, "macroscopic" object with 10^11 particles where quantum effects are now evident. It isn't easy!

Zz.
 
  • #25
The fact that a pencil when balanced on its tip cannot remain in that state for longer than a minute due to quantum fluctuations alone is better example. The fact that the pencil isn't a coherent quantum object is not relevant in this estimate.
 
  • #26
Count Iblis said:
The fact that a pencil when balanced on its tip cannot remain in that state for longer than a minute due to quantum fluctuations alone is better example. The fact that the pencil isn't a coherent quantum object is not relevant in this estimate.

Actually, that is a bad example. Since the object isn't coherent, "quantum fluctuations" would be completely random, and thus, cancel each other out. When you have random "noises", averaging it out over time or over large numbers will cancel its effect. If not, you have to explain why there is an asymmetry in its collective effect to cause it to topple on one side.

I'd like to see such experimental determination to argue that this effect is detectable. Considering how difficult and how minute it is to start even seeing the Casimir effect, I am highly skeptical that one can make such an argument with this pencil argument.

Zz.
 
  • #27
ZapperZ said:
But I could come in and change that, and replace it with this that is equally valid:

One can also determine the "instantaneous" position of an atom by shooting a stream of atom through a "slit". The ones that passes through the slit had that position the moment it passed through. Here, I used NO PHOTON to measure its position. The uncertainty it the position corresponds to the width of the slit.

I could also located an atom using something ranging from an STM to AFM. No photons there either!

So it is not a "requirement" that one must use a photon to measure a position. That's a fallacy. Zz.

Can you tell me what's the cause of the HUP if it has nothing to do with measurement?

Following quote is from PF Mentor: HallsofIvy; Do you mean PF mentor was wrong?
https://www.physicsforums.com/showthread.php?t=124886

HallsofIvy said:
How do you determine the position? Imagine you have a really large magnifying glass so that you can actually see the electron. You still have to shine a light on it so that you can see it. That light carries momentum, furthermore to get a very accurate position you will need to use small wavelength light (you can only get the postion to within a half wavelength of the light you use) which has higher momentum. Although you can use the two positions to accurately calculate the speed (and momentum) of the electron before you shown the light on it, the light itself changes the momentum.
 
Last edited:
  • #28
kahoomann said:
Can you tell me what's the cause of the HUP if it has nothing to do with measurement?

It comes from the basic postulates of quantum mechanics: Hermitian operators, Hilbert space vectors that satisfy the Cauchy-Schwarz inequality, and the fundamental commutation relation:

[tex] [p,x] = i \hbar [/tex]

For a full derivation see any undergraduate quantum mechanics textbook, or have a look at the following link:

http://en.wikipedia.org/wiki/Uncertainty_principle#Derivations

kahoomann said:
Following quote is from PF Mentor: HallsofIvy; Do you mean PF mentor was wrong?

No, Halls gave a standard intuitive description of the HUP, but this is nothing more then a thought experiment to help make the HUP seem more intuitive. The thought experiment was first put out by Heisenberg himself, but again this was only to make sense out of the result he had rigorously derived earlier.
 
Last edited:
  • #29
ZapperZ said:
Actually, that is a bad example. Since the object isn't coherent, "quantum fluctuations" would be completely random, and thus, cancel each other out. When you have random "noises", averaging it out over time or over large numbers will cancel its effect. If not, you have to explain why there is an asymmetry in its collective effect to cause it to topple on one side.

I'd like to see such experimental determination to argue that this effect is detectable. Considering how difficult and how minute it is to start even seeing the Casimir effect, I am highly skeptical that one can make such an argument with this pencil argument.

Zz.


Hmmmm, I'm not sure I understand this. What's wrong with the usual procedure of separating out the center of mass movement of a many particle system in quantum mechanics?
 
  • #30
Count Iblis said:
Hmmmm, I'm not sure I understand this. What's wrong with the usual procedure of separating out the center of mass movement of a many particle system in quantum mechanics?

Now it's my turn to say that I don't understand this. How are you able to "separate" out the center of mass movement of a "many-particle" component of a pencil? Who has done this? And how are you then able to detect such fluctuation experimentally?

Zz.
 
  • #31
<< Can you tell me what's the cause of the HUP if it has nothing to do with measurement? >>

For particles, it does have to do with measurement. Zapper's example of using a slit to locate a "photon" is confusing because he hasn't defined what he means by a "photon". Presumably he's thinking of a "photon" as a point particle of some kind. But then he should be more specific about the dynamics of that photon, i.e., whether it moves according to a path integral or a de Broglie-Bohm guidance equation, or a Wigner function.

I think there is lot's of confusion on this board because no one has really stated and distinguished the 3 different formulations of the HUP, which relate to

1) the structure of the wavefunction/packet.

2) a property of an ensemble of particles.

3) a dynamical property of measuring a single particle.

The Heisenberg microscope is an example of 3), and it isn't a very valid example because it is actually a semiclassical thought experiment, not a quantum mechanical one. But the examples of 1) and 2) have not yet been specified here.

An example of 1) is the general property of a Fourier analysis on waves of any kind, classical or quantum mechanical. In the case of a general wave packet, it is manifested as a reciprocity relation between wavenumber, dk, and spatial pulse width, dx, given by d(k)*d(x) = 1, or between frequency band width, domega, and time width, dt, given by domega*dt = 1. Quantum mechanics differs only in that you multiply both sides of these reciprocity relations by hbar (or just apply the de Broglie relation dp = hbar*dk), implying the magnitude of the relations is very small (or on quantum mechanical lengthscales), and this is basically the HUP for quantum wave packets. It says nothing about point particles, which is ultimately all we see in physical experiments.

An example of 2) is the operator algebra that Heisenberg historically formulated and applied to experiments. Starting from the commutation relation

[[tex]\hat{x}[/tex],[tex]\hat{p}[/tex]] = i*hbar, where [tex]\hat{x}[/tex] = x and [tex]\hat{p}[/tex] = -i*hbar*d/dx,

one can then use these position and momentum operators to compute the expectation values of observables by

< x > = int[[tex]\bar{psi}[/tex]*[tex]\hat{x}[/tex]*psi]dx

< p > = int[[tex]\bar{psi}[/tex]*[tex]\hat{p}[/tex]*psi]dx

and then

dx = Sqrt[< x^2 > - < x >^2]

dp = Sqrt[< p^2 > - < p >^2]

Physically, what the example of 2) means is that suppose you have an initial wavefunction psi_0(x) so that the particles have a distribution of rho(x) = R[tex]^{2}_{0}[/tex](x). Now consider the ith particle in the ensemble and at time t, measure its x-coordinate, x_i(t), N times. Then evaluate from these N position measurements the average and mean positions described above. Then, starting from the same initial wavefunction, perform now a series of momentum measurements in the x-direction. Let the ith particle in the ensemble at time t be found to have momentum p_x,i(t). Then evaluate from these N momentum measurements the average and mean momenta described above. The produce of dx and dp turns out to be

dx*dp [tex]\geq[/tex] hbar/2.

Notice that 1 position and momentum measurement is not sufficient for this definition to work. It requires an *ensemble* of measurements. This tells us that the HUP is not a property of any individual particle, but rather of an *ensemble* of particle measurements.

There are of course uncertainty relations for other conjugate quantities in QM, such as for angular momentum, and the same general rules above apply. One comment however about the energy-time uncertainty relation. Wikipedia on the uncertainty relation gets it exactly right, so I will actually just quote them:

"Since energy has the same relation to time as momentum does to space in special relativity, it was clear to many early founders that the following relation holds:

dE*dt [tex]\geq[/tex] hbar/2.

but it was not clear what dt is, because the time at which the particle has a given state is not an operator belonging to the particle, it is a parameter describing the evolution of the system. Lev Landau once joked "To violate the time-energy uncertainty relation all I have to do is measure the energy very precisely and then look at my watch!"

Einstein and Bohr however understood the meaning of the principle. A state which only exists for a short time cannot have a definite energy. In order to have a definite energy, the frequency of the state needs to be accurately defined, and this requires the state to hang around for many cycles, the reciprocal of the required accuracy.

For example, in spectroscopy, excited states have a finite lifetime. By the time-energy uncertainty principle, they do not have a definite energy, and each time they decay the energy they release is slightly different. The average energy of the outgoing photon has a peak at the theoretical energy of the state, but the distribution has a finite width called the natural linewidth. Fast-decaying states have a broad linewidth, while slow decaying states have a narrow linewidth.

One false formulation of the energy-time uncertainty principle says that measuring the energy of a quantum system to an accuracy ΔE requires a time interval Δt > h / ΔE. This formulation is similar to the one alluded to in Landau's joke, and was explicitly invalidated by Y. Aharonov and D. Bohm in 1961. The time Δt in the uncertainty relation is the time during which the system exists unperturbed, not the time during which the experimental equipment is turned on."

The question that one should now ask is how can one relate the uncertainty relation for wave packets to the empirically deduced uncertainty relation for particles in experiments. The pilot wave theory of de Broglie and Bohm gives an exact answer to this question which I can go into if anyone is interested.
 
  • #32
>> The pilot wave theory of de Broglie and Bohm gives an exact answer to this question which I can go into if anyone is interested. <<
Yes, I'm interested.
Is there wavefunction collapse in pilot wave theory of de Broglie and Bohm
 
  • #33
kahoomann said:
>> The pilot wave theory of de Broglie and Bohm gives an exact answer to this question which I can go into if anyone is interested. <<
Yes, I'm interested.
Is there wavefunction collapse in pilot wave theory of de Broglie and Bohm


Great. There is no wavefunction collapse in the de Broglie-Bohm theory. There is however branching of wavefunctions from an initial superposition state (after a measurement interaction), only one of which the eventually observed point particle get's piloted by in the future. The other wavefunctions, having now distinct supports in configuration space, just propagate away according to Schroedinger's equation. This is called "effective collapse", because it explains why wavefunctions appear to collapse in the phenomenological formalism of textbook QM. To see how the HUP is derived for particles guided by wavepackets, and to see how the measurement theory works in pilot wave theory please read this short review paper:

(Week 5 of the Perimeter Institute Interpretation of Quantum Mechanics Lecture Course series):
http://www.iqc.ca/~qipcourse/interpr...-09-10-dBB.pdf
http://www.iqc.ca/~qipcourse/interpret/

Let me know if you have any questions about this.
 
Last edited by a moderator:
  • #34
ZapperZ said:
Now it's my turn to say that I don't understand this. How are you able to "separate" out the center of mass movement of a "many-particle" component of a pencil? Who has done this? And how are you then able to detect such fluctuation experimentally?

Zz.

Ok, let's look at a slightly different problem: "quantum superpositions of a mirror", http://arxiv.org/abs/quant-ph/0210001"

We propose a scheme for creating quantum superposition states involving of order $10^{14}$ atoms via the interaction of a single photon with a tiny mirror. This mirror, mounted on a high-quality mechanical oscillator, is part of a high-finesse optical cavity which forms one arm of a Michelson interferometer. By observing the interference of the photon only, one can study the creation and decoherence of superpositions involving the mirror. All experimental requirements appear to be within reach of current technology.
 
Last edited by a moderator:
  • #35
Count Iblis said:
Ok, let's look at a slightly different problem: "quantum superpositions of a mirror", http://arxiv.org/abs/quant-ph/0210001"

If you do a search on my posts, you'll see that I've mentioned this several times, including the exact reference to the PRL paper.

So what about this paper is relevant here?

Zz.
 
Last edited by a moderator:

Similar threads

Back
Top