Is Schrodinger's cat experiment a paradox?

  • Thread starter rede96
  • Start date
In summary, the video explained that in a thought experiment, an atom with a half-life is in a box with a geiger counter and poison, and the cat. It was explained that statically, the cat had a 50/50 chance of being alive or dead, but according to the Copenhagen interpretation, the cat was both alive and dead until someone observed it. According to the video, the problem with the Copenhagen interpretation is that it does not require a person to make the observation, which is why the cat was both alive and dead in the set-up.
  • #36
rede96 said:
Sorry for the stupid question but how do we know they didn't have a defined spin when they left the cannon if the measurements system we use to detect the spin will effect the result?

You need to first understand QM before delving into the foundations. It seems like you're still confusing the lack of knowledge about a system until a measurement reveals what the state of the system was all along with superposition in the context of QM. The latter cannot be interpret in that classical way! If we did interpret superpositions in that way then we would run into a contradiction when time-evolving a system backwards using the Schrodinger equation.
 
Physics news on Phys.org
  • #37
Maui said:
This is assumed, right? And it is the environment that causes the phases to shift out of coherence and this purported environment is somehow 'classical'. Sorry but if i understand this correctly, you are talking about religion, not science and if this is the progress you allude to, we are in the middle of nowhere with no hope whatsoever of an adequate solution to the MP.

Decoherence is not a religion and there is no Measurement Problem according to this:
http://physics.about.com/od/quantumphysics/fl/Decoherence-and-the-Measurement-Problem.htm
There is no clear dividing line between the quantum and classical world, just the interference effects become much less noticeable in the latter.
 
  • #38
Jilang said:
Decoherence is not a religion and there is no Measurement Problem according to this:
http://physics.about.com/od/quantumphysics/fl/Decoherence-and-the-Measurement-Problem.htm
There is no clear dividing line between the quantum and classical world, just the interference effects become much less noticeable in the latter.
Did you read your own link? It says the same things i outlined in my previous post. How could that be legitimate science?

" As the phase angles lose their coherence through interaction with the surrounding environment, the result is classical probability and the absence of quantum behavior, which means that traditional methods of analyzing the situation using classical physics techniques is a valid approximation."How is that science instead of bad philosophy(aka religion)?BTW, one should maintain a truly sceptical position anytime someone makes a statement that the MP has been resolved, esp. if no Nobel Prizes had been awarded.
 
Last edited:
  • #39
Jilang said:
Decoherence is not a religion and there is no Measurement Problem according to this...

Decoherence doesn't solve the measurement problem. That's not even the intent of the decoherence program. You shouldn't use "About.com" as a resource, just sayin'
 
  • Like
Likes 1 person
  • #40
What is the decoherence program and what are its intentions? It sounds kinda sinister!
 
  • #41
Jilang said:
What is the decoherence program and what are its intentions? It sounds kinda sinister!
To attempt to identify the mechanism that decoheres coherent states.

Its primary aim seems to be to somehow, if possible, engage the 'environment', the measuring apparatus and a global wavefunction of the world. In the end, if enough people accept their assumptions, establish the idea among peers that collapse of the wavefunction is unnecessary.

Just my impressions from reading this board.
 
Last edited:
  • #42
Jilang said:
What is the decoherence program and what are its intentions? It sounds kinda sinister!

Isn't all of this sinister in the end? :biggrin:

Read p.6 of this paper: http://arxiv.org/pdf/1402.1217.pdf
 
  • #43
Maui said:
This is assumed, right? And it is the environment that causes the phases to shift out of coherence and this purported environment is somehow 'classical'. Sorry but if i understand this correctly, you are talking about religion, not science and if this is the progress you allude to, we are in the middle of nowhere with no hope whatsoever of an adequate solution to the MP.

Decoherence is not a solution to the measurement problem, but you can easily prove to yourself that it happens with a simple computer program. The environment is not "classical", just large compared to the system being measured. Decoherence definitely happens, the question is how general and how important it is. It seems to "solve" the measurement problem for larger systems (it takes a quantum statistical problem and turns it into a "classical" statistical problem), but when individual particles are concerned, it is unclear. Why should we see just one eigenvalue? There still must be a measurement postulate.
 
  • #44
DrewD said:
It seems to "solve" the measurement problem for larger systems (it takes a quantum statistical problem and turns it into a "classical" statistical problem)


How? You should not assume the Born rule to be able to derive decoherence as it leads to the idea of priviliged decohering chairs, apparatues, etc. Namely, what you intend to prove is already there. It's injecting classicality to derive classicality. Doesn't sound very convincing, does it?
 
  • #45
ChrisVer said:
about the last link, could someone give an actual example for a mixed and pure state? Because I'm having some confusion in understanding the difference...
does it say that a particle could be described by a:
pure state
[itex]|Ψ>= \frac{1}{\sqrt{2}} [ |0> \pm |1> ] [/itex]
while in mixed each can have either 0 or 1?
[itex]|Ψ>= |0> [/itex] or [itex]|Ψ>= |1> [/itex]

The issue is states are not elements of a vector space as some books, especially those at the intermediate level like Griffiths, will tell you. They are in fact positive operators of unit trace defined by the general form of the Born Rule. To really grasp it you need to see the two axioms of QM as detailed by Ballentine in his text.

1. To each observation there corresponds a Hermitian operator whose eigenvalues give the possible outcomes of the observation.
2. There exists a positive operator of unit trace P such that the expected outcome of the observation associated with the observable O is E(O) = Trace (PO) - this is the Born Rule in its most general form. By definition P is called the state of the system.

In fact the Born Rule is not entirely independent of the first axiom, as to a large extent it is implied from that via Gleason's Theroem - but that would take us too far afield - I simply mention it in passing.

Also note that the state, just like probabilities, is simply an aid in calculating expected outcomes. Its not real like say an electric field etc. In some interpretations its real - but the formalism of QM is quite clear - its simply, like probabilities, an aid in calculation.

By definition states of the form |x><x| are called pure. States that are a convex sum of pure states are called mixed ie are of the form ∑ pi |xi><xi| where the pi a positive and sum to one. It can be shown all states are either pure or mixed. Applying the Born rule to mixed states shows that if you have an observation whose eigenvectors are the |xi><xi| then outcome |xi><xi| will occur with probability pi. Physically one can interpret this as a system in state |xi><xi| randomly presented for observation with probability pi. In such a case no collapse occurs and an observation reveals what's there prior to observation - many issues with QM are removed. Such states are called proper mixed states.

Pure states, being defined by a single element of a vector space, can be associated with those elements and that's what's usually done. Of course when you do that they obey the vector space properties so the principle of superposition holds ie if |x1> and |x2> are any two pure states a linear combination is also a pure state. This is what is meant by a superposition. Note it deals with elements of a vector space not convex sums of pure states when considered operators - they are mixed states. This means the state 1/2 |x1> + 1/2 |x2> is a pure state and is totally different from the mixed state 1/2 |x1><x1| + 1/2 |x2><x2|.

Now what decoherence does is transform a superposition like 1/2 |x1> + 1/2 |x2> into a mixed state like 1/2 |x1><x1| + 1/2 |x2><x2|. When that is done it can be interpreted as a proper mixed state which solves many of the issues with collapse etc.

Hope this helps.

Thanks
Bill
 
Last edited:
  • #46
Maui said:
This is assumed, right? And it is the environment that causes the phases to shift out of coherence and this purported environment is somehow 'classical'. Sorry but if i understand this correctly, you are talking about religion, not science and if this is the progress you allude to, we are in the middle of nowhere with no hope whatsoever of an adequate solution to the MP.

Of course its an assumption. Randomly pick objects and they have a random properties - its almost (but not quite) the definition of randomly picking. Its made all the time by actuaries, statisticians, engineers, applied mathematicians etc etc. It really is utterly trivial, and if invalid we are really up shite creek without a paddle, not just in QM but many many areas.

But still, yes its an assumption, but one you are scraping the bottom of the barrel to doubt IMHO. The desperation some are driven to to hold certain views in QM never ceases to amaze me.

Thanks
Bill
 
  • #47
rede96 said:
I wasn't too sure what you meant by that to be honest.

I am pretty sure he is just presenting the Many Worlds take on it.

Mathematically its a very very beautiful interpretation, utterly beguiling and breathtaking in its elegance and simplicity. I like it very very much and even got David Wallices textbook on it:
https://www.amazon.com/dp/0199546967/?tag=pfamazon01-20

Suffice to say its not my preferred interpretation, but really such things as deciding between interpretations without experimental support is philosophy, which, correctly IMHO, is off topic here. We can discuss interpretations, and explain them, but philosophically dissecting them is better suited to a philosophy forum, not one like this devoted to physics.

So I will simply say - selecting a preferred interpretation without experimental support is an opinion - opinions are like bums - everyone has one - it doesn't make it correct.

Thanks
Bill
 
Last edited by a moderator:
  • #48
bhobba said:
Of course its an assumption. Randomly pick objects and they have a random properties - its almost (but not quite) the definition of randomly picking. Its made all the time by actuaries, statisticians, engineers, applied mathematicians etc etc. It really is utterly trivial, and if invalid we are really up shite creek without a paddle, not just in QM but many many areas.

But still, yes its an assumption, but one you are scraping the bottom of the barrel to doubt IMHO. The desperation some are driven to to hold certain views in QM never ceases to amaze me.

Thanks
Bill
I do not see the justification for splitting the world into classical and quantum and you have not provided one. If you need a classical world to get decoherence, why would you need decoherence at all? You'd already have a classical world of environments, apparatueses, tables and chairs... there is nothing to explain.
It really is utterly trivial, and if invalid we are really up shite creek without a paddle, not just in QM but many many areas.
Are we not?
 
  • #49
Jilang said:
Decoherence is not a religion and there is no Measurement Problem according to this:
http://physics.about.com/od/quantumphysics/fl/Decoherence-and-the-Measurement-Problem.htm

Decoherence is most definitely NOT a religion.

But without reading the link I can assure you there is a measurement problem even with decoherence.

First you need to understand what the measurement problem is. Its not really the collapse issue you read about because most of the time the system is destroyed by measurement. What it really is is basically understanding what the dickens is going on with the Born rule. It has a number of parts, such as the preferred basis problem and probably the toughest one of all - why do we get any outcomes at all. There are others as well - if you want to go deeply into it get Schlosshauer's textbook:
https://www.amazon.com/dp/3642071422/?tag=pfamazon01-20

Now decoherence definitely explains some parts of the measurement problem, probably other parts such as the preferred basis problem (but further research is required to be sure), however it is powerless to explain why we get any outcomes at all. Its trivial in Many Worlds, Bohmian Mechanics etc - but the formalism of QM by itself simply says - you get outcomes.

Then there are unresolved issues like the so called factoring problem.

Jilang said:
There is no clear dividing line between the quantum and classical world, just the interference effects become much less noticeable in the latter.

That's true - but that in itself creates issues.

Decoherence has made great strides in allowing us to understand the emergence of a classical world, and in formulating a fully quantum theory of measurement - but issues still remain - although of course research is ongoing.

Thanks
Bill
 
Last edited by a moderator:
  • #50
Maui said:
I do not see the justification for splitting the world into classical and quantum and you have not provided one.

But I don't do that, nor does dechorence do that.

In fact that's the big advantage of decoherence - it avoids that, and allows you to define a measurement as after decoherence - nothing classical at all.

The measurement problem is still there for reasons I stated in my previous post. It has enough issues without giving it problems it doesn't have.

Thanks
Bill
 
  • #51
Jilang said:
What is the decoherence program and what are its intentions? It sounds kinda sinister!

Get Schlosshauer's book:
https://www.amazon.com/dp/3642071422/?tag=pfamazon01-20

It sheds a lot of light the measurement problem, solves some issues with it, helps in understanding the quantum to classical transition, is of great importance in endeavors like constructing a quantum computer, and is quite interesting in its own right.

Thanks
Bill
 
Last edited by a moderator:
  • #52
Maui said:
How? You should not assume the Born rule to be able to derive decoherence as it leads to the idea of priviliged decohering chairs, apparatues, etc. Namely, what you intend to prove is already there. It's injecting classicality to derive classicality. Doesn't sound very convincing, does it?

You do not need to assume the Born rule for decoherence. Nothing classical is assumed. I don't know where you got that idea. You take a state that is coherent, take the tensor product with a much larger system (also quantum), evolve the system, take the partial trace to recover the original system, and it will be found in a more diagonal state than it was before. That is decoherence (obviously there is a bit more to it). This is probably a separate issue from the original topic.

I also don't think that it solves the measurement problem. The reason for my comment that it "solves" the measurement problem is that a many body system will behave in a classical statistical way rather than a quantum statistical way after decoherence. A diagonal density matrix has the same behavior that you would find from stat mech. This does not solve the measurement problem, but it does recover some of the classical behavior.
 
  • #53
rede96 said:
but at the macro level I find it difficult to think that nature would know every single possible outcome from an event and how it effects the total system, and they all exists simultaneously until one is observed.

I am not sure but this difficulty could also occur to someone who is thinking about the Least Action Principle. In the formalism, it is like nature knows "a priori" the outcome which minimizes the action and chooses that path as the physical path instead of all the others (perturbative or not) possible paths...
In QM path integrals (at least at the level I understand the formalism), it's like the particles take all possible paths (with different possibilities/phase each) and reaches the final point/state. There comes the interference thing (like in Huyghen's principle).

Am I wrong in something? I think I'm not
 
  • #54
Maui said:
You should not assume the Born rule to be able to derive decoherence as it leads to the idea of priviliged decohering chairs, apparatues, etc. Namely, what you intend to prove is already there. It's injecting classicality to derive classicality. Doesn't sound very convincing, does it?

I think you may not be understanding what's going on here.

Processes like tracing over the environment require the Born rule to justify it.

All that means however is you can't use them to justify the Born rule - if you don't do that there is no issue.

Thanks
Bill
 
  • #55
DrewD said:
You do not need to assume the Born rule for decoherence.

Sorry mate - but you do.

Its assumed in justifying tracing over the environment:
http://physics.stackexchange.com/qu...ake-the-partial-trace-to-describe-a-subsystem

See Lubos's reply. But the key assumption is 'The expectation values of the operator P in |ψ⟩ are trivially equal to Tr(ρP)' - that's the Born Rule. Its a trivial application of it, but that you need to assume it is far from trivial.

Its not an issue - all it means is you can't use decohrence to justify the Born Rule.

But it should be fairly obvious you can't really do that anyway. Gleason's theorem shows the key ingredient is non-contextuallity. Decoherence is independent of that. Claims of envarience and such being able to do it must have non-contextuality hidden somewhere.

That doesn't mean they are not of value. To me contextuality is very ugly mathematically and if you can come up with reasonable arguments that 'hide' it, it simply makes contextuality even uglier.

Thanks
Bill
 
Last edited:
  • #56
rede96 said:
but at the macro level I find it difficult to think that nature would know every single possible outcome from an event and how it effects the total system, and they all exists simultaneously until one is observed. I think I need to be a bit more reading :D

That's not quite what MW says.

Basically, after decohrence, it interprets the pure states as a separate world and everything keeps evolving. No collapse occurs - nothing happens.

Of course only one world is ever experienced, and deciding on that is an issue. That's where Born's rule is required. But Gleason comes to the rescue with that - if you want a measure that is basis independent - and since MW has the state as very real indeed it would be a bit weird to ascribe a basis dependence to it.

Thanks
Bill
 
  • #57
bhobba said:
Sorry mate - but you do.

I agree. I actually just came back to this thread because I realized I was wrong.
 
  • #58
DrewD said:
I agree. I actually just came back to this thread because I realized I was wrong.

I actually didn't realize it until prompting by others made me look it a bit more carefully.

Its not a biggie though. All it means is you can't use it to justify the Born rule - but IMHO Gleason does that well enough anyway, as well as showing its real, deep down, rock bottom essence - basis independence.

I have a couple of spare minutes so will spell it out.

Gleason's theorem says that the only probability measures that can be applied to the subspaces of a Hilbert space is via the usual Born Rule ie if A is the projection operator of the subspace a positive operator of unit trace exists such that the measure is Trace(PA).

When you go through the proof you think - it water tight - no out - Born Rule proved.

Not so fast Grasshopper. Let say we have the projection operator X = |x1><x1| + |x2><x2| then we can find another basis of the subspace to give the same projection operator Y = |y1><y1| + |y2><y2|. Of course X=Y

So far so good. Now treat those operators as observables. Consider X. Let's change the outcome of |x2><x2| to 0. Then the expected outcome of this new operator is the probability of getting the first outcome. And similarly if we change the outcome of |x1><x1| the expected outcome of the new operator is the probability of getting the second outcome. But we must get one or the other so E(X) = E(|x1><x1|) + E(|x2><x2|). Of course E(X) is the probability of getting the first or the second outcome. Similarly E(Y) = E(|y1><y1|) + E(|y2><y2|).

But X=Y, so E(|x1><x1|) + E(|x2><x2|) = E(|y1><y1|) + E(|y2><y2|). But there is nothing in the observable postulate that says this must be the case. It is an inevitable consequence of being able to assign a measure to subspaces so it doesn't matter what basis is used in the subspace. But basis are man made things - they are entirely arbitrary. Fundamental physics should not depend on it. If we are to take the vector space formalism seriously then it should be the case.

However it is in fact a very strong assumption, as the fact you can prove the Born Rule from it shows. It generally goes by the name non-contextuality because different basis decompositions of a subspace correspond to physically different observations. Mathematically its very reasonable, almost trivial. Physically - its very strong.

Thanks
Bill
 
Last edited:
  • #59
Maui said:
Are we not?

Yes we are - but the shite is not as thick as some think and we have mathematics to aid us in making good headway.

Thanks
Bill
 
  • #60
bhobba said:
I think you may not be understanding what's going on here.

Processes like tracing over the environment require the Born rule to justify it.

All that means however is you can't use them to justify the Born rule - if you don't do that there is no issue.

Thanks
Bill



I fail to see what decoherence is supposed to explain if you require measurement/collapse to justify it. It's a dead end. By itself, it adds no interpretational value at all - zero.



bhobba said:
Yes we are - but the shite is not as thick as some think and we have mathematics to aid us in making good headway.

Thanks
Bill


That's what i thought when i first registered here but I've changed my mind - there is no way beyond where physics is at the moment wrt to the MP, unless new physics is discovered.
 
Last edited:
  • #61
Maui said:
I fail to see what decoherence is supposed to explain if you require measurement/collapse to justify it.

So you don't think being able to interpret a state after decoherence as a proper mixed state and hence having the observed state their proir to observation, and bypassing Kochen-Specker, is an advance?

You don't think being able to define a measurement independent of a vague concept of a measurement apparatus is an advance?

You don't think being able to explain the preferred basis problem is an advance?

There are others as well, but if you can answer the above that would be nice.

Most disagree with you - but let's see why you don't think the the three I mentioned are an advance.

Maui said:
That's what i thought when i first registered here but I've changed my mind - there is no way beyond where physics is at the moment wrt to the MP, unless new physics is discovered.

My view is different. We have all these different interpretations such that, pick any issue with QM and there is at least 1 that fixes it. But not all - there is the rub. What we need is some way to decide via experiment

Thanks
Bill
 
  • #62
bhobba said:
So you don't think being able to interpret a state after decoherence as a proper mixed state and hence having the observed state their proir to observation, and bypassing Kochen-Specker, is an advance?
So what is this mundane construction supposed to provide? The way you have put them together seems to say more about your philosophical inclination than physics.
You don't think being able to define a measurement independent of a vague concept of a measurement apparatus is an advance?
The environment is also a vague and foreign concept to qm that you can't define without resorting to that which you try to derive(measurement and decoherence). It's not vague only after one introduces his/her own philosophy in the framework which one somehow considers very obvious and self evident. It's true that the environment is obvious but you are not using quantum theory, you are using your 5 senses. But if you return to quantum mechanics(which of course you must), the environment is neither obvious nor granted. It's vague. It requires a measurement postulate in all cases with no exceptions.
You don't think being able to explain the preferred basis problem is an advance?
The 'ever so special' environment that selects the preferred basis, the same environment that is also entangled with the system and obviously in superposition with it. As i said earlier, if you already have the environment in an eigenstate of the observable quantity, you do not need decoherence to explain classicality. But you don't, that's what interpretations do and what decoherence doesn't.
Most disagree with you - but let's see why you don't think the the three I mentioned are an advance.
This is wrong. Most people who understand what you are saying will not agree with you(run a poll if you wish, i don't remember seeing such a poll here). Decoherence doesn't add anything of interpretaional value(unless you add bits of religion/philosophy which you seem to not have a problem with) and it definitely doesn't solve the MP at all, I am afraid even FAPP. I have no problem with your assumptions, but imo you should not present your view of it as a solution to the MP but as another interpreation.

PS. We are going in circles with this thread and this isn't very surprizing given that it's been mostly about personal philosophies.
 
Last edited:
  • #63
rede96 said:
What I was wondering is why couldn't they just put a timer on the Geiger counter, which started the moment the experiment was started. Once the lid was lifted, say 90 minutes after the start of the experiment, and if the cat was found dead, then they could have just checked the time counter to see when the atom decayed. If it was say after 60 minutes then doesn't that sort of suggest that cat didn't exist in both states before the box was opened?
Under the Copenhagen interpretation the timer, assuming it's inside the box, is also in superposition, just like the cat. So having the timer doesn't change anything. The box is a superposition of dead-cat states in which the atom has decayed at different times and hence the timer shows different times, together with live-cat states in which the atom has not decayed and the timer has not yet recorded a time for decay. One of those states is not chosen until we open the lid and look inside.

Alternatively, if the timer is outside the box so that we can see when the atom decays as soon as it does, then the wavefunction collapses at that point and there is no superposition of live and dead cats.

I see this thread has five pages. I've only read half the first one. Perhaps somebody has already made these points.
 
  • #64
phinds said:
I think Schrodinger really came up with the cat thing to show how silly the Copenhagen interpretation can be.

Personally, I'm really confident that the moon is there whether I'm looking at it or not and I think the cat is always either alive or dead.

What about the sun? It's still there whether you're watching it or not...right? Is it possible that the cat is always either visible or invisible?
 
  • #65
Maui said:
The environment is also a vague and foreign concept to qm that you can't define without resorting to that which you try to derive(measurement and decoherence). It's not vague only after one introduces his/her own philosophy in the framework which one somehow considers very obvious and self evident. It's true that the environment is obvious but you are not using quantum theory, you are using your 5 senses. But if you return to quantum mechanics(which of course you must), the environment is neither obvious nor granted. It's vague. It requires a measurement postulate in all cases with no exceptions.

? The environment is just a larger quantum system. This is no more vague than it is in classical thermodynamics. How is this philosophy? And yes, it does require a measurement postulate... if it didn't, it would solve the measurement problem. It just produces classical statistical mechanics from quantum. I think you are building a strawman (or have found one that others have put together for you). We have not claimed that the measurement problem is solved by decoherence.
 
  • #66
andrewkirk said:
Under the Copenhagen interpretation the timer, assuming it's inside the box, is also in superposition, just like the cat. So having the timer doesn't change anything. The box is a superposition of dead-cat states in which the atom has decayed at different times and hence the timer shows different times, together with live-cat states in which the atom has not decayed and the timer has not yet recorded a time for decay. One of those states is not chosen until we open the lid and look inside.

Where are you getting this from?

Under Copenhagen there is an assumed world out there that is totally classical and commonsense. There is no timer in a superposition, cat, box or anything. They are all classical. The observation occurs at the particle detector. That is where collapse occurs. To Copenhagen there is no problem.

What Schrodinger's Cat was meant to show is, since that classical world is in fact quantum, you should be able to analyse it that way - but problems arise if you do. We need a fully quantum theory of measurement without this division. But Copenhagen has this division and it leads to no logical contradiction etc etc - its simply a blemish that is better done away with. And much progress has been made towards that end.

Thanks
Bill
 
  • #67
DrewD said:
? The environment is just a larger quantum system. This is no more vague than it is in classical thermodynamics. How is this philosophy?


Of course it is since it lies outside the scope of the formalism. The environment is the philosophical part of qm.


And yes, it does require a measurement postulate... if it didn't, it would solve the measurement problem. It just produces classical statistical mechanics from quantum.


It doesn't produce it but describle it with some contrived philosophical assumptions(religion). Bhobba thinks it explains it but it doesn't. All it does is restate the obvious which is of no value really and one cannot use it as an interpretation.

I think you are building a strawman (or have found one that others have put together for you). We have not claimed that the measurement problem is solved by decoherence.


Bhobba thinks it solves it fapp and he is dead wrong.
 
  • #68
subquantumboy said:
What about the sun? It's still there whether you're watching it or not...right? Is it possible that the cat is always either visible or invisible?

Under Copenhagen there is a commonsense classical world out there - the sun is there whether you are watching it or not. The last comment about the cat is silly.

Some in this thread simply do not understand what Copenhagen says.

Thanks
Bill
 
  • #69
Maui said:
Bhobba thinks it solves it fapp and he is dead wrong.

Please describe the experiment that can tell the difference between a proper and an improper mixed state.

Thanks
Bill
 
  • #70
bhobba said:
Please describe the experiment that can tell the difference between a proper and an improper mixed state.

Thanks
Bill


You can't because measurements introduce classicality but you can't use that as an argument for that which you set out to prove. The world is quantum! And it is the quantum to classical transition that you are trying to explain. But of course you can't. Certainly not in the way you have set it up.
 

Similar threads

Replies
143
Views
8K
2
Replies
46
Views
7K
Replies
39
Views
3K
Replies
9
Views
2K
Replies
17
Views
2K
Replies
2
Views
2K
Replies
2
Views
1K
Back
Top