Is action at a distance possible as envisaged by the EPR Paradox.

In summary: QM?In summary, John Bell was not a big fan of QM. He thought it was premature, and that the theory didn't yet meet the standard of predictability set by Einstein.
  • #386
my_wan said:
In principle yes, whether cogency can actually pan out for the standard model I can't say. I was recently challenged by one of my favorite skeptics to write a computer program that mimicked EPR correlation statistics. I found this that claims to have done it (haven't looked that close yet):
http://msc.phys.rug.nl/pdf/athens06-deraedt1.pdf
I was considering a variation of an encryption scheme I once wrote, based on some (now defunct) notions of cross frame information embedding. Actually with FTL models I might reconsider a limited version of that. It embedded an encrypted message in a fake encrypted message. Anyway I'm considering these quasirandom sequences and what rules might be needed to mimic detector setting choices. Interesting problem anyway.

Not meaning to ignore the rest of your post, which I want to review in more detail.

However, I am a computer programmer by profession. I have performed extensive analysis of the De Raedt computer simulation you referenced. I obtained the line by line source code for their model, and have created a series of models that accurately mimic their code using Excel (since their stuff requires a lot of add-on software to run). Using Visual Basic, I create trial runs for a large number of iterations at various angles and graph them. This spreadsheet is available from my website and I will post the link (it was previously posted on another thread).

This shows that it is in fact possible to construct a "local realistic" algorithm that does not violate a Bell Inequality, but yields a subsample which does. Thus it does not reproduce the QM predictions for the full universe, but does for a so-call "unfair sample". It is a very interesting piece of work.

However, my spreadsheet goes on to show why the same model is fatally flawed. In fact, it shows why pretty much ANY similar model is also fatally flawed. As far as I know, this analysis is original although I am sure there are others who have figured this out as well. I don't think anyone else has actually programmed the problem area so as to demonstrate it using the same technique as the De Raedt model itself.

My point being that it is far easier to claim success for a model than to actually produce that success. I will gladly take on any local realistic model which, like the De Raedt model, offers a specific algorithm which is actually "local" and "realistic" (since we are talking computer simulation). I can assure you, there isn't likely to be a model which can withstand attack. All this because Bell is in fact a map.

And keep in mind that the De Raedt model does not purport to mimic the results of all QM in the first place - so technically it is not a local realistic candidate theory. It is really an attempt to demonstrate that Bell can be beat, but it does not actually accomplish that in the end.
 
Last edited by a moderator:
Physics news on Phys.org
  • #387
DrChinese said:
I am not going to waste my time trying to figure out this gibberish. If you want to use his formula to present a valid set of data, I will look at it. BUT QUIT SAYING IT WITHOUT SHOWING IT!
So, what are you saying? That Unnikrishnan's formulation doesn't reproduce the qm predictions. That's just silly. Read the paper.

If you're saying that his formulation shouldn't be interpreted as local realistic in the sense of EPR-Bell, then I agree with you.

But it is an explicitly local model which expresses the conceptual points I've been making.

DrChinese said:
The entire point of Bell is that he cannot follow through on his claims.
His claim is that we don't need nonlocality to understand the correlations in terms of a local common cause. He's made an explicitly local model using a relational variable produced via emission process. And he correctly reproduces the qm predictions.

I agree with you that Bell put a stopper on explicitly local realistic models which, in effect, require the joint results to be caused by the variables which cause individual results. It's impossible to do that because the joint results are determined by a RELATIONSHIP between the counter-propagating disturbances which is NOT the same thing as the variables which determine the individual results.
 
  • #388
IcedEcliptic said:
I've read that paper, and I believe that Dr. Chinese summed up that tidibt quite well.
Unnikrishnan made an explicitly local model of entanglement that reproduces the qm results. I think that if you had read and understood the paper, then you would see how it relates to the conceptual points I've been trying to get across.

IcedEcliptic said:
The issue is not that the formalism if QM is not untenable in some ways, but rather that you are arguing for a loophole that I, along with many others believe has been utterly disprove as a factor.
I'm not arguing loophole(s). That's Zonde.

IcedEcliptic said:
You both argue against something, and grasp at straws to do so, but really your pages of arguments boil down to not liking SQM, and not having a superior theory in its place.
I'm not arguing for LHV models or against Bell. I'm arguing that we can understand entanglement correlations without resorting to nonlocality, or weird alternate realities. I don't think you've been paying close enough attention to what's been said to label anyone in this discussion a crackpot.

IcedEcliptic said:
The paper is like your arguments which Dr. Chinese continues to rip to confetti; it is scattered, distracting, and fundamentally lacking in substance. Its only strength is that you have yet to state enough of your position to refute it on purely scientific grounds, but that is yet another weakness. You go on and on about a non-existent loophole, and Zonde has ideas about Malus' Law that are between laughable and upsetting.
Zonde is the loophole person. I'm the one who brought up the applicability of Malus Law to certain situations.

IcedEcliptic said:
Your papers are crocks, you have pages of nonsense and rhetoric, but you continue your circular retreat, and in theory you will keep this up until a mentor finally accepts that you are a genuine crackpot and not just misguided.
Are you calling Unnikrishnan a crackpot now?
 
  • #389
Here is the link to the Excel spreadsheet models I created around the De Raedt simulations:

http://www.drchinese.com/David/DeRaedtComputerSimulation.EPRBwithPhotons.B.xls"

To see the code I wrote, go into the Visual Basic editor. Sheet A shows their model working correctly. Sheet B shows their model working incorrectly for a setup which matches their base assumptions.
 
Last edited by a moderator:
  • #390
DrChinese said:
Here are the links to the Excel spreadsheet models I created around the De Raedt simulations:

http://www.drchinese.com/David/DeRaedtComputerSimulation.EPRBwithPhotons.B.xls"

To see the code I wrote, go into the Visual Basic editor.

You are very devout in your hobby, I respect this! Thanks for showing us the fruits of your labour.
 
Last edited by a moderator:
  • #391
ThomasT said:
Unnikrishnan made an explicitly local model of entanglement that reproduces the qm results.

No he didn't. He simply claimed he did.

And I will call any local realist a "crackpot" who can't be bothered to generate a dataset which demonstrates the local realistic nature of their purported model. And I mean that in the nicest way. Why wouldn't someone generate the dataset? I mean, that would convince anyone that they have a solid model. So you have to wonder. So while I am not being literal about the crackpot designation, I am trying to say that an explanation should be forthcoming from the author if he wants to be taken seriously as to WHY there is no dataset.

De Raedt et al at least met this criteria. So my hat is off to them. Of course, I don't wear a hat in the first place, but I think you know what I mean.
 
  • #393
ThomasT said:
Unnikrishnan made an explicitly local model of entanglement that reproduces the qm results. I think that if you had read and understood the paper, then you would see how it relates to the conceptual points I've been trying to get across.

I'm not arguing loophole(s). That's Zonde.

I'm not arguing for LHV models or against Bell. I'm arguing that we can understand entanglement correlations without resorting to nonlocality, or weird alternate realities. I don't think you've been paying close enough attention to what's been said to label anyone in this discussion a crackpot.

Zonde is the loophole person. I'm the one who brought up the applicability of Malus Law to certain situations.

Are you calling Unnikrishnan a crackpot now?

Show me your data or go away. Dr. Chinese has already addressed the rest.
 
  • #394
my_wan said:
I have a bit of confusion how you are defining contextual variables myself. Earlier I seen it referred to as measuring separate realities in this thread. That was a bit ambiguous considering MWI. Here you say the relational model is not observer independent, but fail to specify what it's independent of. There is a difference between a configuration space, and a variable which is dependent on the perspective in which that configuration space is measured. Thus the whole point of contextual variables is that they are not observer independent, but the reality of the configuration space is. Analogs to these types of variables everywhere, the most relevant of which are in GR. What follows is not a claim, but a demonstration of the issues involve in complaining that contextual variables are not observer independent.

Consider what a water wave means to a single water molecule. It's nothing more than a small momentary deflection, not even significant relative to the general random motion. Same thing for air molecules when I say "boo". What part of "boo" is contained in each air molecule? Is the sound "boo" a preexisting property of air molecules? Conjugate variables are common enough in classical physics. What properties are preexisting in this world is a good question, perhaps even the constants?

In GR we make a well justified operational distinction between mass and rest mass. In the general case mass is a contextual variable, but the mass is real. So how relevant is that distinction? Consider a particle in QFT: A particular excitation of a field. Ask what happens if the entire field was uniformly excited by this magnitude. We could assume the total vacuum energy density increases accordingly, but this reasoning lead us to the vacuum catastrophe, and I'd say a prediction 107 orders of magnitude off is trouble for that assumption. Then we have a zero total energy of the universe, [tex]GM_t^2/R = M_tc^2[/tex]. This is pretty strong indication to me that the the entire universe, and everything we empirically measure about it, are purely contextual variables. Could it be that local field variances fully defines all empirical properties contextually, such that uniform absolute magnitudes of anything is meaningless, like gauge fields? This does not mean the configuration space that defined the variables isn't real, and almost certainly covariant. But trying to define reality solely in terms of the variables we measure wouldn't make much sense, in spite of the reality of covariant field variances.

As noted, I'm not trying to convince you that this is the way it is. Significant theoretical issues make this outline problematic. I'm merely trying to point out the issues in assuming that because contextual variables are not observer independent realism is out. Here I described a scenario where *all* variables are contextual, and still maintained realism. Everything you measure gets its metric from you, or some instrument, self referencing. You are a product of the very thing you are measuring, and not even space and time itself, the metric on which measurements are predicated, is non-contextual.

EPR denied we live in a world in which Alice changes spacelike separated Bob's reality. So if your theory allows Alice's reality to change Bob's (or vice versa), I consider it to be context dependent. And that would be fully consistent with the Bell result.

Now, a serious problem exists in any NON-contextual candidate theory because you must explain correlations for entangled particles at the same angles, while also explaining why unentangled particles are NOT correlated. You also have the Bell inequalities to contend with. So these are severe constraints which are not present in either contextual or nonlocal theories.
 
  • #395
DrChinese said:
No he didn't. He simply claimed he did.

And I will call any local realist a "crackpot" who can't be bothered to generate a dataset which demonstrates the local realistic nature of their purported model. And I mean that in the nicest way. Why wouldn't someone generate the dataset? I mean, that would convince anyone that they have a solid model. So you have to wonder. So while I am not being literal about the crackpot designation, I am trying to say that an explanation should be forthcoming from the author if he wants to be taken seriously as to WHY there is no dataset.

De Raedt et al at least met this criteria. So my hat is off to them. Of course, I don't wear a hat in the first place, but I think you know what I mean.
Are you saying that Unnikrishnan's model isn't local (not local realist -- just local -- that's all I've been saying -- remember, I agree with you that local realistic models of entanglement are ruled out by Bell)? Are you saying that his model doesn't reproduce the qm expectation value and correlation function (from which you can calculate datasets for any angles)?
 
  • #396
IcedEcliptic said:
Show me your data or go away. Dr. Chinese has already addressed the rest.
What is a 'dataset' going to tell you that the expectation value and correlation function doesn't already??
 
  • #397
ThomasT said:
What is a 'dataset' going to tell you that the expectation value and correlation function doesn't already??

A lot! If there is a model, let's see it model! Otherwise you are saying 1=2 and I can't be bothered showing my work. People can claim they have a secret formula, but that doesn't fly here.
 
  • #398
ThomasT said:
Are you saying that Unnikrishnan's model isn't local (not local realist -- just local -- that's all I've been saying -- remember, I agree with you that local realistic models of entanglement are ruled out by Bell)? Are you saying that his model doesn't reproduce the qm expectation value and correlation function (from which you can calculate datasets for any angles)?

You cannot create a dataset from the expectation values, no. Let's see it working. I don't know if it is local or realistic or both, all I know is the claims.
 
  • #399
DrChinese said:
A lot! If there is a model, let's see it model! Otherwise you are saying 1=2 and I can't be bothered showing my work. People can claim they have a secret formula, but that doesn't fly here.
I honestly don't understand what you're saying.

Are you saying that the qm expectation value and correlation function are wrong?

What is it that you want? For Unnikrishnan, or me, or somebody, to plug in some angular values to make a 'dataset'? Is that really necessary?
 
  • #400
ThomasT said:
I honestly don't understand what you're saying.

Are you saying that the qm expectation value and correlation function are wrong?

What is it that you want? For Unnikrishnan, or me, or somebody, to plug in some angular values to make a 'dataset'? Is that really necessary?

It is simple. If you have a model, you can general a dataset. Give me the values for 0/120/240 degrees for Alice and Bob, using the formula from the paper. If you think it is the same as QM, then fine, show me. P.S. QM does NOT NOT NOT say there is a realistic dataset.

I will then tear your dataset to shreds. Now, quit saying it is unnecessary when it is. I can say that I witnessed my son walking on water, but you would want to see it yourself. Well, here am I, saying I want to see it. Bell would too. :biggrin:

Just like 1 is not 2, a claim of equivalence is NOT equivalence.
 
  • #401
DrChinese said:
Here is the link to the Excel spreadsheet models I created around the De Raedt simulations:

http://www.drchinese.com/David/DeRaedtComputerSimulation.EPRBwithPhotons.B.xls"

To see the code I wrote, go into the Visual Basic editor. Sheet A shows their model working correctly. Sheet B shows their model working incorrectly for a setup which matches their base assumptions.


DrC, I agree with IcedEcliptic, this is very impressive work for a "hobbyist"! Kudos and +11 on "my scale"! :smile:

There’s a lot I want to comment in the last posts, but time is running out for today, but your code is so interesting I can’t wait:

I checked the VB code and saw that you are using VB’s pseudorandom number generator http://msdn.microsoft.com/en-us/library/f7s023d2.aspx" (to get new a new seed value). Could this be an "issue" (since QM is true random)?

If you consider this an issue, there could be a solution in the http://msdn.microsoft.com/en-us/library/system.security.cryptography.randomnumbergenerator.aspx" for automated clients (HTTP Interface).

Tomorrow I’ll be back to 'tackle' the rest, cheers!
 
Last edited by a moderator:
  • #402
ThomasT said:
What is a 'dataset' going to tell you that the expectation value and correlation function doesn't already??

I give up.
 
  • #403
Time for cake! +10,000 views! :biggrin:

400px-Birthday_cake.jpg
 
  • #404
devilsavocado said:
time for cake! +10,000 views! :biggrin:

[PLAIN]http://upload.wikimedia.org/wikipedia/commons/thumb/4/4f/birthday_cake.jpg/400px-birthday_cake.jpg[/quote]

love it!
 
Last edited by a moderator:
  • #405
Good to know there are so many folks following this great thread! I'm enjoying it very much.
 
  • #406
Tasty looking cake! I assume the two pictures of it are entangled? If Dr. Chinese had photoshopped the candles to be out, in contrast to the "on" above, I would have died laughing.
 
  • #407
Whoooooossssh
s6lsig.jpg


Glad you all liked it!

@IcedEcliptic, of course they are entangled! Now I will call 9-1-1! :smile:
 
  • #408
DevilsAvocado said:
Whoooooossssh
s6lsig.jpg


Glad you all liked it!

@IcedEcliptic, of course they are entangled! Now I will call 9-1-1! :smile:

Wigner's Cake. :biggrin:
 
  • #409
Yup :biggrin:
 
  • #410
DevilsAvocado said:
DrC, I agree with IcedEcliptic, this is very impressive work for a "hobbyist"! Kudos and +11 on "my scale"! :smile:

There’s a lot I want to comment in the last posts, but time is running out for today, but your code is so interesting I can’t wait:

I checked the VB code and saw that you are using VB’s pseudorandom number generator http://msdn.microsoft.com/en-us/library/f7s023d2.aspx" (to get new a new seed value). Could this be an "issue" (since QM is true random)?

If you consider this an issue, there could be a solution in the http://msdn.microsoft.com/en-us/library/system.security.cryptography.randomnumbergenerator.aspx" for automated clients (HTTP Interface).

Tomorrow I’ll be back to 'tackle' the rest, cheers!

Yes, I know it is pseudo-random. After a while you will realize that it does not need to be "truly" random. It is just a simulation to get things moving on how everything *should* work in this area. However, it might be worth you adding that element to see how it changes things. Thanks for the reference by the way, I will check it out.
 
Last edited by a moderator:
  • #411
DrChinese said:
EPR denied we live in a world in which Alice changes spacelike separated Bob's reality. So if your theory allows Alice's reality to change Bob's (or vice versa), I consider it to be context dependent. And that would be fully consistent with the Bell result.
I'm still not sure how you can represent "context dependent" that way from what I described. It would be somewhat analogous to saying Alice changed the reality of spacelike separated Bob by accelerating, thus changing Bob's velocity non-locally.

DrChinese said:
Now, a serious problem exists in any NON-contextual candidate theory because you must explain correlations for entangled particles at the same angles, while also explaining why unentangled particles are NOT correlated. You also have the Bell inequalities to contend with. So these are severe constraints which are not present in either contextual or nonlocal theories.
If it's modeled contextually in the relativistic sense above, then "now" is simply "now" as defined the detector, with nothing else involved. Same way Alice changed Bob's velocity, by accelerating herself "now" as defined by Alice.

I understand the ease with modeling EPR correlation when the setting for one end always stays the same, and how that breaks to varying degrees under arbitrary detector settings. We operate on a reasonable assumption, with a random spin sequence of particles, single detector settings can't make a statistical difference. I'm going to call this assumption into question wrt detection sequence, not detection rates, even though the sequence appears random to us at any single detector.

Assumptions (I'll use spin only here):
1) Spin is a distinct "real" property, regardless of time dependence, relational character, etc.
2) By 1) spin has a distinctly "real" anti-correlation with its correlated pair.

Now, given the above assumptions, when a particle enters a detectors polarizer, the particle polarization relative to the detectors polarizer has a distinctly "real" physical meaning. Thus the empirical (apparently) random detection sequence is determined by this relative particle/polarizer angle. By 1), through nothing more than simple geometry, this provides information about the "real" polarization of its pair by 2). Likewise for the other particle. Thus, through simple geometry and the realness of spin defined by 1) and 2), a zero setting for the detectors is uniquely defined by the polarizer/spin angle, regardless of the experimenters knowledge or choice in how the detectors zero setting is chosen.

If the particle spin/polarizer angle has "real" physical meaning, arbitrary choices become moot, as that provides an reference to a unique angle inversely common to both particles. Thus a relation that provides for Bell's inequalities, where anyone detector is predefined, is valid under arbitrary choices.

So now we can add 3) to our assumptions:
3) By 2) the relative spin to polarizer angle of a single detector uniquely identifies the polarization angle of both particles.

Now we obviously can't detect the relative angle between 'individual' particle spin and polarizer setting, but if spin is real it is there, and apparently affects detection sequence, though not overall detection rate. I can't say this is how it is but the information is there, without FTL. In fact, if this is true, it requires perfect determinacy to perfectly violate Bell's inequalities.
 
  • #412
DrChinese said:
Yes, I know it is pseudo-random. After a while you will realize that it does not need to be "truly" random. It is just a simulation to get things moving on how everything *should* work in this area. However, it might be worth you adding that element to see how it changes things. Thanks for the reference by the way, I will check it out.

You are welcome.

I think simulation of EPR is very interesting. It would be great to have an open "EPR framework" with real-time simulations + graphs + automatic validation of BI, which would allow for input of 'new ideas' for immediate testing, with minimal coding. Maybe a project for the future... if possible...
 
  • #413
I need to polish up my last argument and better outline its consequences. Until that post I only viewed it in the context of more complex physical constructs, but distilled down it's easier to see the bare consequences.

In essence, when we say we have a choice of detector setting we are overgeneralizing. In fact, if the realism assumptions are valid, the particle entering the detector itself defines a unique zero setting via the particles "real" polarization. The experimenter can only choose an offset from that polarization, and not specifically the offset relative to the distant detector. A violation of Bell's inequalities, in this view, entails a unique and separate perfectly determined detection sequence for each detector offset relative the particle polarization. Likely quantized offsets to get such perfect experimental results. The inverse of this perfectly determined sequence can be repeated IIF the distant detector chooses the same offset relative to a distant, but perfectly anticorrelated particle. Not having prior knowledge of determinates, polarization, etc., we can only see it in the coincidences of a pair of otherwise random sequences.
 
  • #414
my_wan said:
I need to polish up my last argument and better outline its consequences. Until that post I only viewed it in the context of more complex physical constructs, but distilled down it's easier to see the bare consequences.

In essence, when we say we have a choice of detector setting we are overgeneralizing. In fact, if the realism assumptions are valid, the particle entering the detector itself defines a unique zero setting via the particles "real" polarization. The experimenter can only choose an offset from that polarization, and not specifically the offset relative to the distant detector. A violation of Bell's inequalities, in this view, entails a unique and separate perfectly determined detection sequence for each detector offset relative the particle polarization. Likely quantized offsets to get such perfect experimental results. The inverse of this perfectly determined sequence can be repeated IIF the distant detector chooses the same offset relative to a distant, but perfectly anticorrelated particle. Not having prior knowledge of determinates, polarization, etc., we can only see it in the coincidences of a pair of otherwise random sequences.

OK, you are really going off the deep end now. :smile: (And I mean that in a nice way.)

Everything you are saying has been refuted a zillion times already. I can demonstrate it either by theory or by experiment, pick your poison. But first, like ThomasT, you will need to show me something! I can't refute NOTHING!

Walk me through some examples. Provide me a dataset. If you want, I will make it easy and you can talk through the perfect (EPR) correlation cases first before moving on to the Bell cases (like 0/120/240 I always mention).

And by the way, I will make a little prediction: when we are done, I will have proven your example wrong. But you won't change your opinion because you will say that there is an example that proves you right, you just haven't found it yet.

So if you are going to follow this line, you can just say so now and save us both time. The question comes down to: are you asking or are you telling? Because I'm *telling* you that your thinking does NOT follow from the facts. I mean you might want to consider this little tidbit before you go much further: photons can be entangled that have NEVER existed within the same light cone. How do you propose to explain that? That certainly would have turned Einstein's head.
 
Last edited:
  • #415
DevilsAvocado said:
You are welcome.

I think simulation of EPR is very interesting. It would be great to have an open "EPR framework" with real-time simulations + graphs + automatic validation of BI, which would allow for input of 'new ideas' for immediate testing, with minimal coding. Maybe a project for the future... if possible...

For those acquainted with c-sharp, I have the same de Raedt simulation, but converted in an object oriented way (this allows a clear separation between the objects(particles and filters) used in the simulation).
But an open framework should probably be started in something like http://maxima.sourceforge.net/" .
 
Last edited by a moderator:
  • #416
ajw1 said:
For those acquainted with c-sharp, I have the same de Raedt simulation, but converted in an object oriented way (this allows a clear separation between the objects(particles and filters) used in the simulation).

Yes, it still takes a little thought for the coder. I wanted to have something that clearly related to the original De Raedt model, so that there would be little question that my program did the job.

The issue is to make sure that there is nothing happening in the code that: a) has the detectors considered when the particles are prepared initially; or b) has particle 1/detector 1 mixed with particle 2/detector 2 in any way.

I know you are aware of this, I am saying this for the benefit of others who may be reading.
 
  • #417
I see your site is back up DrC. :smile: I'll go over it soon.

DrChinese said:
OK, you are really going off the deep end now. (And I mean that in a nice way.)

Everything you are saying has been refuted a zillion times already. I can demonstrate it either by theory or by experiment, pick your poison. But first, like ThomasT, you will need to show me something! I can't refute NOTHING!

Walk me through some examples. Provide me a dataset. If you want, I will make it easy and you can talk through the perfect (EPR) correlation cases first before moving on to the Bell cases (like 0/120/240 I always mention).

And by the way, I will make a little prediction: when we are done, I will have proven your example wrong. But you won't change your opinion because you will say that there is an example that proves you right, you just haven't found it yet.

So if you are going to follow this line, you can just say so now and save us both time. The question comes down to: are you asking or are you telling? Because I'm *telling* you that your thinking does NOT follow from the facts. I mean you might want to consider this little tidbit before you go much further: photons can be entangled that have NEVER existed within the same light cone. How do you propose to explain that? That certainly would have turned Einstein's head.
Ok, you may have a point, but I'd like to see it. I hope you deliver, I'm arguing in the hopes of learning something new. I have a preference for experiment in empirical matters but without ignoring theory, as theory is what is at issue here. As for whether I'm asking or telling: Neither. I'm taking a position to be debated to sharpen the articulation of the controversial points. The example I'll go through is from 0 to 45, and explain how counterfactual reasoning can be interpreted in those discrepancies. In particular, when you say on your website:
[PLAIN said:
http://www.drchinese.com/David/Bell_Theorem_Easy_Math.htm]Yet[/PLAIN] according to EPR, an element of reality exists independent of the act of observation. I.E. all elements of reality have definite values at all times, EVEN IF WE DON'T KNOW THEIR VALUES.
When you say, A i.e. B, I agree with A but will argue B implies properties that don't necessarily follow from A. I think it was the above interpretation you placed on the "realism" I used in my prior post.

Consider the following detection rates:
00 = 1
50 = 0.985
100 = 0.940
150 = 0.867
200 = 0.767
250 = 0.643
300 = 0.5
350 = 0.342
400 = 0.174
450 = 0
This pattern inversely after every 45 degrees.

To show the discrepancy with realism as defined, let's consider a set of string detections where any common setting of detector pairs matches this (rounded). [0] is a 'coincidence' non-detection and [1] is a 'coincidence' detection.
00 = [1111111111] (100% coincidences)
50 = [1111111111]
100 = [1111111110]
150 = [1111111110]
200 = [1111111100]
250 = [1111111000]
300 = [1111100000]
350 = [1111000000]
400 = [1100000000]
450 = [0000000000]

Now if we pick a pair of arbitrary angles 100 and 400 we get:
100 = [1111111110]
Diff 300 = 0.5 (empirical) :: 0.766 if reality match (falsified) -> [1111100000] verses [1111111100]:200 = 0.767
400 = [1100000000]

Now what went wrong with realism here? Note that the strings represent coincidences, not detections. Furthermore, for any given detection potentially an arbitrarily large, perhaps infinite, number individual states, vectors, etc., went into defining that detection. Thus when looking at "coincidences", not detections, we can't automatically presume that the detections that define the coincidences between 100 and 400 are the same coincidences in detections between 00 and 300. Yet the 'reality' condition being imposed presumes only a single coincidence pattern can be involved in a given coincidences rate. Thus each coincidence profile would have a distinct detection and coincidence profile for each particle and relative angle of detector, which can only be repeated on a twin to the degree that the relative detector angle matches the original relative detector angle, as define relative to the polarization of that particle.

In principle, you can take each coincidence term in [1111111111...], [], ..., at each angle, expand each term [1] to contain its own coincidence profile with the other coincidence elements for each variation of angle. Then repeat for those coincidence elements. This would diverge rather quickly, but presumably converge quickly with a measurement for the same reason. I can't prove this, but in some sense if you want to take Hilbert space and wavefunctions seriously as real requires taking infinities pretty seriously. as in actual infinities.

Am I convinced by this? Perhaps on Mondays, Wednesdays, and Fridays, but it is as reasonable as anything else proposed, and I've seen no argument to escape it. Even if it flies in the face of indeterminism in principle, it doesn't even in principle allow an escape in practice, EPR notwithstanding. This mutual dependence on individual 'real' particle properties verses detector settings, and the resulting variation in specific detections verses coincidences is how relational interpretations escape EPR while maintaining realism in the event sets that define them.

The key point here is that the specific detection pattern of a series of particles at one angle can't be the same detection pattern at another angle, cross setting counterfactual assumptions are presumptuous with or without realism. Thus the coincidences between two pair of detector patterns and settings is even further removed from counterfactual claims from alternative settings. Yet the "realism" as defined by impossibility claims requires coincidences from random sequence pairs to counterfactually match entirely different coincidences in entirely different random sequences as a proxy for "realness" in values. The summation of events that define the outcome can nonetheless be real, so long as you don't require a summation of them in one physical configuration, defined my the detector settings, to match the summation of the same events with another set of detector settings. It would be analogous to saying mass, space, time, etc., can't be real because observer measure it differently in different circumstances.

About your "prediction" (I hope so):
Hopefully my point is fairly clear now, I hope you can offer more, because this is where I'm stuck atm. To tell me I have to explain it isn't realistic as the alternative hasn't explained anything either. To say I will not change my mind presumes I have made up my mind, but so long as a fundamental weakness exist in FTL claims through counterfactual reasoning, and reasonable arguments exist that justify invalidating counterfactual reasoning even if in realism based toy models, I'll be stuck with uncertainty. Yes, counterfactual reasoning is a 'fundamental' weakness of Bell's theorem et al. Not to mention the trouble it creates for realism based FTL theories. My position will remain a mere choice, which I can only hope helps lead me forward in some way, unless you can deliver.
 
Last edited by a moderator:
  • #418
my_wan said:
1. I see your site is back up DrC. :smile: I'll go over it soon...

2. When you say, A i.e. B, I agree with A but will argue B implies properties that don't necessarily follow from A. I think it was the above interpretation you placed on the "realism" I used in my prior post.

Consider the following detection rates:
00 = 1
50 = 0.985
100 = 0.940
150 = 0.867
200 = 0.767
250 = 0.643
300 = 0.5
350 = 0.342
400 = 0.174
450 = 0
This pattern inversely after every 45 degrees.

To show the discrepancy with realism as defined, let's consider a set of string detections where any common setting of detector pairs matches this (rounded). [0] is a 'coincidence' non-detection and [1] is a 'coincidence' detection.
00 = [1111111111] (100% coincidences)
50 = [1111111111]
100 = [1111111110]
150 = [1111111110]
200 = [1111111100]
250 = [1111111000]
300 = [1111100000]
350 = [1111000000]
400 = [1100000000]
450 = [0000000000]

Now if we pick a pair of arbitrary angles 100 and 400 we get:
100 = [1111111110]
Diff 300 = 0.5 (empirical) :: 0.766 if reality match (falsified) -> [1111100000] verses [1111111100]:200 = 0.767
400 = [1100000000]

Now what went wrong with realism here? Note that the strings represent coincidences, not detections. Furthermore, for any given detection potentially an arbitrarily large, perhaps infinite, number individual states, vectors, etc., went into defining that detection. Thus when looking at "coincidences", not detections, we can't automatically presume that the detections that define the coincidences between 100 and 400 are the same coincidences in detections between 00 and 300. Yet the 'reality' condition being imposed presumes only a single coincidence pattern can be involved in a given coincidences rate. Thus each coincidence profile would have a distinct detection and coincidence profile for each particle and relative angle of detector, which can only be repeated on a twin to the degree that the relative detector angle matches the original relative detector angle, as define relative to the polarization of that particle.

1. Yes...!

2. OK, now we are getting somewhere. But you have already jumped a few places too far here, and so we need to go back a step or two.

a. EPR defines realism as being the ability to predict the outcome in advance. That is a separate criteria from the Bell test itself, and something which is assumed to be true. In other words, if we have a Bell state, we have perfect correlations. If we have perfect correlations, then there is an element of reality. If we have elements of reality at all angles, then there must be beginning values which were predetermined IF realism applies. Do you follow this argument? This is straight from EPR. Bell too. So if you agree on this definition of realism, we can apply it in your example.

b. To apply to your example: we cannot simply say: the correlations are [1111100000] or whatever. We need to specify the Alice values and the Bob values, as well as values for Chris. Later, during the test, we will make a separate selection of which pair (2 of the 3) we will actually pick. Then we calc the coincidences. If you agree with this, then we can proceed to the next steps.

And I do agree that the set of coincidences for 0 and 30 degrees is different than for 10 and 40 degrees. They have no causal connection to each other. I am glad you see that point. We will return to it later I suspect. For purposes of our example, don't worry about randomizing the results: just get values that work correctly when we ulitmately do look at coincidences. We will likely need 12 items instead of 10 in order to make the example arithmetic work out. Which is seeing that there is 3/12 coincidences per QM vs. 4/12 for local realistic at my example 0/120/240 example. By the way, that is also the same as 0/30/60 degrees so we only need your 30 degree value to work everything out (since the perfect correlations are always 100%). Simple, eh?

Also we need to agree about what a coincidence is. I call it a coincidence if there is a match. With no deduction for non-matches. Your formula seems to deduct for non-matches, which is confusing to me. Can we use the terms such that Match=Coincidence? That way, the coincidence rate at 45 degree is 50%. Actually, I don't entirely follow your labeling about detections vs. coincidences. There are always detections in our ideal example.
 
Last edited:
  • #419
And just to be clear: We need observer Chris (i.e. 3 sets of values) because the open question is: Does the choice of observation (i.e. which 2 observers are selected out of 3) affect the outcome? You are arguing that it cannot (assuming the observers are spacelike separated). I say it does matter, that you cannot arrive at the QM predictions otherwise.
 
  • #420
Yes in my string notation I ignored random detections not attributable to a causal mechanism per the "reality" postulate, and reordered them in nonrandom sequences. Trivial to simply randomize 00, change the remaining values accordingly. I did this to grab the main content in a nonrandom handwritten way to directly compare what was considered "real" about the coincidences. I'm a little strapped for time atm, but your issue with detections vs. coincidences is something that needs worked out. My string notation can't really have helped considering your version. I need to reformulate something you are more familiar with. You used a third observer, where I simply compared one pair of coincidences at one set of detector settings to different pair rather than a third observer.

I'll be back later, hopefully with a third person version, and also reiterate my earlier issues with realism as defined in paragraph a. Also again why EPR used it knowing its limitations. Your right we should take it piece by piece.
 

Similar threads

  • Quantum Physics
2
Replies
45
Views
2K
  • Quantum Physics
Replies
4
Views
994
Replies
20
Views
1K
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
6
Views
1K
  • Quantum Physics
Replies
18
Views
2K
Replies
3
Views
1K
  • Quantum Physics
3
Replies
100
Views
9K
Replies
6
Views
3K
Replies
3
Views
740
Back
Top