Is action at a distance possible as envisaged by the EPR Paradox.

In summary: QM?In summary, John Bell was not a big fan of QM. He thought it was premature, and that the theory didn't yet meet the standard of predictability set by Einstein.
  • #596
my_wan said:
When you define 2 arbitrary polarizations, such as 22.5 and 30, this already requires using a common coordinate where both detectors agree on where the settings representing 22.5, 30 and all other settings, including 0 is. So even arbitrary setting requires FTL information of some sort, albeit predefined. We don't consider this FTL because space has covariant symmetries wrt various coordinate systems. Yet, in the EPR case, relative covariance is maintained, i.e., difference in detector settings, but covariance with the numerical labels we put on that coordinate system is broken. It makes our coordinate system look broken in this respect.
No, common reference does not require FTL. Common reference is established when you set up experiment. You measure birefringence of fiber that is used to transport photons from source to measurement site. And you compensate birefringence of fiber to establish common reference.
For example imagine that fiber is arranged so that at one site of measurement photons are received from up direction and polarizer is rotated in horizontal plane. In that case you simply establish common reference by finding out what is the angle of polarization for photons that come out of the fiber given certain angle of polarization for photons that go into the fiber at the other end.
So common reference is established by setup of experiment. Nothing like FTL.
 
Physics news on Phys.org
  • #597
DevilsAvocado said:
The 'trouble' I see with this explanation is that changes in the set up in a real way (wires), creates real changes in the outcome... To me this indicates the wave function must "be there" to produce these changes... (as much as a water wave is real)

Look at this picture, and hopefully you see what I’m aiming at:

Changes in the set up are all that you need to change the distribution of outcomes -- you don't need any 'thing' other than equipment characteristics, i.e., no reference to quantum entities, waves, etc. For example, see section 4.3 Geometrical Account of QLE starting on p 28 of our FoP paper, http://users.etown.edu/s/stuckeym/FOP 2008.pdf. In particular notice how how Eq. 31 becomes Eq. 32 on p. 29.
 
  • #598
my_wan said:
The way I constructed the HV's in the photons allows any level of violation of Bell's inequalities. I defined photons by a default polarization, followed by a binary digit for each angle available to the detector. So a random number generator, min/max=0/1, that exceeded the Malus' Law for that angle was set to 0. So I did match the QM rates for any arbitrary angle 'difference', but only when the difference was definable.

Maybe we are not saying the same thing. Of course you DON'T have HV values for arbitrary thetas. I already gave you one set, 0/22.5/45. You got 85% for 0/22.5, so you must have about 65% for 22.5/45.. which is incorrect of course. That's the point, you cannot have the ratios work out in your model unless you use 0 as one of the two points. If those work, others will not.
 
  • #599
DrChinese said:
Maybe we are not saying the same thing. Of course you DON'T have HV values for arbitrary thetas. I already gave you one set, 0/22.5/45. You got 85% for 0/22.5, so you must have about 65% for 22.5/45.. which is incorrect of course. That's the point, you cannot have the ratios work out in your model unless you use 0 as one of the two points. If those work, others will not.

Yes, but all the photons default polarization are perfectly randomized, so it makes no difference which polarization you call 0. So by labeling 22.5/45, your assuming 22.5 has absolute meaning to a relative value. Consider this analogy:

You have 3 inertial observers A, B, and C. Relative to A, B and C is going 85 km/hour and 50 km/hour respectively. This doesn't mean relative to B that C is going 15 km/hour or 135 km/hour. So when you say: 22.5/45 is incorrect you are in a sense correct. For the same reason it's incorrect to say relative to B, B and C have velocities of 85 km/hour and 50 km/hour respectively.

Similar to the velocity vectors above, the photon interaction with a polarizer is the product of a vector space, and the 0 condition is not fundamentally different from a 0 self velocity of an inertial observer. So 22.5/45 is 0/22.5. You can't say that means the original 0 angle, in 0/22.5/45, must then match coincidences 100%, any more than you can say 2 inertial observers with 0 self velocity must then have 0 relative velocity wrt each other.

You specifically violated the condition I specified for it to work, to demonstrate it wouldn't work that way. Because this is ostensibly predicated on Bell's realism, i.e., pre-QM classical physics, I justified this required condition with 3 purely classical objects with a measurable property (velocity).

Of course it could be argued a detection either happens or doesn't, unlike velocity. With detector settings 0/90, each is detecting a different 50% of the photons. If you move 0->22.5 and 90->110.5, then the 50% of photons being detected by 22.5 is a different 50% than what was detected at 0. Yet it remains exactly that 50% not being detected at 110.5. Same for any settings, like 0/22.5. Even if you maintain that classical type hidden variables can't mimic this, this is exactly what QM predicts to happen in photon detection statistics with a 'single' polarizer being rotated in single beam of randomly polarized light. That is that a photon that passes a polarizer at 0 has a ~85% chance of passing that same polarizer at 22.5. This is without a correlated pair in existence at all to communicate with FTL. Thus, it's certain that the detection statistics are a local QM phenomena, local to the way a photon interacts with a polarizer. Yet whatever this local phenomena, it's deterministically replicable in a perfectly (anti)correlated particle.
 
  • #600
my_wan said:
Yes, but all the photons default polarization are perfectly randomized, so it makes no difference which polarization you call 0. So by labeling 22.5/45, your assuming 22.5 has absolute meaning to a relative value..

It does matter IF you won't get the QM predicted values, which you won't.

All I ask is that the percentage of matches between 0/22.5 match the percentage from 22.5/45 and that the 0/45 matches are 50%. Your model does not do this for any dataset with more than about 20 items. You should acknowledge that is so. I assume you now understand why such a dataset is not possible.
 
  • #601
ThomasT said:
1. Say Alice and Bob are counter-propagating sinusoidal (light) waves that share a cloned property, eg., they're identically polarized. Analyze this cloned property with crossed polarizers and you get entanglement correlation. Cos^2 |a-b| in the ideal. It's just optics. Not that optics isn't somewhat mysterious in it's own right. But we can at least understand that the entanglement stats so produced don't have to be due to Alice and Bob communicating with each other, or that nonseparability means that Alice and Bob are the same thing in the sense that they're actually physically connected when they reach the polarizers..

2. Bell didn't address this case, because it's precluded by the EPR requirement that lhv models of entanglement be expressed in terms of parameters that determine individual results.

3. On the other hand, since a local realistic computer simulation of an entanglement preparation is not the same as a local realistic formal model (in the EPR sense), then it wouldn't be at all surprising if such a simulation could reproduce the observed experimental results, and violate a BI appropriate to the situation being simulated -- and this wouldn't contradict Bell's result, but, rather, affirm it in a way analogous to the way real experiments have affirmed Bell's result.

1. I have news for you: this is patently FALSE. If you take 2 identically polarized photons and run them through the polarizers as you describe here, you do NOT get Cos^2 |a-b| or anything close to it. You ONLY get this for ENTANGLED photons. In other words: in the case where your assumption is actually valid - and I do mean identical and identically polarized photons coming out of a PDC crystal - you do NOT get entangled state statistics. You ONLY get those when the output is in a superposition of states. (Whether you get one or the other is a decision that the experimenter can make by altering the setup slightly.)

2. Bell quite discussed the case where the correlations are due anti-symmetric considerations.

3. I would like to see one (and yes, it would surprise me). This is a somewhat complex subject and I am currently working with the De Raedt team (and another independent theoretical physicist) regarding some concerns I have expressed about their model. Their model does have some very interesting features. If it were possible to suitably express such a simulation, I think it might require some additional experimental analysis. It would not affect Bell's Theorem.
 
  • #602
DrChinese said:
It does matter IF you won't get the QM predicted values, which you won't.

All I ask is that the percentage of matches between 0/22.5 match the percentage from 22.5/45 and that the 0/45 matches are 50%. Your model does not do this for any dataset with more than about 20 items. You should acknowledge that is so. I assume you now understand why such a dataset is not possible.

And all I ask is that if you and I both have 0 self momentum, we must not have any momentum relative to each other. Even with plain old Galilean Relativity, it's not a very reasonable thing to ask is it?

And all I ask is that if our relative velocity is 30 km/hour, and we both increase our velocity by 30 km/hour, our relative velocity must remain unchanged. Not a very reasonable thing to ask is it.

Yet I do get QM predicted values, if I'm allowed to define ANY angle as 0, just like any inertial observer can describe their velocity as 0. If I change my definition of my velocity by X, it does not mean it changes my measurement of your velocity by X. Same with linear changes in relative polarizer angles.

So why demand even stricter linearity in measureables than what even Galilean Relativity supports? Fundamentally all EPR correlations measure is how many of the 50% of photons a polarizer detects overlaps with different polarizer settings. Yet, counterfactually, it is being presumed that the same subset of photons, with a common detection overlap, are involved with 2 different detector settings.
 
  • #603
my_wan said:
And all I ask is that if you and I both have 0 self momentum, we must not have any momentum relative to each other. Even with plain old Galilean Relativity, it's not a very reasonable thing to ask is it?

And all I ask is that if our relative velocity is 30 km/hour, and we both increase our velocity by 30 km/hour, our relative velocity must remain unchanged. Not a very reasonable thing to ask is it.

Yet I do get QM predicted values, if I'm allowed to define ANY angle as 0, just like any inertial observer can describe their velocity as 0. If I change my definition of my velocity by X, it does not mean it changes my measurement of your velocity by X. Same with linear changes in relative polarizer angles.

So why demand even stricter linearity in measureables than what even Galilean Relativity supports? Fundamentally all EPR correlations measure is how many of the 50% of photons a polarizer detects overlaps with different polarizer settings. Yet, counterfactually, it is being presumed that the same subset of photons, with a common detection overlap, are involved with 2 different detector settings.

I truly have no idea what you are talking about. I am discussing polarization, not velocity or relativity. If you generated a realistic dataset that works like real QM does, then simply show it. It is easy for me to request this since I KNOW you don't have it.

And why do you talk about the 50% detected? There is no 50%! They are ALL detected! The relevant issue is subensembles of the entire universe.
 
  • #604
DrChinese said:
I truly have no idea what you are talking about. I am discussing polarization, not velocity or relativity. If you generated a realistic dataset that works like real QM does, then simply show it. It is easy for me to request this since I KNOW you don't have it.

And why do you talk about the 50% detected? There is no 50%! They are ALL detected! The relevant issue is subensembles of the entire universe.

So what you are saying here is that the a giver polarizer setting passes all photons at that polarization? No, a given polarization setting passes 50% of a randomly polarized beam of light. A polarization setting at 90 degrees to that will pass exactly the other 50%. Thus any setting between those 2 must pass some of the photon at the 0 setting, and some from the 90 setting. This is true with or without classical mechanisms. See a visual here:
http://www.lon-capa.org/~mmp/kap24/polarizers/Polarizer.htm

The only thing to explain, per Bell's ansatz, is why the transition between 0 and 90 is not, counterfactually, linear with changes in the angle. The exact same paradox exist, in that polarizer applet above, when you add a second inline polarizer and notice that, with arbitrary offsets from the first polarizer, the percentage of the photons passing both polarizers do NOT fall off linearly between 0 and 90. EXACTLY the same non-linearity seen in EPR correlations, without ANY correlated photons. Yet EPR correlations indicate this is deterministically replicable if the photons are exactly (anti)correlated. But it is a LOCAL non-linearity producing it, exactly as seen in that applet.

Unlike the restrictions of Bell's realism, I include 'LOCAL' QM effects as valid effects to explain this non-linearity with. If nature is defined as a 'real' pure field, how can you expect properties to be linear representations of parts? If the relevant is "subensembles", we are likely not dealing with a finite Universe, even on the microscopic scale. But that is, in itself, not a violation of realism. Einstein did build GR as a causally connected field theory, however problematic that is for quantization.
 
  • #605
my_wan said:
So what you are saying here is that the a giver polarizer setting passes all photons at that polarization? No, a given polarization setting passes 50% of a randomly polarized beam of light. A polarization setting at 90 degrees to that will pass exactly the other 50%. Thus any setting between those 2 must pass some of the photon at the 0 setting, and some from the 90 setting. This is true with or without classical mechanisms. See a visual here:
http://www.lon-capa.org/~mmp/kap24/polarizers/Polarizer.htm

I keep trying to tell you that this is NOT how most real experiments are performed. Polarizing beam splitters are used. 100% of the light emerges, and it goes one way or another. That way, there is no question that there is a match.

You are creating artificial confusion by talking about the counterfactual "overlapping" or whatever it is. In fact, Alice and Bob are always counted (ideal case of course).
 
  • #606
my_wan said:
The exact same paradox exist, in that polarizer applet above, when you add a second inline polarizer and notice that, with arbitrary offsets from the first polarizer, the percentage of the photons passing both polarizers do NOT fall off linearly between 0 and 90. EXACTLY the same non-linearity seen in EPR correlations, without ANY correlated photons. Yet EPR correlations indicate this is deterministically replicable if the photons are exactly (anti)correlated. But it is a LOCAL non-linearity producing it, exactly as seen in that applet.

You know, that is an interesting similarity. But it actually has nothing directly to do with Bell test correlations. Those are obtained by a different technique, and yes, there is an underlying mathematical relationship connecting them. But that is where the connection ends.

If you can formulate a non-local connection between 2 polarizers in series, go for it. But that analogy does not apply to Bell tests. In fact, I am sure that there probably IS a connection at some deep level as you suggest. After all, the Heisenberg Uncertainty Principle is at work in both cases so that is to be expected. In my opinion, the same quantum non-locality is at work whenever the HUP is invoked. But everyone may not agree with that opinion.

However, that does not change the fact that it is the ENTANGLED connection which is of interest in Bell tests. It is that paradox which is at hand, and which is the subject of EPR.
 
  • #607
Note: Experimental constraints are too high for me to worry about anything but the ideal case.
DrChinese said:
And why do you talk about the 50% detected? There is no 50%! They are ALL detected! The relevant issue is subensembles of the entire universe.
This is a single polarizer with a photon detector:
attachment.php?attachmentid=26133&stc=1&d=1275435978.jpg

Now when we turn a second polarizer to 22.5 degrees, relative to that one, we get:
attachment.php?attachmentid=26134&stc=1&d=1275435978.jpg

Given that only 50% of the orginal beam hits the second polarizer, it's passing 85.36% of the polarized light hitting it. This is precisely the percent of EPR correlations at that same angle offset. It also matches at EVERY arbitrary offset. I take this to empirically mean that 2 polarizers, with a 22.5 degree offset, will counterfactually detect 85.36% of the same individual photons.

DrChinese said:
You know, that is an interesting similarity. But it actually has nothing directly to do with Bell test correlations. Those are obtained by a different technique, and yes, there is an underlying mathematical relationship connecting them. But that is where the connection ends.
So a point for point, angle for angle exact match is no connection? Let's look at what Bell's ansatz operationally assumed: That the correlations of any local realistic EPR mechanism must linearly transition from 50% to 100% max. Hence the 75% Bell limit on correlations at 22.5 degrees. But if a beam of polarized light does NOT linearly transition from 0 to 90 degrees, how can you possibly expect (presummed deterministic) correlations to?

DrChinese said:
If you can formulate a non-local connection between 2 polarizers in series, go for it. But that analogy does not apply to Bell tests. In fact, I am sure that there probably IS a connection at some deep level as you suggest. After all, the Heisenberg uncertainty principle is at work in both cases so that is to be expected. In my opinion, the same quantum non-locality is at work whenever the HUP is invoked. But everyone may not agree with that opinion.

The first sentence if kind of interesting, but my only point was that if the mechanism that induced the non-linearity exists in uncorrelated photons was a QM property of the way a photon interacts with a polarizer, and this interaction is fundamentally deterministic, it cannot be used as an ansatz to define a non-local mechanism. There is no doubt whatsoever that HUP is empirically valid, but that doesn't rule out a local deterministic underpinning, with or without finite parts (subensembles). Bell's ansatz is contingent upon countable subensembles with 'absolute' (measurable) properties.

DrChinese said:
However, that does not change the fact that it is the ENTANGLED connection which is of interest in Bell tests. It is that paradox which is at hand, and which is the subject of EPR.
Yes, but the locality claims about the meaning of ENTANGLED connection is predicated on a linearity that are trivially violated generally, in even Newtonian physics, and specifically in polarizer/photon interactions without EPR correlations. Yes the entangles states is interesting, but the non-linearity across relative detector settings do not represent a test of locality, except in the rawest assumption that all observables have perfectly linear relationships with things.

You keep asking for a dataset, but you'll just hang to the notion that you must be able to plug in 22.5/45, and get the same answer as 0/22.5. To that I have 1 question: If predefining a common coordinate system such that 22.5/45 has a relative difference of 22.5 is not a FTL cheat, why the is predefining ONLY the relative difference a FTL cheat? Coordinate systems are by definition non-physical, only the symmetries on them are.
 

Attachments

  • 1x50.jpg
    1x50.jpg
    14.1 KB · Views: 507
  • 2x22.5.jpg
    2x22.5.jpg
    12.2 KB · Views: 433
  • #608
ThomasT said:
You asked if the mathematical legitimacy of Bell's theorem is irrefutable. The mathematical form of Bell's theorem is the Bell inequalities, and they are irrefutable. Their physical meaning, however, is debatable.

In order to determine the physical meaning of the inequalities we look at where they come from, Bell's locality condition, P(AB|H) = P(A|H)P(B|H).

Then we can ask what you asked and we see that:
1. A and B are correlated in EPR settings.
2. Bell uses P(AB|H) = P(A|H)P(B|H)
3. P(AB|H) = P(A|H)P(B|H) is invalid when A and B are correlated.

Conclusion: The form, P(AB|H) = P(A|H)P(B|H), cannot possibly model the experimental situation. This is the immediate cause of violation of BIs based on limitations imposed by this form.

What does this mean?

P(AB|H) = P(A|H)P(B|H) is the purported locality condition. Yet it is first the definition of statistical independence. The experiments are prepared to produce statistical dependence via the measurement of a relationship between two disturbances by a joint or global measurement parameter in accordance with local causality.

Bell inequalities are violated because an experiment prepared to produce statistical dependence is being modeled as an experiment prepared to produce statistical independence.

Bell's theorem says that the statistical predictions of qm are incompatible with separable predetermination. Which, according to certain attempts (including mine) at disambiguation, means that joint experimental situations which produce (and for which qm correctly predicts) entanglement stats can't be viably modeled in terms of the variable or variables which determine individual results.

Yet, per EPR elements of reality, the joint, entangled, situation must be modeled using the same variables which determine individual results. So, Bell rendered the lhv ansatz in the only form that it could be rendered in and remain consistent with the EPR meaning of local hidden variable.

Therefore, Bell's theorem, as stated above by Bell, and disambiguated, holds.

Does it imply nonlocality -- no.

DrChinese said:
This is not correct because it is not what Bell says. You are mixing up his separability formula (Bell's 2), which has a different meaning. Bell is simply saying that there are 2 separate probability functions which are evaluated independently. They can be correlated, there is no restiction there and in fact Bell states immediately following that "This should equal the Quantum mechanical expectation value..." which is 1 when the a and b settings are the same. (This being the fully correlated case.)

DrC, ThomasT.

You both appear to agree that Bell uses P(AB|H) = P(A|H).P(B|H) in his work.

I cannot see how EPR studies using that formula could be serious. If H includes a hidden variable for each particle, that formula gives P(AB|H) = P(A|H).P(B|H) = (1/2).(1/2) = 1/4.

Can you direct me to an example where Bell uses P(AB|H) = P(A|H).P(B|H) in his work, please?

[Apologies for possible hi-jack; I will add this under Understanding Bell's mathematics.]
 
Last edited:
  • #609
my_wan said:
...

You are wandering all over the place. When you want to tackle a point, I will be glad to discuss. I have asked you before to stop your meandering. Listen to what I am saying, and re-read your responses. You are just flailing.

I told you to look at PBSs not polarizers. I know how polarizers work, you don't need to provide a diagram. They have nothing to do with the discussion. We are talking about Bell's theorem and Bell tests.

I know you have a lot of pet ideas. So what? We are NOT here to discuss your pet ideas. The point is to discuss the science of EPR and Bell. I know you are "supremely" confident of your ideas, but you have yet to demonstrate a single cogent idea. I refuse to continue if you won't be a well-behaved participant.
 
  • #610
JenniT said:
DrC, ThomasT.

You both appear to agree that Bell uses P(AB|H) = P(A|H).P(B|H) in his work.

I cannot see how EPR studies using that formula could be serious. If H includes a hidden variable for each particle, that formula gives P(AB|H) = P(A|H).P(B|H) = (1/2).(1/2) = 1/4.

Can you direct me to an example where Bell uses P(AB|H) = P(A|H).P(B|H) in his work, please?

[Apologies for possible hi-jack; I will add this under Understanding Bell's mathematics.]

Will continue that part of the discussion in that thread...
 
  • #611
I was reading over the rebuttals, and it seems I often misinterpreted your claim of 100% of photons emerging from a polarizer. I argued the polarizer effect by narrowing attention to a particular subsystem of the experiment. I do need to include polarizing beam splitter if for no other reason than perhaps to avoid some confusion.

Yes it's true that a PBS effectively detects ~100% of the light. Yet this still represents a single detection axis. So let's see what looking at both outputs of a PBS entails in the argument I posed. Consider a PBS in front of a randomly polarized beam of light. ~50% will be diverted to 1 detector, while the other ~50% is diverted to another. By the argument I proposed, if you rotate that PBS 22.5 degrees, ~15% of the light that would have been diverted 1 way is now diverted the other way.

Now consider a pair of PBS/detectors at each end of an EPR experiment. With both PBS's set on the same axis we get ~100% correlations. We offset 1 PBS by 22.5 degrees. Each photon has a certain tolerance for how far off from the PBS axis can be relative to the default photon polarization before it's diverted the other way by the PBS. When you exceed this tolerance, then, in spite of being anticorrelated with it's partner, the tolerance in the difference between the PBS detection axis is exceeded, so it reads as uncorrelated.

Bell's ansatz assumes a locally realistic mechanism must take a form that linearly transitions with the change in angle. What we have is a transition that changes with the square of the angle. Yet this empirical fact is ubiquitous. The same rules apply to polarizers, the efficiency loss in aerial antennas offset from the ideal setting, etc. This empirical fact may or may not have a realistic basis. But the fact that EPR correlations exhibit the same detection profile says nothing about locality when the same effect occurs without any correlations involved. EPR correlations, in this view, would only indicate the mechanism is deterministically replicable.

By the way, if the wavefunction is assumed to be real, with particles being a projection from a Hilbert space construct, it's reasonable that the square of the angle defines the observables. Even if only a subset of the 'possibilities' formally defined in Hilbert space represent an actual state. Self interaction still seems to require an ensemble (possible infinity) of micro-states.
 
  • #612
my_wan said:
1. I was reading over the rebuttals, and it seems I often misinterpreted your claim of 100% of photons emerging from a polarizer. I argued the polarizer effect by narrowing attention to a particular subsystem of the experiment. I do need to include polarizing beam splitter if for no other reason than perhaps to avoid some confusion.

Yes it's true that a PBS effectively detects ~100% of the light. Yet this still represents a single detection axis. So let's see what looking at both outputs of a PBS entails in the argument I posed. Consider a PBS in front of a randomly polarized beam of light. ~50% will be diverted to 1 detector, while the other ~50% is diverted to another. By the argument I proposed, if you rotate that PBS 22.5 degrees, ~15% of the light that would have been diverted 1 way is now diverted the other way.

Now consider a pair of PBS/detectors at each end of an EPR experiment. With both PBS's set on the same axis we get ~100% correlations. We offset 1 PBS by 22.5 degrees. Each photon has a certain tolerance for how far off from the PBS axis can be relative to the default photon polarization before it's diverted the other way by the PBS. When you exceed this tolerance, then, in spite of being anticorrelated with it's partner, the tolerance in the difference between the PBS detection axis is exceeded, so it reads as uncorrelated.

2. Bell's ansatz assumes a locally realistic mechanism must take a form that linearly transitions with the change in angle...

1. This is correct, you end up with subensembles where you have HH, VV, HV and VH. These are experimentally verifiable. What is counterfactual is the realistic case where there are 3 settings, and you get 8 permutations: HHH, HHV, ... , VVV.

2. Bell does not say this. He says that the local realistic formula ideally should reproduce the quantum expectation value. That is, if there is to be agreement between local realism and QM. So then you notice that it more or less requires the function to have a second derivative of zero (i.e. stationary) so that the realism requirement works. Now, this is not an absolute requirement per se. But you can see that he is setting things up to hint strongly that there will be a contradiction. And he is sharing some of his thoughts about how he arrives at his proof.
 
  • #613
1. Yes, but hidden variables may themselves be subassemblies of those measurables, which define the measurables, rather than just a hidden appendange to them.

2. Wrt: "Bell does not say this."
So, his ansatz, which assumes a maximum classical correlation 0=100%, 22.5=75%, 45=50%, 67.5=25%, and 90=0%, is not a requirement that max correlation statistics must linearly transition with the angle?

Here is an approach that takes a generally similar tack to my argument, with the a priori known probability distribution, but in the context of classical nonlinear filtering in a stochastic system.
http://arxiv.org/abs/0907.2327"
Abstract: [PLAIN said:
http://arxiv.org/abs/0907.2327][/PLAIN] A model is developed to describe state reduction in an EPR experiment as a continuous, relativistically-invariant, dynamical process. The system under consideration consists of two entangled isospin particles each of which undergo isospin measurements at spacelike separated locations. The equations of motion take the form of stochastic differential equations. These equations are solved explicitly in terms of random variables with a priori known probability distribution in the physical probability measure. In the course of solving these equations a correspondence is made between the state reduction process and the problem of classical nonlinear filtering. It is shown that the solution is covariant, violates Bell inequalities, and does not permit superluminal signaling. It is demonstrated that the model is not governed by the Free Will Theorem and it is argued that the claims of Conway and Kochen, that there can be no relativistic theory providing a mechanism for state reduction, are false.
 
Last edited by a moderator:
  • #614
my_wan said:
1. Yes, but hidden variables may themselves be subassemblies of those measurables, which define the measurables, rather than just a hidden appendange to them.

2. Wrt: "Bell does not say this."
So, his ansatz, which assumes a maximum classical correlation 0=100%, 22.5=75%, 45=50%, 67.5=25%, and 90=0%, is not a requirement that max correlation statistics must linearly transition with the angle?

3. Here is an approach that takes a generally similar tack to my argument, with the a priori known probability distribution, but in the context of classical nonlinear filtering in a stochastic system...

1. They can only go as deep as A and B. There is no C, hence no realism.

2. I think you mean the boundary point of a Bell inequality. Bell does not require that boundary to be the actual expectation function. Rather, that QM and LR are on different sides of it.

3. Again, another author who does not feel the need to provide for realism in their "realistic" solution. Hey, Joy Christian just came up with yet another "disproof of Bell" this week! Same thing, proof of hidden variables for A and B but not C. So what is the point of touting realism when no realistic dataset is forthcoming? A single counterexample should do it!
 
  • #615
DrChinese said:
1. They can only go as deep as A and B. There is no C, hence no realism.
The 3rd variable is counterfactual in Bell's EPR argument, so the realism is suspect in that case.

DrChinese said:
2. I think you mean the boundary point of a Bell inequality. Bell does not require that boundary to be the actual expectation function. Rather, that QM and LR are on different sides of it.
Well naturally the linear assumption is a boundary rather than a prediction. Yet it remains that Bell's ansatz assumes a classical mechanism can not exceed this linear boundary.

DrChinese said:
3. Again, another author who does not feel the need to provide for realism in their "realistic" solution. Hey, Joy Christian just came up with yet another "disproof of Bell" this week! Same thing, proof of hidden variables for A and B but not C. So what is the point of touting realism when no realistic dataset is forthcoming? A single counterexample should do it!
Actually, wrt the authors mentioned, I have to agree.. :blushing: They tend to overstate the significance of what they provided. Such attempts do remain important though.

The thing is, the claims of what violations of Bell's inequalities actually mean tends to be overstated on both sides of the fence. We are both arguing on the grounds of what we don't know, the nature of a connection between spacelike separated correlations. The argument from ignorance is inherent in the whole debate. I appreciate you making me think though.

Wrt a dataset, your not going to be happy with the floating 0 angle to maintain relative detector data locally. Neither am I really, but the physical significance of a choice in coordinate labels, distinct from the symmetries, is also dubious. My modeling attempts is to articulate the issues in my mind. They involve generating a list of thousands of random virtual photons and looping through them with a set of virtual detectors. I'm still trying some new, likely dubious, ideas. If we are dealing with transfinite subensembles it may not be possible with or without FTL. But the objective is to learn the issues in as much detail as possible. Adding both sides of the PBS output is actually quiet useful.
 
  • #616
my_wan said:
The 3rd variable is counterfactual in Bell's EPR argument, so the realism is suspect in that case.

That is the definition of realism. If there is no simultaneous C, there is nothing to discuss in a hidden variable theory. It simply isn't a hidden variable theory.

Because Bell slips this requirement in such a subtle manner, it doesn't jump out to many folks. But there it is, right after his (14), and it is quite evident: a, b and c are all together in one equation.

So it is simple: if you reject this assumption as meaningful, then the Bell result is not meaningful.

But you will be part of a small minority. Hey, some people don't like the Beatles either.
 
  • #617
If we take fully generalized thermodynamic models and/or Hilbert space seriously, we could also be looking at a version of Hilbert's paradox of the Grand Hotel. Of course that begs the question of why QM is normalizable. Yet that's a bit of a soft spot from a foundational perspective anyway. Yet, again, if a unit vector is a sum over an infinite number of local "hotel rooms", infinitesimals momentarily occupying a finite subset of those rooms, it still doesn't require FTL as a mechanism.

Would you consider 'actual infinities' a violation of realism? Even the linked paper by Bedingham, using a stochastic model, appears to be stuffing an arbitrary number of possible states into a singular ensemble. Same for the thermodynamic model, with statistically complete variables, linked a few pages back. Hilbert space, with it's required metrically complete property, appears to require the same thing, if it's taken to be physically real in some sense.

This also appears to be a required property for Quantum Computers to work as expected. Who was it that offered Quantum Computers as proof of MWI, due to not enough particles in the Universe to mimic them? The Axiom of Choice also appears to be related in some sense.

So what is your view wrt realism if it's defined in terms of 'actual infinities'?
 
  • #618
my_wan said:
If we take fully generalized thermodynamic models and/or Hilbert space seriously, we could also be looking at a version of Hilbert's paradox of the Grand Hotel. Of course that begs the question of why QM is normalizable. Yet that's a bit of a soft spot from a foundational perspective anyway. Yet, again, if a unit vector is a sum over an infinite number of local "hotel rooms", infinitesimals momentarily occupying a finite subset of those rooms, it still doesn't require FTL as a mechanism.

Would you consider 'actual infinities' a violation of realism?... So what is your view wrt realism if it's defined in terms of 'actual infinities'?

There is no dividing line between QM and realism on this subject. I don't see how the problem of infinities relates to realism. I guess you are saying that infinities cannot exist, and that somehow that means that counterfactuals don't have to exist. But I am not asserting counterfactuals exist, you are. Or at least you are if you are a realist.
 
  • #619
DrChinese said:
There is no dividing line between QM and realism on this subject. I don't see how the problem of infinities relates to realism. I guess you are saying that infinities cannot exist, and that somehow that means that counterfactuals don't have to exist. But I am not asserting counterfactuals exist, you are. Or at least you are if you are a realist.
Your reading way too much into my words, apparently based on a 'perception' of my position. In fact I said "actual infinities" may indeed exist. A sentiment that I have stated several ways before. Here I suggested perhaps the incongruence in counterfactual measures might be a real 'physical' result of Hilbert's paradox of the Grand Hotel.

I'm not holding nature to a conception of my choice. I am debating points for which I lack certainty, in the hopes of learning something that increases or decreases that certainty. The highly limited few things I have a fair degree of certainty on, is not even included in my arguments. I've asserted how I think it's possible for counterfactuals to be interpreted within a 'particular' contextual construct, but mostly dropped it for lack of clarity. But I can't a priori reject reasonable arguments, even if they lack the conclusiveness the authors wish it to.

When you objected with: "another author who does not feel the need to provide for realism in their "realistic" solution", I had to agree that, in spite of some reasonable content, your objection was essentially valid. I don't see any solid justification on either side. The non-realist seems to say, we don't see it so it must not exist. The realist seems happy to suggest mechanisms without actually stating what's real. So I began thinking about how Bedingham and others smooth over Bell's violations, where they hide the inequalities, and why it's not sufficient for some to define realism.

I'm not asking you to accept Hilbert's hotel paradox as an actual explanation, only something that it might in principle be so. My question was far more limited, to get a better picture of what you would 'in principle' accept as a realistic model. Because a repeat of Bell's realism really leaves me with a lot of questions about the range of what can and can't qualify as realism in your view. It seems the definitions used by various authors are incongruent, even when based on the same words, like the definition used by Bell.

Note: 'Actual infinities' is distinct concept from infinities in general. 'Actual infinities' are existential, so they by definition relates to realism. And it seemed to me the approach Bedingham et al used implicitly stuffed extra occupants in Hilbert's hotel, and even provided some justification in terms Hilbert space, quantum computers, etc. I remain at a loss for how you define the constraints of what qualifies for realism. You've rejected my characterization as a linear part-->measurable property, and continually quote my text that says one thing and characterize it as saying another. My desire for more concrete definitions is hampered by assumption of my positions, opposite of what I stated, on the very questions I ask to articulate those definitions. So I can only guess what your answer might have been.
 
  • #620
ThomasT said:
1. Say Alice and Bob are counter-propagating sinusoidal (light) waves that share a cloned property, eg., they're identically polarized. Analyze this cloned property with crossed polarizers and you get entanglement correlation. Cos^2 |a-b| in the ideal. It's just optics. Not that optics isn't somewhat mysterious in it's own right. But we can at least understand that the entanglement stats so produced don't have to be due to Alice and Bob communicating with each other, or that nonseparability means that Alice and Bob are the same thing in the sense that they're actually physically connected when they reach the polarizers.

DrChinese said:
1. I have news for you: this is patently FALSE. If you take 2 identically polarized photons and run them through the polarizers as you describe here, you do NOT get Cos^2 |a-b| or anything close to it.
In the cases you're talking about, the explanation is that the photons (while sometimes very closely polarized) aren't identically polarized. They're not 'clones' of each other. How do we know that? Precisely because when you run them through the polarizers you don't get cos^2 |a-b| entanglement stats (but you do get a range of approximations of essentially the same sinusoidal angular dependency -- which suggests to me that 'entanglement' is simply a special case involving the same underlying physical principles, which include, but aren't limited to, (1) the principle of locality and (2) the cos^2 theta rule).

DrChinese said:
You ONLY get this for ENTANGLED photons.
I agree. They (or a common property that's being jointly measured) are clones of each other. Which means that they're, eg., identically polarized. Which is deduced via the production of entanglement stats.

DrChinese said:
In other words: in the case where your assumption is actually valid - and I do mean identical and identically polarized photons coming out of a PDC crystal - you do NOT get entangled state statistics.
Then, as I said above, these photons aren't cloned (ie., entangled) wrt polarization. In this case, we can assume that |L1 - L2| > 0 (ie., we can assume that they weren't identically polarized), where L1 and L2 denote the optical vectors of the photons.

------------------------

ThomasT said:
2. Bell didn't address this case, because it's precluded by the EPR requirement that lhv models of entanglement be expressed in terms of parameters that determine individual results.
DrChinese said:
2. Bell quite discussed the case where the correlations are due anti-symmetric considerations.
That's not what I'm talking about -- which is that if Bell had modeled the joint situation in the global (ie., nonseparable) terms that it actually required (involving some modification in the representation of the 'beables' involved), then he might have presented a local realistic model which would have reproduced the qm correlation. The point of departure for viable local realistic models is that an experimental situation measuring a joint microphysical parameter vis a joint measurement parameter requires a 'nonseparable' representation. Such models have been produced, they work, and they remain unrefuted.

(Wrt my statement 2. above, I've come to think that EPR's definition of reality doesn't require that LR models of entanglement be expressed in terms of parameters that determine individual results. That is, there can be a common, underlying parameter that determines joint results while not determining individual results, and this, realistic, conception isn't contradicted by the EPR's conception of reality and definition thereof vis elements of reality.)

-----------------------------------------------

ThomasT said:
3. On the other hand, since a local realistic computer simulation of an entanglement preparation is not the same as a local realistic formal model (in the EPR sense), then it wouldn't be at all surprising if such a simulation could reproduce the observed experimental results, and violate a BI appropriate to the situation being simulated -- and this wouldn't contradict Bell's result, but, rather, affirm it in a way analogous to the way real experiments have affirmed Bell's result.

DrChinese said:
3. I would like to see one (and yes, it would surprise me). This is a somewhat complex subject and I am currently working with the De Raedt team (and another independent theoretical physicist) regarding some concerns I have expressed about their model. Their model does have some very interesting features. If it were possible to suitably express such a simulation, I think it might require some additional experimental analysis. It would not affect Bell's Theorem.
Not the math itself, no, but it would affect the physical interpretation of BI violations wrt locality and determinism -- rendering them irrelevant wrt those considerations.

---------------------------------------------------

From the thread: "Why the De Raedt Local Realistic Computer Simulations are wrong", you stated:

DrChinese said:
In trying to show that there "could" be an exception to Bell, please consider the following to add to your list of tests for you candidate LHV theory:
... snip ...
DrChinese said:
b) The formula for the underlying relationship will be different than the QM predictions, and must respect the Bell Inequality curve. I.e. usually that means the boundary condition which is a straight line, although there are solutions which yield more radical results.
If you're requiring that an LR model of entanglement not agree with qm predictions or experimental results, then I now see the point of your 'LR dataset' requirement. Well, yes, I certainly agree that one way to rule out qm compatible and viable LR accounts of entanglement is to simply require them to be incompatible with qm and inaccurate. But that would be inane. So I must be misunderstanding what you mean.
 
  • #621
ThomasT said:
1. Then, as I said above, these photons aren't cloned (ie., entangled) wrt polarization. In this case, we can assume that |L1 - L2| > 0 (ie., we can assume that they weren't identically polarized), where L1 and L2 denote the optical vectors of the photons.

2. If you're requiring that an LR model of entanglement not agree with qm predictions or experimental results, then I now see the point of your 'LR dataset' requirement. Well, yes, I certainly agree that one way to rule out qm compatible and viable LR accounts of entanglement is to simply require them to be incompatible with qm and inaccurate. But that would be inane.

1. Again, this is patently false. They most certainly ARE polarization clones of each other. And they are entangled. But they are not polarization entangled, which is quite different. If we accept your physical assumption of "counter-propagating influences", then these should produce the same statistics as entangled particles. But they don't.

Now why are these particles acting different? Because they are NOT in a superposition of polarization states. This is meaningful within QM but has no counterpart in a local realistic theory - in which there is no such thing as a superposition (by definition). Take a look at how these photon pairs are produced and you will see how ridiculous your assertion is. A reference:

Theory of two-photon entanglement in type-II optical parametric down-conversion
M. Rubin, D. Klyshko, Y. Shih, A. Sergienko
Physical Review A, December 1994
http://sws.bu.edu/alexserg/PRA_50_5122.pdf

"Using Eq. (41), it is easy to see that |Phi'> is a product state when Psi=pi/8; otherwise it is in an entangled state. It is an EPR state if Psi=0 or pi/4 and is a linear superposition of two EPR states for all other Psi's..."

What this means is that the only difference in producing the entangled state versus the product state is a small rotation of a wave plate. Perhaps you could explain how that separates these streams using a local realistic viewpoint. (P.S. this is a trick question because any accurate answer would show where to find the physical source of entanglement, and there isn't one.) Similarly there are other ways to break polarization entanglement and all of them rely on gaining knowledge of "which path" and therefore do not produce a superposition.

Again, I keep calling you out on this subject and you are operating in denial. The fact is that entangled particles have attributes that do not follow a local realistic explanation. You are simply trying to claim your ideas are equivalent to QM and they are not. If you are going to make an assumption with physical implications, then you lay yourself open to seeing that disproved. Which it has been, over and over.2. Talk to Bell about this. Or God. I did not create our universe, so it is not my requirement. Next you will be complaining about the 4 color map theorem as being "inane".
 
Last edited by a moderator:
  • #622
Here is an interest paper by Michael Seevinck, which rigorously derives a version of Bell's inequalities for correlations.
http://philpapers.org/rec/SEETQW".
Found. Phys. 36, 1573-1586 (2006)

He makes a more heuristic case here:
http://philpapers.org/rec/SEETQW said:
It is possible that one thinks that the requirement of local realism is too strong a requirement for ontological robustness. However, that one cannot think of entanglement as a property which has some ontological robustness can already be seen using the following weaker requirement: anything which is ontologically robust can, without interaction, not be mixed away, nor swapped to another object, nor flowed irretrievably away into some environment. Precisely these features are possible in the case of entanglement and thus even the weaker requirement for ontological robustness does not hold.
This same case against ontological robustness made here naturally also applies to the properties in Bell's inequalities. If ontologically robust variables exist, independent of any observation of it, this tells us it can't innately contain the properties, or observables, that we associate with the realism of classical properties. These properties must be generated dynamically.

At a foundational level, any such ontologically robust variables, independent of the dynamically generated properties, must by definition make them independent variables. As Schneider put so well in "[URL Determinism Refuted[/URL], an independent variable cannot even in principle be observed. However, if they play a role in dynamically generating observables, they may still have deterministic underpinnings. Thus Schneider has not refuted determinism, nor ontologically robust variables, in principle, but merely described exactly why it can't be directly observed in experiments, whether they exist or not. Schneider's argument only holds if absolutely nothing we can't see exist. The standard QM interpretation is predicated on this notion.

The positivist can yell poppycocks, but existential postulates are fundamentally no different from any mathematical postulate, so long as it's used for more than just to sweep the ontological and/or empirical difficulties of QM under the rug. So long as QM and GR remain disconnected, even failing the above criterion, it remains a legitimate open and worthy question. There is sound reason to consider observables synonymous with what is 'real'. It is the sole source of cogency of any theory. Yet to fail to make a distinction, in principle, between what is observed and what is ontologically real has been referred to as sleepwalking by some authors.

The point here is that ontological realism, ontologically robust variables, does not explicitly depend on any given measurable having any direct relation to those variables. Schneider's argument should make it clear that, even if realism is factual in principle, the notion that these ontologically robust variables are in themselves measurables is untenable. For a realist to assume a thing is observable without interaction amounts to ESP, at which point a self-referential interaction is observed, not the thing. From this perspective, the very notion of classical realism, used by Bell, Einstein, etc., is fatally flawed at the foundational level. Yet the realism may yet persist, or not.
 
Last edited by a moderator:
  • #623
my_wan said:
At a foundational level, any such ontologically robust variables, independent of the dynamically generated properties, must by definition make them independent variables. As Schneider put so well in "[URL Determinism Refuted[/URL], an independent variable cannot even in principle be observed. However, if they play a role in dynamically generating observables, they may still have deterministic underpinnings. Thus Schneider has not refuted determinism, nor ontologically robust variables, in principle, but merely described exactly why it can't be directly observed in experiments, whether they exist or not. Schneider's argument only holds if absolutely nothing we can't see exist. The standard QM interpretation is predicated on this notion.

That Schnieder guy makes some good points, thanks for pointing this out.

:smile:
 
Last edited by a moderator:
  • #624
DrChinese said:
That Schnieder guy makes some good points, thanks for pointing this out.

:smile:

:wink:
 
  • #625
Here is a more rigorous treatment of the idea that, if QM holds locally, then it indicates a violation of Bell's inequalities with no-signaling:
http://arxiv.org/abs/0910.3952"
Phys. Rev. Lett. 104, 140401 (2010)

This paper also uses an argument I previously attempted here wrt classical variables:
http://arxiv.org/abs/0804.0884"

It still seems to me, based on my modeling, that in order to define EPR in terms of variables, each offset in detector settings has to be defined by a separate (probably relativistically related) probability space as defined by Hess et al. Unless of course I'm allowed to define 1 of the detector settings as 0, and simply rotate the whole coordinate system to change its settings. Otherwise the number of variables required grows excessively large for arbitrary settings, perhaps even diverges. Quantum computers appears to require an arbitrary number of variables also.

QM, in a sense, consist of discontinuous solutions to differential equations. Along with the Born rule and HUP, it primarily sums up the conceptual difficulties with QM. I suspect Bell violations may be related more to a physical manifestation the Born rule than HUP. As if natures measurables really are a projection from an entirely different coordinate symmetry than we assume.
 
Last edited by a moderator:
  • #626
my_wan said:
This paper also uses an argument I previously attempted here wrt classical variables:
http://arxiv.org/abs/0804.0884"

That reference deserved to be labeled with the author's name. Hess is a persistent local realist who has attacked Bell and Bell tests from numerous angles. His work is widely rejected.

In this piece, he basically argues for the QM position by asserting that there are no classical probability spaces. He discusses the idea of incompatible measurements (i.e. >2) which is in fact the QM position. I guess if you move the bar far enough, everyone can claim victory.

The question I always ask myself for these arguments is really quite simple: what would history's greatest local realist - Einstein - think of the argument? Of course, we can only speculate but speculate I will. Einstein would have appreciated the Bell argument and would NEVER try to con his way out of it with an argument like Hess has made. Please, feel free to disagree...
 
Last edited by a moderator:
  • #627
Ok, so Hess has his critiques, but on what grounds are the counterarguments predicated? In fact this is why I chose this reference, rather than the original version, because it was a response to criticisms, thus contained references to those criticisms.

Criticizing it on the grounds that it fails to explain EPR correlations, or provide a mechanism for doing so, is a non-starter. Consider the following quote from the Hess paper, as a result of implied content of his critiques:

(Hess)-[PLAIN said:
http://arxiv.org/abs/0804.0884]It[/PLAIN] also should be noted that the author subscribes fully to the teachings of both quantum and Kolmogorov probability (as different and well proven probability frameworks) and to their well known relationship to actual experiments (see e.g. [16]). The author has neither any criticism for these frameworks nor for the definition of the “elements of physical reality” of the EPR paper [17] nor for the EPR-type experiments performed by Aspect [18] and others. The author criticizes exclusively the work of Bell as not being general enough to apply to general physics problems (quantum and/or classical) and the work of Bell’s followers for the same reason and for actual logical and mathematical mistakes.

So when you say widely rejected, precisely what was widely rejected? No specific claim was made that the given mechanism would even provide a realistic explanation of the inequality violations. Only that Bell's argument, as posed, lacked the generality needed for the generality often taken in its interpretation, whether a classical or quantum context. So his critiques proceeded on the grounds that the assumed variables have a presupposed relationship to the measurables, and proceeds to destroy the argument on those grounds. Well duh... So the implied meaning of "His work is widely rejected" is of little import to the questions that remain open and unanswered. Facts are not a democracy, and the claims here presupposes a generality lacking in the argument. Thus no complete proof or disproof exist atm.

I get a queasy feeling anytime I start trying to second guess how someone else would view something. I suspect Einstein had his own perspective, that didn't lack a full appreciation of the empirical validity of QM, nor the loss he was at to explain what quanta was.
I consider it quite possible that physics cannot be based on the field concept, i.e., on continuous structures. In that case, nothing remains of my entire castle in the air, gravitation theory included, [and of] the rest of modern physics. (Albert Einstein, 1954)
Here he placed the importance of describing what actually is above his own lifes work. So where we might presuppose Einstein would go with any given piece of empirical evidence is more than a little presumptuous.
 
Last edited by a moderator:
  • #628
my_wan said:
1. Ok, so Hess has his critiques, but on what grounds are the counterarguments predicated? In fact this is why I chose this reference, rather than the original version, because it was a response to criticisms, thus contained references to those criticisms.

So when you say widely rejected, precisely what was widely rejected? No specific claim was made that the given mechanism would even provide a realistic explanation of the inequality violations. Only that Bell's argument, as posed, lacked the generality needed for the generality often taken in its interpretation, whether a classical or quantum context. So his critiques proceeded on the grounds that the assumed variables have a presupposed relationship to the measurables, and proceeds to destroy the argument on those grounds. Well duh... So the implied meaning of "His work is widely rejected" is of little import to the questions that remain open and unanswered. Facts are not a democracy, and the claims here presupposes a generality lacking in the argument. Thus no complete proof or disproof exist atm.

2. I get a queasy feeling anytime I start trying to second guess how someone else would view something. I suspect Einstein had his own perspective, that didn't lack a full appreciation of the empirical validity of QM, nor the loss he was at to explain what quanta was.Here he placed the importance of describing what actually is above his own lifes work. So where we might presuppose Einstein would go with any given piece of empirical evidence is more than a little presumptuous.

1. It is normal, in this forum, to identify work which is not generally accepted (or worse, is generally rejected). Hess makes note of the fact that his position is rejected by Mermin. As to the substance of his argument: Hess is constantly trying new attacks on Bell. It is hard not to get the feeling that his position is based on emotion rather than science. When he comes up with something worth looking at in more detail, I will. In the meantime, I am waiting for a specific counterexample to discuss. He doesn't offer any.

2. Well, I presume to state that Einstein would have no part of Hess' ideas. He would have understood Bell immediately, and would never have tried to weasel out of it with anything less than something equally substantial. As you mention, Einstein would be willing to give up everything for one good argument. Fortunately, Bell only requires Einstein give up 1 thing.
 
  • #629
Again, what exactly has been rejected, the claim that the Bell argument contains this class of of variables the Bell argument doesn't address, or the claim that that no such class has been constructed to do so?

I mention this paper only because I did use a similar argument as one possibility among others. I am also dissatisfied with it, as I have noted. The 0 angle definition condition, I was forced into to make it work, is physically quiet similar to what Hess et al proposed in making a new HV set for each possible angle. A new set for each angle gets out of the 0 angle condition I had, but creates a new problem. The variables must still define the offset, and 1 or the other detector, but not both, has to count off from that offset. Thus it introduces the same relative coordinate condition I was forced to impose with an arbitrarily defined 0 setting.

The thing about Mermin's counter is that he presupposes the counterfactual coincidences must have the same coincidence rates relative to a separate run in which they were empirically established. In fact Mermin states he uses his red/green light toy model to articulate issues with.

Let's consider at a pair of unfair coins. These coins are special, and have a micro-dial setting to determine how unfair they are. You set it so they have an 15% chance of landing on the opposite sides. Now you take a 3rd coin, and want to set it so it has a 50% chance of landing on the same side as the 1st coin, and an 85% chance of landing on the opposite side as the second coin. Does the fact that it can't be done invalidate the reality of the coin settings? Yet separately you can do just that.

Are we arbitrarily imposing a similar physical absurdity, and hiding it behind a presupposed 3-way correlation? Does the variables we suppose are carried by the photons physically preclude such 3-way correlations for perfectly valid physical reasons? In fact, in QM, the probabilities must be considered jointly, precluding probabilities greater than 1. It is only through an a priori imposition that such conditions are demanded in QM, which are contrary to the rules of QM. So we are also violating the rules of QM, as well as physical constraints on the coin analogy, with such counterfactual a priori demands.

So if the rules of QM are not being violated, show how QM predicts a probability greater than 1, without presupposing it through counterfactual choices. Otherwise Bell's inequality sneaks a QM rules violation in the back door, via a counterfactual claim. The physical constraint, like the coins, would be in physical creation the 3rd correlated particle (variables) with the specified properties, not in what the detectors read after the fact, nor a constraint on any single pairing of properties and hv's.

That may be the strongest objection yet. Like trying to define 3 coins that can all land on opposite sides, because couterfactually any 2 can.
 
  • #630
my_wan said:
Again, what exactly has been rejected, the claim that the Bell argument contains this class of of variables the Bell argument doesn't address, or the claim that that no such class has been constructed to do so?

I mention this paper only because I did use a similar argument as one possibility among others. I am also dissatisfied with it, as I have noted. The 0 angle definition condition, I was forced into to make it work, is physically quiet similar to what Hess et al proposed in making a new HV set for each possible angle. A new set for each angle gets out of the 0 angle condition I had, but creates a new problem. The variables must still define the offset, and 1 or the other detector, but not both, has to count off from that offset. Thus it introduces the same relative coordinate condition I was forced to impose with an arbitrarily defined 0 setting.

The thing about Mermin's counter is that he presupposes the counterfactual coincidences must have the same coincidence rates relative to a separate run in which they were empirically established. In fact Mermin states he uses his red/green light toy model to articulate issues with.

Let's consider at a pair of unfair coins. These coins are special, and have a micro-dial setting to determine how unfair they are. You set it so they have an 15% chance of landing on the opposite sides. Now you take a 3rd coin, and want to set it so it has a 50% chance of landing on the same side as the 1st coin, and an 85% chance of landing on the opposite side as the second coin. Does the fact that it can't be done invalidate the reality of the coin settings? Yet separately you can do just that.

Are we arbitrarily imposing a similar physical absurdity, and hiding it behind a presupposed 3-way correlation? Does the variables we suppose are carried by the photons physically preclude such 3-way correlations for perfectly valid physical reasons? In fact, in QM, the probabilities must be considered jointly, precluding probabilities greater than 1. It is only through an a priori imposition that such conditions are demanded in QM, which are contrary to the rules of QM. So we are also violating the rules of QM, as well as physical constraints on the coin analogy, with such counterfactual a priori demands.

So if the rules of QM are not being violated, show how QM predicts a probability greater than 1, without presupposing it through counterfactual choices. Otherwise Bell's inequality sneaks a QM rules violation in the back door, via a counterfactual claim. The physical constraint, like the coins, would be in physical creation the 3rd correlated particle (variables) with the specified properties, not in what the detectors read after the fact, nor a constraint on any single pairing of properties and hv's.

That may be the strongest objection yet. Like trying to define 3 coins that can all land on opposite sides, because couterfactually any 2 can.

I reject the idea that a realistic theory is possible. It is really that simple. My definition of reality being the same as the EPR definition: if it can be predicted in advance, there must be an element of reality. But there cannot be 3 simultaneously real. This is not a requirement of QM, and in no way is QM given a preferred status in Bell other than by way of comparison. You cannot get a realistic theory with ANY function where there is rotational invariance, as Mermin demonstrated. Hess has provided nothing for me to reject other than his conclusion. There is no realistic model. Again.

QM does not ask for counterfactuality, so your argument is backwards. It is realism that requires extra assumptions, not QM. So if you think these requirements are absurd, well, that would simply mean you reject realism. Sorry, but you cannot have your cake and eat it too.

So define realism however you like. Define it like Hess if that makes you happy (or whatever his latest absurd definition of the week happens to be). But I won't agree that day is night, that blue is red, or whatever. I will stick with Einstein's elements of reality.
 

Similar threads

2
Replies
45
Views
2K
Replies
4
Views
1K
Replies
18
Views
2K
Replies
6
Views
1K
Replies
2
Views
1K
Replies
100
Views
9K
Replies
6
Views
3K
Back
Top