CERN team claims measurement of neutrino speed >c

In summary, before posting in this thread, readers are asked to read three things: the section on overly speculative posts in the thread "OPERA Confirms Superluminal Neutrinos?" on the Physics Forum website, the paper "Measurement of the neutrino velocity with the OPERA detector in the CNGS beam" published on arXiv, and the previous posts in this thread. The original post discusses the potential implications of a claim by Antonio Ereditato that neutrinos were measured to be moving faster than the speed of light. There is a debate about the possible effects on theories such as Special Relativity and General Relativity, and the issue of synchronizing and measuring the distance over which the neutrinos traveled. The possibility
  • #176
I believe they intend to get MINOS to use better metrology and redo the experiment whilst OPERA moves on with their original charter (oscillations).

As for supernova neutrinos, they've not yet been detected at high energies per experiments like Antares. Is it possible they only exist at those energies fleetingly until they escape the event?
 
Physics news on Phys.org
  • #177
Buckleymanor said:
Come on you would not have to cut a tunnel 730 km long, if you just managed to do the one way measurement of light using GPS and the same systems and direction of the OPERA experiment.
I mean how long does the tunnel or evacuation tube have to be to get an experimental handle on compareing the results.
If you did would it not be somewhat clearer to evaluating the experimental results for both neutrinos and light speed.

Perhaps I'm not understanding your question. If you tried to measure time of flight for photons over a shorter segment of the same path the neutrinos being detected at OPERA are following, how would you get neutrino results to compare it to? Are you proposing to move the OPERA detector? I don't think it's easily movable.

If you're just suggesting that we set up a shorter-length experiment to measure photon time of flight and neutrino time of flight over the same path, not necessarily from CERN to OPERA but someplace more easily manageable, that's different.
 
  • #178
Parlyne said:
I'd think, if the problem came from any sort of temperature effect that the experiment would have to see a large seasonal variation; and, the seem to have been very careful to demonstrate that they, in fact, do not.

Hi Parlyne,

You notice I didn't say anything about seasonal variations. I once worked in a place that was sometimes warmer in winter than in summer. The temperature had little correlation with seasons, but it did vary quite a lot. Measuring is a whole lot safer than assuming, especially when the conclusion is quite startling. The thing to do here is to be grindingly thorough, because measuring a 10 us proton pulse to an accuracy of a few ns is not trivial. The pulse detector SNR doesn't seem to support single-pulse measurements, so they used multi-pulse analysis. That are some pernicious things that might be buried in multi-pulse analysis which could confuse the result.

They seem to have done a good job, I mean I like that paper from a quality viewpoint, but when layers are peeled back there are always some assumptions. So for this paper every assumption should be lighted and examined somehow. That's exactly what they tried to do, but did they catch every effect? Are the assumptions good ones? I'm not worried about time and position references, looks like they did it right. I'm a lot worried about the proton pulse assumptions for the the multi-pulse statistical approach to be safe. I'm slightly suspicious about a 30 us fiber optic. A slightly wrong assumption in either of those two can bend the result a lot.
 
  • #179
If you built the OMEGA setup on opposite ends of the Earth and got roughly 17 times the OMEGA error (diamter of the Earth divided by 730km) would this confirm the supraluminal thing?

Sure (presuming that you just mean opposite sides of the Earth). It would be a harder experiment to perform; but, that would be what would be expected if this result is correct.
By my admittedly crude estimation, it would take a proton beam density ~306 times greater than theirs to achieve a similar neutrino detection statistic (assuming a linear beam dispersion but that seems safe). Or, the detector would have to be 306 times larger in area. This is using their figures of a FWHM beam width (I'm assuming diameter) of 2.8 km.

Doable I'm sure, although a more ambitious and expensive project.

PS I'm open to correction.
 
  • #180
kmarinas86 said:
Also, just because the extraction time is on the order of ten microseconds does not in away forbid time resolutions on the order of nanoseconds. Due to the relatively steep rise and fall of the beginning and end of each pulse, the beginning and end time of early arrival of the barrage of neutrinos can be obviously ascertained using atomic clocks of nanosecond resolution.

.. (60.7 ± 6.9 (stat.) ± 7.4 (sys.)) ns was measured.

Perhaps they really meant to say 'was calculated'

The time of flight of CNGS neutrinos (TOFν) cannot be precisely measured at the single interaction level since any proton in the 10.5 μs extraction time may produce the neutrino detected by OPERA. However, by measuring the time distributions of protons for each extraction for which neutrino interactions are observed in the detector, and summing them together, after proper normalisation one obtains the probability density function (PDF) of the time of emission of the neutrinos within the duration of extraction.


So they are burrowing statistically into a 10.5us rectangle, to resolve to 60.7ns.
- that's a statistics 'gain' of ~173, and you do need to be very careful of your PDF/correlation assumptions when chasing this much 'gain', especially indirectly.

I make it that a skew error of ~1.005292190792 in the Real:Assumed shape of that rectangle. will deliver the same time result.
 
  • #181
The kicker signal is just used as a pre-trigger and as an arbitrary time origin. The measurement of the TOFν is based instead on the BCT waveforms, which are tagged with respect to the UTC.

The UTC time stamp is based on the kicker signal and the TOFv does not appear to have the 50.2 ± 2.3 ns added to the OPERA waveform UTC timestamp at the end. It's a pity Fig 2 doesn't show the actual point where the kicker signal is collected.

The arrival time distribution of the photons to the photocathode and the time walk due to the discriminator threshold in the analogue frontend chip as a function of the signal pulse height were accurately parameterized in laboratory measurements and included in the detector simulation.

Several checks were performed by comparing data and simulated events, as far as the earliest TT hit timing is concerned. Data and simulations agree within the Monte Carlo systematic uncertainty of 3 ns for both the time difference between the earliest and the following hits, and for the difference between the earliest hit and the average hit timing of muon tracks.

As the detector simulation has the same error and the FPGA lag had been parameterised (included in the calcs) instead of being used as an end UTC time stamp adjustment you would get a consistent error.
 
  • #182
DevilsAvocado said:
which says "Time accuracy; 20 nsec".

So all the timing is based on some proprietary GPS receiver with proprietary firmware & maybe even proprietary rounding errors, with the antenna on top of the mountain & the equipment somewhere below, and every piece of wire & equipment in the whole project adding to latency.
 
Last edited by a moderator:
  • #183
CERN >c result - eliminating the errors

I was just thinking that it would be helpful to list the possible experimental errors and eliminate them as it is shown that they have already been accounted for. (See http://www.universetoday.com/89191/faster-than-the-speed-of-light-opera-update/ for some comments from the GSL people re their metrological accuracy.)

In broad terms there are at least the following possible types of errors:

wrong timing (at either end)
wrong distance (between CERN and the Gran Sasso Laboratory (GSL))
wrong neutrinos (ie they just happened to pick up stray neutrinos and misattributed them)
wrong calculation
wrong equipment

if there are other broad types of errors, point them out.

Wrong timing errors

To get an accurate timing, the clocks at CERN and GSL would need to be well synchronised and running at the same rate - what are the relative positions of the two facilities? An image of GSL seems to show that it is up in the mountains, and if I recall correctly, CERN is buried - do their clocks take this into account? Would an error in the difference in altitude be in the right order? Did the work carried out by CERN and GSL take this into account?

Could the measurement process have affected the measurement? (That is, are we looking at a sort of Heisenberg effect, where our observation of the neutrinos is somehow affecting timing in a way that we haven't figured out?)

I'm assuming that any systematic errors in measuring the emission times and arrival times would have been identified and eliminated quite early in the investigations.

Any other possible timing errors?

Wrong distance errors

As pointed out elsewhere, the 60ns that is involved translates to 10m. An error in measuring the distance between the transmitter (CERN) and the receiver (OPERA at GSL). The update article, however, did state that "the measurements of the initial source of the neutrino beam and OPERA has an uncertainty value of 20 cm over the 730 km. The neutrino flight time has an accuracy of less than 10 nanoseconds, and was confirmed through the use of highly regarded GPS equipment and an atomic clock. Every care was given to ensure precision". So 10m is well outside of their uncertainty value, as is 60ns.

The only questionable part of this is that they measured the distance between two points using GPS, rather than the measuring the path that light would take between the points.

Personally, I think this is where the error is. The path that would be taken by light between "the initial source of the neutrino beam and OPERA", if light could take that path without being absorbed/deflected by the chunk of the Earth in between, could possibly be shorter than the distance calculated to lie between the two points. This would not be in contravention of relativity, but might require a slight reinterpretation. I'll go into more depth in a follow up post.

The use of GPS eliminates one of the obvious errors, in that the surface distance between points is not the shortest distance between them. In any event, the chord between 732km of arc of the Earth's radius is something like 730km and such a miscalculation would result in an error in the order of milliseconds, not nanoseconds.

Any other distance errors?

Wrong neutrino errors

Included for completeness. If the experiment was done once, then it could be possible (but highly unlikely) that stray neutrinos could have been picked up 60+ns before the expected neutrinos. Even so, the expected neutrinos should have been picked up 60+ns later (unless the experimental equipment was arranged so that they were ignored).

However, the experiment was repeated. I strongly doubt that this is the error.

Wrong calculation errors

Included for completeness. Calculating a speed is so simple that the possibility of an undetected error in calculation is remote. Once the timings and distances are correct, the error would have to be a repeated misentering of data, and that is probably automated anyway.

If it's automated, then I'd assume that the figures would have been crunched by hand as well (I know I would).

Wrong equipment errors

Included for completeness. Basically they are measuring three things, the emission time, the arrival time and the distance between emitter and receiver. Wrong equipment will just affect their timing and distance.

However, I did think of a possible equipment error that would lead to timing issues. Transmission lag. Presumably, the CERN clock is not right on top of the emitter. Therefore information from the emitter would have to be sent to the clock, saying something like "neutrinos emitted now". At the other end, it would be the same, with information being sent to a clock saying "neutrino(s) detected now". Was this taken into account and eliminated?

Although I only included them for completeness, what other possible wrong neutrino, wrong calculation or wrong equipment errors are there?

neopolitan
 
  • #184


DaveC426913 said:
They traveled how far? Miles.

It means, light would have taken, say, 10,000ns to get to the detector, but these neutrinos arrived in less than 9,950ns. They didn't so much see them traveling at >c as they did see them arrive miles away before they were expected.

(I totally spitballed the numbers. Just trying to make the point.)

Ahhhhh, this makes perfect sense now! So it is possible for SR to still work because the neutrinos may have used a higher dimension or another way to to traverse the distance at speeds < c. But wouldn't this mean that from our reference point the neutrino's speed was still > c?
 
  • #185
Can anyone describe what technique would have been used to determine the chord length through the Earth between the two surface points? I suppose it's a commonly used technique, but I'm curious because the Earth is not a perfect sphere and each point is at a different altitude. Seems like it would be a tough thing to get within a few cm, but (afaik) nobody has so far detailed how that might be a possible source of error, except to question the GPS methodology. Edit: I see that neopolitan alluded to it in #433, but in the context of a possible GPS surface positioning error.

I just can't find any reference to how they did that.
 
Last edited:
  • #186


neopolitan said:
Wrong calculation errors
Included for completeness.
Also they could wrongly calculate statistical accuracy. There were about 16000 events from which only about 100-200 events were at leading and trailing edges of neutrino time distribution. See fig.12 from the paper http://hal.archives-ouvertes.fr/in2p3-00625946"

Meanwhile most of events are at flat region of the time distribution. They don't matter for calculation of time. (See fig.11)
If they use 16000 events to get 6 sigma then actual accuracy, calculated by 100-200 events, is about 1 sigma or less.
 
Last edited by a moderator:
  • #187
I guess just as many people here, it is tempting to "guess what went wrong" in this experiment, but some modesty is of course in order as the people looking at this result aren't idiots. So it is fair to assume that the things that have been mentioned in the paper are "well-done" and it is hard from an outsider to try to do better than the people who have their nose in the equipment since years.

I've read the paper and there are two things that weren't mentioned. One has already been brought up, and that is the GR effect of "plunging into a gravitational potential" which would change the interval as compared to the interval in an Euclidean space. However, as was pointed out, one might expect changes on the order of 10^-10, but you can hardly explain a 10^-5 effect.

The other thing I was wondering about, and I didn't see it in the paper, is:

in what reference frame is UTC defined ? Is this reference frame a "rotating frame" (in which case I have a hard time how a universal time can be defined), or is this reference frame a non-rotating frame ?

Because the rotation speed of the Earth at the Earth's surface is of the order of 10^-5 of the light speed.

In other words, if you look at the neutrino's moving in a reference frame that doesn't turn with the Earth (because you've defined your UTC time in that frame) then you should consider that the San Grasso lab is moving wrt the neutrino beam. Of course, in the reference frame of CERN + San Grasso, the time has been corrected for the dilatation, and the velocity of light is "the same" in this "moving frame", but the question is: is it the time of that reference frame (of the "moving CERN + San Grasso") or is the reference time (UTC) the one of a "non moving" (non-rotating) reference frame to which the two clocks are tuned ?

Probably it is silly, as I said before, to try to guess what could be wrong as an outsider if a whole team of professionals has been looking into this. It is just a matter of understanding myself of how the reference time frame was picked.
 
  • #188
Hello,

I just spent some time reading the recent OPERA-CNGS paper on apparently FTL neutrinos (http://arxiv.org/abs/1109.4897) .
In this reading, I have some difficulty to clearly see how the arrivals events are processed.
I am not even sure I understood properly basic information like:

- what the "chronometer" start event is
- how many pulses of 10µs where included in the analysis
- how many neutrinos where detected (is that the 16111 event mentioned?)
- how the 200 MHz source intensity oscillations are used/needed in the data processing
- if the rise time of the 10µs proton waveform plays any role in the analysis
- what is meant by "extraction"
- ...

I would like to understand more clearly how the data analysis proceeds without keeping the useless technical details. I would like to select the usefull information from this paper, as far as data processing is involved.

My current understanding is that when a 10µs proton pulse is produced, most often no neutrino is detected in Gran Sasso. During this 10µs proton pulse, the proton intensity oscillates about 2000 times between high and low intensity (5ns period). Therefore, a neutrino occasionally detected in Gran Sasso has more probability to have been produced during one of the 2000 high-intensity phases that during any of the 2000 low-intensity phases.
However, I do not see why any of the 2000 high intensity period would have a higher probability, and therefore I also do not understand why the time of flight could be determined with a precision better that 10µs, while the effect being discussed deals with a precision of about 10ns!
I really must have missunderstood something.

How was it possible to measure the time of flight with a 10ns precision, based on theis 10 µs proton pulse?

Thanks for your help.

Michel

(before eventually re-starting a specific thread focusing on data analysis)
 
  • #189
neopolitan said:
If this is in reference to my comment, you might have conflated things. I honestly don't know what the quantitative effect of dipping into an area of lower gravitational potential is. Would it be enough to account for this observation, or too little or too much? I just don't know. What I think, though, is that there might be some effect to some degree.

It is interesting that it was not mentioned. It might be worth thinking about a little more deeply.

cheers,

neopolitan

It was Vanadium who mentioned this, earlier in the thread (too lazy to look up the post).

If something, I'd bet more on the "simultaneous time coordinate" at Gran Sasso and at CERN. Simultaneity is frame-dependent, and in order for it to make sense as a measurement of "c" (assuming flat spacetime), you need to use the reference frame in which source and detection are stationary.
But if the simultaneous time coordinate is defined in a non-rotating frame (I simply don't know how the time calibration is done, in what frame they consider it to be simultaneous), then you cannot use moving sources and detectors wrt this frame in order to measure a velocity, because you then make the elementary SR application error of mixing time and space coordinates of different frames.

It is just that I don't know how the synchronisation between the two watches (at CERN and at Gran Sasso) is done. I thought is was through GPS, but is GPS not using a "fixed reference" frame independent of the rotation of the Earth for its time coordinate ? I'm asking, I don't know. It is that I don't see directly how one could define a simultaneous time coordinate in a rotating frame as it is not inertial.

The other thing that makes me bet on that, is that the correction must be of the order of 10^-6 as the beta (velocity due to rotation at Earth surface wrt inertial frame) is about 10^-6 c at the equator, which isn't too far from the effect that is observed.

0.5 km/second at the equator and light is 300 000 km/second, so 10^-6.
 
  • #190
vanesch said:
I've read the paper and there are two things that weren't mentioned. One has already been brought up, and that is the GR effect of "plunging into a gravitational potential" which would change the interval as compared to the interval in an Euclidean space. However, as was pointed out, one might expect changes on the order of 10^-10, but you can hardly explain a 10^-5 effect.

The other thing I was wondering about, and I didn't see it in the paper, is:

in what reference frame is UTC defined ? Is this reference frame a "rotating frame" (in which case I have a hard time how a universal time can be defined), or is this reference frame a non-rotating frame ?

Because the rotation speed of the Earth at the Earth's surface is of the order of 10^-5 of the light speed.

The rotation of the speed of the Earth at the Earth's equator as well as that at 45 degrees latitude (between France and Italy) is on the order of 10^-6, not 10^-5, times the speed of light.

So the rotation of the Earth cannot be the reason for the anomaly.
 
  • #191
kmarinas86 said:
The rotation of the speed of the Earth at the Earth's equator as well as that at 45 degrees latitude (between France and Italy) is on the order of 10^-6, not 10^-5, times the speed of light.

So the rotation of the Earth cannot be the reason for the anomaly.

You're right that it is 10^-6 and not 10^-5 (as I mentioned in another post). It might even be taken into account (seems elementary), but I haven't seen it mentioned in the article.
 
  • #192
DaleSpam said:
To my understanding, both the time and distance were measured by GPS, which measures in an earth-centered inertial frame.
GPS time is measured by atomic clocks on the surface of the rotating Earth. It is a fixed 19 second offset from International Atomic Time (TAI), which is also a mean sea level, rotating Earth based time frame. One TAI second = one GPS second = one UTC second. Are you thinking of Geocentric Coordinate Time (TCG)? TCG ticks slightly faster than GPS/TAI/UTC.
 
  • #193


cordially said:
Meanwhile most of events are at flat region of the time distribution. They don't matter for calculation of time. (See fig.11)
If they use 16000 events to get 6 sigma then actual accuracy, calculated by 100-200 events, is about 1 sigma or less.

This was brought up at the Q&A after the presentation. To quote one of the scientists: "I can fit anything in [the flat region]". The OPERA guys tried to claim that the flat region wasn't all that flat, and that the peaks and valleys there did matter.

I'm certainly no expert on this, but it seems to me they should follow up by using more, but shorter, pulses instead of two long ones. As you say, the most influential events are at the start and end of each pulse.

Also it would perhaps be interesting to analyze the flat region separately. For example add some small offset of the event timings and see how good the fit is still. If the "flat" region is indeed not all that flat, then the fit should quickly become poor.
 
  • #194
DaleSpam said:
To my understanding, both the time and distance were measured by GPS, which measures in an earth-centered inertial frame.

Mmm, but how come then that they find agreement with a land-based survey, which measures in the rotating frame, as they say, in the paper ?

Now, I know this is somewhat ridiculous, because the people of the experiment also know all this. It is just that I'm trying to wrap my mind around exactly what has been measured.
 
  • #195
D H said:
GPS time is measured by atomic clocks on the surface of the rotating Earth.

I'm having difficulties imagining how you can have a simultaneous time coordinate in a rotating frame. After all, as compared to an inertial frame, clocks at the poles don't suffer any time dilatation as their velocity wrt the inertial frame is 0, while clocks at the equator which have a significant velocity suffer a dilatation. So I don't see how you can "keep them synchroneous".
 
  • #196


Lord Crc said:
The OPERA guys tried to claim that the flat region wasn't all that flat, and that the peaks and valleys there did matter.
I can't understand why did they make likelihood analysis based on averaged proton distribution waveform, rather than using individual waveforms for each event.

As you compare example waveform for single event (fig.4 in the paper) with averaged waveform (fig.9) - they differ significantly, so waveforms must also significantly differ from event to event.

Relatively small number of events occurring when first peak (of 5-peak structure) is strongest (opposite to fig.4) may cause the likelihood fit (if computed with averaged pdf) to be shifted towards low values. As the sawtooth is strongly left-asymmetric, too low probabilities used for likelihood analysis affect left (rising) edge more than right one, causing systematic error towards low values of \delta t.
 
Last edited:
  • #197
vanesch said:
I'm having difficulties imagining how you can have a simultaneous time coordinate in a rotating frame. After all, as compared to an inertial frame, clocks at the poles don't suffer any time dilatation as their velocity wrt the inertial frame is 0, while clocks at the equator which have a significant velocity suffer a dilatation. So I don't see how you can "keep them synchroneous".
Clocks at the poles do suffer time dilation. The poles are 21.36 km closer to the center of the Earth than is the equator; they are deeper in the Earth's gravity well. All ideal clocks at sea level tick at the same rate. Sea level is an equipotential surface of gravitational plus centrifugal forces.
 
Last edited:
  • #198
DarkDrag0nite said:
I'm very confused here. Are we talking as the topic "neutrino speed >c" or "neutrino speed > light" ?

As I've seen from many News, all of them just said that it is faster than light.

>c, and not "light passing through the earth" because no light passing through the Earth was used to make a comparison.
 
  • #199
First, a lot of the issues people (especially first time posters) are bringing up are addressed in the paper. Read it. There are very few people who can tell what another group did wrong without knowing what they did.

Second, the difference between a rotating Earth frame and a stationary frame is essentially irrelevant. If you draw the space-time diagram for the setup, including
the GPS satellites (one is enough if you assume it's already synchronized) you will discover what they are measuring is very close to the interval between emission and detection, which is a Lorentz invariant. There are two corrections that need to be applied - one is the fact that LGNS is moving 50mph faster than CERN because of the Earth's rotation: that's a 10-15 effect. The other is that the Earth has moved between the emission and detection times by a few feet. That should be properly taken into account by the GPS receiver (and I have questioned this), and if it is, it's a 10-6 effect on the 10-5 effect, or 10-11.

As I have said before, the application of using GPS to synchronize two distant stations to a nanosecond is not common, and as such I am less confident that the firmware in the unit is bug-free than had the application been more widely used.

Third, the statistical techniques for determining whether Model A or Model B fit the data better (say a 0ns offset and a -60 ns offset) are almost a century old, and well-described in the paper, and shown clearly in Figure 8. The idea that some people here can do a better job with the statistics in their heads is ridiculous.

In any event, Figure 12 makes it clear - this is not a simple statistical fluke: if you moved the data 1.2 bins to the left or right, you would see the difference.
 
  • #200
D H said:
Surely, it is. Maybe I didn't communicate it right. vanesch remarked that "clocks at the poles don't suffer any time dilatation as their velocity wrt the inertial frame is 0, while clocks at the equator which have a significant velocity suffer a dilatation." vanesch forgot about general relativistic effects. The net effect is that clocks at sea level (better: clocks on the geoid) tick at the same rate.

Right. I stand corrected. I didn't realize that the GR effect was important here, as Vanadium stated that gravitational effects account for something like 10^-10 and I took that for granted.

However, SR effects account for about 10^-6 (relative velocities), so if what you say is correct, this means that GR effects are also of the order of 10^-6 for a depth of 20 something km. Now, the chord of a 700 km arc dips about 10 km deep into the earth, so one would expect then a similar GR correction to the interval.
 
  • #201
Vanadium 50 said:
As I have said before, the application of using GPS to synchronize two distant stations to a nanosecond is not common, and as such I am less confident that the firmware in the unit is bug-free than had the application been more widely used.

I'd say it is very common. GPS is one of two systems used for time transfer in the UTC itself meaning this is done routinely. Granted, it is the less accurate system, but the reason they used in here is presumably because it is good enough. Note that their clocks were calibrated by PTB and METAS and checked by movable time transfer. Hence, I think we can be pretty sure that someone would have told them if they were doing something wrong.
 
  • #202
vanesch said:
Mmm, but how come then that they find agreement with a land-based survey, which measures in the rotating frame, as they say, in the paper ?
The differences in distance Between the land based and gps surveys is several orders of magnitude too small.
 
  • #203
neopolitan said:
I'm just not sure that there is a general acceptance that a path which passes through a region of (albeit slightly) lower gravitational potential might be shorter than the calculated distance between two points on the surface of geoid.

Whatever effect there might or might not be from this, it is way too small to matter. See post #177; at most a change in path length due to the change in gravitational potential would be about a 10^-10 effect.
 
  • #204
vanesch said:
Right. I stand corrected. I didn't realize that the GR effect was important here, as Vanadium stated that gravitational effects account for something like 10^-10 and I took that for granted.

However, SR effects account for about 10^-6 (relative velocities), so if what you say is correct, this means that GR effects are also of the order of 10^-6 for a depth of 20 something km. Now, the chord of a 700 km arc dips about 10 km deep into the earth, so one would expect then a similar GR correction to the interval.

How do you get the SR effect you claim? For gamma to differ from 1 by 1 part in 10^7, I get a required relative speed of 83 miles per second. For a relative speed of 1000 mph, gamma differs from 1 by 1 part in 10^12 or so.
 
  • #205
Vanadium 50 said:
Second, the difference between a rotating Earth frame and a stationary frame is essentially irrelevant. If you draw the space-time diagram for the setup, including
the GPS satellites (one is enough if you assume it's already synchronized) you will discover what they are measuring is very close to the interval between emission and detection, which is a Lorentz invariant.

Well, I don't want to know how the GPS system actually works, what counts is what is the result of it. If it gives you the synchronised reference time in a stationary frame, then you assume that they have build in all necessary corrections to do so.

What I wanted to say was that if you "synchronize" in a stationary reference frame Oxyzt, which means that at events "Emission" and "Reception" you measure "t" (the t of the reference frame Oxyzt), but you measure the distance between "Emission" and "Reception" in a frame Ox'y'z't' using worldlines of stationary points (that is, with 0 velocity in frame Ox'y'z't') so that it is easy to measure that distance in that frame, then you cannot combine this distance measured in Ox'y'z't' with a time measured on Oxyzt.

My question was what kind of time coordinate (in what kind of frame) is used in the GPS system (no matter how they actually do it, assuming they do it right), and I thought that it was only possible in an intertial frame. However, I stand corrected, this can also be a time on a rotating geode which also contains another "universal time" as I forgot about the GR correction.

But it DOES matter what reference frame one uses to define "synchronised time", because mixing a time coordinate from one frame and a distance from another is at the origin of all "paradoxes" in introductory SR, such as the pole-barn paradox and the like.

There are two corrections that need to be applied - one is the fact that LGNS is moving 50mph faster than CERN because of the Earth's rotation: that's a 10-15 effect.

Which should then according to DH be annihilated by the geode effect.

The other is that the Earth has moved between the emission and detection times by a few feet. That should be properly taken into account by the GPS receiver (and I have questioned this), and if it is, it's a 10-6 effect on the 10-5 effect, or 10-11.

That's if you're working in an inertial frame ! If you work in the rotating frame that is not the case. This is why defining the correct reference frame is so important, and rather tricky in this case.

The point is not that I think I'm smarter than those guys, it is just that nothing of all this was mentioned in the paper.
 
  • #206
PAllen said:
How do you get the SR effect you claim? For gamma to differ from 1 by 1 part in 10^7, I get a required relative speed of 83 miles per second. For a relative speed of 1000 mph, gamma differs from 1 by 1 part in 10^12 or so.

I was talking about beta and thought about a series development in beta, but now you come to say it, for most relativistic corrections the first non-zero term term is beta-squared. So this pushes the effects indeed in the 10^-12 range or so.

I guess this closes the discussion about a relativistic effect due to Earth's gravity or rotation...
 
  • #207
lalbatros said:
Hello,

How was it possible to measure the time of flight with a 10ns precision, based on theis 10 µs proton pulse?

Thanks for your help.

Michel

(before eventually re-starting a specific thread focusing on data analysis)

One pulse does not have good enough signal to noise ratio to get a time of flight precision of a few ns. The proton pulse actually doesn't have much current in terms of everyday lab measurements, although it's a huge current in terms of teravolt particles. So they made a model by using many emitter pulses, and comparing many receive pulses to it. Several people pointed out that there can be hidden assumptions when that is done. For example, one hidden assumption might be that the emitter pulse is invariant shape, except for band-limited Gaussian noise. If that's wrong, then the mathematical processing used to put together the 'average' of many pulses goes a little wrong and might make a bias which could be unaccounted for.

All my tentative "might' and "could' words are because, they're smart guys and maybe they already did it just right, but a paper with that level of total detail in it would be unreadable! There's deep exam questions here about experimental technique, just as should be. It's a lot of work for them to answer even a few of the most carefully considered issues of the critics. This will take time. There is no way around it, and they understand that.
 
  • #208
f95toli said:
I'd say it is very common. GPS is one of two systems used for time transfer in the UTC itself meaning this is done routinely. Granted, it is the less accurate system, but the reason they used in here is presumably because it is good enough.

That's the point - who is using something more complicated than something you buy at Fry's for this particular application? The bigger the market for this, the less likely something is odd in the firmware.
 
  • #209
PAllen said:
They said they used a 3-D coordinate system, which implies they considered this.

Sorry, didn't know that. But another problem arises with the use of GPS. The satellites which are making these measurements may slip a bit in their orbits - they are not in absolutely perfect geostationary orbits. Even a deviation of [itex]\pm[/itex]1 meter could have an enormous effect on the accuracy of the neutrino reading.
 
  • #210
GPS satellites are not in geosynchronous orbits.

Whatever mistake was made, if a mistake was made, was quite subtle. That group has been building up this data for a few years. They looked for obvious explanations, not so obvious explanations, asked outside groups for help, and still couldn't find anything that explained their results.

I'm guessing that they did do something wrong. I'm also guessing that we at PhysicsForums will not be the ones to ferret that mistake out.
 

Similar threads

Replies
1
Views
2K
Replies
1
Views
2K
Replies
2
Views
2K
Replies
8
Views
4K
Replies
30
Views
7K
Replies
19
Views
4K
Replies
46
Views
5K
Back
Top