- #1
- 1,516
- 456
Sorry, me again. In another thread #47 I gave a experimental setup to detect an anisotropy in light speed of the form ##c(\hat{n}) = c_o + \alpha (\hat{n}\cdot\hat{v})##. It was pointed out quite correctly that the error incurred in the long arm to the sensor would cancel the anisotropy making the time delay between pulses appear as ##L/c+ = L/c- = L/c##. It was also pointed out that this is due to the existence of a coordinate transform which insures the isotropic result will always be measured. Cool.
It's fair to ask if this situation remains true if I were to consider another form for the anisotropy. If one considers ##c(\hat{n}) = c_o + \alpha (\hat{n}\cdot\hat{v})^3## then the error incurred in the long arm vanishes as the inverse cube of the distance to the detector when the long arm is in the plane ##\hat{n}\cdot\hat{v} = 0## while the anisotropy along the short arm would remain unaffected. I surmise that a coordinate change to introduce such an anisotropy can not exist otherwise the experiment wouldn't work for the reasons given before.
Is this correct or am I still confused?
It's fair to ask if this situation remains true if I were to consider another form for the anisotropy. If one considers ##c(\hat{n}) = c_o + \alpha (\hat{n}\cdot\hat{v})^3## then the error incurred in the long arm vanishes as the inverse cube of the distance to the detector when the long arm is in the plane ##\hat{n}\cdot\hat{v} = 0## while the anisotropy along the short arm would remain unaffected. I surmise that a coordinate change to introduce such an anisotropy can not exist otherwise the experiment wouldn't work for the reasons given before.
Is this correct or am I still confused?