Lorentz violating severely restricted: Mqg/Mplank > 1200

  • Thread starter MTd2
  • Start date
  • Tags
    Lorentz
In summary, this conversation discusses the recent detection of high-energy emission from a short gamma-ray burst, GRB 090510, using the Fermi Gamma-ray Space Telescope. This emission shows a significant deviation from the Band function, suggesting two distinct spectral components and challenging the prevailing gamma-ray emission mechanism. The detection of a 31 GeV photon during the first second of the burst also sets the highest lower limit on a GRB outflow Lorentz factor, indicating that the outflows powering short GRBs are highly relativistic. This photon also sets limits on a possible linear energy dependence of photon propagation speed, requiring a quantum-gravity mass scale significantly above the Planck mass. However, this result does not disfavor loop quantum gravity or other
  • #106
Micha said:
... Lubos Motl concludes, that LQG is dead. Where is your voice, Marcus?

Hi Micha. I have never found L especially reliable on the facts or successful at anticipating the future course of research. I really have no business getting involved in this thread. It is an exciting lively discussion and everybody is doing a great job. I think I will try to keep "my voice" out of it (unless it gets too tempting to resist) but thanks so much for asking! Maybe I will start a quieter thread trying to anticipate how QG research is likely to go, emphasis-wise, over the next 6 months or so. Then if I make predictions, and they turn out wrong, anybody who bothers to read can laugh at me.
 
Last edited:
Physics news on Phys.org
  • #107
It is interesting that a paper (cited in the OP) suggesting that there is no frequency-dependent photon dispersion (to some constraint) rests on the capture of one high-energy photon in one observation. That's not really good science, regardless of the number of names and sponsoring agencies on the paper. That signal could have been unrelated to the GRB in question.

Astronomy is an observational science, and if it going to be used to test and constrain cosmology (a really good idea, IMO) people have to take a breath and wait for trends in repeatable observations to lead the way.
 
Last edited:
  • #108
atyy said:
I agree that the major question about asymptotic safety is its existence. But suppose the UV fixed point for gravity exists, and the critical surface is finite dimensional - will that be enough to make predictions, or will there still be a problem coming in from electroweak theory not having a continuum limit (ie. can asymptotically safe gravity and electroweak theory be combined at Planck scale energies)?
Dear atyy, this is a whole research project of yours. Great questions. ;-)

But there are no clearcut answers known to these questions. It is not known whether the UV fixed surface is finite-dimensional without electroweak stuff added. So of course, it's also unknown whether it would remain finite-dimensional with the non-gravitational forces included. And it is unclear how the Landau poles would be treated. At any rate, it's clear that one cannot "neglect" the nongravitational forces at the Planck scale because they're actually *guaranteed* to be stronger and more important than gravity, so one would be neglecting the bulk of the forces, see

http://arxiv.org/abs/hep-th/0601001

Also, it is unknown how to actually extract predictions from asymptotically safe theories although it should be possible: but no systematic procedure is known. If it were known, it would have to be possible to prove the finiteness of the UV surface, too. And so on. So asymptotic safety's only realistic goal at this moment is to defend an idea, not predict or explain things beyond this idea, which I find too little, too late.
 
  • #109
Micha said:
Hi Lubos,
as Christine and others pointed out, we are talking about one photon. I don't think that it is appropriate to say, that some theory has been falsified by measuring one photon. Never. Of course this does not change anything in all the arguments exchanged in this very interesting thread. You can always add the small if clause "if the result is confirmed then..." and then we can put a probability to that. But by completely dropping it, I think you make your position attackable at a point, where it is not necessary. Because whether this is confirmed or not we can just wait and see. No point to put energy into this if you are a theorist.
Dear Micha, what you write is just nonsense.

Strict and careful analysis implies that it doesn't matter whether a theory is falsified by one photon or 2009 photons. What matters is the confidence level. There is no rule in science that one needs at least two particles to falsify a hypothesis, and there can't be one. Such a rule would be completely arbitrary. And the confidence level that the photon couldn't have been delayed/sped up by minutes is well above 99%: just look how these things are argued in the paper. The last line of page 16 explains, for example, what conclusion is at the 99% confidence:

http://arxiv.org/ftp/arxiv/papers/0908/0908.1832.pdf

It's the inequality with 1.22 times the Planck mass. If one looks at somewhat lower confidence levels, he gets to those 100 times Planck mass. 99% confidence level is higher than the confidence level declared by the IPCC that the 20th century warming was mostly man-made - yet the latter statement is often said to be "settled" (and even I tend to agree that this particular IPCC statement is true, except that I don't think that 0.4 deg C per century - a majority of the observed warming - is worrisome in any way).

I am not "dropping" any confidence levels. I am just saying that they are not simple functions of the number of photons and that even with the observations they made, they're so high that the question is de facto settled, especially if the LQG people were really predicting between minutes and weeks (!) of lags - this is surely not above one second and probably not even above 10 ms. (And the actual prediction of Lorentz-breaking theories is that the speed of light is completely arbitrary!) The question will never be "quite" settled, at 100%, because this is impossible in the real world. One can only be getting "closer" to 100%. However, if you want, I am ready to make a bet that future photons from similar events will just confirm the same thing: no lag. Ready to make a bet? USD 1,000?

Before you decide, let me say that there was also a 3.4 GeV photon about 0.2 seconds from the 31 GeV photon. With 99% confidence, more than 90% of the future photons will arrive within the same 2-second window during all future measurements. That's because the theory - relativity - predicting this statement has been supported by a 99% CL evidence. Forget all crazy comments (in the media, and not only media) about a 4-minute delay produced in MAGIC etc. The delay accumulated by a photon crossing the whole visible Universe can't exceed 2 seconds, at 99.9% confidence level. That's what this Fermi observation shows.

I actually don't believe that you believe otherwise. It would be downright preposterous.
 
Last edited:
  • #110
MTd2 said:
There would be no QCD fathers if they thought like that before studying it.
You're completely wrong about the history, too. Read, for example, 25 years of asymptotic freedom by Gross

http://arxiv.org/abs/hep-th/9809060

It contains a detailed section about his path to asymptotic freedom.

It is very clear that he - and others - never studied theories that they thought were wrong (and not even theories that had no good circumstantial evidence to be right). So instead of a non-existing QCD (or QCD without a proof of a right behavior), he focused on sum rules and good no-go theorems (about positive beta functions) which were shown to have a loophole, which is what really created QCD. But there was no QCD before this discovery. There had to be Yang-Mills theory - for decades - but its relevance for strong interactions couldn't have been understood.

The QCD discovery *is* the discovery of its asymptotic freedom, so there are no fathers of QCD (I mean the theory primarily with gluons) before the discovery of asymptotic freedom. The closest people may be Yang and Mills but saying that Yang-Mills theory would have been relevant for the strong force would have been pure and vacuous wishful thinking before some evidence was known - and the negative beta-function was the first evidence.
 
  • #111
humanino said:
You know MTd2, Lubos admitted that he is on a political fight against LQG. I am not sure what is the point to continue any scientific argumentation for or against here, we merely have lobbying activity and it is against PF rules. So I am not sure how much will be necessary and whether it is worth pushing in this direction, for instance Lubos made several references to "God" which should be enough for moderation of a "regular" member.
I didn't write I am in political war with LQG. I am in political war with some people behind it, like Smolin, who want to destroy science as we've known it for centuries. It just happens that they also defend LQG - but it's not a coincidence that the people who defend unscientific methods to determine the truth also end up with unscientific theories.

"My" well-known signature about God who wrote the world in the language of mathematics is due to Galileo, and very similar comments came from Einstein as well as the majority of famous physicists, too. Feel free to suggest censoring but it shows something about you, not about Galileo, Einstein, or me, for that matter.
 
Last edited:
  • #112
marcus said:
I don't understand your reasoning, Sauron. LQG researchers tried for some years to derive a prediction of dispersion, but could not make the 4D theory yield such a prediction.
This observation makes dispersion less likely.

That's very interesting, Marcus. And what about e.g. these papers

http://arxiv.org/abs/gr-qc/9809038
http://arxiv.org/abs/hep-th/0108061
http://arxiv.org/abs/gr-qc/0411101
http://arxiv.org/abs/gr-qc/0403053
http://arxiv.org/abs/hep-th/0603002
http://arxiv.org/abs/hep-th/0111176
http://arxiv.org/abs/hep-th/0208193
http://arxiv.org/abs/hep-th/0501116
http://arxiv.org/abs/gr-qc/0207030
http://arxiv.org/abs/gr-qc/0207031
http://arxiv.org/abs/gr-qc/0207085
http://arxiv.org/abs/hep-th/0501091
http://arxiv.org/abs/hep-th/0605052
http://arxiv.org/abs/gr-qc/0404113

and hundreds of papers that cite them to almost unanimously conclude that LQG predicts an energy-dependent speed of light? What about all those loud LQG people who were so proudly saying that they had a "prediction"? What about basic physics knowledge that makes it completely obvious that every discrete model of space - with discrete real spectra - is inevitably incompatible with Lorentz symmetry because Lorentz symmetry implies that these spectra are both continuous and allowed to be imaginary?
If it is born out by other similar observations then this will help guide their development of LQG and save them trouble.

It certainly does not falsify the approach :biggrin: since there was no prediction that actually derived from the theory. I see this kind of Fermi-LAT observation as stimulating for LQG and the other QG approaches.

The task of deriving predictions still remains, and various avenues are being explored. But that is a separate topic. All this observational result does is give more direction and focus to the effort. Or? Please explain if you see it differently.

I don't think that you can be both mentally healthy and having elementary human honesty at the same moment if you're able to write these things. This story is over.
 
  • #113
lumidek said:
I am in political war with some people behind it, like Smolin, who want to destroy science as we've known it for centuries. It just happens that they also defend LQG - but it's not a coincidence that the people who defend unscientific methods to determine the truth also end up with unscientific theories.

In what way does Smolin want to destroy science? We all know what smolin are no longer as keen on string theory as he maybe was long time ago, but assming your not equating "science in phyisics" with string theory, then can you be a little bit more specific?

Smoling seeks a little more diversity, and suggest we should not put all eggs in one basked. Never ever have I read him say that string theory should not be researched. Smolin is quite openminded in constrats to some other people which appear very intolerance about differently thinking minds - for no convincing reason.

I don't see how trying to actively suppress variety is good scientific method. The rational choice would be to scheduele resources as per the probable potential. Of course, each scientist may (without contradition) make different ratings of what programs are more probable. That's why there is diversity in the community.

So is the politics what you have mind? or are you talking about smolins rejection of eternal timeless laws?

If I'm not mistaken, you're a hired professional right? Why would other programs treaten you? If you are right and everyone else is wrong, given time you will be the hero. Why not for the sake of healty diversity give your opponents a break?

I don't think anyone here suggest you should drop doing string theory. You obviously burn for it, so go for it with all you've got.

Also, I don't like the main LQG program for other reasons. But fortunately LQG and string theory aren't the only options either. And if there's no existing program no intolerance should ban be from exploring it.

You seems to lump any "non-string approaches" together and seem to think that the string framework is certainly true and unquestionable, and that whatever comes next, will fit into your string world.

This is the intolerance I react to. I don't see Smolin displaying even a fraction of such intolerance??

Can you enlighten me what is so horrible about Mr Smolin? Also as far as I know, smolin isn't into pure LQG, he has been elaborating a lot of ideas including hte ideas of evolving law, that really doesn't fit in the LQG framework. I find rovelli's and smolins reasoning to be in strong contrast of several important points.

/Fredrik
 
  • #114
Certainly the finding on this thread, would - given that the confidence can be established - constrain only certain linear LIV models (which appears to be a small subset of possible "non-stringy appraoches"), which is nothing that worries me anyway and it's no convincing argument for string theory.

/Fredrik
 
  • #115
lumidek said:
That's very interesting, Marcus. And what about e.g. these papers

That's because you call LQG papers, those that are actualy about non-LQG quantum gravity or either that failed to see lorentz breaking effects on LQG! :)
 
  • #116
lumidek said:
It is very clear that he - and others - never studied theories that they thought were wrong (and not even theories that had no good circumstantial evidence to be right).
As opposed to people who study theories that think are wrong? This is not an issue. People study new things because they don't know its final form. This is the same case with gravitational assymtotic safety.
 
  • #117
Hans de Vries said:
Now, while agreeing with you, how would you explain that your favorite theory
doesn't exhibit the same problem? Why doesn't the photon propagator become
"fuzzy" with all these complicated geometry paths at the Planck scale?

The (not so well known) "photon self-propagator" which has the photon
field itself as a source, rather than the current j, does a wonderful job in
canceling out the contributions on all paths other than the light-cone path
(see sect 1.19 of: http://physics-quest.org/Book_Chapter_EM_basic.pdf )
but it needs a flat geometry at Planck's scale.

Regards, Hans

lumidek said:
Because all the interactions that govern physics in the 10/11-dimensional space are Lorentz-invariant i.e. under SO(3,1), all effective actions encoding how the subtle massive and wound objects will influence photons in 3+1D will be Lorentz-invariant, too. So there's never any predicted dispersion or anything like that. All the fundamental laws of string theory are exactly Lorentz-invariant, which means that all the effective laws obtained with them - and all predictions - will be Lorentz-invariant, too. Lorentz invariance implies that the speed of light is a universal constant.

Hi, Lubos

I see you went into this issue by answering ensabah6's post. Thank you for doing so.

lumidek said:
Lorentz invariance implies that the speed of light is a universal constant.

Nevertheless, the propagation speed of massless particles, in this case the photon,
is often not equal to c even if the theory is Lorentz, invariant, with the best know
example the flat spaces with an even number of spatial dimensions which have photon
Green functions which are non-zero inside the light cone. (Figure 1 http://physics-quest.org/Higher_dimensional_EM_radiation.pdf" )

Gravitational lensing is another case where you get a dispersion of the photon propagator.

So it doesn't seem that easy for string theory to escape dispersion due to its complicated
geometry paths at Planck scale (except for the trivial Brane world case of course)

Do you want to say that you expect that the holonomy requirements which lead to
CY and G2 manifolds should be responsible for the cancellations of the non-light cone
paths? and if so do you have any interesting links?Regards, Hans
 
Last edited by a moderator:
  • #118
A counterpart to Lubos' reaction would be to affirm that the recent results by LIGO, e.g.,

http://egregium.wordpress.com/2009/08/20/news-from-ligo/

completely falsify the string theory programme (where in fact they only appear to rule out some ST models that favor the existence of cosmic strings).
 
Last edited by a moderator:
  • #119
The great grand father of spinfoams is Penrose. If we take him seriously, our 4 dimensional space-time emerges from an underlying twistor space. In this context, light-cone structures are maintained even with quantum fluctuations of the geometry, so Lorentz invariance should hold firm. Witten took interest in this direction as you know, and it's quite an interesting possibility even for string theory. Also in that case, locality should be reconsidered.
 
  • #120
lumidek said:
...
Strict and careful analysis implies that it doesn't matter whether a theory is falsified by one photon or 2009 photons.
...
Ready to make a bet?

What, if the single photon was due to a detector malfunction? Do you know all of its inner workings to be sure?

In the real world you never get 100% security, yes. But if a different group with a different detector got the same result, then I think people would start to accept a strong word like "falsify". This is exactly not about betting, but about careful language.

The LHC has two technically completely different detectors constructed and operated by different groups. Is it a waste of resources?
 
  • #121
Micha said:
What, if the single photon was due to a detector malfunction? Do you know all of its inner workings to be sure?

In the real world you never get 100% security, yes. But if a different group with a different detector got the same result, then I think people would start to accept a strong word like "falsify". This is exactly not about betting, but about careful language.

The LHC has two technically completely different detectors constructed and operated by different groups. Is it a waste of resources?

This is an interesting question - the experiment should be done by a "competent" group - but the competence can only be verified by another "competent" group(s) repeating their results. But if the competence is verified, then the first observation would constitute the the falsification.

Like the proof of Fermat's last theorem - when is it proved?

It's like the event horison - forms in the past but known only in the future? :-p

Edit: I guess Perelman's proof would illustrate the sociological issues better?
 
  • #122
Would quantization of time break lorentz invariance? does string theory posit that time is infinitely continuous?

Would it break Lorentz invariance if both space and time are quantized, and all photons of all frequencies can travel one Planck length per Planck time?
 
  • #123
atyy said:
This is an interesting question - the experiment should be done by a "competent" group - but the competence can only be verified by another "competent" group(s) repeating their results. But if the competence is verified, then the first observation would constitute the the falsification.

Like the proof of Fermat's last theorem - when is it proved?

It's like the event horison - forms in the past but known only in the future? :-p

Edit: I guess Perelman's proof would illustrate the sociological issues better?

I consider these questions to be mostly artificial, created by our language.

Proof, falsification and so on, these words work like Boolean algebra, either true or untrue. Anything in the real world however, like collecting evidence, is a continuous process. Only our impatient mind wants to put labels. And they are useful for orientation.

I just think, Lubos is a little impatient to put them, which brings critics and discussion, where unnecessary.
 
  • #124
Micha said:
Proof, falsification and so on, these words work like Boolean algebra, either true or untrue. Anything in the real world however, like collecting evidence, is a continuous process. Only our impatient mind wants to put labels. And they are useful for orientation.

Yes, continuous is better than discrete - otherwise you violate Lorentz invariance :smile:
 
  • #125
atyy said:
Yes, continuous is better than discrete - otherwise you violate Lorentz invariance :smile:

:biggrin:
 
  • #126
Hans de Vries said:
Hi, Lubos

I see you went into this issue by answering ensabah6's post. Thank you for doing so.

Nevertheless, the propagation speed of massless particles, in this case the photon,
is often not equal to c even if the theory is Lorentz, invariant, with the best know
example the flat spaces with an even number of spatial dimensions which have photon
Green functions which are non-zero inside the light cone. (Figure 1 http://physics-quest.org/Higher_dimensional_EM_radiation.pdf" )

Gravitational lensing is another case where you get a dispersion of the photon propagator.

So it doesn't seem that easy for string theory to escape dispersion due to its complicated
geometry paths at Planck scale (except for the trivial Brane world case of course)

Do you want to say that you expect that the holonomy requirements which lead to
CY and G2 manifolds should be responsible for the cancellations of the non-light cone
paths? and if so do you have any interesting links?

Regards, Hans
Dear Hans, the constancy of the speed of light in the vacuum - regardless of the motion of the source, observer, or frequency - is a postulate of special relativity, so it's valid everywhere as long as relativity is taken seriously.

One may produce theories with a "different light", building upon special relativity. Massive light will obviously move slower than the "speed of light" (= the ultimate cosmic speed limit). There are good theoretical and experimental reasons to think that light can't be massive. Also, in string theory, one can slow the light down by the B-field two-form.

If there's one, gravitons continue to move by the maximum speed - which is often misleadingly called "speed of light" - while the photons may become slower. But the B-field in the (realistic) vacuum must be zero because of many other reasons, within string theory or independently of string theory.

In the context of a curved space, the motion "by the speed of light" is generalized to "motion along null trajectories". All light in GR is moving along null trajectories. Nothing is changed about it in the case of light bending (quite on the contrary, the light-like shape of the trajectory is crucial in any calculation of light bending), and nothing is changed about it because of G2 holonomy of compact manifolds in string theory, other shapes, embedded branes or fluxes, or any other feature of string theory.

The constancy of the speed of light is a fundamental and universal law of physics and you're confusing yourself if you want to think that any of the things you mentioned - or almost any other effect - destroys the validity of this postulate of relativity. It doesn't. And it doesn't destroy the validity of any consequence of relativity, either. It remains true in string theory as well as any other theory that is compatible with the observations of spacetime geometry as known in 2009. Relativity is a universal set of rules about the relationships between space and time, momentum and energy, electricity and magnetism, and other things. It's valid at all distance scales.
 
Last edited by a moderator:
  • #128
Hi, Lubos

Thank you for replying, I do appreciate it.

lumidek said:
Dear Hans, the constancy of the speed of light in the vacuum - regardless of the motion of the source, observer, or frequency - is a postulate of special relativity, so it's valid everywhere as long as relativity is taken seriously.

One may produce theories with a "different light", building upon special relativity. Massive light will obviously move slower than the "speed of light" (= the ultimate cosmic speed limit). There are good theoretical and experimental reasons to think that light can't be massive. Also, in string theory, one can slow the light down by the B-field two-form.
I was referring here to the standard theory where the Green's functions of the photon
in even dimensional spaces (2n+1d) are non zero inside the light-cone. (although still
expressed in the Lorentz invariant parameter t2-r2)

So, although the Green's functions are mostly concentrated on the lightcone, they
are not 100% so, and there is dispersion. This is the origin of the fact that Huygens
principle is not valid for (2n+1d) spaces. There was a discussion on this a while ago
at sci.physics.research.

You may find my old derivation the most accessible which is http://physics-quest.org/Higher_dimensional_EM_radiation.pdf" , but it was not the
first. Some relevant links:
[1] S. Hassani, Mathematical Physics, (Springer-Verlag, New York, 1998)
Is mentioned to contain a complete derivation of the massless propagators in the space-time domain in any dimensional space.

[2] D. V. Gal'tsov, Radiation reaction in various dimensions, Physical Review D 66, 025016 (2002). hep-th/0112110
http://arxiv.org/abs/hep-th/0112110

[3] Cardoso et. al. Gravitational Radiation in D-dimensional Spacetimes, Physical Review D 67 064026 (2003). hep-th/0212168
http://arxiv.org/abs/hep-th/0212168Regards, Hans
 
Last edited by a moderator:
  • #129
Dmitry67 said:
I wonder what superstring theories tell about this: http://en.wikipedia.org/wiki/Faster...t_.28Casimir_vacuum_and_quantum_tunnelling.29
-- Scharnhorst effect

Is it always so small? Even on the planks scale?
ALso no matter how small is the difference, the "c" in all formulas is the apparent light speed (reduced by the vacuum) or the fundamental one?
Dear Dmitry, good points.

These Casimir changes to the speed of light are correctly derived from low-energy approximate physical theories, and because string theory reduces to them at distances longer than the Planck scale, it also reproduces the conclusions about the speed-up in the Casimir context.

The constancy of the speed of light refers to the vacuum, and when you have the Casimir plates, it's no longer considered "vacuum". Still, if one properly calculates how quickly the "information" in the light can be transmitted, this speed never exceeds the speed of light in the vacuum.
 
  • #130
Hans de Vries said:
Hi, Lubos

Thank you for replying, I do appreciate it.

I was referring here to the standard theory where the Green's functions of the photon
in even dimensional spaces (2n+1d) are non zero inside the light-cone. (although still
expressed in the Lorentz invariant parameter t2-r2)

So, although the Green's functions are mostly concentrated on the lightcone, they
are not 100% so, and there is dispersion. This is the origin of the fact that Huygens
principle is not valid for (2n+1d) spaces. There was a discussion on this a while ago
at sci.physics.research.

You may find my old derivation the most accessible which is http://physics-quest.org/Higher_dimensional_EM_radiation.pdf" , but it was not the
first. Some relevant links:

[1] S. Hassani, Mathematical Physics, (Springer-Verlag, New York, 1998)
Is mentioned to contain a complete derivation of the massless propagators in the space-time domain in any dimensional space.

[2] D. V. Gal'tsov, Radiation reaction in various dimensions, Physical Review D 66, 025016 (2002). hep-th/0112110
http://arxiv.org/abs/hep-th/0112110

[3] Cardoso et. al. Gravitational Radiation in D-dimensional Spacetimes, Physical Review D 67 064026 (2003). hep-th/0212168
http://arxiv.org/abs/hep-th/0212168


Regards, Hans
Dear Hans, very good points. Indeed, the Green's functions are zero or nonzero inside the light cone depending on the dimension. But this fact doesn't mean that the light doesn't move by the speed of light.

There are several kinds of Green's functions and they differ in details of their interpretation and behavior. Generally, they're the response functions, and any point inside the future light cone of P is allowed to "respond" to changes at point P.

But the propagation of physical photons is only governed by the so-called "on-shell physics", namely the behavior of the Fourier-transformed Green's functions - which are functions of energy and momenta - near the on-shell loci of the momentum space, e.g. near E^2=p^2.c^2 in this case.

If you change the momentum Green's functions away from this locus arbitrarily, the propagation of the light won't be affected, but the behavior of the spatial Green's functions inside the light cone will. This proves that not all information in the Green's functions is relevant for questions about the propagation of physical particles: for example, field redefinition leads to completely different Green's functions but completely the same propagation of physical particles. So you must be very careful when you try to extract conclusions about the propagation from the Green's functions. You have apparently used an algorithm that was oversimplified - well, it was incorrect.

Well, it's true that the strong quantitative version of the Hyugens principle fails in odd spacetime dimensions because of the comment you wrote.
 
Last edited by a moderator:
  • #131
lumidek said:
Dear Hans, the constancy of the speed of light in the vacuum -
The constancy of the speed of light is a fundamental and universal law of physics and you're confusing yourself if you want to think that any of the things you mentioned - or almost any other effect - destroys the validity of this postulate of relativity. It doesn't. And it doesn't destroy the validity of any consequence of relativity, either. It remains true in string theory as well as any other theory that is compatible with the observations of spacetime geometry as known in 2009. Relativity is a universal set of rules about the relationships between space and time, momentum and energy, electricity and magnetism, and other things. It's valid at all distance scales.

Or so you assert.

I agree with you, that the role of science is to reduce a mess of observational data to a minimum of elegant principles, that reproduce the observational data as an emergent feature.

Sometimes, one makes missteps along the way. One embraces fundamental truths, which turn out to be only approximate truths in retrospect. Euclidian geometry, gallilean relativity, to name just a few.

Sometimes, the way forward, is to take a step back. Lorentz invariance may be just such a case. You seem to treat it as an article of faith. << personal insult removed by Mentors >>
 
Last edited by a moderator:
  • #132
Eelco said:
Sometimes, the way forward, is to take a step back. Lorentz invariance may be just such a case. You seem to treat it as an article of faith.

Well, if religion is a belief without sufficient proof then Special Relativity
is certainly no religion, to say the least...

For string theorists any proof that Lorentz invariance holds at "all" length
scales is of course welcome. Otherwise they would not only have to deal
with a complex geometry at Planck scale but also with a deviation of one
most elementary principles of physics. Not really a workable situation...

Regards, Hans
 
  • #133
Hans de Vries said:
Well, if religion is a belief without sufficient proof then Special Relativity is certainly no religion, to say the least...
As you say in the following paragraph, he implies invariance at all scales. So, that is religion for you too, since you imply doubt Lorentz invariance at all scales.
 
  • #134
Hans de Vries said:
Well, if religion is a belief without sufficient proof then Special Relativity
is certainly no religion, to say the least...
Neither was Euclidian geometry. Euclidian geometry is no religion, but asserting that geometry could only ever possibly be Euclidian, because you say so, is a religious attitude, even without any observational evidence to the contrary.

Does the fact that you cannot conceive of any other possibility prove it must be so, or is it evidence of your own limitations?


For string theorists any proof that Lorentz invariance holds at "all" length
scales is of course welcome. Otherwise they would not only have to deal
with a complex geometry at Planck scale but also with a deviation of one
most elementary principles of physics. Not really a workable situation...
Ditching euclidian geometry could even more aptly be described as 'not really a workable situation'.

It is ofcourse perfectly workable, the trick is finding an alternative paradigm. Or maybe, the real trick is wanting to find an alternative paradigm, which indeed probably isn't that welcome, if youve invested the best years of your life into one already.
 
  • #135
lumidek said:
Dear Dmitry, good points.

These Casimir changes to the speed of light are correctly derived from low-energy approximate physical theories, and because string theory reduces to them at distances longer than the Planck scale, it also reproduces the conclusions about the speed-up in the Casimir context.

The constancy of the speed of light refers to the vacuum, and when you have the Casimir plates, it's no longer considered "vacuum". Still, if one properly calculates how quickly the "information" in the light can be transmitted, this speed never exceeds the speed of light in the vacuum.

Thank you
Whats about different sorts of very dense "false vacuum" very soon after the Big Bang?
In the reply above, by "vacuum" do you mean
* the lowest energy vacuum, like ours (but it is not guaranteed to be the lowest one!)
* the 'current' vacuum in the Universe (so at a time of the false vacuum we should use the false vacuum)?
* Could be the Casimir effect (and FTL between plates) much stronger in the false vacuum?
 
  • #136
MTd2 said:
As you say in the following paragraph, he implies invariance at all scales. So, that is religion for you too, since you imply doubt Lorentz invariance at all scales.

The difference between All length scales and "All" length scales
(all relevant length scales) is an altogether different discussion.

I would think that Lorentz invariance can be an emergent effect but
it's certainly not trivial because of the extreme accuracy and specifically
in this case: The total lack of dispersion.

Regards, Hans
 
  • #137
Eelco said:
Or so you assert.

I agree with you, that the role of science is to reduce a mess of observational data to a minimum of elegant principles, that reproduce the observational data as an emergent feature.

Sometimes, one makes missteps along the way. One embraces fundamental truths, which turn out to be only approximate truths in retrospect. Euclidian geometry, gallilean relativity, to name just a few.

Sometimes, the way forward, is to take a step back. Lorentz invariance may be just such a case. You seem to treat it as an article of faith. << personal insult removed by Mentors >>
Except that the local Lorentz symmetry can't be an approximation, unless the whole structure collapses - a fact that Einstein realized very well and emphasized in all his essays. He's made many mistakes but this was not one of them, we know 1 century later. There exist approximate truths but there also exist generally valid principles, and relativity is clearly the latter (much like a few conservation laws, the general postulates of quantum mechanics, and, in the macroscopic context, the laws of thermodynamics).

The right religious analogy is between Lorentz violation and Allah. The difference between the Lorentz violation and Allah is that Allah has already been seen twice. ;-)

Making the bold and loud statements of your kind without any glimpse of any violation of relativity is, well... fails to be sensible at the basic level. Lorentz symmetry is a fact of Nature, regardless of the volume that you or someone else invests into the denial of the basic and most important laws of physics. These principles are the insights that the true physics knowledge is all about, and who doesn't understand them and their inevitability, just doesn't understand the core of the 21st century physics.

Incidentally, effective field theories are meant to be approximations but string theory can't be an approximation, either. If the world is described by string theory, there can't be any "small corrections" that can be added to physics of string theory. String theory can't be deformed, a fact that is demonstrated in the major textbooks of the subject. I know that the universal validity of some laws is extremely unpopular with the postmodernists who want everything to be "relative", but it's true, anyway.
 
Last edited:
  • #138
Hans de Vries said:
specifically in this case: The total lack of dispersion.

When you spoke about the gravitational lensing, I thought you were referring to the idea that some of the photons could go through slightly larger paths due to quantum fluctuation, and that the smaller the wavelength the more frequently it would go through such fluctuations and
the longer there would be the path. Speed of light would not be violated, but it would bring an apparent stochastic element to the time of arrival of such photons.
 
  • #139
Dmitry67 said:
Thank you
Whats about different sorts of very dense "false vacuum" very soon after the Big Bang?
In the reply above, by "vacuum" do you mean
* the lowest energy vacuum, like ours (but it is not guaranteed to be the lowest one!)
* the 'current' vacuum in the Universe (so at a time of the false vacuum we should use the false vacuum)?
* Could be the Casimir effect (and FTL between plates) much stronger in the false vacuum?
Dear Dmitry,

the false vacuum e.g. during inflation may have a higher curvature, i.e. it is a de Sitter space. So is the vacuum in our world. The isometry of the de Sitter space is the de Sitter group. Its contraction is the Poincaré group - and the latter is thus relevant for physics at distances much shorter than the radius of de Sitter curvature.

To make the sentence about the Lorentz symmetry strict, "vacuum" should be the lowest energy state of an asymptotically Minkowski superselection sector of the Hilbert space - an empty flat spacetime.

But that doesn't mean that the statement is completely wrong in any other context. It can be seen that the empty de Sitter space with the cosmological constant 10^{-120} in Planck units preserves the same statements, up to corrections of order 10^{-120} (or zero, in properly formulated statements).

The vacuum with some matter inside conserves the Lorentz symmetry, too. In fact, the Lorentz symmetry is supposed to act on all the states in the same superselection sectors, to make any sense and to be useful.

There are all kinds of "variations" of the idealized situation of the empty Minkowski space, and many conclusions from the empty space are still exactly valid while many others are valid up to corrections that can be shown to be small.

I can't enumerate all related statements in all conceivable contexts in this comment because that would be like listing all facts about physics: all of physics is, in some sense, encoded in the physics of the vacuum. Your question is just not specific enough to have a good answer.

The Casimir forces and the vacuum energy from a false vacuum are, at least in the first approximation, two independent corrections to the flat Minkowski space that simply add up. In the second order, their effects are nonlinear and don't just add up, but the addition is surely sufficient for virtually all situations worth considering (or measurable in practice).

Is there some reason why you're asking this completely technical question? I don't think it has any far-reaching consequences.
 
Last edited:
  • #140
Hans de Vries said:
The difference between All length scales and "All" length scales
(all relevant length scales) is an altogether different discussion.

I would think that Lorentz invariance can be an emergent effect but
it's certainly not trivial because of the extreme accuracy and specifically
in this case: The total lack of dispersion.

Regards, Hans
Well, Lorentz symmetry could have a priori been an emergent effect, an approximate fact that only arises in the long-distance approximation. Except that one can

1) show theoretically that unlike the baryon charge conservation, and its U(1) symmetry (which hold in the Standard Model because no Lorentz-symmetry gauge-symmetric renormalizable term involving the SM fields can be written down - but nonrenormalizable ones can!), the Lorentz symmetry can't be an accidental symmetry for complicated systems with many fields. The amount of fine-tuning needed to obtain a Lorentz-invariant long-distance limit from a Lorentz-violating short-distance fundamental theory is effectively infinite if there are many fields (and particle species). There is no reason - neither "dynamical" nor "anthropic" reason - why Nature would make such a huge amount of fine-tuning just in order to confuse us about the composition of the symmetries in Her physical laws.

2) show experimentally that the Lorentz symmetry holds not only at long distances, something that was shown by 100+ years of successful tests of relativity, but even at the Planck scale, as Fermi has just showed. The probability that the Lorentz violation is visible for other effects - so that it physically exists - but it just happens to be invisible for the propagation of photons, with a 1% accuracy, is negligible. Why would Nature play these silly games with us all the time?

Isn't the assumption of such games equivalent to the belief that there are elephants flying everywhere in the skies on Sunday evening but the laws of physics just prevent us from looking at them on Sunday evening and no one has noticed? All these possibilities are silly conspiracy theories, not legitimate hypotheses that should be put on par with relativity or other justified laws of Nature. Such conspiracy theories may always be made even more awkward to avoid a falsification at a new level, so you can never "quite" falsify all of them, but the goal of science is not to create increasingly artificial conspiracy theories, but rather increasingly unified theories that actually describe an increasing set of observations with a decreasing collection of input (assumptions, independent concepts, and parameters).

I just don't understand why you still treat this question - about the emergent and/or fundamental character of the local Lorentz symmetry - as an open one. It is no longer open in August 2009, much like the creationism vs evolution is no longer open. The Lorentz invariance holds at the Planck scale, too. Consequently, it can't be an emergent symmetry resulting from a high number of degrees of freedom conspiring in a way. This is an empirical fact by now. Get used to it.
 
Last edited:
Back
Top