LHC - absence of supersymmetric particles

In summary: The hierarchy problem is that you can't have an infinite amount of fine tuning without contradicting other constants in the theory. For example, you can't have a universe with an infinite amount of supersymmetry because that would require infinite amounts of extra parameters.
  • #1
Dennis Plews
36
4
Thread title edited by moderator.

I've not seen any reports from CERN concerning the detection of any putative supersymmetric particles. Is the absence of such detections a problem?
 
Physics news on Phys.org
  • #2
It is a problem for theorists working on supersymmetry and trying to get funding.

Apart from that: We don't know if the universe has supersymmetry. If it has, we don't know the energy scale of it. "Visible at the LHC" was nice theoretically before the LHC started, but it is getting unlikely.
There are many possible models of new physics, supersymmetry is the most prominent one but not the only one.
 
  • Like
Likes Dennis Plews and dextercioby
  • #3
I disagree with mfb. The LHC has accumulated 1% of the data that it will eventually, and while it has ruled out many models, I don't think it's anywhere near having ruled out even most SUSY models.

For example, suppose you have a 100 GeV gravitino, a 400 GeV sneutrino, a set of degenerate quarks at 1750 GeV and 4 TeV gauginos. I don't think there's anything that excludes this spectrum, and the LHC will eventually get evidence for it, but it will take a while.
 
  • #4
mfb said:
It is a problem for theorists working on supersymmetry and trying to get funding.

Apart from that: We don't know if the universe has supersymmetry. If it has, we don't know the energy scale of it. "Visible at the LHC" was nice theoretically before the LHC started, but it is getting unlikely.
There are many possible models of new physics, supersymmetry is the most prominent one but not the only one.
.
Thank you for your reply.
Apart from Loop Theory f/k/a "Chain Mail" , which I don't recall whether it predicts particles heavier than Standard Model particles, I am unfamiliar with other models. What are they and do they postulate particles within the energy range of the LHC?
 
  • #5
Vanadium 50 said:
I disagree with mfb. The LHC has accumulated 1% of the data that it will eventually, and while it has ruled out many models, I don't think it's anywhere near having ruled out even most SUSY models.
You cannot rule out SUSY.
And I didn't say we would have done so.

There is a huge parameter space left, but the most promising regions didn't show a signal. While the LHC will be able to study a larger range with more data in the future, most of the possible parameter space for SUSY will stay way above the capabilities of the LHC. The LHC can just access the most natural part - the TeV scale.
Dennis Plews said:
I am unfamiliar with other models. What are they and do they postulate particles within the energy range of the LHC?
Heavier versions of gauge bosons (W', Z', ...), a fourth generation, gravitons, more Higgs-like particles, technicolor, composite quarks, black holes, excited fermions, ... (these categories are not exclusive, and not exhaustive either).
There are also more model-independent searches: Invisible Higgs decays, lepton flavor violating processes (e. g. Higgs -> tau mu), "any new resonance in the x+y spectrum" for various x and y, and so on.
And then you have the really exotic things: fractional charges or multiply charged particles, magnetic monopoles, long-living very heavy particles (decaying in the calorimeters) and so on.
 
  • #6
mfb said:
most promising regions

This is the part I disagree with. "Most promising" is really "most popular". Popularity is determined by, among other things, how much cool stuff you might see right around the corner. I don't think the spectrum I posited does any worse than more popular models. It just has fewer cool things to discover in it.
 
  • Like
Likes Urs Schreiber
  • #7
It makes SUSY less natural and requires more fine-tuning.

Fine, let's call it "most popular". That is still an issue if you want to get funding as SUSY theorist.
 
  • #8
If you are looking for your car keys, and you search 0.1% of your house and can't find them, do you give up the search? This is why I think drawing conclusions at this phase of the LHC is way, way premature.

As far as "requires more fine-tuning", this idea of quantifying fine tuning is new, and I don't think helpful. How much fine-tuning do you need because the sun and the moon are the same angular size? Is this a problem with our models of solar system formation? The hierarchy problem can be expressed as ## m^2_h = m^2_{\rm bare} + \Lambda^2 ## (I'll ignore factors like ##16 \pi^2##). If this theory is good to the Planck scale, you need the Higgs bare mass to exactly match the radiative corrections but in the opposite direction to 38 digits. I agree this is a problem.

If you have SUSY at a few hundred GeV, you only need to match one digit. If it's at a few TeV, you need two digits. I don't see the former as a perfectly good solution but the latter as completely unacceptable. And until the theory community got into a tug of war with "my theory is 30% more natural than yours!" not many other people did either.
 
  • Like
Likes Dennis Plews and Urs Schreiber
  • #9
Vanadium 50 said:
If you are looking for your car keys, and you search 0.1% of your house and can't find them, do you give up the search? This is why I think drawing conclusions at this phase of the LHC is way, way premature.
You know how bad that analogy is.
If I searched the whole house once, I can search it again spending more time on the search, but the probability that I lost the keys outside (or never had keys) increased.
Currently we have data equivalent to ~50/fb at 13 TeV (a bit more or less depending on the analysis). With 300/fb we can increase the search range, but it won't be 6 times as large - more typical is something like 1.5 times the mass range. With 3000/fb we can increase it again, but it won't be 60 times as large - we might get 2 times the current mass range.

That gives a large region left to search. But it also means we searched quite a large region already.
Vanadium 50 said:
If you have SUSY at a few hundred GeV, you only need to match one digit. If it's at a few TeV, you need two digits. I don't see the former as a perfectly good solution but the latter as completely unacceptable.
No one said that, and there is no hard cut. SUSY just gets less natural the higher the energy gets, and you don't even have to put percentages on it.
 
  • Like
Likes Dennis Plews and Dr Aaron
  • #10
Vanadium 50 said:
The hierarchy problem can be expressed as ## m^2_h = m^2_{\rm bare} + \Lambda^2 ## (I'll ignore factors like ##16 \pi^2##). If this theory is good to the Planck scale, you need the Higgs bare mass to exactly match the radiative corrections but in the opposite direction to 38 digits. I agree this is a problem.
.

what evidence is there that the Planck scale is relevant to the Higgs sector?
 
  • #11
Vanadium 50 said:
As far as "requires more fine-tuning", this idea of quantifying fine tuning is new, and I don't think helpful. How much fine-tuning do you need because the sun and the moon are the same angular size? Is this a problem with our models of solar system formation? The hierarchy problem can be expressed as ## m^2_h = m^2_{\rm bare} + \Lambda^2 ## (I'll ignore factors like ##16 \pi^2##). If this theory is good to the Planck scale, you need the Higgs bare mass to exactly match the radiative corrections but in the opposite direction to 38 digits. I agree this is a problem.

If you have SUSY at a few hundred GeV, you only need to match one digit. If it's at a few TeV, you need two digits. I don't see the former as a perfectly good solution but the latter as completely unacceptable. And until the theory community got into a tug of war with "my theory is 30% more natural than yours!" not many other people did either.

I don't agree that the hierarchy problem is really a "problem" that needs to be solved in any meaningful sense. But, the hierarchy problem was an important motivation to formulate SUSY in the first place and the higher the energy scale to which SUSY phenomena are constrained, the less well motivated SUSY is from this perspective.

Another major motivation for SUSY was to provide a cold dark matter candidate, but there are now pretty much no SUSY particles which can satisfy the experimental boundaries on dark matter parameters. SUSY predicted a WIMP in the tens or hundreds of GeVs to explain dark matter, but direct detection experiments like LUX have ruled out these candidates and astronomy evidence suggests that thermal relic dark matter needs to have a mass around 2-14 keV rather than in the GeV scale.

A third motivation for SUSY is that it naturally produces gauge unification of the three Standard Model coupling constants, which requires the beta functions of those coupling constants to differ materially from their Standard Model counterparts. But, thus far, the LHC has not discerned any deviation of the beta functions observed from the Standard Model even though deviations in two of the three coupling constants should be discernible at the LHC if the SUSY scale is not far too high.

A looming limitation on SUSY theories relates to neutrinoless double beta decay. SUSY theories, generically, tend to assume Majorana neutrinos which give rise to neutrinoless double beta decay. But, the upper bounds on the rates at which this can occur have gotten increasingly smaller, while generically, a higher mass scale for SUSY favors higher rates of neutrinoless double beta decay. So, it is not, in general, true that one can simply increase the SUSY scale without bound and save the theory from experimental falsification.

Another probe of very high energy physics that can't be directly tested at the LHC is the value of muon g-2 which is significantly different from the theoretically predicted value which a great deal of effort in recent years has gone into refining. This points to some BSM physics at high energies, but its quite small and well defined absolute magnitude leaves a pretty modest window within which any SUSY theory has to fit and hence provides upper and lower bounds on a SUSY scale to some extent. Naively, muon g-2 measurements point to a SUSY scale on the order of 10s of TeVs in that model dependent analysis, which the LHC wouldn't be able to see. Of course, even if the muon g-2 discrepancy is real and not just a combination of systemic and theoretical error and statistical flukes in measurements, there is no fundamental reason why SUSY has to be the particular kind of BSM physics that is behind that discrepancy.

SUSY is also not great at providing a motivation for lepton universality violations which are the flavor of the week experimental anomaly at the LHC.
 
Last edited:
  • Like
Likes Dennis Plews, Dr Aaron and kodama
  • #12
kodama said:
what evidence is there that the Planck scale is relevant to the Higgs sector?
It doesn't have to be the Planck scale, it is generally the scale of new physics. With the expectation that this new physics doesn't have diverging contributions. The Planck scale is just the first scale where we are absolutely sure there has to be BSM physics.
SUSY is an example. You get contributions up to the scale of SUSY breaking. Above that, the supersymmetric particles cancel the contribution of the other particles, and the total contribution is finite.
 
  • Like
Likes kodama
  • #13
mfb said:
It doesn't have to be the Planck scale, it is generally the scale of new physics. With the expectation that this new physics doesn't have diverging contributions. The Planck scale is just the first scale where we are absolutely sure there has to be BSM physics.
SUSY is an example. You get contributions up to the scale of SUSY breaking. Above that, the supersymmetric particles cancel the contribution of the other particles, and the total contribution is finite.

what if there is no new physics until the Planck scale, and the Planck scale BSM is not relevant to the Higgs sector
 
  • #14
It has to be relevant. Otherwise you have infinities hanging around and the whole approach is wrong.
 
  • Like
Likes kodama
  • #15
mfb said:
It has to be relevant. Otherwise you have infinities hanging around and the whole approach is wrong.

What I have in mind is Shaposhnikov and Wetterich paper predicting a 126 gev Higgs, no new physics from fermi scale to Planck scale, and Planck scale is described by asymptotic safety program. given the assumptions of Shaposhnikov and Wetterich paper, a 126 gev higgs, is there a hierarchy problem in their theory?
 
  • #16
kodama said:
What I have in mind is Shaposhnikov and Wetterich paper predicting a 126 gev Higgs, no new physics from fermi scale to Planck scale, and Planck scale is described by asymptotic safety program. given the assumptions of Shaposhnikov and Wetterich paper, a 126 gev higgs, is there a hierarchy problem in their theory?

Yes presumably a maximal hierarchy problem... Note however there are a lot of technical caveats in this business, and it doesn't really make sense to discuss them until you understand the much simpler standard model hierarchy problem... namely that particles of mass M, coupled to the Higgs field generically give quantum contributions that scale like: $$\delta M^{2}_{h}\sim \alpha M^{2}$$
 
  • Like
Likes Greg Bernhardt
  • #17
Haelfix said:
Yes presumably a maximal hierarchy problem... Note however there are a lot of technical caveats in this business, and it doesn't really make sense to discuss them until you understand the much simpler standard model hierarchy problem... namely that particles of mass M, coupled to the Higgs field generically give quantum contributions that scale like: $$\delta M^{2}_{h}\sim \alpha M^{2}$$

since the Shaposhnikov and Wetterich assumes no new physics from fermi scale to Planck scale, presumably the particles of mass M, coupled to the Higgs field is the top quark in their approach, which predicts 126 gev to within 1gev

is there a hierarchy problem and fine tuning problem if the heaviest particle coupled to the Higgs field is the top quark
 
  • #18
ohwilleke said:
I don't agree that the hierarchy problem is really a "problem" that needs to be solved in any meaningful sense. But, the hierarchy problem was an important motivation to formulate SUSY in the first place and the higher the energy scale to which SUSY phenomena are constrained, the less well motivated SUSY is from this perspective.

Another major motivation for SUSY was to provide a cold dark matter candidate, but there are now pretty much no SUSY particles which can satisfy the experimental boundaries on dark matter parameters. SUSY predicted a WIMP in the tens or hundreds of GeVs to explain dark matter, but direct detection experiments like LUX have ruled out these candidates and astronomy evidence suggests that thermal relic dark matter needs to have a mass around 2-14 keV rather than in the GeV scale.

A third motivation for SUSY is that it naturally produces gauge unification of the three Standard Model coupling constants, which requires the beta functions of those coupling constants to differ materially from their Standard Model counterparts. But, thus far, the LHC has not discerned any deviation of the beta functions observed from the Standard Model even though deviations in two of the three coupling constants should be discernible at the LHC if the SUSY scale is not far too high.

A looming limitation on SUSY theories relates to neutrinoless double beta decay. SUSY theories, generically, tend to assume Majorana neutrinos which give rise to neutrinoless double beta decay. But, the upper bounds on the rates at which this can occur have gotten increasingly smaller, while generically, a higher mass scale for SUSY favors higher rates of neutrinoless double beta decay. So, it is not, in general, true that one can simply increase the SUSY scale without bound and save the theory from experimental falsification.

Another probe of very high energy physics that can't be directly tested at the LHC is the value of muon g-2 which is significantly different from the theoretically predicted value which a great deal of effort in recent years has gone into refining. This points to some BSM physics at high energies, but its quite small and well defined absolute magnitude leaves a pretty modest window within which any SUSY theory has to fit and hence provides upper and lower bounds on a SUSY scale to some extent. Naively, muon g-2 measurements point to a SUSY scale on the order of 10s of TeVs in that model dependent analysis, which the LHC wouldn't be able to see. Of course, even if the muon g-2 discrepancy is real and not just a combination of systemic and theoretical error and statistical flukes in measurements, there is no fundamental reason why SUSY has to be the particular kind of BSM physics that is behind that discrepancy.

SUSY is also not great at providing a motivation for lepton universality violations which are the flavor of the week experimental anomaly at the LHC.
I am sorry to interrupt the discussion, but I am unfamiliar with lepton universality violations. Could you explain this a bit? Thanks!
 
  • #19
kodama said:
is there a hierarchy problem and fine tuning problem if the heaviest particle coupled to the Higgs field is the top quark
No. I said there was a maximal hierarchy, which means the cutoff is the Planck scale, which one way or the other is the ultimate limit of viability of the effective theory. There might be new physics before that scale, which improves the hierarchy problem, but absent that it certainly does not go away in any approach that allows gravity to couple quantum mechanically to matter. Now you could ask for a miracle and postulate that some new Planckian miracle makes the equations above vanish, but that's tantamount to solving the hierarchy problem.
 
  • #20
Haelfix said:
No. I said there was a maximal hierarchy, which means the cutoff is the Planck scale, which one way or the other is the ultimate limit of viability of the effective theory. There might be new physics before that scale, which improves the hierarchy problem, but absent that it certainly does not go away in any approach that allows gravity to couple quantum mechanically to matter. Now you could ask for a miracle and postulate that some new Planckian miracle makes the equations above vanish, but that's tantamount to solving the hierarchy problem.

how does Shaposhnikov and Wetterich predict Higgs correctly, with asymptotically safe gravity?

also is conformal higgs still not ruled out?
 
  • #21
kodama said:
how does Shaposhnikov and Wetterich predict Higgs correctly

I looked it up. This is the eighteenth time you have brought this up. Eighteen.
 
  • Like
Likes arivero
  • #23
kodama said:
how does Shaposhnikov and Wetterich predict Higgs correctly

Now we know, given mfb's link. They make multiple predictions, sometimes with a wide range: 150 +/- 24 GeV.
 
  • #24
is there a hierarchy problem in S&W theory, one that does not use SUSY
 
  • #25
kodama said:
is there a hierarchy problem in S&W theory, one that does not use SUSY

No idea. You're the one who keeps pushing this paper. Maybe you should read it.
 
  • Like
Likes arivero and weirdoguy
  • #26
In my opinion that paper absolutely deserves attention (and it has over 200 citations), and I say that even though I favor string theory. I also have to say that the prediction which predicted the Higgs mass was narrow and based on simple hypotheses (asymptotic safety of gravity, a desert above the Fermi scale, a positive gravitational contribution to the anomalous dimension of the Higgs quartic coupling).

Anyway, I have created a thread specifically for the question of whether and how the hierarchy problem exists for this theory.
 
Last edited:
  • Like
Likes kodama
  • #27
mfb said:
I wonder if all the other "correct" predictions got the same love.

which other predictions do you think are also worth serious considerations?

to narrow the field, they get close to 126 GEV value with a range range is plus or minus 6 GEV and have not been ruled out or consistent with other known facts with current LHC data set
 
  • #28
Dr Aaron said:
I am sorry to interrupt the discussion, but I am unfamiliar with lepton universality violations. Could you explain this a bit? Thanks!
See, for example:

We present a measurement of angular observables and a test of lepton flavor universality in the B→K∗ℓ+ℓ− decay, where ℓ is either e or μ. The analysis is performed on a data sample corresponding to an integrated luminosity of 711 fb−1 containing 772×106 BB¯ pairs, collected at the Υ(4S) resonance with the Belle detector at the asymmetric-energy e+e− collider KEKB. The result is consistent with Standard Model (SM) expectations, where the largest discrepancy from a SM prediction is observed in the muon modes with a local significance of 2.6σ.

Belle Collaboration, "Lepton-Flavor-Dependent Angular Analysis of B→K∗ℓ+ℓ−" (December 15, 2016).

See also a recent review pre-print on the subject https://arxiv.org/abs/1706.07808

A number of more recent results have cast doubt on these anomalies, however. Such as https://arxiv.org/abs/1705.05802.

The branching fraction and effective lifetime of B0(s)→μ+μ− at LHCb with Run 1 and Run 2 data Mick Mulder, for the LHCb Collaboration (Submitted on 9 May 2017)

After Run 1 of the LHC, global fits to b→sℓℓ observables show a deviation from the Standard Model (SM) with a significance of ∼ 4 standard devations. An example of a b→sℓℓ process is the decay of a B0s meson into two muons (B0s→μ+μ−). The latest analysis of B0(s)→μ+μ− decays by LHCb with Run 1 and Run 2 data is presented. The B0s→μ+μ− decay is observed for the first time by a single experiment. In addition, the first measurement of the B0s→μ+μ−effective lifetime is performed. No significant excess of B0→μ+μ− decays is observed. All results are consistent with the SM and constrain New Physics in b→sℓℓ processes.

<Moderator's note: Some links have been removed in accordance to our rules and policy.>
 
Last edited by a moderator:
  • Like
Likes Dr Aaron
  • #29
Was there a theoretical reason to have expected B0 decays to be in excess? I am at (or beyond) the limits of my understanding for this, but isn't the starting mass of the B0 the determining factor for the predicted decays?
 
  • #30
Dr Aaron said:
Was there a theoretical reason to have expected B0 decays to be in excess?
Excess relative to the SM prediction? Various BSM models change the rate relative to the SM, and typically they lead to a larger rate.
Dr Aaron said:
but isn't the starting mass of the B0 the determining factor for the predicted decays?
What do you mean by "starting mass"?
 
  • Like
Likes Dr Aaron
  • #31
From what I gather, the "universality" of the three varieties of leptons is the real problem here. They are lumped together as a single type of entity when they each have their own composition and therefore variances in the mass each type starts out with. To expect them to all decay similarly seems quite curious. You can fire three bullets of different masses from the same rifle but you certainly would not expect all three to land at exactly the same location downrange. The "violations" would seem to be inherent due to the false assumption that they were all the same in the beginning.
 
  • #32
Dr Aaron said:
From what I gather, the "universality" of the three varieties of leptons is the real problem here. They are lumped together as a single type of entity when they each have their own composition and therefore variances in the mass each type starts out with. To expect them to all decay similarly seems quite curious. You can fire three bullets of different masses from the same rifle but you certainly would not expect all three to land at exactly the same location downrange. The "violations" would seem to be inherent due to the false assumption that they were all the same in the beginning.
That doesn't make any sense.
 

FAQ: LHC - absence of supersymmetric particles

What is the LHC?

The Large Hadron Collider (LHC) is the world's largest and most powerful particle accelerator. It is located at the European Organization for Nuclear Research (CERN) in Switzerland.

What are supersymmetric particles?

Supersymmetric particles are hypothetical particles that are predicted by supersymmetry, a theory that proposes a symmetry between particles with integer and half-integer spin. They are thought to exist in order to explain certain discrepancies in the Standard Model of particle physics.

Why is the absence of supersymmetric particles at the LHC significant?

The LHC was built in part to search for evidence of supersymmetric particles, as their existence would provide support for the theory of supersymmetry. The absence of these particles at the LHC suggests that the theory may not be correct and could lead to a reevaluation of our understanding of particle physics.

What are some possible explanations for the absence of supersymmetric particles at the LHC?

One possible explanation is that the LHC may not have enough energy to produce supersymmetric particles. Another is that they may be too heavy to be detected by the LHC. It is also possible that supersymmetric particles do not exist at all and the theory of supersymmetry is incorrect.

What are the implications of the absence of supersymmetric particles for the future of particle physics?

The absence of supersymmetric particles at the LHC has led to a reassessment of the theory of supersymmetry and its predictions. It may also open up new avenues of research and lead to the development of new theories to explain the fundamental nature of the universe. Further experiments and data analysis at the LHC and other particle accelerators will be crucial in determining the implications for the future of particle physics.

Similar threads

Replies
6
Views
2K
Replies
2
Views
3K
Replies
1
Views
2K
Replies
18
Views
3K
Replies
2
Views
2K
Replies
19
Views
5K
Replies
2
Views
3K
Back
Top