Unelegant, Unnatural, Ugly BSM theme books

  • B
  • Thread starter star apple
  • Start date
  • Tags
    Books Bsm
I'm not sure... actually I'm not sure if the paper is really about the eliteHi there, in summary, the conversation is about finding books similar to Peter Woit's "Not Even Wrong" and Lee Smolin's "The Trouble with Physics" for entertainment purposes. The discussion also touches on the concept of naturalness in physics and the upcoming publication of a new book by Sabine Hossenfelder. The conversation then shifts to a paper that discusses the limitations of the Standard Model in explaining electroweak symmetry breaking and the structure of particles. The conversation ends with a comment on the different perspectives and audiences in physics.
  • #36
star apple said:
You mean our search for new physics is because the standard model is not asymptotically safe? and if it is safe.. no need for new physics even at the general relativistic level? no need for superstrings and loop quantum gravity, etc.? hmm...

I'm wondering what controls whether a field is asymptotically safe or not at certain energy.. or more technically as Sabine put it:
http://backreaction.blogspot.com/2014/03/what-is-asymptotically-safe-gravity-and.html

"But how theories in general depend on the energy scale has only been really understood within in the last two decades or so. It has been a silent development that almost entirely passed by the popular science press and goes under the name renormalization group flow. The renormalization group flow encodes how a theory depends on the energy scale, and it is at the basis of the idea of effective field theory."

So what controls the renormalization group flow?
 
Physics news on Phys.org
  • #37
star apple said:
Why is naturalness less important in the final theory, any reference?

The arguments for naturalness are most natural in the context of considering our current theories as effective theories, ie. low energy theories that are useful at low energies, but incomplete at high energies. The renormalization group is the tool which allows us to understand why we can have useful theories at low energies, even though we are ignorant of the true high energy theory.
 
  • Like
Likes star apple
  • #38
star apple said:
You mean our search for new physics is because the standard model is not asymptotically safe? and if it is safe.. no need for new physics even at the general relativistic level? no need for superstrings and loop quantum gravity, etc.? hmm...

If gravity and matter are asymptotically safe, then that means that quantum general relativity is valid to infinitely high energies, and there is no need for superstrings. The relationship between loop quantum gravity and asymptotic safety is unknown - heuristic arguments suggest that if loop quantum gravity does work, then it will be a form of asymptotic safety - however, this is only a very rough argument.
 
  • Like
Likes star apple
  • #39
star apple said:
I'm wondering what controls whether a field is asymptotically safe or not at certain energy.. or more technically as Sabine put it:
http://backreaction.blogspot.com/2014/03/what-is-asymptotically-safe-gravity-and.html

"But how theories in general depend on the energy scale has only been really understood within in the last two decades or so. It has been a silent development that almost entirely passed by the popular science press and goes under the name renormalization group flow. The renormalization group flow encodes how a theory depends on the energy scale, and it is at the basis of the idea of effective field theory."

So what controls the renormalization group flow?

If a quantum field theory is asymptotically safe, that means that it is valid up to infinitely high energy.
 
  • Like
Likes star apple
  • #40
atyy said:
If a quantum field theory is asymptotically safe, that means that it is valid up to infinitely high energy.

But what makes the QFT asymptotically safe in the first place at infinitely high energy? doesn't it require new physics for that to happens.. So when we say we don't need new physics when the standard model is asymptotically safe.. but isn't what make first asymptotically safe in the first place is due to some new physics?
 
  • #41
I'm still wondering about my earlier question, which I'll repeat here since it seems relevant for this topic:

To which extent is finetuning (and hence naturalness) an artefact of doing perturbation theory? Are there exactly soluble QFT's which suffer from naturalness/finetuning problems?

I mean, how would finetuning of the Higgs mass show up in a non-perturbative formulation of the SM?

I thought the question is appropriate here, so I don't start a new topic. Without wanting to hijack this topic of course ;)
 
  • #42
haushofer said:
I'm still wondering about my earlier question, which I'll repeat here since it seems relevant for this topic:

To which extent is finetuning (and hence naturalness) an artefact of doing perturbation theory? Are there exactly soluble QFT's which suffer from naturalness/finetuning problems?

I mean, how would finetuning of the Higgs mass show up in a non-perturbative formulation of the SM?

I thought the question is appropriate here, so I don't start a new topic. Without wanting to hijack this topic of course ;)

There is an interesting discussion in https://www.google.com.sg/url?sa=t&...0D0sQFgg6MAY&usg=AOvVaw2-LVf2T6qnYUCeZ5kTGnKa.
 
Last edited:
  • Like
Likes haushofer
  • #43
atyy said:
Yes. Nonperturbatively, naturalness relates to the sensitivity of the theory to small changes in a dimensionless parameter: https://www.google.com.sg/url?sa=t&...0D0sQFgg6MAY&usg=AOvVaw2-LVf2T6qnYUCeZ5kTGnKa

Great article. The first thing that came to my mind was how come physicists didn't focus more on nonperturbative scheme instead of proposing supersymmetry to handle the quadratic divergences. Supersymmetric particles won't exist in nonperturbative scheme just like virtual particles are just side effect of perturbation theory that is not there in lattice QFT. Unless they think perturbation method could be chosen by nature intrinsically?.
I think it is analogous to the criteria for well-posedness: http://reference.wolfram.com/language/tutorial/DSolveWellPosedness.html
 
  • #45
haushofer said:
Thanks, I'll check it out!

Please share how you understand the paper. There is a passage in page 3 that got me puzzled: "In brief, the quadratic divergences are completely irrelevant for the naturalness and fine-tuning problems involving the physical parameters."

How do you interpret the statement? Does it mean nonperturbative approach doesn't or does remove the Higgs Hierarchy Problem? And when it mentioned "gauge hierarchy problem".. did it mean the higgs?

Also the paper was written in 1983.. a time when we still didn't have a cellphone. So it was ancient. Now after more than 30 years.. is there any update to it.. or new jargons being used now.. for example like relativistic mass no longer used now. Something similar in the terms used in the paper? atyy? anyone?

Thank you.
 
  • #47
atyy said:
I don't understand the paper well, but Wetterich has written more recent papers that do mention naturalness etc., so that could help put us understand whether his thinking has changed or not.
https://arxiv.org/abs/0912.0208
https://arxiv.org/abs/1612.03069

Haushofer mentioning about nonperturbative approach yesterday bothered me a bit about the electron's gyromagnetic ratio that perturbation can produce a value to better than one part in 10^10, or about three parts in 100 billion. I was supposed to mention this yesterday so let me ask about it. After reading the archives about nonperturbative approach. I found this message of yours written in April 4, 2011 in message 78 of https://www.physicsforums.com/threads/non-perturbative-qft-without-virtual-particles.485597/page-4

rogerl asked: "In Hierarchy Problem, the Higgs can have Planck mass because of quantum contributions. So what they do is propose that the virtual particles of Supersymmetric particles can cancel the very large quantum contributions in the Hierarchy Problem. Why do they have to take drastic measure and radical idea just to get rid of the large contribution if virtual particles are just multivariate integrals. Why didn't they just go to lattice methods to solve it?

atyy replied: "That's an interesting question. I don't know. My understanding is that the underlying theory is given by special relativity, quantum mechanics, Wilsonian renormalization, and the standard model Lagrangian. I would guess that the fine tuning problem is a heuristic argument based on Wilsonian renormalization, so it should have a counterpart in a lattice language.

Also, is there such a thing as non-perturbative QED? Unless a QFT is asymptotically free or safe, isn't it by definition only perturbatively defined? According to http://www.sciencedirect.com/scienc...02d57ae15e181b9774e884147a99780a&searchtype=a , QED is likely not asymptotically safe. The only question then is how we choose to name the terms in a particular perturbation expansion."

atyy. It's been 6 long years since you wrote the above. Please update us of your understanding now. So do you think the Higgs Hierarchy Problem has a counterpart in lattice language? And after so many results in the LHC and half a dozen years of pondering about it.. so is there such thing as a non-perturbative QED? What do you think? What's new in your thinking now compared to 2011?
 
  • #48
According to an expert/professor (Demystifier). Finetuning and naturalness are not artifacts of doing perturbation theory. Also for instance, if you study SM on the lattice, you have to choose some UV cutoff on the lattice. The physical quantities may strongly depend on that choice, which can lead to a fine tuning problem.

So with the nonperturbative approach not a solution of the Higgs Hierarchy Problem and crossed out, we are back to:

1. Supersymmetry (example of Naturalness)
2. Extra Dimensions (Randall RS1, RS2)
3. Natural Finetuning (Lubos')
4. Multiverse Anthropic principle
5. Scale Symmetry (is this an example of Naturalness?)

Let me ask you. When a grenade explode in the ground. Does anyone every ask if it's caused by Naturalness (simply by formula) or by Multiverse? It may sound silly.. is it not. So if we eliminate these three. We have left:

1. Extra Dimensions (Randall RS1, RS2)
2. Scale Symmetry (is this an example of Naturalness?)

If we don't have Extra Dimension. We are left with Scale Symmetry.

But is an exploding grenade caused by Scale Symmetry where the distances of the grounds and the size of the grenade were created on impact?

What seems to be missing in the choices are Anthropic principle without Multiverse.. or in other words Intelligent Design.. but let's not use these words as the words automatically denote mindlessness.. let's use the word "programming" instead... that's right.. the Standard Model parameters could be programmed that way instead of coming from naturalness or extra dimensions or multiverse.. is it not?

What could still solve the Higgs Hierarchy Problem is if the Higgs is a composite.. is this still possible?

Again someone please share whether scale symmetry is an example of naturalness because I can't decide. Thanks.
 
  • #49
@star apple regarding you original question: You can find a list of books in the same spirit as Woit's and Smolin's here, and essays written in a similar spirit here.
 
  • Like
Likes star apple
  • #50
ohwilleke said:
the suggestion, I think it was in one of Lubos's blog posts, that the universe likes to be as extreme and "unnatural" as possible without breaking is probably a better hypothesis than "naturalness".

A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.
 
  • Like
Likes ohwilleke
  • #51
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.

It is always good to learn new terms.
 
  • Like
Likes Urs Schreiber
  • #52
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.

Lubos is as knowledgeable as Witten and one of the most powerful defenders of superstring theory and supersymmetry.. but I thought superstrings and supersymmetry were about naturalness where they were looking for certain Calabi–Yau manifold configuration to explain the constants of nature (is this not the goal of superstring theory?) but in Lubos article https://motls.blogspot.com/2017/04/like-james-bond-nature-loves-to-walk.html, why was he supporting unnaturalness? Was he saying that we were just lucky that in one shot (without any Multiverse scenario), all the constants of nature in the form Calabi–Yau manifold configurations lined up to produce our universe.. like we were just lucky to win a 1 in a billion raised to 10s lotto??
 
  • #53
ohwilleke said:
So what. Standard Model parameters aren't random variables and the claim that we have any plausible basis upon which to expect that they have any particular value a priori is nothing but disciplinary folk myth. The laws of Nature should fit together exactly perfectly and lo and behold, they do.
Which laws fit perfectly together? ;-) This is the problem with the thinking reflected in your comment.

I think sometimes there is confusion between understanding the process of learning vs understanding knowledge. For some of use that think elsewise this is not a myth, its just the modest requirement of putting things into evolutionary perpective.

The task at hand is to find these laws, and which guiding principles we use.

What seems unnatural and unexplainable is only because we do not yet see the evolutionary development. For example, human existence may seem unnatural to some, but if you understand it in the evolutionary perspective it is rather natural. Evolution is as natural as anything gets.

Sabine put it clearly on her blog against naturalness though:

"But note that changing the parameters of a theory is not a physical process. The parameters are whatever they are."
-- http://backreaction.blogspot.se/2017/11/naturalness-is-dead-long-live.html

This was i think a clear statement, which is why i like it - however i disagree with it.

If we look at "theory" as human science knows it, it unquestionably IS a physical process in theory space. We can call learning, inference, or abduction of best explanation etc.

Then step 2 is to ask, how an atom nuclei "know" which theory to obey? You might think that it must ahve obeyed the same laws even 100 years ago, when human sciences has not yet understood it? Yes of course, this is true. But thing are more subtle. If we think that the laws of physics are universal they apply also to complex systems, and the BEHAVIOUR of complex systems. And if you also think about how microcausality can be implemented with any reasonable soundness, then it seems to be how absurd it is to think that atomic structures will "OBEY" rules written in the sky? That if anything is an irrational idea. Instead it seems to be the only way to have some causality is that these rules must be literally encoded in the microstructure. This all leads to the idea of evolution of law if you add an principle of equivalence that the "laws of physics" (or more correctly, the rules for self-organisaiton) must be the same on all complexity scales. The problem though is to understand what the real core of physical law IS? Maybe it is NOT a fixed mathematical structure? Maybe the core of the law is relations between structures? And that is also a possible fallacy to think of thse as existing in a gigantic space of possible structures.

Its not fair to say this is a myth, it is rather a fairly new idea and unexplored one.

[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
A list with pointers to where this idea has been voiced is on the nLab here: universal exceptionalism.
I do not see any conceptually sound reason behind those ideas. To me it sounds like some version of the old "mathematics beauty" argument or similar things.

Obviosly, if string theories out of the landscape could simply PICK the RIGHT solution, that describes our universe and unifies all forces, then the critique against the landscape would fade. But right now, the insight seems to be that the existence of this problem is telling us something about our strategy for navigating in theory space. In short, we seem to be lost according to the map, but not in reality. So the way we charted the map seems wrong.

/Fredrik
 
  • #54
Fra said:
Which laws fit perfectly together? ;-) This is the problem with the thinking reflected in your comment.

I think sometimes there is confusion between understanding the process of learning vs understanding knowledge. For some of use that think elsewise this is not a myth, its just the modest requirement of putting things into evolutionary perpective.

The task at hand is to find these laws, and which guiding principles we use.

What seems unnatural and unexplainable is only because we do not yet see the evolutionary development. For example, human existence may seem unnatural to some, but if you understand it in the evolutionary perspective it is rather natural. Evolution is as natural as anything gets.

Sabine put it clearly on her blog against naturalness though:

"But note that changing the parameters of a theory is not a physical process. The parameters are whatever they are."
-- http://backreaction.blogspot.se/2017/11/naturalness-is-dead-long-live.html

This was i think a clear statement, which is why i like it - however i disagree with it.

If we look at "theory" as human science knows it, it unquestionably IS a physical process in theory space. We can call learning, inference, or abduction of best explanation etc.

Then step 2 is to ask, how an atom nuclei "know" which theory to obey? You might think that it must ahve obeyed the same laws even 100 years ago, when human sciences has not yet understood it? Yes of course, this is true. But thing are more subtle. If we think that the laws of physics are universal they apply also to complex systems, and the BEHAVIOUR of complex systems. And if you also think about how microcausality can be implemented with any reasonable soundness, then it seems to be how absurd it is to think that atomic structures will "OBEY" rules written in the sky? That if anything is an irrational idea. Instead it seems to be the only way to have some causality is that these rules must be literally encoded in the microstructure. This all leads to the idea of evolution of law if you add an principle of equivalence that the "laws of physics" (or more correctly, the rules for self-organisaiton) must be the same on all complexity scales. The problem though is to understand what the real core of physical law IS? Maybe it is NOT a fixed mathematical structure? Maybe the core of the law is relations between structures? And that is also a possible fallacy to think of thse as existing in a gigantic space of possible structures.

Its not fair to say this is a myth, it is rather a fairly new idea and unexplored one.I do not see any conceptually sound reason behind those ideas. To me it sounds like some version of the old "mathematics beauty" argument or similar things.

Obviosly, if string theories out of the landscape could simply PICK the RIGHT solution, that describes our universe and unifies all forces, then the critique against the landscape would fade. But right now, the insight seems to be that the existence of this problem is telling us something about our strategy for navigating in theory space. In short, we seem to be lost according to the map, but not in reality. So the way we charted the map seems wrong.

/Fredrik

The Microsoft Windows operating system or the MacOS operating system doesn't uniquely pick out a certain company data. Because Windows and MacOS are operating system and programmable.. if Superstring theory is also an operating system and programmable. Does this makes Superstring theory a success even now? It's only a failure because string theories out of the landscape couldn't simply PICK the RIGHT solution to use your words.. but do the Microsoft Windows and MacOS pick out a certain company solution (like the company profile and data of Mercedes Benz).. it doesn't so could Superstring Theory be similar?
 
  • #55
star apple said:
I thought superstrings and supersymmetry were about naturalness where they were looking for certain Calabi–Yau manifold configuration to explain the constants of nature

There is no known mechanism in string theory that would dynamically prefer Calabi-Yau compactifications over other compactifications. The interest in CY-compactifications was entirely driven by the prejudice that nature should feature one unbroken supersymmetry at low (here: weak breaking scale) energy. For more see the string theory FAQ: "Does string theory predict supersymmetry?"
 
Last edited:
  • Like
Likes Demystifier
  • #56
Fra said:
I do not see any conceptually sound reason behind those ideas.

The entry starts out with the words "The philosophical sentiment..."
 
  • #57
star apple said:
The Microsoft Windows operating system or the MacOS operating system doesn't uniquely pick out a certain company data. Because Windows and MacOS are operating system and programmable.. if Superstring theory is also an operating system and programmable. Does this makes Superstring theory a success even now? It's only a failure because string theories out of the landscape couldn't simply PICK the RIGHT solution to use your words.. but do the Microsoft Windows and MacOS pick out a certain company solution (like the company profile and data of Mercedes Benz).. it doesn't so could Superstring Theory be similar?
Assuming i get your analogy, that is probably what some strain theorists hope, but the problem i see is...

Nothing wrong with a "hypothesis space"
because that is how an actions and inference under uncertainly works.

The pathology is that a rational inference system would be unable to generate more hypothesis than we can manage to test or even handle. In an intrinsic inference bounded resources for computing and encoding will always make sure the map of hypothesis space is managable. Anything else should intuitively be an evolutionary disadvantage. This always ensures naturality.

In my eyes this merely shows that string theory with its nice things unfortunately is not the right system. To find rational measures om the landscape that after scaling naturally explains the standard model probably requires some extra constructing principles.

Maybe these are found and added to string theory to tame the landscape though. Then in restrospect we woll understand the fine tuning issue and landscape in new light.

/Fredrik
 
  • #58
Fra said:
Maybe these are found and added to string theory to tame the landscape though.

Are you aware that the space of solutions to all other theories of nature that we know is much larger than the landscape of type II flux compactifications? The solution spaces to standard theories, such as general relativity or Yang-Mills theory, are continuous spaces of infinite dimension, hence of cardinality at least that of the real numbers. So a claim that the space of IIB flux compactificatins is a finite number such as ##10^{500}##, implies that it is tiny compared to what is to be expected from a space of solutions to a theory of nature. Even if finite numbers of this form "feel large", they are negligible compared to the cardinality of the continuum.

It is worthwhile to soberly think about what it would really mean if there were a unique string vacuum, or maybe two or three. It would be the Hegelian dream of all of nature derived uniquely from pure thought become real. While it is (or was) fun to hope that this is the case with string theory, it makes no sense to speak of a "problem" if it turns out not to be the case. That would be like speaking of a problem if you don't win the billion dollar lottery. It would have been great if you did, but now that you didn't this just means the situation is normal as it was before you bought the lottery ticket.
 
  • Like
Likes atyy
  • #59
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
While it is (or was) fun to hope that this is the case with string theory, it makes no sense to speak of a "problem" if it turns out not to be the case.
I think for people like some of us on here, this is the kind of "problem" that motivates us. So for me it IS a problem, although we can agree to put the "problem" in an appropriate geeky context which only a fraction of us care about. We sure have bigger - but less intriguing - problems on earth.
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
That would be like speaking of a problem if you don't win the billion dollar lottery. It would have been great if you did, but now that you didn't this just means the situation is normal as it was before you bought the lottery ticket.
Your odds comparasion i agree with. It isn't the first time i hear that exact analogy. But there is only one way and that is forward.
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Are you aware that the space of solutions to all other theories of nature that we know is much larger than the landscape of type II flux compactifications? The solution spaces to standard theories, such as general relativity or Yang-Mills theory, are continuous spaces of infinite dimension, hence of cardinality at least that of the real numbers. So a claim that the space of IIB flux compactificatins is a finite number such as ##10^{500}##, implies that it is tiny compared to what is to be expected from a space of solutions to a theory of nature. Even if finite numbers of this form "feel large", they are negligible compared to the cardinality of the continuum.
I am glad you bring up cardinality and measures. You are indeed touching upon the core of the problems here. In fact i have been thinking a lot aout this, and the problem of how to compare "evidence" and an inferential abstraction is one of t he things that has led my to my current stance to all this.

Many problems root in the fact that it is ambigous to compare infinities that you have in a formal expression. But infinities are really defined by means of limits, and in contiuum mathematics i feel that very often one has lost the original track of this limiting procedure, and their order and "rate". You can of course fix this, but there is a lot of degrees of freedom in these models that are nonphysical, and to the point were we confuse ourselves with what we are doing. You have similar problems in the foundations of probability theory and inference. When you try to build inferences, one has to be quite anal about counting, because if you want to compare two sets of evidence and both sets are infinite, then something is wrong. The nyou have to find a the integration measures on hte spaces that are tuned so that they comply to the underlying limiting procedures. One of the problems of infinities is imo that we have lost the physical track of the real degrees of freedom, and we are LOST among the huge mathematical degress of freedom. Especially if you start out from a classical system (lagrangian) you ahve this baggage of uncountable sets in there, moreover in a disastrous mess! My goal is to make a reconstruction, not starting from classical mechanics, but from an abstraction of inference. Continuum models will obviously still be preferred in large complexity limit, but its is just a gigantic mess to start with uncountale sets from square one.

/Fredrik
 
  • #60
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Are you aware that the space of solutions to all other theories of nature that we know is much larger than the landscape of type II flux compactifications? The solution spaces to standard theories, such as general relativity or Yang-Mills theory, are continuous spaces of infinite dimension, hence of cardinality at least that of the real numbers. So a claim that the space of IIB flux compactificatins is a finite number such as ##10^{500}##, implies that it is tiny compared to what is to be expected from a space of solutions to a theory of nature. Even if finite numbers of this form "feel large", they are negligible compared to the cardinality of the continuum.

It is worthwhile to soberly think about what it would really mean if there were a unique string vacuum, or maybe two or three. It would be the Hegelian dream of all of nature derived uniquely from pure thought become real. While it is (or was) fun to hope that this is the case with string theory, it makes no sense to speak of a "problem" if it turns out not to be the case. That would be like speaking of a problem if you don't win the billion dollar lottery. It would have been great if you did, but now that you didn't this just means the situation is normal as it was before you bought the lottery ticket.

How does this work of string theory is supposed to contain the other theories like GR?
 
  • #62
Fra said:
So for me it IS a problem, although we can agree to put the "problem" in an appropriate geeky context which only a fraction of us care about. We sure have bigger - but less intriguing - problems on earth.
It is certainly an open problem of fundamental physics as such, but it is not a defect of string theory.

Besides, there is so little known for sure about points in the landscape, that all debate about whether it is "large" or "small" might better be postponed until it is really understood. It is easy to forget how many simplifying assumptions enter the identification of string backgrounds. One day most of these counting arguments will be obsolete, since they don't properly deal with the mathematics of string backgrounds. This point was made for instance in Distler-Freed-Moore 09:

"We hope that our formulation of orientifold theory can help clarify some aspects of and prove useful to investigations in orientifold compactifications, especially in the applications to model building and the “landscape.” In particular, our work suggests the existence of topological constraints on orientifold compactifications which have not been accounted for in the existing literature on the landscape."

but this kind of careful analysis tends to be ignored these day.
 
  • #63
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
It is certainly an open problem of fundamental physics as such, but it is not a defect of string theory.
What i meant with pathology was in the context of inferences and the defence of why movement in theory space can be (should be?) seen as a physical process when talking about naturalness. So if string theory offer no solution like a proper inferential theory imo should - ST isn't a proper inference theory. You can still think that string theory is fine for other purposes.

Of course, no one really claimed it was an inference theory. But I see some remote links and its the ONLY merit I personally see in it, but then for sure there are pieces missing. This is not just a technical issue, its much easier to see from the conceptual side.

I posted about it here as i found some random thoughts in the direction
https://www.physicsforums.com/threa...old-the-thought-of-jonathan-j-heckman.923630/
(But i have something more radical in mind)
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Besides, there is so little known for sure about points in the landscape, that all debate about whether it is "large" or "small" might better be postponed until it is really understood. It is easy to forget how many simplifying assumptions enter the identification of string backgrounds. One day most of these counting arguments will be obsolete, since they don't properly deal with the mathematics of string backgrounds. This point was made for instance in Distler-Freed-Moore 09:

"We hope that our formulation of orientifold theory can help clarify some aspects of and prove useful to investigations in orientifold compactifications, especially in the applications to model building and the “landscape.” In particular, our work suggests the existence of topological constraints on orientifold compactifications which have not been accounted for in the existing literature on the landscape."

but this kind of careful analysis tends to be ignored these day.
I seems to me that if the inference scheme associations to ST proves right,then it is likelythat here must is a mathematical way to solve the problems without going via the reconstruction that I have in mind, but it will likely be far more technically complex than necessary, not to mention it seems to be a very non-physical or conceptually akward way, which begs the question of what to use as guidance for coming up with the constraints needed? Its like starting with a description with a massive amount of redundancy, and try to by finding all the constraints extract the real options VS starting with the physical options and then find how that looks like in the continuum approximation.

But i can not follow in details if the former way is viable. I figure people like you are the persons we need to go that route. But going that route i figure takes a different mindset and guidance than i have.

I try to use my intuition about physical inferences and let that guide me to the tools, instead of doing the other way around. String theory to me is an example of an interesting mathematical framework, but it is not really clear what it means and of what use it is for the problem at hand.

/Fredrik
 
  • #64
Fra said:
Its not fair to say this is a myth, it is rather a fairly new idea and unexplored one.

It is a myth in the sense that it derives from an a priori Platonic assumption rather than being rooted in empirical evidence or a necessary theoretical consistency.

Naturalness is basically a form of Baysean statistical reasoning. But, there is no scientific source for the Baysean distribution we are drawing from, and Baysean reasoning is particularly weak when there is no meaningful empirical basis for your priors. Baysean statistics exist to not waste existing empirical data points, and is outside of its domain of applicability when you have not empirical data points from which to derive your priors.
 
  • #65
ohwilleke said:
But, there is no scientific source for the Baysean distribution we are drawing from, and Baysean reasoning is particularly weak when there is no meaningful empirical basis for your priors. Baysean statistics exist to not waste existing empirical data points, and is outside of its domain of applicability when you have not empirical data points from which to derive your priors.
Like i said this is unexplored ideas but the solution to your critique here is to consider evolution of law and attach bayesian reasoning to information processing agents. Second associate these information processing agents to matter. This is the direction thinkg i am personally trying to work in.

We should not put in manual priors by hand, this is no good, i agree. The prior is just the current state of evolution. It is a learning schema, NOT a statistical approach based on a fixed probability space. The probability spaces themselves also must evolve.

The scientific empirical support for the prior lies in the interaction HISTORY of the agent, which has carefully selected the prior.

So this is MORE than just simply bayesian probability. You must put the machinery of the bayesian inference in an interacting and evolving context. Then it gets alive.

/Fredrik
 
  • #66
  • #67
martinbn said:
Are you saying that there are solutions of general relativity, infinitely many, that are not in any way described, even just in principle, by string theory, since string theory has only finitely many solutions!?

Indeed, perturbative string theory is more constrained than the effective field theories that it reduces to at low energy. The popular discussion of the landscape always gets this backwards: Faced with the landscape of consistent string theory vacua being possibly very large (which is not completely clear yet, since it is so little understood in mathematical detail) people forget that the usual spaces of solutions of plain QFT are much more vast still. Vafa tried to drive home this point with speaking of the vast Swampland surrounding the landscape.

It's like in the children's game of big numbers: One of the 4-year olds cries out "one thousand!" and the others are silenced and awed by the immensity of this number, ignorant as they are. In reality it's the other way around: In physics spaces of solutions generically have the infinite cardinality of the continuum, and something special and noteworthy has to happen to make that become finite.
 
Last edited:
  • #68
Your ncatlab.org site is awesome. I can also see your and your co-authors passion for rigour and wish to bring order into theoretical physics field and to explain. There is no question that this is a very important task. Our views also take place at different levels. I am not referring to mathematical inconsistencies at this point.
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
It is certainly an open problem of fundamental physics as such, but it is not a defect of string theory.
...
Besides, there is so little known for sure about points in the landscape, that all debate about whether it is "large" or "small" might better be postponed until it is really understood. It is easy to forget how many simplifying assumptions enter the identification of string backgrounds. One day most of these counting arguments will be obsolete, since they don't properly deal with the mathematics of string backgrounds. This point was made for instance in Distler-Freed-Moore 09:

One core point in there seems to be
Distler-Freed-Moore 09 said:
...do not actually point to a problem in string model building that would be worse than in model building in other theories.
this might be, but it is a weak defence to a justified critique towards such a massive investment. A program that with open eyes accepts and builds onto a questionable foundation, made the choice that the foundation is good enough.

The critique is in place because string theory unlike classical GR, is not just a normal theory, its a toolbox for constructing various theories, but the mapping to reality is vauge not only experimentally but also conceptually. Ie. it has traits of an inference system in several ways. First because its a theory of theory, second because it loosely provides a microstructure (strings) whos interactions ENCODE its interaction with the environment, similar to rationally acting information processing players in a game of expectations. I don't execpt that you conenct to this, but what i am effectively saying that there is a possible WAY to see/understand this that is NOT in terms of "geometry", but in terms of inferences. And while mathematically it might well be isomorphic, the intuition is much better in my brain in the inference abstraction, and buy gut feeling is that this is also the right "natural" perspective.

So the critique is also a bit "tough love" because i think we rightfully can have/should have much higher expecations on a theory like this. I personally hope string theorist turn attention to the direction hinted by Heckman, and understand the ST toolbox in new light.

/Fredrik
 

Similar threads

Replies
27
Views
5K
Replies
6
Views
3K
Replies
115
Views
12K
Replies
2
Views
3K
Replies
6
Views
4K
Replies
48
Views
990
Replies
14
Views
4K
Back
Top