Will LQG explain the constants?

In summary, the conversation discusses the values of some fundamental numerical constants in the standard models of matter and cosmology and how they are favorable to life. The conversation also explores the theories of cosmic natural selection and LQG as potential explanations for these constants. The CNS principle suggests that these constants are selected based on their ability to promote reproductive success, while LQG proposes that the formation of black holes may reproduce the universe. This is a controversial and unsettling idea for some, as it may reveal violations or limitations of accepted physical laws. However, others argue that as long as these theories can predict and explain what is observed in our universe, they are valid alternatives to the anthropic principle.
  • #1
marcus
Science Advisor
Gold Member
Dearly Missed
24,775
792
the values of some of the fundamental numerical constants occurring in the standard models of matter and cosmology are considered to be favorable to life and people wonder why they are what they are

for example the finestructure constant alpha, why is it around 1/137?

for example the cosmological constant Lambda, why is it around E-120?

(the currently estimated value in rationalized Planck units is 0.85E-120)

A WAY HAS BEEN SUGGESTED by which LQG could explain some of these basic constants without appealing to the existence of life (which is unfalsifiable and thus not a scientific theory: it indiscriminately "predicts" anything we could possibly observe and any experimental outcome we could measure).

The model for explaining basic constants, called CNS by Lee Smolin, is FALSIFIABLE, in the sense that it predicts quantitative outcomes of observations which conceivably could go againt it and prove it wrong. So it has some definite predictive value----it UNpredicts certain outcomes of future experiment.

What Motl has called "The Anthropic Lack of Principles" does not unpredict anything. It does not bet its life on the outcome of some future measurement and risk empirical disproof. Instead, it can accommodate any future observation. Therefore by the traditional standards of empiricism it is meaningless.

However the CNS principle achieves similar explanatory aims and is falsifiable. CNS has predictive content in the sense that it unpredicts certain things, which might be observed next year or tomorrow or whenever.

CNS stands for "cosmic natural selection" but it could also be thought of as signifying "constants natural selection". It proposes an evolutionary mechanism which selects for "good" values of the basic constants.

In this case "good" means favoring REPRODUCTIVE SUCCESS analogous to what drives darwinian natural selection processes in other contexts.
 
Last edited:
Physics news on Phys.org
  • #2
the CNS principle is based on the premise that the formation of a black hole reproduces the universe
(with some variation of physical constants analogous to genetic mutation)

so values of the basic constants (like alpha and Lambda) which promote the plentiful production of black holes contribute to the REPRODUCTIVE FITNESS of the universe and to its reproductive success.

this leads to experimental checks and tests of the same sort as one may apply to biological natural selection.

in particular Smolin derives from the CNS principle an upper bound of around 3 solar masses on the size of a neutron star. (I don't recollect the exact number maybe it is 2.5 solar masses.)

If tomorrow some astronomers observe a pulsar (a type of neutron star) which belongs to a binary pair allowing reliable determination of the mass, and the mass of this pulsar turns out to exceed Smolin's CNS upper bound then CNS is shot down. It is able to predict something that might not be observed.

there is more to it, and I am oversimplifying, but that is the main thrust
there is more in
"Scientific Alternatives to the Anthropic Principle"
http://arxiv.org/hep-th/0407213
 
Last edited by a moderator:
  • #3
marcus said:
the CNS principle is based on the premise that the formation of a black hole reproduces the universe
(with some variation of physical constants analogous to genetic mutation)

... that is the main thrust, there is more in
"Scientific Alternatives to the Anthropic Principle"
http://arxiv.org/hep-th/0407213

the reason having alpha be around 1/137 contributes to reproduction is that it allows a large number of stable chemical elements to exist (a big periodic table with a rich chemistry) without which

clouds of dust and gas which are ready to condense down to form stars would be less able to radiate away surplus heat. any kind of condensation requires dumping heat---usually accomplished by radiating it off into space

in other words, the existence of carbon and oxygen speeds up the condensation of stars

if the only elements were hydrogen and helium then maybe there could be some stars, but they would not form so readily

and fewer stars eventually means fewer black holes, so fewer offspring

so if a cosmos has a bad alpha, that only let's it have two chemical elements (hydrogen and helium), then it won't have as many children to pass along its bad alpha to.

but if a cosmos has a good alpha then (other things being equal) it will have a lot of stars and black holes and it will have a lot of children to inherit its good alpha.
 
Last edited by a moderator:
  • #4
the role of LQG here is that it permits a theoretical probing of the major singularities of Gen Rel, to see if it is theoretically possible that
the formation of a black hole reproduces the universe

you can read more about this in Smolin's essay
"Scientific Alternatives to the Anthropic Principle"

Basically the Bang and Hole singularities have to be cured by an improved Gen Rel (a quantized version) which is what LQG tries to be. and then conditions where the singularities used to be have to be studied, to see if hole leads to bang.

this is an unfamiliar and even (for some people I imagine) unsettling prospect, it may reveal violations of accepted physical laws, or limitations on their applicability. To put it simply, we have to take our physics thru what was once a singularity. Physical constants may not remain perfectly constant in those abnormal circumstances, and the laws of thermodynamics may balk, and all that.

so the business of hole leading to bang needs to be approached gingerly with a great deal of caution on the part of theorists.
 
Last edited:
  • #5
By Marcus.


this is an unfamiliar and even (for some people I imagine) scary prospect, it may reveal violations, or limitations of applicability, of accepted physical laws that work in more usual context. basically we have to take our physics thru what was until recently a singularity. physical constants may not remain perfectly constant in those abnormal circumstances, and the laws of thermodynamics may balk, and all that.

This does not worry me, as it may have been a unique event, as long as
it predicts what is in our universe, what is the problem?
It goes beyond the grain to think that a theory like this is correct, but
why not, what we have now is a mishmash of splinted theories that go
nowhere, so any explanation is welcome
 
  • #6
wolram said:
... as long as
it predicts what is in our universe, what is the problem?
...

I should qualify what I said, and make clear my reservations, so that no one expects too much of the CNS principle.

1. it may be proven wrong. even as we speak some astronomer may be measuring the mass of a neutron star which is more massive than Smolin's limit. or someone maybe be observing something else which contradicts the notion that cosmic evolution optimizes the constants for BH production.

2. it needs a lot more work. there are several dozen basic constants (in standard model physics and cosmology). Which of them bear on black hole formation? If they were optimized for BH production, then what would their values likely be? Are the observed values close to the measured ones, or not? Not all the theoretical work has been done.

3. it involves judging what is reasonable odds. Like we can believe that gazelles were selected for speed, because you examine details of the animal and they look optimized, within a few percent of what it ought to be if the aim was to make the animal fast. If you come across something that doesn't fit, well maybe it just hasnt evolved good YET. But since most stuff looks optimized at least within reasonable tolerances, the idea that it evolved for speed is PLAUSIBLE.

To sum it up, the CNS could be shot down observationally today or tomorrow. Or LQG might fail to show that hole and bang conditions are theoretically compatible, leaving doubt as to how hole could CONNECT to bang.

Or when they look carefully at all the physics constants they might find one that is NOT evolved for hole production. As if when you examine gazelles closely you were to discover that each animal has a 5 pound left toenail----something obviously interfering with speed.

And then there is the business that part of the argument depends on judging how close to ideal optimum you are going to say is optimized. Suppose one of our constants turns out to be within 5 percent of perfect (for making holes) but not to be within 1 percent. What do you say? Do you say that it was on its way to evolving towards the optimum but hadnt yet been thru ENOUGH ITERATIONS?

So I am not discussing some neatly tied, trim package. the idea is still in the works. (and it risks falsification, like any good idea should). But I will say frankly that NONE OF THESE RESERVATIONS that I've stated worries me in the slightest.
 
Last edited:
  • #7
LQG, seems to be a possible melting pot for trial and error, out of the
multitude of events, one came about that allowed our existence, but
that is not science its more philosophy.
LQG seems to be the only candidate that may unlock the DNA of the
universe to date, i doubt it will be the final answer, but i will bet it will
be something similar that takes its place.
 
  • #8
outside, our young apple tree is covered with white blossoms
and the bees are taking care of scrambling its genes, which was the idea
of evolving flowers in the first place.
While outside enjoying the sun for a while, I thought of something by William Yeats the poet:

IN GRATITUDE FOR UNKNOWN INSTRUCTORS

what they undertook to do
they brought to pass.
all things hang like a drop of dew
upon a blade of grass.
 
  • #9
Marcus i live but a few miles from where the bard once lived, local legend
says that he was a drunken, but who care his words are forever.
 
Last edited:
  • #10
I tend to doubt we can derive all the fundamental physical constants from first principles. We may, however, be able to explain the relationships - i.e., dimensionless numbers like alpha. I was doing some casual surfing on this and came up with some pretty weird stuff. While checking NIST I noticed the show the value of alpha being derived from the 'Wales constant' instead of the expected [by me] e^2/hc. I did some checking on this Wales constant thing, still expecting to find e^2/hc at the end of the trail. Instead, I ended up here:
http://www.btinternet.com/~ugah174/
 
Last edited by a moderator:
  • #11
wolram said:
Marcus i live but a few miles from where the bard once lived,..

whoa! Warwickshire (which you give as your location) is where Stratford-on-Avon is! Not being a whiz at geography I had not made the connection. So, if I understand you, your home is not far from where Shakespeare lived, when he was not busy in London.

I believe that Yeats also spent some time in Warwickshire, but that is a relatively unimportant association compared with the Stratford-on-Avon one. The local people would hardly have noticed Yeats, and tourists would not want to be told about him. But he was as good a rhymer in his own way.
 
  • #12
I wanted to expand a bit on neutron stars and CNS. Smolin is a little vague on what might constitute a deal-busting mass. In the paper he states
"Sufficiently high is certainly 2.5M, although if one is completely confident of Bethe and Brown’s upper limit of 1.5 solar masses, any value higher than this would be troubling."

More to the point, I am intrigued by the notion that physical constants are free to choose arbitrary values when emerging from whatever it is they emerge from. Could it be there is only one truly abitrary constant and the others are forced to fall in line once the 'master' constant has chosen a value? Or could it be the universe developes chaotically, where the fundamental constants are initially arbitrary but self tuning until a stable configuration evolves. What I'm visualizing here is the universe not freezing its adjustable parameters until finding a stable combination that allows it to lose its negentropy by creating atoms, stars, etc.
 
  • #13
Winter approaches

For some of us, this is more seasonal:

When all aloud the wind doth blow,
And coughing drowns the parson's saw,
And birds sit brooding in the snow,
And Marian's nose looks red and raw,
When roasted crabs hiss in the bowl,
Then nightly sings the staring owl,
Tu-who;
Tu-whit, tu-who: a merry note,
While greasy Joan doth keel the pot.

William Shakespeare, Love's Labor's Lost
 
Last edited:
  • #14
Kea said:
For some of us, this is more seasonal:
...
Tu-whit, tu-who: a merry note,
While greasy Joan doth keel the pot.

that's right! Keas and Kiwis get winter at the Other time of year!
our sympathies.

I was curious about an apparently unfavorable reaction to the main idea here, which you posted in another thread:

"These prejudices are so invasive that they pervade even the best work in LQG. Picture a Black-Holes-Generate-Baby-Universes scenario...always described from the viewpoint of a metametaobserver, which the theory says cannot possibly exist!"

The paper where Smolin discussed this idea did not use any LQG formalism (Smolin does write papers which are not LQG research! :smile: ). "Scientific Alternatives to the AP" was what i would call a philosophy of science paper. It dealt with the logical possibility of a connection between hole and bang in a general way.

Your post contains the suggestion that there are actual LQG papers qualifying as some of "the best work in LQG" dealing with the hole-to-bang connection. I have not seen any LQG papers that do this, exemplary or otherwise. If you actually know some please give me links to them! I would love to read them.

As far as I know the bang singularity was only removed in 2001 and replaced by a bounce from a prior contraction (but Bojowald did not identify this contraction as coming from a black hole, he simply extended the model somewhat back in time to before the classical singularity without identifying what was there).

And even if Bojowald HAD mentioned that the prior contraction phase that he discovered looked technically similar to a black hole collapse, I don't suppose that would have made him a "metameta" (not sure what you mean by that), nor would I find it inconsistent with the LQG research framework. If two regimes are found to be mathematically similar one is permitted to point this out.

All the same, he did NOT speculate as to the nature of the prior contraction phase, and one reason is that, far as I know, the hole singularity has technically still not been removed! One still has to find out WHAT THE BLACK HOLE COLLAPSE LOOKS LIKE MATHEMATICALLY, before one can compare that with the contraction prior to the classical singularity in cosmology. Some preliminary results by Ashtekar and Bojowald have appeared, but nothing like a hole to bang "scenario" is discussed there.

In seeming contradiction, your post suggests that there are several exemplary LQG papers which "always" describe the hole to bang "scenario" is a certain fashion. I regret to say that this is disconnected from the reality of the LQG literature that I know. And I have been watching the LQG literature rather closely for a couple of years. So what you hint at in your post surprises me and excites my curiosity.

I will fetch you a non-LQG paper (Husain, Winkler) exploring the hole singularity and finding a bounce. As I recall, Husain and Winkler use the ADM variables, as in the Wheeler-deWitt formalism, no spin networks, no Ashtekar variables. They use their own extension of pre-1986 pre-LQG quantum gravity. Husain and Winkler's methods are also used by Modesto, who derives a bounce in the hole by means he says were "suggested" by LQG---this is related to LQG but not very representative. A second Ashtekar and Bojowald paper (which would be more representative) is said to be in preparation and I hope to see it before very long.

And I shall hope that in return you will provide me with some links to already posted LQG papers which treat some case of a black hole and derive a bounce (even if they don't explicitly say that the bounce is part of a "scenario").

This would be essential before we can sensibly talk about LQG papers "always describing" a hole-to-bang "scenario" from whatever viewpoint.
:smile:

And by the way, i am rather confident that in the future LQG WILL finish removing the hole singularity, and will find a mathematical resemblance between the prior-to-bang contraction and the contraction down the hole. I see clear signs that LQG will study this possible connection, to see if it works at a technical level.

I have no way of telling whether technical compatibility will be found when they examine the hole and bang ex-singularities. If a theoretical JOINT is made then the prior contraction is PART OF OUR UNIVERSE and we can look for various observable signatures that might serve to check the theory.
If LQG does not find compatibility then all bets are off although I suppose some other quantum gravity theory might.
 
Last edited:
  • #15
Lets get the links for the Husain Winkler and Modesto papers about black hole bounce. Even tho Husain and Winkler are not using representative LQG methods they are still interesting.


http://arxiv.org/gr-qc/0410125
Quantum resolution of black hole singularities
Viqar Husain, Oliver Winkler
4 pages

"We study the classical and quantum theory of spherically symmetric spacetimes with scalar field coupling in general relativity. We utilise the canonical formalism of geometrodynamics adapted to the Painleve-Gullstrand coordinates, and present a non-Schrödinger quantisation of the resulting field theory. We give an explicit construction of operators that capture curvature properties of the spacetime and use these to show that the black hole curvature singularity is avoided in the quantum theory."

As I say, they use ADM variables (the metric on the 3-manifold, not the connection) which is associated with "Geometrodynamics", the circa 1970 Wheeler deWitt approach, and not typical of LQG.

My impression was that Leonardo Modesto was not doing regular LQG either, although the TITLE of his second paper says LQG. His first paper's abstract says that his approach is "suggested" by LQG but actually follows
Husain and Winkler ADM variables formulation. (no Ashtekar variables, no spin networks!)

http://arxiv.org/abs/gr-qc/0407097
Disappearance of Black Hole Singularity in Quantum Gravity
Leonardo Modesto
9 pages
Phys.Rev. D70 (2004) 124009

"We apply techniques recently introduced in quantum cosmology to the Schwarzschild metric inside the horizon and near the black hole singularity at r = 0. In particular, we use the quantization introduced by Husain and Winkler, which is suggested by Loop Quantum Gravity and is based on an alternative to the Schrodinger representation introduced by Halvorson. Using this quantization procedure, we show that the black hole singularity disappears and spacetime can be dynamically extended beyond the classical singularity."


http://arxiv.org/gr-qc/0411032
The Kantowski-Sachs Space-Time in Loop Quantum Gravity
Leonardo Modesto

"We extend the ideas introduced in the previous work to a more general space-time. In particular we consider the Kantowski-Sachs space time with space section with topology R x S^2. In this way we want to study a general space time that we think to be the space time inside the horizon of a black hole. In this case the phase space is four dimensional and we simply apply the quantization procedure suggested by Loop Quantum Gravity and based on an alternative to the Schroedinger representation introduced by H. Halvorson. Through this quantization procedure we show that the inverse of the volume density is upper bounded and so space time is singularity free. Also in this case we can extend dynamically space time beyond the classical singularity."

though the title says LQG, the Halvorson approach (that Husain and Winkler say they are using) is not based on Ashtekar-type connection-variables and does not use spin networks. So this may be a cousin of LQG but it is somewhat on the margin: not typical or representative work.

yes, I checked this paper as well. it quantizes the metric, or rather two parameters that parametrize the black hole metric. that is not what I understand LQG to be about, although some part of it may have been suggested by LQG as the author says. The approach developed by Husain Winkler, and in these papers by Modesto, may be of interest on its own, however!
 
Last edited by a moderator:
  • #16
marcus said:
...The paper where Smolin discussed this idea did not use any LQG formalism...

Your post contains the suggestion that there are actual LQG papers qualifying as some of "the best work in LQG" dealing with the hole-to-bang connection. I have not seen any LQG papers that do this, exemplary or otherwise. If you actually know some please give me links to them! I would love to read them.

...I don't suppose that would have made him a "metameta" (not sure what you mean by that)...

All the same, he did NOT speculate as to the nature of the prior contraction phase, and one reason is that, far as I know, the hole singularity has technically still not been removed! One still has to find out WHAT THE BLACK HOLE COLLAPSE LOOKS LIKE MATHEMATICALLY...

Hi Marcus

Sorry! In the context (other thread) in which I made that remark I used the term LQG very loosely to refer to anything in mainstream QG outside of Strings and the Third Road and its relatives, including naive spin foam models. This includes the more philosophical papers of Smolin, even if, as you say, they are not technically within the LQG framework.

We need to sort out some terminology once and for all. Did you do that on PF somewhere?

As for references to LQG work on BH-BB: as far as I know they don't exist, as you say. By metemeta I was referring to the objective observer of the multiverse. This is a philosophical issue rather than a mathematical one, at this point, although I have not been discussing Category Theory for some time without some physical motivation.

Cheers
Kea
:-p
 
  • #17
Marcus, the connection of these papers to echt LQG is closer than appears from the summaries. From the Husain & Wincler Singularity Reduction paper:

It is widely believed that a quantum theory of gravity will give insights into the question of what becomes of classical curvature singularities. This is based largely on intuition from uncertainty principle and fundamental length scale arguments in regions of large spacetime curvatures. What is required to address the problem quantitatively is quantization of model systems that contain classical metrics with curvature singularities. Such models are usually symmetry reductions of general relativity or other generally covariant metric theories. Within a model an obvious approach
is to look at classical observables such as curvature scalars, and see if they can be represented as operators on a suitable Hilbert space. Their spectra and quantum dynamics may give an indication of what becomes of the classical singularity.

This question has been studied using models derived from symmetry reductions of general relativity since the late 1960’s [1, 2, 3, 4]. All of this work used the Arnowitt-Deser-Misner (ADM) (metric variable) Hamiltonian formulation of general relativity (”geometrodynamics”) as the classical starting point, and the Schr¨odinger representation as the quantum starting point for developing a quantum gravity model. The results obtained from various mini- and midi-superspace models were
largely inconclusive. Some indicated singularity avoidance, others did not, but no insights emerged as general and definitive in the sense of transcending the model studied.

After the development of the Ashtekar (connection variable) Hamiltonian formulation (”connection dynamics”), many of the questions studied in the ADM formulation were revisited, including the general canonical quantum gravity program (for reviews see [5, 6]). The different classical variables led to the development of a non-Schr¨odinger representation program based on holonomy variables (”loop quantum gravity”). Recently results from this programme were applied by Bojowald [7, 8]
to the old question of quantizing mini-superspace models, with a view to studying what happens to classical curvature singularities upon quantization. This application has produced some interesting results: for Friedman-Robertson-Walker (FRW) mini-superspace models the Hamiltonian constraint acts like a difference operator on the space of states, and there is an upper bound on the spectrum of the inverse scale factor operator. Taken together, these results lead to the conclusion that the big
bang singularity is resolved in the loop quantum gravity approach [9].

A number of questions may be asked at this stage concerning classical variables, quantization procedures, and singularity resolution: What criteria constitute singularity avoidance? Is the singularity avoidance conclusion from the loop quantum gravity programme a result of both the classical starting point and the choice of representation? Would a non-Schr¨odinger representation quantization in
the geometrodynamical ADM variables give the same results?

Motivated by these questions, we study a new quantization of flat FRW cosmology (the model in which the loop quantum gravity results were first obtained [7, 8]). Our classical starting point is the geometrodynamical Hamiltonian formulation, which we quantize via a non-Schr¨odinger representation motivated by holonomy-like variables. We obtain results qualitatively similar to those obtained in loop quantum cosmology: the Hamiltonian constraint acts like a difference operator on the space of states, and the spectrum of the inverse scale factor has an upper bound.

Thus at every stage they are comparing their results to Bojowald's, and they are motivated by his success. Notice the reference to "holonomy-like variables". This is along the lines of doing the same thing with a different technique, which has a long and honorable history in science. It could be that their non-Schrodinger representation will become a valued tool in LQG research.
 
  • #18
These prejudices are so invasive that they pervade even the best work in LQG.
Picture a Black-Holes-Generate-Baby-Universes scenario...always described from the viewpoint of a metametaobserver, which the theory says cannot possibly exist!

Kea said:
As for references to LQG work on BH-BB: as far as I know they don't exist, as you say.

Hi Kea, I think that is right, (there is no LQG work on the "hole-to-bang" connection) and it is a very interesting point. I am wondering why there is not any such work! It is a striking absence. Maybe I can learn something from it.

In light of that I was wondering what it is that is so pervasive throughout LQG that it "pervades even the best work in LQG." Because your post indicates that it has something to do with "Black-Holes-Generate-Baby-Universes scenario...always described from the viewpoint of a metametaobserver"

However I do not know of any LQG work that deals with a multiverse! Please point to some papers which would be exemplary or representative enough to count among "the best LQG work" which has an observer of a multiverse, or any multiverse at all!

By metemeta I was referring to the objective observer of the multiverse. This is a philosophical issue rather than a mathematical one, at this point,...

This is very interesting, since I don't consider that Bojowald is describing a multiverse when he removes the BB singularity and extends time back to a prior contraction.

In my view IT IS JUST THE SAME UNIVERSE so I do not perceive anything different from the normal business of extrapolating back in time except that it does not stop at the former, or classical, singularity.

So I don't see any new philosophical problem arising.

If you do please explain it, Kea, since I would be delighted to hear about it.

thx, I have to go help with supper. Back later



:smile:
 
Last edited:
  • #19
pervade even the best work in LQG. Picture a Black-Holes-Generate-Baby-Universes scenario...always described from the viewpoint of a metametaobserver, which the theory says cannot possibly exist!

this, and your other quote are especially interesting because they suggest to me that you think using Category theory solves some philosophical difficulty from which "even the best" or at least some representative LQG work suffers.

As far as I have seen, LQG has not lead to what is usually thought of as "multiverses". Although by resolving the BB singularity and pushing back to a prior contraction, it may SEEM to some people that Bojowald has crossed some "philosophical" boundary. To me, he has simply extended the universe back a few moments in time beyond where we used to go!

However, I compare this to what you say here, Kea

By metemeta I was referring to the objective observer of the multiverse. This is a philosophical issue rather than a mathematical one, at this point, although I have not been discussing Category Theory for some time without some physical motivation.

again I regret to say I have to go out shortly. But this is quite a stimulating bunch of ideas and i am looking forward to getting back to you and selfAdjoint!

cheers, :smile:
 
  • #20
if each black hole leads to new universe then doesn't that imply a different one thus your "multi"tude of uni"verses" or is it the same one everytime and if you rewind the bigbang back to it's supposed e-verse universe where everything is mirrored and time runs backwards then does that imply a possible infinite universe somewhere else contracting to a crunch and then bang another one forms or is it the same one

so basically is it one universe at a time with a parrallel negaverse runing backwards or many universes all different running in multiversal time which an observer will never see as we will always be locked into our 4d one where time runs forward

pardon my ignorance but it was kinda where I was going with the white hole thread
 
  • #21
spicerack said:
...

pardon my ignorance but it was kinda where I was going with the white hole thread

hi spicerack, let me tell you in a nutshell what this thread is about.

it is about an empirically testable way to explain why the constants of physics/astronomy favor black hole formation (if in fact they do) while avoiding running afoul of philosophy.

the thread is based on a paper of Smolin called
"Scientific Alternatives to the Anthropic Principle"
http://arxiv.org/hep-th/0407213

if you want to take part in the thread then I suggest that you

1. have a look at "Scientific Alternatives" because that paper is absolutely central to the discussion

2. know what some of the main physical constants are, especially alpha and Lambda, since they are so important to the very existence of everything around us

3. think a little about the constants and ask yourself what mechanism might cause alpha and Lambda to be what they are rather than something else.

their values, which we would like to be able to explain what caused them to be that, are approximately 1/137 for alpha
and 0.85E-120 for Lambda

both these values are amazingly fine-tuned to produce black holes!

if Lambda were a little different the universe would either have expanded so rapidly that galaxies and stars would never have a chance to condense, or else would have contracted in a crunch before any black holes had a chance to be born! Lambda is amazingly right to produce lots of black holes.

Also if alpha were a little different then space would be be full of dilute gas unable to condense into stars because lacking adequate means to radiate off its heat: so few black holes would get to form in that case as well.

What could have caused alpha and Lambda to be so favorable to the birth of black holes?

We are discussing one or more possible mechanisms.
 
Last edited by a moderator:
  • #22
sorry Marcus but in line with the paper am I not asking whether 2 of the mechanisms lead to the same universe or many ?
 
  • #23
spicerack said:
sorry Marcus but in line with the paper am I not asking whether 2 of the mechanisms lead to the same universe or many ?

what two mechanisms are you talking about?
 
  • #24
One mechanism is does black hole baby universe production lead to the same universe ? As in many paths to the same place or does each individual black hole lead to the reproduction of a different universe everytime thus giving us a multiverse.

The other mechanism is Bojowald pushing through the big bang to his backwards everted universe where i would assume white holes take the place of black ? So does that in turn lead to more universes or the same one being created everytime if the process is cyclic ?

apologies for just firing off questions off the top of my head next time I should try and at least read the background info rather than follow the trains of thought.

thanks Marcus for indulging me once again
 
  • #25
spicerack said:
sorry Marcus but in line with the paper am I not asking whether 2 of the mechanisms lead to the same universe or many ?

not sure i understand what two mechanisms

you may need to go to another thread if you want to talk about "multiple universes"

I don't usually visit threads where people talk about "multiple universes" cause it seems like an oxymoron, or just faddish baloney. I don't approve when scientists do it.

What LQG seems to do is extend our understanding a small amount further back in time, so that our mathematical model doesn't get stuck at where it used to (at the 'big bang singularity' where the old model broke)

it is the same universe, it just our analysis of it extends back a few moments further, when space was contracting. You can decide to call it, for some arbitrary reason, a "different universe" but then it is just a verbal conversation about what words mean, what does universe mean what is same what is different. I don't go in for semantics. For me if the model of our universe can extrapolate back a little earlier then it is still our universe.

THE QUESTION I WOULD SUGGEST YOU FOCUS ON IS not this 'multiple universes' stuff but WHY ARE THE CONSTANTS apparently fine tuned to produce AN ABUNDANCE OF BLACK HOLES?

(and incidentally the same alpha and Lambda that provide for plenty of black holes to form ALSO, as an accidental byproduct I suppose, provide conditions favorable to our earth-type biology---that is to pond scum, mushrooms, octopuses, ostriches and other glorious creations of Father Selection and Mother Nature. but this does not matter to the main topic)
 
  • #26
spicerack said:
...The other mechanism is Bojowald pushing through the big bang to his backwards everted universe where i would assume white holes take the place of black ? So does that in turn lead to more universes or the same one being created everytime if the process is cyclic ?

Bojowald doesn't make time go backwards or turn black into white. he just does what cosmologists have done for generations which is EXTRAPOLATE BACK IN TIME. just like detectives do in mysteries. I think of it as running the model in reverse, to find out how it was at an earlier time.

You have gotten confused and thought there were two mechanisms when you have only described one: that a black hole contraction could proceed into a big bang expansion. I may have contributed to the confusion by mentioning the eversion of the volume element----forget about that, it is a comparatively unimportant detail that crops up in the math. doesn't figure in the main outlines (but might cancel out another unimportant detail and help fix a minor problems later)

BTW I do not like the expression "baby" because it suggests the offspring is necessarily smaller. Also that it should be imagined as a different individual rather than just an extension.

Bojowald has not discussed this kind of speculative ramification, in any paper I've seen. he only supplies the math----technical stuff---for how a bounce, or transition from contraction to expansion, might work.
His analysis, that I have seen, concerns the time right around the changeover. He specializes in a brief timespace from one second before until one second after. or even briefer.

You make of it whatever speculations you will. Bojowald is known basically for removing the old theory's singularity (the point where it broke). For patching the kettle you might say. he is not noted for philosophy.

As a matter of taste, in the use of English, in answer to yr question, i would say ONE universe, not two or several. when you extrapolate back to one second before expansion began then (as a matter of taste in the use of words) it is still the same universe.

the same set of equations is describing things, the equations describe a mechanism. you can run the equations back, extrapolating back in time to what was before. the equations say it was a contraction. OK because you can extrapolate back to it with a mathematical model, I say it is the same universe. And I also say it is the same simply for the sake of not sounding like a faddish chattering nincompoop: the universe is what there is one of
 
Last edited:
  • #27
so basically are you saying that all black holes lead to the same universe and that it is an extension of this one but in a different place. They are not babies or different universes but different parts of this same one, so is it reasonable to speculate that black holes there lead to here ? BTW that sounds more like a worm hole to me

Isn't what you then claim of "extensions" rather than "babies" contradictory to what Smolin claims, that black holes bounce to form new distinctively different universes and that cosmic natural selection by random mutation of the constants determine which is the fittest to have life spring forth in it ?

the confusion i get is from a bounce which implies matter is dissipated and reflected back into our universe but what is actually referred as a bounce is everted big bangs pushing through black holes to become a new universe in another place

pardon me, but as you can see i really have no idea what I'm on about which is why I'm not bothering to use logic and reason to much to think about what i type just reacting using instinct and intuition

am I close though ?

marcus said:
THE QUESTION I WOULD SUGGEST YOU FOCUS ON IS not this 'multiple universes' stuff but WHY ARE THE CONSTANTS apparently fine tuned to produce AN ABUNDANCE OF BLACK HOLES?

Intelligent design ?

oh no

I didn't just say that

:wink:
 
  • #28
spicerack said:
so basically are you saying that all black holes lead to the same universe and that it is an extension of this one but in a different place. They are not babies or different universes but different parts of this same one, so is it reasonable to speculate that black holes there lead to here ? BTW that sounds more like a worm hole to me
...

Yes! I think it is just semantics what you call same and different in this case. The constants are allowed to change only very very slightly during the bounce (in Smolin CNS).

dont hold me to a precise figure but think of a tenth of a percent in alpha, or a hundedth, just to get a feel for it

People already have been endlessly writing and discussing about that much change in alpha happening over time in OUR universe. So semantically we can contemplate small gradual changes in constants while maintaining a sense of indentity.

As I say, I think it is a matter of taste whether one says same or different. how one chooses to speak English, not a physics issue.

BTW Smolin has put the CNS idea up to see if empirical observation shoot it down. he seems to be taking a "wait and see" empirical attitude not trying to convince or persuade. Chronos has some thoughts about why Smolin has put CNS on the table at this time. If you read the "Scientific Alternatives" paper you may have gotten an idea.

so is it reasonable to speculate that black holes there lead to here ?

No, actually. It would violate causality like something coming back from the future. We arent talking "science fiction" here! Everything inside a black hole is in the future of our region of spacetime. You shock me, spicerack!
:smile: What a notion!

BTW that sounds more like a worm hole to me
What do you mean by a worm hole? this CNS transition is not a hole in any sense that I can imagine---more the opposite. It is an extremely dense region of spacetime, and very hot (as you may have pictured the big bang).

I think the image of a hole is the opposite of correct. holes afford passage (this does not) holes are empty (this is not! it is the opposite, it is jam packed to the max and hot to the max)

thinking "hole" it is sort of like picturing the core of a star as hollow, the core of a star is not hollow, it is very dense and hot.
this region is like that only millions of times denser and hotter

if you have ever heard of a "quark star"-----like a neutron star but denser and hotter so that it has a core made of quarks that have busted out of their jeans and are no longer even arranged in separate neutrons----well this region is denser and hotter and stranger still

And BTW it may not exist. this is just a theory, that extreme contraction can lead to a bounce-----it comes out of LQG mathematics and LQG is not yet a complete theory nor has it been empirically tested.

I would say one should imagine, but withhold belief at this point.

And I make no claims about it, of course. I am just describing a mechanism that might (if it passes some tests) explain the curious fact that the constants we measure appear "fine-tuned" to produce plenty of black holes


Isn't what you then claim of "extensions" rather than "babies" contradictory to what Smolin claims,
sorry spicerack, you lose my attention when you say "claims". I do not claim. Smolin does not claim.

The following is a distortion of the model Smolin offered. I believe that in this type of discussion you are morally obligated to read the paper and paraphrase honestly:
that black holes bounce to form new distinctively different universes and that cosmic natural selection by random mutation of the constants determine which is the fittest to have life spring forth in it ?

Shame on you spicerack! :mad: You are putting on your own spin by adverbs.
1. the whole point is that the new set of constants is NOT distinctively different. the new set is NOT random but almost the same. the mechanism wouldn't work unless the offspring were not virtually indistinguishable from the parent (as in biological evolution: let not the octopus mutate abruptly into an oak tree but each beget after its own kind with only small changes so that selection worketh properly)

2. self-optimising systems work by taking small steps

3. life is irrelevant to this. Don't you feel a bit disingenous inserting a phrase like "the fittest to have life spring forth in it" into a discussion of somebody else's model where life plays no role?

a bounce which implies matter is dissipated and reflected back into our universe

no! this is a different kind of bounce! what word in English would you propose we use instead? to me the process is very bounce-like and since we have inherited no better word I think it is a good one to use. but be aware that in this case what bounces doen not rebound back on us but goes forward into its own future.

a fork in time is also a good verbal image, in my opinion. but basically I am just watching from the sidelines


Intelligent design ?

oh no

No, of course not. If I thought you gave any credence to that notion I'm afraid I would not bother to chat with you. :smile:
Cosmic Natural Selection is obviously not about some self-willed busybody interfering in the affairs of nature. Indeed to the extent that black hole optimization appeals to me (as more than just another cosmology model to be tested by its predictions) it is probably because it may help protect the scientific endeavor from the danger of that kind of corruption.
 
Last edited:
  • #29
In the beginig there was only gravity and radiation, if gravity is the space time,
the metric, then radiation is the only thing left that could "encode" allowable
states in this metric, maybe via the spin, tension of connections in the mico
states, or some other" memorable" condition, from then onwards the remembered
states would have to have some mixing constraints as in DNA, from this the
constants emerge. babble babble.
 
  • #30
Spicerack appears to be a cute lady. She just mentioned Intelligent Design (tho archly denying its appeal to her in the same breath). This causes me think what a debased sacrilege the notion of Intelligent Design is. It offends ones feeling of the sacredness of nature to be told that some Divinity has chosen the value of the fine structure constant.
I assume that alpha's value (near 1/137) was arrived at naturally, by some (probably sweet and elegant) mechanism to be eventually discovered.

If someone wants to say that a divine power created the whole Works, well fine and dandy! But I shall suppose that some analysable mechanism such as natural selection created details like the electron and the gazelle.

I wish to show my reverence for nature by supposing that natural selection chose the streamlined shape of birds, and the beautifully adjusted cosmological constant Lambda.

Part of the way I revere nature is my sporadic but repeated efforts to understand how it works.

I assume that some evolutionary mechanism arrived at the periodic table of elements---by a process of gradually adjusting the parameters like alpha which determine it.

The idea that a Designer interfered in the construction of the periodic table with its 90 or 100-odd chemical elements, is basically offensive to the spirit, it desecrates something which I feel is sacred.

So basically I would say that what is wrong with Intelligent Design is gives a debased idea of God. It is ignoble and a bit perverted for a Divinity to get down and tweak physical parameters of the Standard Model in order, say, to make Marilyn Monroe, or eels, possible. Or to save the bacon of some contemporary theoretical monster like "String Theory" with its "Landscape", if it is possible to forestall its self-destruction.

People who promote talk of Intelligent Design are involving the Creator in petty mechanistic details, which runs counter to what I see as the main direction of of Western monotheism and also, I suspect, the forms of spirituality in other high religious traditions.

And they are also eroding the standards of scientific inquiry. So they are, in my view, harming two of our highest and most precious traditions. they are managing to degrade both at once!

But this is not the topic of this thread. the main thing I want to focus on, as I said to spicerack, is how do you explain why the parameters of the Standard Models as we measure them seem so marvelously adjusted to favor the production of black holes? If indeed, as Smolin's paper suggests, they are.
 
Last edited:
  • #31
Hmm I don't see the relevance really. I mean the constants are finetuned for a lot of things, like the abundance of both young and old galaxies, as well as for chemistry and the stuff that makes life.

I mean I don't think we will ever get rid of all constants in a physical theory, I'd just like to have as few as possible. It would be rather nice if we had maybe 3 or 4 master constants in some equation that then outputed everything we observe and all the other socalled constants are just derived.

Which ones you select for being fundamental or derived are of course model/philosophy dependant. Personally I don't mind having force coupling constants. Especially if there is some grand unification scheme which ends up giving you 1 master constant, and various symmetry breaking scales to get the rest.

However things like the cosmological constant (or nonconstant) as an input does bother me! In principle we should be able to calculate it from first principles.. Its precisely the sort of thing that screams for a mechanism.
 
  • #32
wolram said:
In the beginning there was only gravity and radiation, if gravity is the space time,
the metric, then radiation is the only thing left that could "encode" allowable
states in this metric, maybe via the spin, tension of connections in the micro
states, or some other" memorable" condition, from then onwards the remembered
states would have to have some mixing constraints as in DNA, from this the
constants emerge.

dont say babble babble. you have originality and vision. it is not babble (at least so far) but a concise visionary account. and rather common-sensical as such things go

I have fixed some spelling and punctuation.

people need visions, the mind does not live entirely by the scientific method alone.

However let's see if we can get back to empirical stuff. Are the constants, as measured, actually in the right ranges so as to favor the abundance of black holes?
this is a really hard question?
Smolin talks in his paper about the top quark mass. Why is that important?
why would having that mass be in a certain range help with black hole production? Maybe that is too hard a question for us.
 
  • #33
Haelfix said:
...I mean I don't think we will ever get rid of all constants in a physical theory, I'd just like to have as few as possible. It would be rather nice if we had maybe 3 or 4 master constants in some equation that then outputed everything we observe and all the other socalled constants are just derived.

Which ones you select for being fundamental or derived are of course model/philosophy dependant. Personally I don't mind having force coupling constants. Especially if there is some grand unification scheme which ends up giving you 1 master constant, and various symmetry breaking scales to get the rest.

However things like the cosmological constant (or nonconstant) as an input does bother me! In principle we should be able to calculate it from first principles.. Its precisely the sort of thing that screams for a mechanism.

I assume you have read Smolin's paper
http://arxiv.org/hep-th/0407213
which is the focus of discussion. What did you think of it?

Does the mechanism determining the cosmological constant which Smolin offers in this paper seem satisfactory to you (I agree it screams for a mechanism) or do you see flaws?
 
Last edited by a moderator:
  • #34
I honestly can not see that any constants would favour BH production,
I see BHs as a degenerate area of space time, where the coherent
information of the metric, and the original gravitating body have been
splattered around the event horizon, the power house of the BH is
information less gravity, I think in the end it is the origin of the mass
that created the BH that maybe governed by primordial disturbances
and hence constants.
 
  • #35
wolram said:
I honestly can not see that any constants would favour BH production,
.

I can wolram, but I cannot make you see.

the world's material (as I picture it) started out fairly evenly dispersed.

before it could collapse to form BH it had to condense into galaxies and stars

too large a Lambda prevents condensation
so does a bad choice of alpha (but by a more complicated argument having to do with mechanisms for radiating heat off into space)

Smolin goes into some detail about this and it makes sense to me (but not, I take it, to you)
 
Back
Top