Is the Faint Young Sun Problem Solved by Increased Greenhouse Gases?

In summary, global warming is considered a fact within the science community. Global surface temperatures and ocean temperatures are rising, while there is a overall melting of ice and permafrost. Sea levels are also rising and the rate of sea level rise has accelerated in the past century. The majority of heat from global warming is going into the oceans, as seen in Figure TS.15 of the IPCC report. The warming is also affecting plant and animal life, causing shifts in climate zones. While there were some discussions of global cooling in the 1970s, it was never a consensus view and there is now ample evidence that the Earth is currently experiencing a warming trend. Some individuals may deny this evidence for political reasons, but the scientific consensus remains that global
  • #71
Eric McClean said:
Even i think it is a fact but , some continents like Europe are experiencing fall in temperature that they have never ever had before , global warming is reversing itself?

If you are referring to the short term, then keep in mind that global warming does not mean there will no longer be winter weather. Even a year is considered to be a brief period for the climate. Generally 10 years is the shortest period, but technically a 30 year average is needed to establish the climate for a particular region.
 
Earth sciences news on Phys.org
  • #72
Absense of falsifiability

The basis for belief in global warming from carbon dioxide emissions is the set of models that are referenced by the United Nations Intergovernmental Panel on Climate Change (IPCC) in its 2007 report. According to an author of this report, the climatologist Kevin Trenberth ( <http://blogs.nature.com/climatefeedback/recent_contributors/kevin_trenberth/> ), the IPCC models do not make predictions. It follows that: a) the IPCC models are not falsifiable and b) the IPCC models are not scientific, by the definition of "scientific."

One should not argue that a model built by scientists is a scientific model. To do so is to employ the logical fallacy of arguing from authority.
 
Last edited by a moderator:
  • #73
Terry Oldberg said:
The basis for belief in global warming from carbon dioxide emissions is the set of models that are referenced by the United Nations Intergovernmental Panel on Climate Change (IPCC) in its 2007 report.

Not quite. That is part of working out some of the details; but the major basis for identifying carbon dioxide as the major factor for warming is the thermodynamics of radiation transfer in the atmosphere, which allows to calculate the associated forcing. This is not based on climate models.

No other forcing is known that is as large as this one. This not based on models either; it's rather a case that evidence indicates all the proposed factors simply have smaller, or negative forcings. The major basis for this conclusion is empirical studies and measurements; not models.

According to an author of this report, the climatologist Kevin Trenberth ( <http://blogs.nature.com/climatefeedback/recent_contributors/kevin_trenberth/> ), the IPCC models do not make predictions. It follows that: a) the IPCC models are not falsifiable and b) the IPCC models are not scientific, by the definition of "scientific."

This is a link to an archive of all Kevin Trenberth's contributions to the Climate feedback blog at nature.com. This is a good set of articles to learn more about how the science works.

It is not clear what article you are referring to, but I think the most relevant may be this one: http://blogs.nature.com/climatefeedback/2007/06/predictions_of_climate.html, from June 04, 2007. Here is the start and end of that article. The whole article is good as well.
I have often seen references to predictions of future climate by the Intergovernmental Panel on Climate Change (IPCC), presumably through the IPCC assessments (the various chapters in the recently completedWorking Group I Fourth Assessment report ican be accessed through this listing). In fact, since the last report it is also often stated that the science is settled or done and now is the time for action.

In fact there are no predictions by IPCC at all. And there never have been. The IPCC instead proffers “what if” projections of future climate that correspond to certain emissions scenarios. There are a number of assumptions that go into these emissions scenarios. They are intended to cover a range of possible self consistent “story lines” that then provide decision makers with information about which paths might be more desirable. But they do not consider many things like the recovery of the ozone layer, for instance, or observed trends in forcing agents. There is no estimate, even probabilistically, as to the likelihood of any emissions scenario and no best guess.

[...]

So if the science is settled, then what are we planning for and adapting to? A consensus has emerged that “warming of the climate system is unequivocal” to quote the 2007 IPCC Fourth Assessment Working Group I Summary for Policy Makers (pdf) and the science is convincing that humans are the cause. Hence mitigation of the problem: stopping or slowing greenhouse gas emissions into the atmosphere is essential. The science is clear in this respect.

However, the science is not done because we do not have reliable or regional predictions of climate. But we need them. Indeed it is an imperative! So the science is just beginning. Beginning, that is, to face up to the challenge of building a climate information system that tracks the current climate and the agents of change, that initializes models and makes predictions, and that provides useful climate information on many time scales regionally and tailored to many sectoral needs.

We will adapt to climate change. The question is whether it will be planned or not? How disruptive and how much loss of life will there be because we did not adequately plan for the climate changes that are already occurring?

I recommend people look at the whole thing; and the other articles in the blog as as well. Kevein Trenberth is an expert in energy balance in particular, and is a fair minded reporter of what is known and unknown in the science of climate. In particular, he points out that one important aspect of actually giving reasonable predictions of climate -- rather than general trends and understanding of the major forcings -- is being able to model the short term variations, like the Pacific Decadal Oscillation and other such factors. The science does indeed give falsifiable conclusions about the importance of carbon dioxide and the fact that it drives a substantial part of the current measured global warming phenomenon. But there are still many unknowns before definite predictions could be given.

The IPCC reports give broad ranges of likely outcomes for different emissions scenarios, and explicitly notes the uncertainties of the details in climate patterns as the planet warms.

Cheers -- sylas
 
Last edited by a moderator:
  • #74


Terry Oldberg said:
The basis for belief in global warming from carbon dioxide emissions is the set of models that are referenced by the United Nations Intergovernmental Panel on Climate Change (IPCC) in its 2007 report. According to an author of this report, the climatologist Kevin Trenberth ( <http://blogs.nature.com/climatefeedback/recent_contributors/kevin_trenberth/> ), the IPCC models do not make predictions. It follows that: a) the IPCC models are not falsifiable and b) the IPCC models are not scientific, by the definition of "scientific."

One should not argue that a model built by scientists is a scientific model. To do so is to employ the logical fallacy of arguing from authority.

This is all wrong. For one thing, "the basis for belief" is the fundamental physics of radiative transfer, and the "broad brush science" was laid down late in the 1800's and early in the 20th century. If you can disprove that CO2 is a greenhouse gas and does not possesses the chemical properties to absorb and emit infrared radiation, then you can "falsify" AGW as we know it (although "AGW" is a very ill-defined and broadly encompassing term, so we should establish what exactly we're trying to falsify). In fact, much of the more important parts of anthropogenic climate change, including stratospheric cooling, tropospheric warming, polar amplification, oceans heating slower than land, etc (some of which are unqiue to greenhouse perturbation, other things which happen in a warmer climate of any cause) have their basis in theoretical or very simple radiative-convective models, and are not sensitive to various assumptions which end up producing differing results across models.

Trenberth is probably referring to the fact that models make projections, not predictions. The difference may seem trivial to a non-specialist, but these things have very different implications. It's the difference between saying "If traffic is flowing like business-as-usual today, then I will arrive at my next destination in 30 +/- 4 minutes" versus "we will have no car accidents, no unusual amount of red lights, no traffic jams, etc, and therefore we will get at our destination in 30 +/- 4 minutes." I hope this is clear. Aside from that (note I haven't actually researched the Trenberth quote), models do indeed give falsifiable projections, although doing so requires careful consideration of the timescale involved, the model and observational uncertanties, the "noise" in your record, etc.
 
Last edited by a moderator:
  • #75
Bill Illis said:
Anyone ever put Myhre's forcing estimates into the Stefan-Boltzmann equations.

If the total forcing increase from GHGs is 1.7 W/m2, the Stefan-Boltzmann equations predict very little temperature change from an increase this small.

Surface TempK Today = (390 W/m2/5.67E-08)^.25 = 287.98K = 15.0C

Surface TempK Pre-Ind = (388.3 W/m2/5.67E-08)^.25 = 287.66K = 14.7C

So either Myhre's estimates are not really the traditional watts/metre^2 measure we use normally or the Stefan-Boltzmann equations aren't even being used.

Bill,

You might find this interesting in view of the recent emails disclosed and what was left out.




Temperature Rise of the Earth from Global Warming derived from the Stefan-Boltzmann Law of Physics, ignored by the U.N.

Notes on global warming, one way or the other:

The Stefan-Boltzmann Law concerns the radiation striking the Earth and other bodies. It also covers the radiation of energy back into space from the earth.

It states:
Power = (Surface area of the the earth) times (the Stefan-Boltzmann constant) times (the net emissivity of the earth) times (the temperature of a body) raised to the fourth power.

So let's use this formula with and without man made global warming, GW. The real problem is to eliminate the net emissivity variable which is where GW believers fuzz up the math, the science, and say this law of physics doesn't apply. Really? It does and that can be proved.

The surface area of a sphere like the solid earth, SA, is almost constant. So is the Stefan-Boltzmann constant, SB, verified by Wein and Planck.

P1 = SA * E * SB * T1^4 without GW
P2 = SA * E *SB *T2^4 with GW from man

Just below, all the constants are combined into one number that is a new constant, K. It's value is about 1, (SA*E1*SB)/(SA*E2*SB). Let's assume the emissivity of the Earth is constant. If it wasn't, then the temperature fluctuations over time on this planet would be larger than they are, year to year or within a year. So K is really equal to one on a constant climate earth, for now.

So, dividing these two formulas yields:

P2/P1 = K * (T2/T1)^4 = 1 * (T2/T1)^4

Or for a warming of 0.1 degrees at our average temperature on Earth we get,

P2/P1 = K * [(288.1° Kelvin) / (288° Kelvin )]^4

P2/P1 is a power ratio that can be expressed as a fraction or percentage. T2/T1 can also be expressed as a percentage but it varies as the forth power.

All we need to do first is calculate the percentage of change of K from a slight 0.1 degree of temperature rise to get a feel for the change in E per 0.1 degree Kelvin or Centigrade.

(288.1º / 288º )^4 = 1.00138 = 1.0014

That's how much 0.1 degree of GW will change the constant K. It will change by 1.0014 or go up by a whopping 0.14%




Now let's look at the other side of the formula, the power ratio. Here we have the same problem. How do you measure the radiation from the activities from man alone? Well, let's go outside the environmentalist box and run a few numbers.

Here are the two facts we need:
3.9×10^22 Joules, the estimated energy contained in the world's TOTAL fossil fuel *reserves* as of 2003.
5.5×10^24 Joules, the total energy from the Sun that strikes the face of the Earth each year. This is the value of P1 normally hitting the earth.
P2 is really P1 plus the extra (heat) power from GW, Pgw, or P2 = P1 +Pgw. However, we are interested in only the GW portion from Pgw caused by man, not the total increase from natural heating by the Sun.

Now remember, this is the total energy from all fossil fuel reserves not yet burned up but let's burn up all the fossil fuels up in one year in a super duper gas guzzler engine and the coal in a gazillion new Chinese and Indian power plants.

(3.9x 10^22 Joules burned up in one year) / (5.5 x 10^ 24 Joules from the sun per year) =

(0.709 X 10^-2) = 0.00709 or 0.709%
from only man's activities as defined above.

So if we burned up all the fossil fuels remaining on the Earth in one year, what would the resulting temperature rise be? K was only changed by 0.14% from our fourth power of T calculation.

0.14% raises the temperature of the Earth by 0.1 degree. So (0.1) * (0.709) / (0.14) is a 0.5 degrees Kelvin, Centigrade, or Celsius of warming from burning all the fossil fuel reserves in and on the Earth in one year!

What a big threat. The temperature will go up 0.5 degrees after we burn up a 100 to 200 hundred year supply of all fossil fuels in one year. So the real temperature rise from the activities of man over time will be 0.5 degrees spread out over a hundred years or 0.005 degrees per year, assuming a straight line plot of usage. If I use 200 years, it will be even lower.

In my calculations, I assumed my Earth had a constant emissivity because that is a sticky problem for environmentalists. The U.N. hates and doesn't use the S-B law and besides the true emissivity is hard to determine. Can we actually back calculate the value of the emissivity of the Earth or at least it's range? Yes we can.

I assumed that E2/E1 was equal to one in my outside the box calculation.
Now E for the Earth is about 0.64. E can vary but how much does it vary because of MAN MADE GW? All we have to do is look at K = (SA*E2*SB)/(SA*E1*SB) = E2/E1.
So for the earth, E1 = 0.64. So any change in temperature has to be directly affecting E2, the new emissivity from the extra GW.

"The emissivities of terrestrial surfaces are all in the range of 0.96 to 0.99"******
"Clouds, however, which cover about half of the Earth's surface, have an average emissivity of about 0.5"******
"Taking all this properly into account results in an effective Earth emissivity of about 0.64"*****
"E ice is 0.98"
"E water is 0.67"
"E black stuff is 0.96"
"E aluminum foil is 0.09!"******
"E gold, polished foil is 0.03 (reflects infrared better than Al)"

Now the Earth is not a polished gold surface nor a perfect reflector. It is more like a mix of sand, dirt, clouds, water, ice, and the biggest green house gas of all, water vapor. So we can increase the Earth's emissivity to a totally outrageous painted black Earth to near the new absurd E value of one by using the factor 1.0 / (0.64). Applying this new directly proportional and ridiculous factor, one can derive a new value for 0.005 degrees per year which I derived from the power side of the equation to yield a new maximum increase of 0.0078 degrees per year,
[(1 * 0.005) / 0.64]. What's this ridiculous temperature rise in 100 years? 0.78 degrees. You'll see this number as 0.74 later.

So let's check this calculation with KNOWN published GW facts. In the last 100 years, the Earth has only warmed less than a degree or 0.5 degrees.

IPCC_Fourth_Assessment_Report
"Warming in the last 100 years has caused about a 0.74 °C increase in global average temperature. This is up from the 0.6 °C increase in the 100 years prior to the Third Assessment Report."
So the real UN temperature rise from a new recalculation is being used to prove GW was only in error by a puny 0.14 degrees.

Oh darn. My bonfire guzzler temperature rise number of 0.5 degrees for a hundred years is off from the real rise of global temperatures of the "new" 0.74° C rise in the new "reevaluation" in the "new" UN report. Who's collecting this data and doing these calculations?

However, if the Earth was painted almost so called black (E=0.999), then,
(100 years) * [(1 * 0.005° C) / 0.64] is 0.78 degrees per year. So the UN number of 0.74 appears to be real and has just proved the Earth is painted black. Their effective emissivity value for E2 must be nearer to 0.99 but none of this matches our real color of the earth, does it?

I am crushed. The infallible UN report has proved we were doomed over the last 100 years and are already dead. We were living on a black water world that absorbed all the energy since the Civil War in the USA.

So if the power ratio for man GW goes up by a factor of 0.00709 times, what is the temperature rise in a back calculation in our Stefan-Boltzmann derived formula as related to the rise or fall in E2? What is the real E2 of the earth?
The total power ratio would be 1 plus 0.00709.

1.00709/(E1/E2)(SA*SB/SA*SB) = 1.00709E2/E1 = (T2/288)^4
For E1 = 0.64 for the real earth,
1.5736 * E2 = (T2/288°)^4
The UN says the rise from man made GW is now recalculated to be 0.74 degrees. Fine.
So that's T2 = 288.74°
E2 = [(288.74/288)^4]/1.5736 = 0.642
So Mother Nature has buffered the effects of any global warming by using the biggest green house gas of all, water vapor that can change it's state of matter to do that buffering, unlike the trace gas that stays a gas, CO2. E2 really is almost constant like I initially used above.

Now let's look at the painted black Earth of the UN where E2 = 0.99
1.00709/(E1/E2)(SA*SB/SA*SB) = 1.00709E2/E1 = (T2/288)^4
For E2 = 0.99,
1.5736 * 0.99 = (T2/288)^4
1.5579^.25 * 288 = T2 = 321.8
That's a whopping rise of 33.75 degrees C or K for a UN doomsday emissivity of 0.99 for the good black earth.
So 321.8°K yields an average temperature of the Earth of 119.6 degree F. Really?

Clearly the emissivity of the Earth changed very little to 0.642 and is almost constant.
Okay. It's a 0.31% rise that the climate of Earth and man changes the Earth's emissivity to yield an incorrect rise of 0.74 degrees C, according to the UN report.
Remember I used the infallible UN's new temperature rise number, 0.74, and the factor for burning up all the fossil fuels on Earth in one year, 1.00709.

The levels of CO2 have risen 50% in the last hundred years but the emissivity has not changed much at all and neither has the Earth warmed up to the UN's 119.6 degrees F average temperature. A 50% rise in the emissivity would be E2 = 0.64 * 1.50 = 0.96. Oops! CO2 didn't make the emissivity rise to 0.96, did it?
My average temperature at my house has never ever peaked to 119.6 degrees F (E2 = 0.99).

So CO2 levels are changing wildly. What didn't change much? What was the green house gas that can buffer the effects of the Sun's output and those of man? What is the only condensible gas that is a green house gas above -30 C? What gas is present in high concentrations so that minor fluctuations will not affect the average emissivity or Earth climate much? The answer is water vapor, the biggest green house buffering gas of them all.

The burning of all the fossil fuel reserves on the Earth in one year is like throwing a tanker truck of sulfuric acid into the huge volume of the ocean and saying you raised the pH of the ocean an alarming amount after mixing it in. How about a pH change of one part per googolplex?
When things get out of control, Mother Nature either condenses water vapor with its associated removal of the heat of vaporization, forms ice from water with its removal of its heat of fusion, heats up the top layer of the ocean to volatilize water and create clouds with its added heat of vaporization, or melts ice to water with an added heat of fusion. CO2 can't do all this and is a trace gas.

Mother Nature created a beautiful water molecule feedback system to maintain her various creations of DNA macromolecules and buffer the effects of the variable output of her creation, our Sun. It's a beautiful system that follows the Stefan-Boltzmann law proven by the likes of Max Planck, Wein, and others mentioned in the 1911 Noble Peace Prize speech available on line. Stefan and Boltzmann never knew how big and fundamental their law really was. It was Wein that hammered in the first nail to prove half the Stefan-Boltzmann Law of physics. It was Planck and Planck's Law that hammered in the last nail to prove the Stefan-Boltzmann law of physics covers a wide range of radiation that must be part of any global heating or cooling.

No wonder the UN never used the Stefan-Boltzmann Law in their report to prove man made global warming.

cheers,

stefan.
 
  • #76
stefanslaw said:
The Stefan-Boltzmann Law concerns the radiation striking the Earth and other bodies. It also covers the radiation of energy back into space from the earth.

I have explain how this law is used in [post=2497769]msg #28[/post] of the thread.

stefanslaw said:
Now let's look at the other side of the formula, the power ratio. Here we have the same problem. How do you measure the radiation from the activities from man alone? Well, let's go outside the environmentalist box and run a few numbers.

Here are the two facts we need:
3.9×10^22 Joules, the estimated energy contained in the world's TOTAL fossil fuel *reserves* as of 2003.
5.5×10^24 Joules, the total energy from the Sun that strikes the face of the Earth each year. This is the value of P1 normally hitting the earth.
P2 is really P1 plus the extra (heat) power from GW, Pgw, or P2 = P1 +Pgw. However, we are interested in only the GW portion from Pgw caused by man, not the total increase from natural heating by the Sun.

This is not relevant. Human impact on the climate is not from the energy released, but primarily by the atmospheric greenhouse effect from changes in atmospheric composition.

In my calculations, I assumed my Earth had a constant emissivity because that is a sticky problem for environmentalists. The U.N. hates and doesn't use the S-B law and besides the true emissivity is hard to determine. Can we actually back calculate the value of the emissivity of the Earth or at least it's range? Yes we can.

Note that the science reported by the IPCC does in fact use radiation laws correctly, including the Stefan-Boltzman law. This is a law for a blackbody, which has no frequency dependence on emissivity. In general, Plank radiation laws are used. And they really are used.

I assumed that E2/E1 was equal to one in my outside the box calculation.

You also assumed the Earth radiates like a simple blackbody surface. However, the Earth has an atmosphere, which results in a difference between temperature at the surface, and temperature of thermal radiation emitted out into space. The greenhouse effect basically changes this difference. You can calculate that from radiation physics with a few details of the atmosphere; but you have to use frequency dependent Planck radiation laws, and consider equations for radiation through a transparent medium -- the atmosphere.

The major greenhouse gas in the atmosphere is water vapour, but the amount of water in the atmosphere is determined largely by temperatures. This makes it a feedback. The second most important greenhouse gas is carbon dioxide. As you add CO2, this increases the difference between surface temperatures and the temperature of radiation into space... effectively warming the surface. This in turn increases the capacity of the atmosphere to hold water, which gives an amplification with a further greenhouse effect. This is called a feedback.

Cheers -- sylas
 
  • #77
stefanslaw said:
However, if the Earth was painted almost so called black (E=0.999), then,
(100 years) * [(1 * 0.005° C) / 0.64] is 0.78 degrees per year. So the UN number of 0.74 appears to be real and has just proved the Earth is painted black. Their effective emissivity value for E2 must be nearer to 0.99 but none of this matches our real color of the earth, does it?

Now let's look at the painted black Earth of the UN where E2 = 0.99
1.00709/(E1/E2)(SA*SB/SA*SB) = 1.00709E2/E1 = (T2/288)^4
For E2 = 0.99,
1.5736 * 0.99 = (T2/288)^4
1.5579^.25 * 288 = T2 = 321.8
That's a whopping rise of 33.75 degrees C or K for a UN doomsday emissivity of 0.99 for the good black earth.
So 321.8°K yields an average temperature of the Earth of 119.6 degree F. Really?

First, as Sylas explained it's not the heat from the combustion of fossil fuels that drives global warming. Instead the warming is due to the heat trapping properties of CO2 as it persists in the atmosphere.

Also, emissivity is not a measure of the color of the Earth's surface (that would be albedo). Instead, emissivity is a measure of how well the atmosphere transmits infrared radiation. It's basically the ratio between the infrared flux at the top of the atmosphere to that on the surface. The "thicker" the atmosphere, the lower the emissivity and the warmer the surface.

So, as emissivity rises the Earth surface temperature will fall in response.

Finally, by thickness, I do not mean to imply that emissivity is proportional to density.
CO2 is after all trace gas, but when it comes to infrared radiation, it behaves like a dye.
It becomes especially important at higher elevations in the atmosphere where the amount of water vapor is low.
 
  • #78
No one is up to the challenge?

I'm asking for examples of significant corrections (alters the result) and retractions of mainstream climate science papers (papers that did not stand as a challenge to conventional ideas of climate).

The presence of these corrections and retractions would have been a good indicator that healthy professional criticism (don't confuse with layman/politicized/crackpot skepticism) is present and errors are being caught.
 
  • #79
Responses to posting #73

Sylas:
Thank you for taking the time to comment on my posting. I'll preface my remarks by explaining that I'm an engineer (mechanical, electrical, nuclear) with a background in research and in the successful development of statistically validated models of complex systems. My background in climatology is limited to what I've learned in the past couple of months. However, I have a strong background in fluid mechanics and heat transfer plus a bit of background in atomic physics.

A couple of months ago, I decided to look into the controversy over anthropogenic global warming as a kind of civic duty. As I started this work, I assumed that climatologists knew what they were talking about when the claimed that carbon dioxide emissions were warming the Earth to a significant degree.

One of my first acts was to Google on the phrases "IPCC models" and "validation." This search turned up nothing resembling a validation excercise. However, it did turn up a Web
posting ( http://www.tech-know.eu/uploads/Spinning_the_Climate.pdf ) by a physical chemist named Vincent Gray. Gray explained that he had been a reviewer of each successive IPCC report. In that capacity, he said he had flagged misuse of the words "validation" and
"prediction" and that the IPCC had responding by changing these words to "evaluation" and
"projection" in some areas of its reports but not others. "Validation" and "prediction"
are statistically meaningful words. In particular, if a model makes predictions, it can be
statistically validated. "Evaluation" and "projection" are statistically meaningless; use of
these terms by the IPCC obscures the important issue of whether the IPCC models can be
validated.

According to Gray, he urged description by the IPCC of how the IPCC's models could be
statistically validated. He says that the IPCC blew him off on this issue and implies that the IPCC's models are neither validated nor susceptible to validation. If Gray is correct, then the
IPCC's models are not "scientific" models under Karl Popper's criterion of falsifiability.

I read the IPCC's 2007 and found no evidence that a validation exercise had been performed. Replacing validation were various comparisons of projected to measured temperatures.
It appeared that the authors might themselves have confused "evaluation" with "validation" and "projection" with "prediction," thus arriving at an illogical conclusion for presentation to policy makers. Later, I stumbled across the posting by Kevin Trenberth which I referenced in my posting to the Physics Forum. In his posting, Trenberth seems to confirm what Gray has to say.

In my posting to the Physics Forum, my intent is to open the issue of the falsifiability up
for discussion. I've looked into the matter and have found that a "projection" is a
different kind of entity than a "prediction"; the latter supports statistical validation but
not the former. If true, this finding is of fundamental importance to the debate on policy.

I doubt that there are any policy makers, journalists or political activists with interests in anthropogenic global warming that currently are aware of this finding. Climatologists seem confused by the issue.

With my preface complete, I'll try to respond to issues which you seem to raise in
posting #73.

1. It sounds as though you may be quibbling about my claim that the IPCC models are the
"basis for belief" in a CO2-temperature relationship but I'm unsure of what this quibble is.

So far as I am aware, the IPCC models are the sole vehicle by which the IPCC produces its projections of temperatures, with and without regulation of carbon dioxide emissions, for
consideration by policy makers.

2. The builders of the IPCC models employ the method of reasoning that is called
"mechanistic reductionism." Under this method, the model builder attempts to project the
phenomenology onto well known and interacting mechanisms. In some cases, understanding of the mechanism is relatively secure. In others, the mechanism is unknown but has a large
potential effect on the temperature. Radiative transport, which you mention in your posting,
is one of the mechanisms for which understanding is relatively secure. According to the IPCC
itself, understanding of the mechanism by which cloud formation affects the albedo is not
secure yet variations in the albedo may, and according to some studies do, have a large effect on the temperature.This effect may dwarf the effect of CO2.

Mechanistic reductionism sometimes works. It is, for example, the basis for engineering
design and often produces successful models in that context. However, when we employ
mechanistic reductionism in research on complex systems, it tends to fail. Whether the
attempt at applying mechanistic reductionism to modeling the climate is successful can be
determined only by testing the validity of the proposed models. The validity of a model
cannot be tested unless this model is structured to be falsifiable.

Meteorology is relatively rich in observed statistical events; thus, meteorologists are in a
good position to validate their models and they do so. Climatology is relatively poor in
observed statistical events. I wonder if this feature of climatology has led to disinterest
among climatologists in defining the nature of climatological events and interest in
obfuscating the issue.

3. You may be unclear on the distinction between a "projection" and a "prediction" so I'll
expand upon this topic. A "projection" is a mathematical function that maps the time to the
computed global average temperature. A "prediction" is a logical proposition that states the
outcome of a statistical event. A prediction has a variable which is called its "truth-
value." The truth-value of a prediction is true or false. A projection has no truth-value.

A projection supports comparison of the projected to the measured temperature and
computation of the error. A projection does not support the falsification of a model for the
conclusion cannot be reached that this model is falsified by the evidence. As the IPCC
models cannot be falsified, they lie outside science under Karl Popper's widely accepted
definition of "science."

To make the distinction between a "projection" and a "prediction" more concrete, I'll supply a fictional example. In the example, the projected temperature on January 1, 2020 at 00:00 hours Greenwich mean time is 16.3724 Celsius. The measured temperature is 16.3424 Celsius. The two temperatures differ. Does this observation falsify the associated model or does it not? This question cannot be answered without reference to the associated and as yet unspecified statistical event.

Climatology is not about the instantaneous values of climatological variables. It is about
the averages of these variables over time. In defining the statistical event, the period of
time over which this average is taken must be specified. Let us assume this period has been
specified and that the average measured temperature over this period on January 1, 2020 at
00:00 hours GMT is 16.3817 Celsius. Is the model falsified by this evidence or is it not?

The answer depends upon the as yet unspecified definitions of the outcomes. If each outcome is a temperature, then the model is falsified, for the predicted temperature differs from the measured one. Suppose, however, that an outcome is an element in the set of RANGES of temperatures {...16.2-16.3, 16.3-16.4, 16.4-16.5...}. In this case, the predicted outcome is 16.3-16.4. The measured outcome is 16.3-16.4. As the predicted outcome matches the measured one, the model is not falsified by this evidence.

In order for a model to avoid falsification, the outcomes must be ranges of temperatures
rather than temperatures. There is a further complication. When we model complex sysems, there is sure to be missing information about the outcome, given the observed state of the system at the time a prediction is made. It follows that a model can do no better than to predict the probabilities of the various outcomes and the uncertainties that are associated with these probabilities. It also follows that whether the model is or is not falsified must be determined by comparison of the predicted probabilities of the various outcomes to the observed relative frequencies of the same outcomes in a base of out-of-sample validation data. So far as I've been able to determine, there is no such data.

If my understanding of this situation is correct, then we are in the middle of a social phenomenon in which scientists are stating to policy makers they have a high level of confidence in models that are not scientific models. Lay persons, including politicians and journalists, continually confuse a model that is built by scientists with a scientific model thus reaching conclusions that are logically and empirically unwarranted.
Terry
 
  • #80


Terry Oldberg said:
Sylas:
Thank you for taking the time to comment on my posting. I'll preface my remarks by explaining that I'm an engineer (mechanical, electrical, nuclear) with a background in research and in the successful development of statistically validated models of complex systems.

Welcome to physicsforums, Terry.

Engineering is, of course, rather different to science. "Validation" is a normal part of working for designed or engineered constructions; but it does not have quite so central a position in empirical science.

For example... what would it mean to "validate" a climate model? We already know that they are not complete and only give a partial picture of climate. To quote a common phrase: climate models are always wrong but often useful.

According to Gray, he urged description by the IPCC of how the IPCC's models could be statistically validated. He says that the IPCC blew him off on this issue and implies that the IPCC's models are neither validated nor susceptible to validation. If Gray is correct, then the IPCC's models are not "scientific" models under Karl Popper's criterion of falsifiability.

Trenberth's article that you cited previously is a good guide on this. A climate model is not a scientific theory, but a tool that is based on many scientific theories. The hypotheses used in climate science are falsifiable in the usual sense of the word, scientifically. But a model is not a hypothesis that can be said to judged as scientific or not using Popper's criterion. Climate models can be used to help test some theories. They have limited skill, and their skill is improving. It is a good hypothesis in the usual sense of the word that a climate model does capture some aspect of climate to a certain accuracy; and people are testing ideas like that all the time. But a complete model, able to give validated projections? No such thing.

1. It sounds as though you may be quibbling about my claim that the IPCC models are the "basis for belief" in a CO2-temperature relationship but I'm unsure of what this quibble is.

So far as I am aware, the IPCC models are the sole vehicle by which the IPCC produces its projections of temperatures, with and without regulation of carbon dioxide emissions, for consideration by policy makers.

It's not a quibble; but a fundamental point about the nature of a scientific hypothesis. The scientific hypothesis is that CO2 has a strong effect on temperature. This is a scientific hypothesis in good standing, and it is now confirmed by multiple independent lines of evidence and theory to the point of being basic background to atmospheric physics. In particular, the associated forcing is known to a good level of accuracy; better than any of the other forcings involved.

However, that is not enough to make a prediction for temperatures. There's more involved than the CO2 forcing. Using some basic physics you can indeed get into the right ball park for the magnitude of CO2 effects on temperature. I've described how this works in the thread [thread=307685]Estimating the impact of CO2 on global mean temperature[/thread]; the thread has been inactive now for a couple of months but the first few posts spells out how this estimate proceeds -- and it doesn't use climate models. The basis of this kind of calculation is from work that is all reported in the IPCC; which is a pretty comprehensive summary of the relevant science.

To get an actual prediction for future temperatures, however, you would need to know all the other forcings as well, plus all the details of climate response. And -- as Trenberth points out -- we don't have that level of detail.

This does nothing to damage the strong support for the basic fact of the warming effect of carbon dioxide. It just means there's a lot more than this required to make predictions. We need to keep these two distinct aspects in mind.

What has been calculated are rough estimates of what is possible under different emission scenarios. For a given scenario, there is still a wide uncertainty in the consequences; but these are quantified. There are strong lower bounds on the consequences as far as global temperature rise is concerned; and this does indeed constitute a conventional falsifiable hypothesis. But because it's not possible to give fully validated models of the whole climate system, the best scientific information gives you a range of possible outcomes; and there are a heap of open research questions for sorting that out and refining out understanding further.

2. The builders of the IPCC models employ the method of reasoning that is called "mechanistic reductionism." Under this method, the model builder attempts to project the phenomenology onto well known and interacting mechanisms. In some cases, understanding of the mechanism is relatively secure. In others, the mechanism is unknown but has a large potential effect on the temperature. Radiative transport, which you mention in your posting, is one of the mechanisms for which understanding is relatively secure. According to the IPCC itself, understanding of the mechanism by which cloud formation affects the albedo is not secure yet variations in the albedo may, and according to some studies do, have a large effect on the temperature. This effect may dwarf the effect of CO2.

Your question about albedo is legitimate... and something that has to be sorted out using empirical studies. A model is not going to be good enough for this, because clouds have such a strong albedo contribution, and they are one of the hardest things to model well.

Of course, people ARE studying albedo. The evidence so far strongly suggests that albedo changes are not as great as the forcing from carbon dioxide.

The situation with albedo is a bit vexed, because albedo changes can arise as a feedback from rising temperatures. A purely empirical study of changing albedo levels is not sufficient for sorting that out... for this I think Trenberth's call for better monitoring of energy flows over all may help allow the different theories for cloud to be tested. But in any case, the albedo measures themselves, from satellites and other indirect methods, all seem to indicate that the albedo effect is not as great as the direct CO2 forcing. For more on these studies, see [post=2497270]msg #17[/post] of thread "Another climate update".

The answer depends upon the as yet unspecified definitions of the outcomes. If each outcome is a temperature, then the model is falsified, for the predicted temperature differs from the measured one. Suppose, however, that an outcome is an element in the set of RANGES of temperatures {...16.2-16.3, 16.3-16.4, 16.4-16.5...}. In this case, the predicted outcome is 16.3-16.4. The measured outcome is 16.3-16.4. As the predicted outcome matches the measured one, the model is not falsified by this evidence.

In order for a model to avoid falsification, the outcomes must be ranges of temperatures rather than temperatures. [...]

It is perfectly normal in all areas of science to give results with quantified uncertainties, and to falsify them to a certain level of confidence with observations that are statistically implausible given the probability distributions of theory. As evidence of this accumulates, the model is falsified. This is not limited to climate; it occurs in all areas of science.

I think your account is a bit too limited to really explain the nature of a scientific model and hypothesis testing.

There is also the more fundamental point that even if climate models were shown to be severely incorrect because of (for example) a pervasive systemic error in the handling of cloud albedo common to all the models, this would still not be falsify the warming effects of carbon dioxide. It would just falsify the rather more comprehensive numerical model of climate that takes a lot of other things into account.

The temperature effect of carbon dioxide is two fold. There is the forcing -- which is well known to be about 5.35 W/m2 per natural log of atmospheric concentrations. And there is the climate sensitivity to that forcing, which is much less well constrained and subject to ongoing scientific work, attempting to propose and falsify different scientific hypotheses. Climate models are not the only tool for this. A common hypothesis is that the temperature response is greater than 2 degrees per doubling of concentrations. There have been some lines of evidence proposes to falsify this, which we discuss from time to time. But these are an exception; and wildly considered to be flawed themselves. The vast bulk of work attempting to constrain response has rather falsified the idea of smaller sensitivities.

As is normal in active science, this is all being investigated and new lines of evidence and argument published and considered. To simplistic a focus on the falsification criterion for science tends to miss much of the scientific process in which people test ideas, and test and tests, and work their way towards improved understanding.

Cheers -- sylas
 
  • #81
Terry Oldberg said:
When we model complex sysems, there is sure to be missing information about the outcome, given the observed state of the system at the time a prediction is made. It follows that a model can do no better than to predict the probabilities of the various outcomes and the uncertainties that are associated with these probabilities. It also follows that whether the model is or is not falsified must be determined by comparison of the predicted probabilities of the various outcomes to the observed relative frequencies of the same outcomes in a base of out-of-sample validation data.

True. However, you may do well to read Chapter 8 on Climate Models and their Evaluation in the IPCC Physical Science Basis:

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch8.html

Anyhow, validation isn't the correct term for climate models; try detection and attribution.
Climate Science is basically a signal to noise problem.

Framing anthropogenic global warming as a true/false proposition is too simplistic.
We know that CO2 causes warming. However, what we don't know is how fast. The consensus science is that it's in the range of 2 to 4.5 C/CO2 doubling with a 95% confidence. Also, what we don't know very well is how precipitation patterns will be altered.
There is a wide spread among the models in this regard and precipitation patterns may turn out to be more important to us humans than temperature.

To make a testable prediction, the initial state of the system must be understood in the first place. The Oceans comprise about 90% of the thermal inertia of the Earth's climate so it not possible to make a prediction without modeling the oceans. Up until 2003, there was limited data on the oceans and the ARGO system wasn't considered complete until around 2006. So, we have only have a few years of reasonably good data for an initial state and it will probably be several years before the testable validation type test results you are looking for are available. At that time, the range for CO2 warming will likely be narrowed. It could be 2 to 3C/CO2 doubling or maybe 3.5 to 4.5C/doubling or it may even turn out to be 4 to 5C/doubling. However, it is extremely unlikely that it will be found to be less than 1.5C/doubling. The lower end of sensitivity has been examined closely and there is too much data supporting the higher sensitivities. There is clearly a skewed distribution of probabilities.
 
Last edited:
  • #82
Response to posting #81

Xnn: Thanks for taking the time to respond.

I've read Chapter 8.

I wonder if you'd share your understanding of how numerical values are assigned to the probabilities of the various ranges of sensitivities. For example, how is the value of 0.95 assigned to the probability that the range of sensitivites lies in the interval between 2 and 4.5 C/CO2 doubling?

Terry
 
  • #83
Terry;

Basically, in order to constrain climate sensitivity a stable period of time is needed that is long enough so that we can assume equilibrium is reached. The last glacial maximum (LGM) is frequently used. Models are then run with all that is known about the LGM for about a 1000 times. The results are that sensitivities greater than about 4.5C or less than 2C generally don’t match the data very well. Similarly, it can be demonstrated that < 1.5C and > 6C can probably be ruled out with greater confidence.

Here is a link to a paper on the subject:

http://www.pik-potsdam.de/~stefan/Publications/Journals/Schneider_etal_ClimDyn_2006.pdf

Notice, this paper found a range of 1.2–4.3C/doubling. I believe the IPCC range of 2-4.5 is based on a number of different models, but like I said until we get a better handle on the Oceans, there won't be much of a narrowing the range and about 3C/doubling looks to be mostly likely.
 
  • #84
Terry Oldberg said:
Xnn: Thanks for taking the time to respond.

I've read Chapter 8.

I wonder if you'd share your understanding of how numerical values are assigned to the probabilities of the various ranges of sensitivities. For example, how is the value of 0.95 assigned to the probability that the range of sensitivites lies in the interval between 2 and 4.5 C/CO2 doubling?

Xnn gives a nice link to an interesting study of the Last Glacial Maximum. (Thanks; Xnn! That's a useful one and I've taken a copy.) This uses climate models with a range of parameters to fit the LGM and the likelihood comes from a Monta Carlo sampling of the parameter space. This is a good example of how climate models are often used in research. They allow for a kind of virtual experiment and constrain physically credible results, without actually giving specific projections or predictions.

However, given your questions about models, there's another study you may find interesting discussed in thread [thread=334005]A low likelihood for high climate sensitivity[/thread]. This shows how Bayesian analysis is applied, and in particular looks at the dependence on the priors used. The authors, Annan and Hargreaves, have done a fair amount of study on climate sensitivity. Two major references, both discussed in that thread and directly relevant to your question, are:
  • Annan, J. D., and J. C. Hargreaves (2006), http://www.agu.org/pubs/crossref/2006/2005GL025259.shtml, in Geophys. Res. Lett., 33, L06704, doi:10.1029/2005GL025259.
  • Annan, J.D., and Hargreaves, J.C. (2009) On the generation and interpretation of probabilistic estimates of climate sensitivity, Climatic Change, online-first Oct 10 2009, doi: 10.1007/s10584-009-9715-y. (http://www.jamstec.go.jp/frcgc/research/d5/jdannan/probrevised.pdf ).

For details on how the IPCC report summarizes sensitivity estimates, chapter 9 of the 4AR is best.

Cheers -- sylas
 
Last edited by a moderator:
  • #85
The ice ages do not provide any evidence for the CO2 sensitivity. Global temperatures are thought to decline by -5.0C and the changes in CO2 can only explain about -1.8C of the temperature change at 3.0C per doubling.

The ice ages do not match the 100,000 year orbital cycle or the high-latitude summer solar insolation Milankovitch Cycles either.

What Albedo estimate are they using in these climate simulations. All those glaciers and sea ice and snow are reflecting much more sunlight than is currently the case. I have read a dozen of these papers and I have never seen a single Albedo estimate provided yet.

I like to download data and check the numbers against the assertions made. The only way to get the ice ages to work is that Albedo must increase to about 0.333 (this is as high as one can get) and then it has to become its own self-sustaining climate forcing that even the Milankovitch Cycles can only break about a third as often as they should.

http://img51.imageshack.us/img51/2127/last3iceages.png

...

http://img109.imageshack.us/img109/9195/milkanvsiceages.png

...

http://img27.imageshack.us/img27/3616/dustandiceages.png
 
Last edited by a moderator:
  • #86
Bill Illis said:
The ice ages do not provide any evidence for the CO2 sensitivity.

Do you have any basis for this claim in the scientific literature, other than your own personal skepticism of the various papers that use the last glacial maximum to infer bounds on sensitivity? It's usually expected that we discuss the normal progress of science, by referring to ideas that have support in the literature.

Global temperatures are thought to decline by -5.0C and the changes in CO2 can only explain about -1.8C of the temperature change at 3.0C per doubling.

Minimum CO2 levels were about 180ppm, and the current value (ignoring anthropogenic increases since the industrial revolution) is about 280. The number of doublings is log2(180/280) = -0.64, which gives about -1.9C difference; so I get a little bit more cooling than -1.8, but we are in the same ball park. To this you also add a little extra for the drop in methane and possibly nitrous oxide, to a slightly stronger net cooling from reduced greenhouse effects at the LGM.

Albedo effects are generally thought to be comparable to greenhouse forcing at the LGM. An albedo of 0.33 as opposed to the current 0.3 would give a forcing of 0.03 * 342 = over 10 W/m2. That's enormous; about 4 times as much as the carbon dioxide forcing of around 2.5. I don't believe your numbers; do you have a reference or is this your own estimate?

Conventionally, a forcing of -3 W/m2 would correspond to an albedo of just over 0.31. This is the estimate in the paper by Annan and Hargreaves which I cited for you previously.

Your final diagram states "There is almost no period of increased dust during a glacial expansion phase." What is your basis for this claim, and what confidence can you assign it? It conflicts with what I have seen in the literature. Dust based forcings are considered uncertain; but there is believe to be an increase... described in Xnn's reference.

The ice ages do not match the 100,000 year orbital cycle or the high-latitude summer solar insolation Milankovitch Cycles either.

I would say it matches both, but at different times within the Quaternary.

What Albedo estimate are they using in these climate simulations. All those glaciers and sea ice and snow are reflecting much more sunlight than is currently the case. I have read a dozen of these papers and I have never seen a single Albedo estimate provided yet.

The papers I have cited suggest an albedo forcing of about -3 W/m2 at the LGM, which corresponds to about 0.31, and possibly a little bit more with dust effects; but nowhere near 0.333

Cheers -- sylas
 
Last edited:
  • #87
sylas said:
It's usually expected that we discuss the normal progress of science, by referring to ideas that have support in the literature.

Actually more like -1.9. Minimum CO2 levels were about 180ppm, and the current value (ignoring anthropogenic increases since the industrial revolution) is about 280. The number of doublings is log2(180/280) = -0.64, which gives about -1.9C difference.

Albedo effects are generally thought to be comparable to greenhouse forcing at the LGM. An albedo of 0.33 as opposed to the current 0.3 would give a forcing of 0.03 * 342 = over 10 W/m2. That's enormous; about 4 times as much as the carbon dioxide forcing of around 2.5. I don't believe your numbers; do you have a reference or is this your own estimate?

Conventionally, a forcing of -3 W/m2 would correspond to an albedo of just over 0.31. This is the estimate in the paper by Annan and Hargreaves which I cited for you previously.

Your final diagram states "There is almost no period of increased dust during a glacial expansion phase." What is your basis for this claim, and what confidence can you assign it? It conflicts with what I have seen in the literature. Dust based forcings are considered uncertain; but there is believe to be an increase... described in Xnn's reference.

The papers I have cited suggest an albedo forcing of about -3 W/m2 at the LGM, which corresponds to about 0.31, and possibly a little bit more with dust effects; but nowhere near 0.333

Cheers -- sylas

I'm just using the data from the Antarctic ice cores.

The CO2 minimum is 184.4 ppm at 22,000 years ago and at 3.0C per doubling of CO2 [using CO2 as a proxy for all the GHGs], the temperature change is 1.8C.

I am also using the simplest form of the Climate Framework so that one can separate out the Albedo and Solar Irradiance changes from the impacts of the GHG forcing (I don't have a Climate Model).

Earth Surface Temperature = Solar Energy Effect + Greenhouse Effect

Which is equal to:

288K = 15C = 255K + 33K

And the Solar Energy Effect equals:

Temp Solar Effect = [Solar Irradiance * (1-Albedo) /4 /Stefan-Boltzmann Constant] ^0.25

255K = [1,366 * (1 -0.298) / 4 / 5.67E-08]^0.25

And the Greenhouse Effect equals:

Temp Greenhouse Effect = [33K +/- CO2 Sensitivity@CO2 Doublings]​

Now if one is going to use a temperature sensitivity of 0.75C/watt/metre^2 of forcing as the temperature change at the surface, then one could end up with a lower Albedo estimate for the last glacial maximum but that contradicts the estimates one gets from the Stefan-Boltzmann equations for the impact of solar forcing. The Stefan-Boltzmann equations do not predict a temperature change of 0.75C/watt/metre^2.

I am also using this framework to go farther back in time when the Faint Young Sun becomes important and Solar Irradiance was lower so I need to use a simpler framework. It still provides the same temperature change.

Albedo is an important concept that should not be buried inside a climate model. No matter what Albedo estimate one uses, 3.0C per doubling of CO2/GHGs only explains a small part of the temperature change and the Milankovitch Cycles only partly match up.

On the dust concentrations issue, I just plotted the numbers against the timelines. The dust concentrations could be explained by the dust and loess created by glaciers as they advance. When they stop advancing or melt-back, all that material is left behind and when it dries out and the wind blows (and there is a lot of wind around glacial fronts) there is increased dust. The other explanation of dry conditions and increased deserts in mid-latitudes is certainly part of the picture, its just that the numbers don't match up entirely with that explanation. There is one other explanation in that when the Antarctic glaciers stop accumulating snow and ice, (there is little snowfall and the layers accumulate years of material in very thin layers like at glacial maximums) there is just increased dust recorded in the layers. In years when more snow is falling, there is little dust accumulation in thicker layers.
 
Last edited:
  • #88
Bill Illis said:
I'm just using the data from the Antarctic ice cores.

The CO2 minimum is 184.4 ppm at 22,000 years ago and at 3.0C per doubling of CO2 [using CO2 as a proxy for all the GHGs], the temperature change is 1.8C.

OK... except that you can't use CO2 as a proxy for the others. There's a small additional contribution. CO2 is the major greenhouse contribution, but CH4 and N2O add on to that.

The forcing for CO2 should be about 5.35*Ln(184.4/280)... using your numbers... which gives about -2.23 W/m2. To this you add about -0.3 for CH4 and the same for N2O. So it gets up to -2.8 W/m2 for the greenhouse forcing, which (at 3 K/2xCO2) would be close to -2.3 K total temperature contribution.

You can find the estimates for the other gases in chapter 6 of the IPCC 4AR WG1: the chapter on paleoclimate, page 448.

I am also using the simplest form of the Climate Framework so that one can separate out the Albedo and Solar Irradiance changes from the impacts of the GHG forcing (I don't have a Climate Model).

That's fine; you can get into the ballpark that way.

Earth Surface Temperature = Solar Energy Effect + Greenhouse Effect
Which is equal to:

288K = 15C = 255K + 33K

And the Solar Energy Effect equals:

Temp Solar Effect = [Solar Irradiance * (1-Albedo) /4 /Stefan-Boltzmann Constant] ^0.25

255K = [1,366 * (1 -0.298) / 4 / 5.67E-08]^0.25

And the Greenhouse Effect equals:

Temp Greenhouse Effect = [33K +/- CO2 Sensitivity@CO2 Doublings]​

Now if one is going to use a temperature sensitivity of 0.75C/watt/metre^2 of forcing as the temperature change at the surface, then one could end up with a lower Albedo estimate for the last glacial maximum but that contradicts the estimates one gets from the Stefan-Boltzmann equations for the impact of solar forcing. The Stefan-Boltzmann equations do not predict a temperature change of 0.75C/watt/metre^2.

That's improper use of Stephan Boltzman in all kinds of ways. Most crucially you can't just treat Earth as a static radiator. The 3 C/doubling you are using already assumes a substantial feedback; and the same applies here with albedo.

The more useful quantity is the forcing. With insolation at the top of the atmosphere at close to 342 W/m2, each 0.1 change in albedo gives you a forcing of 3.4 W/m2.

A simple no-feedback response estimate can be given as T/4Q, where Q is emission of about 240 W/m2 and T is the surface temperature of about 288K. (See [post=2497769]msg #28[/post] for more detail.) This gives you about 0.3 K per W/m2, which is less than 0.75 as you say. But by using a sensitivity of 3 K per 2xCO2, which is what you proposed previously, and noting that 2xCO2 is a forcing of about 3.7 W/m2, the real response 3/3.7 = 0.8 K per W/m2.

So your 0.1 change in albedo is a temperature contribution of 3.4*0.8 = -2.7 degrees; about the same as the greenhouse contribution. This is pretty standard with analysis of the LGM in the literature. Greenhouse and ice-albedo effects are of a similar magnitude.

Albedo is an important concept that should not be buried inside a climate model. No matter what Albedo estimate one uses, 3.0C per doubling of CO2/GHGs only explains a small part of the temperature change and the Milankovitch Cycles only partly match up.

Your analysis makes a simple error by taking feedback into account for greenhouse forcing but not for the albedo forcing. You need feedback to affect both.

On the dust concentrations issue, I just plotted the numbers against the timelines. The dust concentrations could be explained by the dust and loess created by glaciers as they advance. When they stop advancing or melt-back, all that material is left behind and when it dries out and the wind blows (and there is a lot of wind around glacial fronts) there is increased dust. The other explanation of dry conditions and increased deserts in mid-latitudes is certainly part of the picture, its just that the numbers don't match up entirely with that explanation.

Well, from your graph it appears that there is substantial dust at the LGM, so you can't ignore it. This is pretty standard in the literature.

Your framework will give you a rough ball park with what is obtained with the published work done more thoroughly by working researchers in paleoclimate, once you treat the climate sensitivity consistently for the various forcings.

Cheers -- sylas
 
  • #89
sylas said:
A simple no-feedback response estimate can be given as T/4Q, where Q is emission of about 240 W/m2 and T is the surface temperature of about 288K. (See [post=2497769]msg #28[/post] for more detail.) This gives you about 0.3 K per W/m2, which is less than 0.75 as you say. But by using a sensitivity of 3 K per 2xCO2, which is what you proposed previously, and noting that 2xCO2 is a forcing of about 3.7 W/m2, the real response 3/3.7 = 0.8 K per W/m2.


Cheers -- sylas


Thanks sylas, I guess I will have to give up this effort.


But I still have a problem with the inconsistency in all these numbers.

240 watts/metre^2 of solar energy results in 255.0K or 1.06K/watt/metre^2

3.7 watts/metre^2 of 2XCO2 forcing results in 3.0K or 0.8K/watt/metre^2

1.0 watt/metre^2 of extra energy in the SB equation results in 0.3K/watt/metre^2

288K at the surface is the equivalent of 390 watts/metre^2 in the SB equation (or 0.74K/watt/metre^2)

adding 1.0 extra watt/metre^2 to 390 watts/metre^2 only results in an extra 0.18K/watt/metre^2


I think there has been too much averaging in all these estimates and in the climate models and the incremental differentials are not being used (the equations should be logarithmic).

This is the way I look at it and I will delete this post (and probably move on to other issues) if people have a problem with it.

http://img187.imageshack.us/img187/6840/sbEarth'surfacetemp.png

You can extend this chart all the way out to 63,250,000 watts/metre^2 for the surface of the Sun and the temperature of 5,779K will be correct.

The incremental each extra watt/metre^2 at the surface is only 0.18K. The Sun needs to add 50,000 watts/metre^2 to add 1.0K to its surface temperature.

http://img189.imageshack.us/img189/2608/sbtempcperwatt.png
 
Last edited by a moderator:
  • #91
Xnn said:
Bill;

Here's a paper on albedo during the last glacial maximum.
Haven't had time to study it, but hopefully its useful although it is in watts/m^2

http://kiwi.atmos.colostate.edu/pubs/Burt_Randall_Otto-Bliesner_submitted.pdf

BTW, some of the figures in your post are not coming thru.
They look like just small boxes with a red x.

Thanks Xnn, this is the only paper which goes through it in understandable detail that I have seen.

"As expected, the global ice albedo feedback is positive from LGM to PI; the change in the clear-sky absorbed shortwave at the surface is 11.68 W m-2 ..."​
... which would translate into an Albedo number of about 0.331 (I will have to go through the paper in more detail because clear-sky is not quite the full picture because of the cloud factor again.)

But if anyone else is having problems with the charts showing up, here are direct links to the charts in the order I posted them which might work.

http://img51.imageshack.us/img51/2127/last3iceages.png

http://img109.imageshack.us/img109/9195/milkanvsiceages.png

http://img27.imageshack.us/img27/3616/dustandiceages.png

http://img187.imageshack.us/img187/6840/sbEarth'surfacetemp.png

http://img189.imageshack.us/img189/2608/sbtempcperwatt.png
 
Last edited by a moderator:
  • #92
Bill Illis said:
Thanks sylas, I guess I will have to give up this effort.

I've tried similar kinds of effort myself, and ironing out the bugs can be a great way to learn.

But I still have a problem with the inconsistency in all these numbers.

240 watts/metre^2 of solar energy results in 255.0K or 1.06K/watt/metre^2

That's the effective radiating temperature of the planet. Basically, Earth radiates the same amount of thermal energy as an equally sized blackbody at 255 K. However, you can't give a simple linear relation of 255/240 to mean anything. The relation is Stefan-Boltzman, given by
[tex]Q = \sigma T^4[/tex]​
with Q = 240, T = 255, and σ = 5.67*10-8 as the constant.

3.7 watts/metre^2 of 2XCO2 forcing results in 3.0K or 0.8K/watt/metre^2

That's our best guess, but there are significant uncertainties. This forcing is the difference in energy at the top of the atmosphere. This response to a forcing includes feedbacks.

1.0 watt/metre^2 of extra energy in the SB equation results in 0.3K/watt/metre^2

Right; except that is using the surface temperature, not the effective radiating temperature. You can get a rough approximation by introducing an effective emissivity, so that the radiation to space is related to the surface temperature as follows:
[tex]Q = \epsilon \sigma T^4[/tex]​
In this case, T is a mean surface temperature; taking a mean temperature over the Earth so as to give the right energy flux into the atmosphere from the surface. The effective emissivity basically relates the energy radiated from the surface to the energy radiated into space.

The full energy details are more subtle than this, of course. You have energy flowing up from the surface by convenction as well, with the balance made by by a backradiation from the atmosphere. However, the equation with ε gets into the ball park.

With this approximation, we have
[tex]\frac{dQ}{dT} = 4 \epsilon \sigma T^3 = \frac{4Q}{T}[/tex]​
Thus the change in energy out the top of the atmosphere per unit temperature is 4Q/T = 4*240/288 = 0.3. This is called the no-feedback response, or Plank response, of the planet.

It is the same temperature response you would have for a grey body with a constant emissivity.

288K at the surface is the equivalent of 390 watts/metre^2 in the SB equation (or 0.74K/watt/metre^2)

adding 1.0 extra watt/metre^2 to 390 watts/metre^2 only results in an extra 0.18K/watt/metre^2

You are just doing a simple division here; which doesn't work. You can use the Stefan-Boltzman law to get about the right magnitudes, as shown above.

I think there has been too much averaging in all these estimates and in the climate models and the incremental differentials are not being used (the equations should be logarithmic).

Well, climate models do the best we possibly can with as much physics as they can handle. Without a climate model, we can use the simple averaging methods described above to get approximate answers, but a climate model looks at the whole distribution of temperature around the planet and all the energy flows. They are not complete by any means, but they do use physically sensible equations for the energy flows that they do represent.

One of the things we haven't looked at is the effect of feedbacks. Climate models can represent various other processes; for example they calculate humidity, using equations like the Clausius–Clapeyron relation and others.

Water, as you may know, is the major greenhouse gas on Earth... but you can't increase humidity by just emitting water into the atmosphere. The lifetime of water in the atmosphere is very short, and so the humidity is determined largely by temperature.

Here's a really simple thought experiment.

Suppose that adding a greenhouse gas gives a forcing, and the non-feedback temperature response is given by 0.3 degrees per unit forcing, as estimated above. But suppose also that any rise in temperature gives additional humidity, with a greenhouse forcing of 2 W/m2 per degree. This is feedback.

What happens to temperature ΔT when we add a forcing F? The total additional forcing is going to be 2 ΔT + F. The equation becomes
[tex]\begin{array}{lll}
& \Delta T & = 0.3(2 \Delta T + F) = 0.6 \Delta T + 0.3 F \\
\Rightarrow & \Delta T & = 0.75 F
\end{array}[/tex]​

The number 2 there is not far off, in fact, as far as we can tell... and there are all sorts of other feedback processes as well which give an additional secondary forcing (either positive or negative) in response to temperature change. If f is the feedback in forcing per degree, and λ0 is the no-feedback response, then the real response turns out to be
[tex]\frac{\lambda_0}{1 - f \lambda_0}[/tex]​

This is the way I look at it and I will delete this post (and probably move on to other issues) if people have a problem with it.

I don't think you need to delete it; but that's up to you.

You can extend this chart all the way out to 63,250,000 watts/metre^2 for the surface of the Sun and the temperature of 5,779K will be correct.

The incremental each extra watt/metre^2 at the surface is only 0.18K. The Sun needs to add 50,000 watts/metre^2 to add 1.0K to its surface temperature.

Actually, the Sun is very close to a blackbody, so it is much simpler. The proper equation from differentiating Stefan-Boltzman gives
[tex]\frac{dQ}{dT} = 4 \sigma T^3 = 43,800[/tex]​
Try taking the difference between σT4 for 5779 and 5780 degrees.

Cheers -- sylas
 
  • #93
sylas said:
You can get a rough approximation by introducing an effective emissivity, so that the radiation to space is related to the surface temperature as follows:
[tex]Q = \epsilon \sigma T^4[/tex]​
In this case, T is a mean surface temperature; taking a mean temperature over the Earth so as to give the right energy flux into the atmosphere from the surface. The effective emissivity basically relates the energy radiated from the surface to the energy radiated into space.

The full energy details are more subtle than this, of course. You have energy flowing up from the surface by convenction as well, with the balance made by by a backradiation from the atmosphere. However, the equation with ε gets into the ball park.

With this approximation, we have
[tex]\frac{dQ}{dT} = 4 \epsilon \sigma T^3 = \frac{4Q}{T}[/tex]​
Thus the change in energy out the top of the atmosphere per unit temperature is 4Q/T = 4*240/288 = 0.3. This is called the no-feedback response, or Plank response, of the planet.

It is the same temperature response you would have for a grey body with a constant emissivity.

Water, as you may know, is the major greenhouse gas on Earth... but you can't increase humidity by just emitting water into the atmosphere. The lifetime of water in the atmosphere is very short, and so the humidity is determined largely by temperature.

Here's a really simple thought experiment.

Suppose that adding a greenhouse gas gives a forcing, and the non-feedback temperature response is given by 0.3 degrees per unit forcing, as estimated above. But suppose also that any rise in temperature gives additional humidity, with a greenhouse forcing of 2 W/m2 per degree. This is feedback.

What happens to temperature ΔT when we add a forcing F? The total additional forcing is going to be 2 ΔT + F. The equation becomes
[tex]\begin{array}{lll}
& \Delta T & = 0.3(2 \Delta T + F) = 0.6 \Delta T + 0.3 F \\
\Rightarrow & \Delta T & = 0.75 F
\end{array}[/tex]​


Cheers -- sylas

Thanks for continuing to indulge me (because I do want to figure this out).

In your formulae above, can we just substitute the different Q's and T's between the surface and the tropopause emission level.

Surface Q = 390 W/m2
Surface T = 288K

Tropopause Emission Q = 240 W/m2
Tropopause Emission T = 255K


4 Q/T Surface = 4*390/288 = 5.4W/m2/K = 0.18K/W/m2

I can certainly see an increased greenhouse forcing can be doubled through increased humidity but that only results then in 0.36K / W/m2.

---

If one just sticks to the tropopause emission level and assume all the 2XCO2 forcings estimates are calculated from that perspective (and are not meant to apply to the surface):

4 Q/T = 4*240/255 = 3.76 W/m2/K = 0.265K/W/m2 or 0.53K/W/m2 (assuming doubled humidity forcing).

Do we have to make the assumption then that the surface will respond according to the Plank response - that the Wet Adiabatic Lapse Rate will remain stable and the surface will warm at the same rate as the tropopause emission level. I think the Stefan Boltzmann equations imply that the Lapse rate will in fact change by some amount.
 
  • #94
Bill Illis said:
In your formulae above, can we just substitute the different Q's and T's between the surface and the tropopause emission level.

Surface Q = 390 W/m2
Surface T = 288K

Tropopause Emission Q = 240 W/m2
Tropopause Emission T = 255K4 Q/T Surface = 4*390/288 = 5.4W/m2/K = 0.18K/W/m2

This is correct, but it is not the change in radiation into space. At the surface you have a lot more than merely the radiant energy flows to consider. There's also a thermal radiant flux downwards from the atmosphere (backradiation), and convection moves a significant amount of additional energy up from the surface into the atmosphere. A recent estimate of all the various energy flows is in this diagram:
KiehlTrenberth2009-EnergyFlows.jpg

Figure 1 from Trenberth et. al. (2009) http://ams.allenpress.com/archive/1520-0477/90/3/pdf/i1520-0477-90-3-311.pdf , in Bulletin of the AMS, Vol 90, pp 311-323.​

Indulge me... I'll try a crude back of the envelope estimate.

Simplifying those flows, and balancing up a bit to look at an equilibrium state with no heating of the ocean, suppose we have the following averaged flows between space and the planet:
  • 342 from space to the Earth (SW).
  • 102 reflected back (SW) for an albedo of 0.298
  • 240 thermal emission out the top of the atmosphere (LW)
Flows at the surface, excluding the backradiation:
  • 162 solar absorbed at the surface (SW)
  • 390 thermal up from the surface (LW)
  • 97 by convection up from the surface (including latent heat of evaporation)
Assuming an equilibirum energy balance we can derive from these numbers:
  • 78 solar absorbed in the atmosphere (SW)
  • 335 = 97+78+390-240 thermal backradiation from the atmosphere down to the surface.

Sticking with integers, consider what happens when you get a forcing from a greenhouse effect of 4, but with no feedbacks. This is little more than what you get with doubling CO2. All the shortwave fluxes remain unchanged. The convection should be about the same as well. What happens is that the thermal fluxes at the surface increase.

Now to a first approximation we can simply say that the forcing means 4 less out the top of the atmosphere, and delivered to the surface, before the equilibrium is restored. In the no-feedback response, the surface and the atmosphere all heat up until you get balance again.

The lapse rate remains the same, given no feedbacks, and the top and the bottom of the atmosphere will heat up by about equal amounts. The greenhouse effect basically works because the atmosphere radiates into space from the upper regions where it is cold, thus reducing the efficiency of Earth as a thermal radiator.

Assuming that all the radiation into space is from the atmosphere (which isn't really true; there's about 40 getting through the infrared window when cloud cover permits) we have presumed that the thermal radiation out the atmosphere dropped from 240 to 236.

Using Stefan-Boltzman Q = σT4 to estimate the effective radiating temperatures:
  • 240 is an effective radiating temperature of about 255.
  • 236 is an effective radiating temperature of about 254.
Now the backradiation out the bottom of the atmosphere is 335, with an effective radiating temperature of about 277. This is cooler than the surface, of course; we can say that we are getting thermal radiation mostly from the lower troposphere.

Raise that by a degree, and you get about 340, so the surface needs to heat up to emit 395, which has an effective temperature of about 289; not quite another degree.

The numbers here are not precise. The key point is that a forcing at the top of the atmosphere results in a larger forcing at the bottom. Your calculation of 0.18 K per W/m2 is about right. Combined with 0.3 K per W/m2 as the non-feedback response to the top of the atmosphere forcing, you can derive about 5 W/m2 at the surface boundary for every 3 W/m2 at the top of the atmosphere. That makes sense given the higher temperatures at the surface.

Do we have to make the assumption then that the surface will respond according to the Plank response - that the Wet Adiabatic Lapse Rate will remain stable and the surface will warm at the same rate as the tropopause emission level. I think the Stefan Boltzmann equations imply that the Lapse rate will in fact change by some amount.

The lapse rate is not based on radiation transfers, so Stefan-Boltzman is not used. Rather, the lapse rate largely depends on the adiabatic movement of air. The major feedback effect is that with increased humidity, the lapse rate falls, as the moist adiabat is less than the dry adiabat. This works as a negative feedback, since it effectively raises the temperature of the upper atmosphere by comparison with the surface, making the Earth a more effective radiator. This moderates the larger positive feedback from the strong greenhouse effect of the additional water in the atmosphere. But I was doing a non-feedback estimate above.

Cheers -- sylas

PS. Let me repeat my suggestion from [post=2498277]msg #32[/post]. Get hold of a free copy of "Principles of Planetary Climate". Given how much you like calculating things for yourself, this book will be right up your alley!
 
Last edited by a moderator:
  • #95
I think the general consensus by the scientific community is:
The Earth's average temperature is rising.

However, it is not so clear(conclusive) why(i.e. man made, natural causes, etc, or combinations).

Sound about right?
 
  • #96
Thanks a bunch sylas.

I think I'm starting to see now.

I've gone through parts of Pierrehumbert's book before but it is a heavy read. I will come back when I've gone through it all this time.
 
  • #97
Winzer said:
I think the general consensus by the scientific community is:
The Earth's average temperature is rising.

However, it is not so clear(conclusive) why(i.e. man made, natural causes, etc, or combinations).

Sound about right?

Yes, although I'd put it more strongly. The rising temperature is a measurement, and the amount of rise is well within measurement accuracies. This is data.

Full details of all the causes is not known, and the sensitivity of climate to different forcings is known only approximately. But the primary cause is almost certainly anthropogenic; not natural. There's still plenty of sorting out details, but the major warming contribution is anthropogenic, with most of the natural forcings since the mid twentieth century likely to be a net cooling effect. That is, over recent decades, natural causes seem to have reduced the anthropogenic warming impact a little bit, rather than contribute to the observed warming.

Cheers -- sylas
 
  • #98
Bill Illis said:
I like to download data and check the numbers against the assertions made.

I was finally able to view the graphs; it may have been security on the other system that prevented their display.

Anyhow, I also like to download data and check it out.
Do you know where the data for the time series charts can be found?
Solar irradiance, CO2 and ice dust levels for the last 400,000 years.
 
  • #99
Winzer said:
I think the general consensus by the scientific community is:
The Earth's average temperature is rising.

However, it is not so clear(conclusive) why(i.e. man made, natural causes, etc, or combinations).

Sound about right?

No; the scientific consensus is that man made causes have also dominated over natural causes for the last 60 years of warming.

If it weren't for all of us humans, the world would be icing up.
 
  • #100
Xnn said:
I was finally able to view the graphs; it may have been security on the other system that prevented their display.

Anyhow, I also like to download data and check it out.
Do you know where the data for the time series charts can be found?
Solar irradiance, CO2 and ice dust levels for the last 400,000 years.

Most of the material I've accumulated comes from the NOAA paleoclimatology website. There is a lot of data there.

http://www.ncdc.noaa.gov/paleo/data.html

If you want to spend some time looking through subdirectories, the FTP site has lots of little surprises.

ftp://ftp.ncdc.noaa.gov/pub/data/paleo/

I've also put up most of the data I'm using in easy-to-use Excel spreadsheets together with direct links to the datasources used at this Mediafire link. (a few of the direct links don't work anymore and there is some newer data from the last month or so, particularly with respect to CO2 in the last 20 million years from Tripati and Pagani which are not included, but one could just add to the spreadsheets as newer info becomes available).

http://www.mediafire.com/?sharekey=cc80321597b8ea7b4c17ca8801618ef794bf8b95c230d06a947708e37b913e74

Newer CO2 data could be added from these two papers.

http://www.seas.harvard.edu/climate/seminars/pdfs/tripati.etal.sci.2009.pdf

http://www.nature.com/ngeo/journal/v3/n1/abs/ngeo724.html

http://www.nature.com/ngeo/journal/v3/n1/extref/ngeo724-s1.pdf
 
Last edited:
  • #101
Xnn said:
If it weren't for all of us humans, the world would be icing up.

I think that is going much too far. The cooling contribution of natural forcings since the middle of the twentieth century has been tiny. If you go back further, natural forcings certainly contributed to the heating in the first part of the century. If you look on the scale of millenia, the Holocene is usually believed to have a long time yet to run, even if humanity was out of the picture entirely.

But we risk topic drift here.

Cheers -- sylas
 
  • #102
There is a relevant paper on the subject of an early Anthropocene.

http://earth.geology.yale.edu/~avf5/teaching/Files_pdf/Ruddiman2003.pdf
 
Last edited by a moderator:
  • #103
Xnn said:
There is a relevant paper on the subject of an early Anthropocene.

http://earth.geology.yale.edu/~avf5/teaching/Files_pdf/Ruddiman2003.pdf

Yes; I am familiar with this notion. The major proponent is William Ruddiman, who is famous for his "early anthropocene", meaning that he believes the human impact on climate began with agriculture, thousands of years ago, and it has prevented the onset of the next ice age.

This is a minority few at present; and the weight of evidence is running against it. The debate goes on, and Ruddiman's stature ensures that it continues to be taken seriously.

Basically, Ruddiman proposed that the Holocene would be coming to an end by now, and the start of a new ice would have begun, if not for the climatic effects of widespread agriculture.

The alternative view, proposed by Loutre and Berger and others, is that Holocene should be expected to be a long interglacial due to low eccentricity, and that absent any human impact the next ice age is still some 30 thousand years or more in the future. A similarly long interglacial occurred in stage 11, some 400 thousand years ago, when eccentricity was also low.

Pre-industrial human impact on climate is mostly regional, rather than global, according to other research.

The nice thing about the IPCC reports is that they give a pretty comprehensive survey of the literature on competing ideas like this. Chapter 6, in paleoclimate, is the relevant part. The more common view at present is for a long integlacial, discussed in section 6.4.1.8 (When will the current interglacial end). Ruddiman's early anthropocene idea, and criticisms, are discussed in section 6.5.1.2 (Why Did Holocene Atmospheric Greenhouse Gas Concentrations Vary Before the Industrial Period?).

The evidence seems to be running against Ruddiman's proposal, though it is still open. The paper you have cited was followed shortly by a "comment" paper by other researchers indicating that Ruddiman's proposal was flawed.

References (starting from the paper cited by Xnn and looking at the subsequent exchange of contrasting views.)
  • Berger, A. and Loutre, M.F. (2002) "An Exceptionally Long Interglacial Ahead?" in Science, v297 (23 Aug 2002), pp 1287-1288.
  • Ruddiman, W.F. (2003) "The Anthropocene Greenhouse Era began Thousands of Years Ago", in Climatic Change v 61 pp 261–293.
  • Claussen, M. et. al. (2005) "Did Humankind Prevent a Holocene Glaciation?" (comment on Ruddiman 2003) in Climatic Change v 69, pp 409-417. (The author's answer is "no".)
  • Ruddiman, W.F. (2007), "The early anthropogenic hypothesis: Challenges and responses", in Rev. Geophys., 45, RG4001, doi:10.1029/2006RG000207. (Acknowledges widespread criticisms and proposes answers to them.)

There are more papers by both Claussen and Ruddiman and their respective colleagues that continue to debate this matter. Ruddiman appears to acknowledge that he has not so far managed to convince the rest of the paleoclimate community, and also that there is still not a conclusive case. By and large, the long interglacial hypothesis seems to have more support, and pre-industrial human impact on climate is not considered to have prevented an ice age.

Cheers -- sylas
 
Last edited by a moderator:
  • #104
Sylas;

Appreciate your feedback and understand that Ruddiman's original hypothesis should not be completely accepted. However, my impression it is accepted that there was a pre-industrial age human contribution of roughly 10 ppm CO2 and 100 ppb CH4 to the atmosphere. In addition, we know that over the last 5000 years orbital changes have lead to a gradual cooling of the arctic that is expected to continue for several thousand years. So, absent human activities, we could have expected an expansion of glacial coverage in the northern hemisphere. This doesn't mean that there should have been a rapid expansion of ice conditions, but rather a continuation of what is known as the little ice age. That is, there would have been an gradual icing of the earth, especially in the northern hemisphere.

Unfortunately, I haven't been able to locate copies of the papers criticizing Ruddiman's work to study. However, I notice that a integrated analysis of solar insolation such as that performed by Huyber (figure 2E) appears to distinguish between recent solar forcing and that of 420 Kyrs ago.

http://www.sciencemag.org/cgi/reprint/313/5786/508.pdf
 
Last edited:
  • #105
sylas said: "Engineering is, of course, rather different to science. "Validation" is a normal part of working for designed or engineered constructions; but it does not have quite so central a position in empirical science.

For example... what would it mean to "validate" a climate model? We already know that they are not complete and only give a partial picture of climate. To quote a common phrase: climate models are always wrong but often useful.
"

Sorry, I can't accept that. If you want to say that it is against my religion, fine. But in this case my religion is what I learned from professors and colleagues that it takes to do statistics right. The difference can be summed up in two books: Lies, Damned Lies and Statistics by Michael Wheeler, and The Visual Display of Quantitative Information by Edward Tufte. Go read both books, then go look at the original hockey stick again and decide which author describes it...

Terry Oldberg said: "According to Gray, he urged description by the IPCC of how the IPCC's models could be statistically validated. He says that the IPCC blew him off on this issue and implies that the IPCC's models are neither validated nor susceptible to validation. If Gray is correct, then the IPCC's models are not "scientific" models under Karl Popper's criterion of falsifiability."

Amen Terry! For me this sums up perfectly why I count myself an anthropogenic global warming skeptic even though I think there are good reasons for reducing CO2 emissions.

Although much of my work has been in the area of programming languages and algorithms, I have a MS in Statistics. I recently made a post on a completely different topic in which I said: "Beware of EDA (exploratory data analysis) without validation. There are times when you must split the sample available to you in half, one part for EDA, the other for validation. Anything else is junk science, or at least junk statistical analysis." (Since I am quoting myself here, I felt I could make some minor edits without adding brackets. ;-)

So the problems I have with climate science as a whole, are the incestuous sharing and editing of data, which makes anything like an independent test almost impossible, and the attitude that even when falsified, climate models should be kept the parameters juggled, new data added etc. No one should be surprised that climate models have no predictive power given the culture in which they have been created.

Will there be a real climate science at some point in the future? Probably. But I can't see it evolving from the current climate experts. They are not real scientists, although they do claim that on TV.

Am I being too harsh? I don't think I am being harsh enough. As a statistician who finds EDA to be fun, I often ask for the data behind papers in in areas such as cosmology, superconductivity, and the solar neutrino problem to mention a few. (I have also asked for such data in fields where I am known, but I am ignoring that.) In every area except climate science, the only problem I have is that the scientists are so glad to have someone interested who is a statistician and programmer who can help with validation, that my validation (or falsification) is not independent.* But in climate science the story is much different.

Yes, I have looked at the data which is now publicly available, and I am regularly shocked at how poor it is. What use is data from sites where measurements were only taken when it wasn't too cold, snowing, or raining? Or where normal and unavailable are represented by the same symbol (0). Climate researchers have often further processed these data to substitute proxies where original data is missing, but now any validation is impossible. At best, attempts to duplicate results will end up either accepting the researcher's assumptions, or with a much different data set. This is the "one tree in Siberia" problem.** If you have access to the original data you can run tests to determine the possible degree of measurement error and the amount of (unbiased) random error present. But when the only data available have been merged, outliers removed, and otherwise "cleaned up," you have to either accept it on religious grounds or reject it as unverifiable.

The other way, of course, to test models is to use them for prediction. Yes, I have seen models which did predict colder temperatures in 2008 and 2009--but they are based on sunspots, and cosmic rays. They are definitely not part of any IPCC consensus.

Finally, Bill Illis has linked to some (decent) NOAA data showing CO2 levels over 6000 ppm, or 20 times current levels millions of years ago. The simple application of the Stefan-Boltzman law would call for about 16 degrees C of ratiative forcing, which was clearly not the case. I certainly understand why the simple thermodynamics doesn't work. There are windows in the CO2 absorption spectrum which remain open no matter how many doublings of CO2 partial pressures occur. It doesn't take much data analysis either to realize that the answer is that water vapor and clouds have a complex response to temperature. But the shouting down by "climate scientists" of weathermen who use statistics to develop and validate complex models of how water vapor actually works is pretty shameful. (They probably watch the predictions of those models every night on TV to see what to wear in the morning, while shouting the models down during the day.)

I'd love to share some of those weather models with you, but there is an entirely different problem involved. The weather data itself is available from NOAA to anybody, in more detail than most desktop computers can handle. But there is a tremendous competition, part of it commercial, part of it military, to come up with better long-term weather prediction models. (And when long-term is long enough, it becomes climate.) Most of this work is done within a few miles of Bedford, MA, either at commercial weather forcasting companies, or by government agencies headquartered at Hanscom AFB.

I worked at MITRE Bedford for over a decade, and as I said at the beginning, my work involvement in these projects usually involved such things as programming languages support, distributed systems development, or (complexity) analysis of algorithms. So the data and models were not mine to distribute. However, the development effort in house was approximately three times as much effort on validating models as on creating new ones. When the models were turned over to the Air Force, the additional validation costs were huge. The additional work created by running multiple models side-by-side for a year or so is appalling. Back then it was basically one model run per supercomputer with multiple runs of the same model necessary every day. Part of the pain is that the data used in the run are usually twelve to twenty-four hours stale by the time the run finishes--and it often took two supercomputers to produce one result every twelve hours. If you read the forcasts from the National Hurricane Center you will find that today (well this year) they usually run four different models against the data from each hurricane every few hours.

The big prediction problem incidently is still the rain/snow line in three dimensions. It is easy to see how snow vs. rain on the ground results in a different albedo once the storm is over, but the same line is very important in heat transport within summer thunderstorms. And validation there is just a bit harder. I've seen pictures of aircraft that came back with (fortunately only slightly)) bent wings. And with lots of dents from flying into hail.


*I remember one case where I suggested non-parametric statistics as a way to tease a signal out of the background. The Wilcoxon rank sum test pulled out a very clear and statistically significant result--which after a few more days of data turned out to be from the heating system in the building. Sigh! At least I helped him track down what would have been an embarrassing systematic error, even if it did mean he had to start data collection all over again.

** Yes, I know that the tree ring data from Siberia was not based on data from just one tree. The problem is that the processed data wipes out the normal random variations that can be used to test for significance.
 

Similar threads

Replies
2
Views
2K
Replies
2
Views
4K
Replies
25
Views
7K
Replies
180
Views
33K
Replies
13
Views
6K
Replies
19
Views
7K
Replies
75
Views
20K
Back
Top