The pressing need for ever-upward temperature adjustments … A matter of life or death to the AGW hype.

In July I wrote a blog post where a strange and very conspicuous step change indeed in global mean temps relative to the trended AMO (North Atlantic SSTa), occurring across the 8-year period of 1963-70, was pointed out:


Animation 1.

As you can clearly see, the two curves generally follow each other in remarkable style all the way from 1860 till today, except for the relatively sudden and substantial global upward shift taking place across the last half of the 60s, being firmly established by the end of 1970. After this point, the curves are back to tracking each other to an equally impressive degree as before the shift, only now with the global raised 0.25 degrees above the North Atlantic.

So why this step change? How did it occur? Continue reading

A case to prove a point: The claims of major (ongoing) Antarctic Peninsula warming

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.”

H.L. Mencken (1918)

It is chronically advanced by the members and fans of the climate establishment as an ostensibly documented (and hence undeniable) Truth – one of many such ‘Truths’ typically laid down as premises considered facts in argument by the warmists, one of many cornerstones of the ongoing promotional campaign for their ‘CO2 global warming hobgoblin':

‘The Antarctic Peninsula endures some of the highest warming rates of any region of the world, warming several times (three, at least) as fast as the globe at large. Major events such as the breaking apart of the Larsen A and B ice shelves in 1995 and 2002 respectively are clear indicators of this calamitous warming.’

From wikipedia:

“(…) the Larsen Ice Shelf is a series of three shelves that occupy (or occupied) distinct embayments along the [eastern] coast [of the Antarctic Peninsula]. From north to south, the three segments are called Larsen A (the smallest), Larsen B, and Larsen C (the largest) by researchers who work in the area. The Larsen A ice shelf disintegrated in January 1995. The Larsen B ice shelf disintegrated in February 2002. The Larsen C ice shelf appeared to be stable in 2008, though scientists predict that, if localized warming continues at its current rate, the shelf could disintegrate at some point within the foreseeable future.

The Larsen disintegration events were unusual by past standards. Typically, ice shelves lose mass by iceberg calving and by melting at their upper and lower surfaces. The disintegration events are linked to the ongoing climate warming in the Antarctic Peninsula, about 0.5 °C per decade since the late 1940s, which is a consequence of localized warming of the Antarctic peninsula. This localized warming is caused by anthropogenic global warming, according to some scientists through strengthening of the winds circling the Antarctic.

(My emphasis.)

Such statements clearly indicate a continuing warming going on. Continue reading

How the IPCC turn calculated numbers into heat

‘Climate ScienceTM’ (represented and promoted by the IPCC) has so corrupted ordinary people’s way of thinking, that in order to demonstrate why there is no ‘atmospheric radiative greenhouse effect’ (rGHE), you have to start all the way from scratch. You have to step completely outside the framework of their concocted ‘mental model’ within which they shape their arguments.

‘Climate ScienceTM’ is afflicted with a dual case of monomania, two major fixations that they cannot and will not drop under any circumstances.

The first one is a complete linear trend line mania. They are unable to look at a data time series and not mentally project one onto it. The data – and especially the variation in it – basically doesn’t matter. Only the straight trend line plastered across it, from the one end to the other, does.

The second one, of direct relevance to this post, is their peculiar obsession with radiative flux intensities and their perceived direct correlation with the surface temperature of objects, expressed by the purely radiative Stefan-Boltzmann relationship. They clearly misinterpret and hence stretch the applicability of this law in the real world far beyond its actual justified range of operation, but absolutely refuse to recognise it. They worship (and use) it as sanctified truth.

Basically, they see the world in terms of radiation first and last. Everything in their world is in the end determined and controlled by thermal radiation. When it comes down to it, according to the warmists, you can simply scrape away everything else and just look at instantaneous radiative emission fluxes and directly know surface temperatures. As if we all lived in Max Planck’s conceptually pure radiative universe.

‘Climate ScienceTM’ thinks (or promotes the idea) that the temperature of any object – even real-world objects on Earth – is determined strictly by its radiative energy output (its emission flux), likewise that this final temperature is known and fixed even from the onset of heating, simply by the instantaneous intensity of its radiative energy input (the absorbed flux) minus convective loss (!).

In other words, if you only know the total (added) intensity of the instantaneous radiative energy flux input to the surface of an object and you are at the same time able to determine its energy loss through convection per unit of time, you will be able to tell its final temperature, no actual thermo-measurement required. Or, turn it around, if you know the temperature of an object, you instantly know the intensity of its radiative energy output, regardless of any simultaneous convective loss of energy.

(Well, you also need to know its surface emissivity/absorptivity, but according to ‘Climate ScienceTM’ most relevant real-world materials (like soil, rock, water, vegetation) possess emissivities close to unity anyway, and so can be approximated as (convecting) black bodies …!) Continue reading

‘Noise + Trend’?

Judith Curry just recently asked the following question in her blog post “The 50-50 argument”:

“So, how to sort this out and do a more realistic job of detecting climate change and (…) attributing it to natural variability versus anthropogenic forcing? Observationally based methods and simple models have been underutilized in this regard.”

There is a very simple way of doing this that people at large still seem to be absolutely blind to. To echo the words of ‘Statistician to the Stars!’ William M. Briggs: “Just look at the data!” You have to do it in detail. Both temporally and spatially. I have done this already here, here and here + a summary of the first three here. In this post I plan to highlight even more clearly the difference between what an anthropogenic (‘CO2 forcing’) signal would and should look like and a signal pointing to natural processes.

Curry has many sensible points. She says among other things:

“Because historical records aren’t long enough and paleo reconstructions are not reliable, the climate models ‘detect’ AGW by comparing natural forcing simulations with anthropogenically forced simulations. When the spectra of the variability of the unforced simulations is compared with the observed spectra of variability, the AR4 simulations show insufficient variability at 40-100 yrs, whereas AR5 simulations show reasonable variability. The IPCC then regards the divergence between unforced and anthropogenically forced simulations after ~1980 as the heart of the their detection and attribution argument. (…)

The glaring flaw in their logic is this.  If you are trying to attribute warming over a short period, e.g. since 1980, detection requires that you explicitly consider the phasing of multidecadal natural internal variability during that period (e.g. AMO, PDO), not just the spectra over a long time period. Attribution arguments of late 20th century warming have failed to pass the detection threshold which requires accounting for the phasing of the AMO and PDO. It is typically argued that these oscillations go up and down, in net they are a wash. Maybe, but they are NOT a wash when you are considering a period of the order, or shorter than, the multidecadal time scales associated with these oscillations.

Further, in the presence of multidecadal oscillations with a nominal 60-80 yr time scale, convincing attribution requires that you can attribute the variability for more than one 60-80 yr period, preferably back to the mid 19th century. Not being able to address the attribution of change in the early 20th century to my mind precludes any highly confident attribution of change in the late 20th century.

And Continue reading

Why ‘atmospheric radiative GH warming’ is a chimaera

Science of Doom (SoD) has apparently issued a challenge of some sort to a commenter going by the name of ‘Bryan’. This is how SoD describes Bryan:

“Bryan needs no introduction on this blog, but if we were to introduce him it would be as the fearless champion of Gerlich and Tscheuschner.”

And the challenge appears to be a return to the ‘Steel Greenhouse’, a setup that is meant to convey in the simplest possible way the basic mechanism behind ‘atmospheric radiative greenhouse warming’ of the surface of the Earth.

The challenge is as follows: Continue reading

On Heat, the Laws of Thermodynamics and the Atmospheric Warming Effect

On average, Earth’s solar-heated global surface is warmer than the Moon’s by as much as 90 degrees Celsius! This is in spite of the fact that the mean solar flux – evened out globally and across the diurnal cycle – absorbed by the latter is almost 80% more intense than the one absorbed by the former.

The Earth’s global surface, absorbing on average 165 W/m2 from the Sun, has a mean temperature of ~288K (+15°C).

The Moon’s global surface, absorbing on average 295 W/m2 from the Sun, has a mean temperature of >200K (-75°C).

A pure solar radiative equilibrium for each of the two bodies (according to the Stefan-Boltzmann equation: Q = σT4, assuming emissivity (ε) = 1) would provide them with maximum steady-state mean global temps of 232K (-41°C) and 269K (-4°C) respectively.

As you can well gather from this, the Earth’s surface is 56 degrees warmer than its ideal solar radiative equilibrium temperature, while the lunar surface is at least 70 degrees colder than its ideal solar radiative equilibrium temperature. That’s a spread of no less than 126 degrees! On average …

Still, these two celestial bodies are at exactly the same distance from the Sun: 1AU.

So what could possibly account for this astounding difference between such close neighbours?

Very simple: The Earth has an atmosphere. The Moon doesn’t. Continue reading

‘Modern Global Warming’ in three steps – the (fairly) short version

In IPCC’s Fifth Assessment Report (AR5) of last year, they stated the following:

“It is extremely likely [95 percent confidence] more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.”

‘More than half.’ That sounds like a pretty conservative guess. Well, they end up going further than that. Much further.

What caused global warming over the last 60 years or so, according to the IPCC? Apparently, human ‘greenhouse gas’ emissions alone (100%):

“The best estimate of the human-induced contribution to warming is similar to the observed warming over this period … The observed warming since 1951 can be attributed to the different natural and anthropogenic drivers and their contributions can now be quantified. Greenhouse gases contributed a global mean surface warming likely to be in the range of 0.5°C to 1.3 °C over the period 1951−2010, with the contributions from other anthropogenic forcings, including the cooling effect of aerosols, likely to be in the range of −0.6°C to 0.1°C.”

That should be a net range of anthropogenic ‘contributions’ to the general global temperature rise between 1951 and 2010 of 0.6 to 0.7°C.

So, then, what did not contribute at all (0%) to that same general warming, according to the IPCC? Apparently, natural external factors like solar activity, and natural internal factors like ocean cycles:

“The contribution from natural forcings is likely to be in the range of −0.1°C to 0.1°C, and from internal variability is likely to be in the range of −0.1°C to 0.1°C.”

That should make up a total natural contribution to the general global temperature rise between 1951 and 2010 of exactly 0°C. Continue reading