Figure 1. Original found here: https://tamino.wordpress.com/2015/12/11/ted-cruz-just-plain-wrong/
A good month ago, the perennially unsavoury character calling himself Tamino once again tried to hold up the spotty “global” network of radiosondes (weather balloons) as somehow a better gauge of the progression and trend of tropospheric temperature anomalies over the last 37 years than the satellites, by virtue of being essentially – as he would glibly put it – “thermometers in the sky”.
So his simple take on the glaring “drift” between current surface records and the satellites over the last 10-12 years is this: The surface records are right and the satellites are wrong. Why? Because the surface records agree with the radiosondes while the satellites don’t! The radiosondes implicitly – in his world – representing “Troposphere Truth”.
And so, when your starting premise goes like this: the radiosondes = thermometers in the sky = troposphere truth, then any “drift” observed between them and the satellites (as in Fig.1 above) will – by default – be interpreted by you as a problem with the latter.
To repeat Tamino’s fairly simplistic reasoning, then, in the form of some sort of logical-sounding argument: Surface and satellites don’t agree. Radiosondes and satellites don’t agree. But surface and radiosondes do agree. Which means the latter two are right, their agreement robustly verifying the ‘rightness’ of each. (And also, the radiosondes represent “Troposphere Truth”.) Which leaves the satellites out in the cold …
There is, however, a definite issue to be had with this line of argument.
More than fifteen months ago I wrote the post “What of the Pause?”, where I tried to analyse the state of the global climate with a special focus on the interesting developments following the 2011/12 La Niña. I have also later discussed that particular time period here.
I have earlier pointed out the close connection between the SSTa in that central-eastern part of the narrow Pacific equatorial zone called “NINO3.4” and “global” SSTa over decadal time frames, how the former consistently seems to lead the latter in a tightknit relationship, firmly constraining the progression of global mean anomalies through time – flat (though with much noise) as long as the NINO3.4 signal remains strong enough to override (and/or control) all other regional signals around the globe, which most of the time it does.
I have then proceeded to show how “global warming” (or “global cooling”) only appears to come about at times when the influence of this tight relationship on the global climate is somehow offset by surface processes elsewhere, meaning outside the NINO3.4 region. This obviously doesn’t happen too often, because it would take a very powerful and persistent process to disrupt and even break the sturdy grip of the NINO3.4 region on the leash with which it controls the generally flat progression of global mean temps over time.
In fact, from 1970 to 2013 it evidently only happened three times. Which means that within these three instances of abrupt extra-NINO surface heat is contained the entire “global warming” between those years. Before, between and after, global temp anomalies obediently follow NINO3.4 in a generally (though pretty noisy) horizontal direction; no intervening gradual upward (or downward) divergence whatsoever.
With the year 2015 completed, I felt an update of this NINO3.4-global SSTa relationship was in order. Is there evidence of a new step as of late …?
My answer to this can only be: ‘It is still too early to tell.’ But interesting things have happened – and are indeed still happening – over the last two to three years, since about mid 2013:
Climate (or global atmospheric) reanalyses are an alternative way to assess how the global climate evolves over time, a blend of model and observation. They tend to include a multitude of variables, but I would like to focus on the one specifically pertaining to our recent discussion about GISTEMP vs. HadCRUt3: global temperatures.
There’s a host of different climate reanalyses around; among the most reputable ones, though, are those conducted by the American agencies NCEP (NOAA) and NCAR, the Japanese JMA, and the European ECMWF.
“A climate reanalysis gives a numerical description of the recent climate, produced by combining models with observations. It contains estimates of atmospheric parameters such as air temperature, pressure and wind at different altitudes, and surface parameters such as rainfall, soil moisture content, and sea-surface temperature. (…)
ECMWF periodically uses its forecast models and data assimilation systems to ‘reanalyse’ archived observations, creating global data sets describing the recent history of the atmosphere, land surface, and oceans. Reanalysis data are used for monitoring climate change, for research and education, and for commercial applications.
Current research in reanalysis at ECMWF focuses on the development of consistent reanalyses of the coupled climate system, including atmosphere, land surface, ocean, sea ice, and the carbon cycle, extending back as far as a century or more. The work involves collection, preparation and assessment of climate observations, ranging from early in-situ surface observations made by meteorological observers to modern high-resolution satellite data sets. Special developments in data assimilation are needed to ensure the best possible temporal consistency of the reanalyses, which can be adversely affected by biases in models and observations, and by the ever-changing observing system.”