The officially published global temperature records all converge on a total temperature rise since the late 19th century of about 0.9 (0.8–1.0) degrees Celsius:
But to what extent can we be confident that this is how the ‘global average surface temperature’ (GAST) anomaly actually evolved over this time frame?
The truth is: We can’t. At all.
This is fundamentally a matter of data coverage, but – just as importantly – it is also a matter of methodology. How do you make up for a paucity of data? How do you properly compile, weight and interpolate data into a reliable “global average” when that data – the actual observational information that you have collected and thus have at your disposal – provides nothing like a full global coverage? And how do you make this “global average” of yours consistent over time when your data coverage (both in total and in spatial distribution) vastly changes over that same time frame? What basic assumptions will you have to rely on? Because, make no mistake, an interpretive undertaking such as this will crucially have to rest on a foundation of some rather sweeping presuppositions. Continue reading