In previous post, I discussed a graph that suggested that the CO2 and CH4 levels in the atmosphere are unprecedented in the last 800,000 years and proposed that it is misleading to compare high resolution data with low resolution data. After I published that post, I wondered whether I could illustrate this with an example. It should be possible if I had some detailed dataset. Then I could make a detailed graph, see how that looks like, then sample this dataset in the same way as a proxy dataset and again make a graph. Comparing both graphs should make clear what the effect is.
The previous post was about “the most popular contrarian argument” according to skepticalscience (“climate changes before, so current climate change is natural”) and what they seem to consider a live example of such a claim. I then proposed in that post that it actually was not a good example of what they want to prove.
What I didn’t discussed yet was how skepticalscience “debunked” this most popular contrarian argument. They did this in the “Climate’s changed before” myth page that was apparently based on this example.
They “debunked” this “myth” by stating that the climate is indeed always changing, but the difference is that it is changing much faster now than in the past because of our increasing emissions. This is how it starts:
In previous post I explored my misconception of a long term average global temperature via sparse weather stations. Imagine my surprise when I read the news paper the next day and came across a perfect example of what I was explaining: this year was the warmest November ever. The article seemed to be taken from the VTM news of December 17, 2013 (see screenshot on the right how it is brought). This was the quote of the news (translated from Dutch, my emphasis):
Last November was the warmest in 134 years worldwide and this basically means it was the warmest November ever measured. Normally the average temperature worldwide in November is 12.9 °C. This year it was 0.78 °C warmer.
The numbers are from NOAA via GHCN-M (monthly mean land temperature) combined with ERSST.v3b (Extended Reconstructed Sea Surface Temperature).
Look closely: it is being brought as if this 0.78 °C is accurately measured somehow. This is obviously not the case. This is a statistical analysis in the assumption that these land + ocean measurements represent the real temperature of the Earth. For the public it is tempting to think this is the case or that the calculations solve all sampling problems. At least I did.
But the measurements are only taken in places where people are happy to live. That is called “convenience sampling” and brings bias into the measurements. If this biased measurements are being used to calculate, the result will be a biased global average temperature.
Garbage In, Garbage Out
Earth’s temperature is very complex. There is no place on earth that keeps the same temperature for long. Even regionally temperature can differ quite a lot. How could one ever calculate the correct average temperature of the earth (510 million square kilometer!) with, oh dear, some thousands of weather stations/buoys/drifters and do this with an accuracy of …gasp… 0.01 °C?!?!?!?!
More, what is this compared with? Measurements before the 1980s were very sparse and hardly existing before the 1940s. Think for sea temperatures (3/4 of the earth) taking temperatures from buckets hauled into ships that happen to be there. I find it hard to believe that the calculations with this sparse data results in the same incredibly high accuracy. Could it even can be calculated reliably at all?