The influence of the zombie thermometers

zombie thermometer

When I saw the response of NCDC press office on the questions raised by Tony Heller (aka Steven Goddard), Paul Homewood and Antony Watts on the reliability in the NCDC temperature network, it was not exactly what I was expecting:

Our algorithm is working as designed.

As far as I could understand the issue, it has to do with how their program (that calculates the average US temperature) works when temperature data is missing. Tony Heller claimed that 40% of the data is “fabricated”, meaning not coming from measurements. When there is no measurement for a certain station, the program makes an estimate by looking at the neighboring stations and assigns an estimate of that missing data from this. So far so good, but something went horribly wrong when the program called that routine for example when the underlying raw data was complete or, more mind boggling, when there were even stations that were closed for many years and yet estimates were generated for them.

Yet they claim that their algorithm is working … as designed.

Not a bug, a feature. Nothing to see here, move along.

Although this infilling is perfectly fine (mathematically speaking that is) in a reliable network, there is an issue with it in a system with many discontinuities.

Like surface weather station data.

Only five years ago I got drawn into the global warming issue when visiting the Surface Stations website. From that moment on I realized that the temperature measurement network was not really in good shape. According the current data, more than 90% of the stations had siting issues and will report temperatures with an error larger than 1 °C. It came as a surprise that there were many, many issues like heat sucking asphalt, stones, nearby buildings, air conditioners/external heat sources and what not more. All these influence temperature readings, upwards.

So in this case, if 90+ percent of the stations really has sitings issues and there is infilling from neighboring stations, how reliable would that infilled data be???

Does NOAA/NCDC fabricates data as being insinuated? Well, depends on the definition of “fabricate”. If it is willfully alter the data for a specific goal, then I don’t think that this is the case. But if it is creating data where there was no data before, yes, I think they are fabricating data and chances are high that the adjustment will be upwards.

Fine, but just don’t call it high quality data anymore…

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s