- JoNova - https://joannenova.com.au -

Australian temperature records shoddy, inaccurate, unreliable. Surprise!

The BOM say their temperature records are high quality. An independent audit team has just produced a report showing that as many as 85 -95% of all Australian sites in the pre-Celsius era (before 1972) did not comply with the BOM’s own stipulations. The audit shows 20-30% of all the measurements back then were rounded or possibly truncated. Even modern electronic equipment was at times, so faulty and unmonitored that one station rounded all the readings for nearly 10 years! These sloppy errors may have created an artificial warming trend. The BOM are issuing pronouncements of trends to two decimal places like this one  in the BOM’s Annual Climate Summary 2011 of “0.52 °C above average”  yet relying on patchy data that did not meet its own compliance standards around half the time.  It’s doubtful they can justify one decimal place, let alone two?

We need a professional audit.

 

A team of independent engineers, scientists, statisticians and data analysts (brought together by the joannenova blog) has been going through the Australia Bureau of Meteorology records (BOM). They’ve audited some 8.5 million daily observations across 237 High Quality and other close sites in Australia. Shockingly, while the BOM calls their database “High Quality” and instructed observers before 1972 to record in tenths of a degree Fahrenheit, the auditors started finding sites with long stretches of records where the weather suspiciously rose and fell only in Fahrenheit quanta, like 72.0, 73.0, 72.0, 71.0, 73.0, 72.0. After 1972, the BOM went metric, and oddly, so did parts of the Australian climate. Numerous sites started warming and cooling in pure Celsius integers.

The bottom line:

  1. The BOM records need a thorough independent audit.
  2. It’s possible that a significant part of the 20th Century Australian warming trend may have come from something as banal as sloppy observers truncating records in Fahrenheit prior to 1972.
  3. Many High Quality sites are not high quality and ought to be deleted from the trends.
  4. Even current electronic equipment is faulty, and the BOM is not checking its own records.
  5. Even climate scientists admit that truncation of Fahrenheit temperatures would cause an artificial warming effect.

The Audit Team identifies a suspicious problem

It was the sharp eye of  Chris Gillham who noticed the first long string of continuous whole numbers in a site record. I wondered if it was faulty equipment and thus if other sites were affected, so people started looking, and suspicious stretches started turning up everywhere. The audit team were astonished at how common the problem was. Ian Hill and Ed Thurstan developed software to search the mountain of data and discovered that while temperatures of .0 degrees ought to have been 10% of all the measurements, some 20 – 30% of the entire BOM database was recorded as whole number, or “.0”.

Ken Stewart has the whole in-depth report at his site:  “Near Enough For a Sheep Station”

 

Fahrenheit era observations (prior to 1972) have even higher proportions of .0 than those of the Celsius era. Note the strange peaks at 0.1 and 0.9? Those are likely due to converting raw F to C and back.

Celcius era observations (1972 – 2011)have obvious peaks at .0 and .5. All decimals ought be equally represented.

A few things we can say for sure is that observers (who were unpaid a lot of the time) didn’t know how much stock would be placed on their records years after they were taken. The cold mornings tended to strip their enthusiasm, possibly explaining why minima records are less accurate than maxima.

Only 15% of sites were compliant with the BOM stipulations for Fahrenheit maximums, and only about 5% were compliant for Fahrenheit minima. That’s 83% and 95% non-compliant!  Around half of all sites were so bad that three out of 10 records were rounded or truncated to whole degrees.

With Celsius records, about 50% of sites for maxima and 65% of sites for minima have months and sometimes years of consecutive daily temperatures presented in the BoM’s raw data feed as rounded integers, along with a high proportion of .5C. There is evidence that in some instances this incorrect raw rather than precise data is used to calculate the BoM’s HQ daily temperature series.

Converting Fahrenheit to Celsius adds another error…

Ken points out: “The occurrence of large percentages of .1 and .9 fractions in the Fahrenheit listings suggests this is an artifact of the original conversion from F to C (rounded to one decimal place), then the conversion back from C to F (again rounded), and suggests that the percentage of .0 is in reality much more. Below is a sample of Fahrenheit to Celsius to Fahrenheit conversions.”

For example, converting 67.0 F to 19.4C will produce 66.9F (not 67.0) when converted back to F if conversions are rounded to one decimal place at each step.

The odd high number of 0.1 and 0.9 records probably comes from converting Fahrenheit to Celsius and back. The rounding in Fahrenheit times was probably worse.

Even climate scientists admit truncation and conversions could be a problem

A seminal 1996 climate study into historic Australian temperature records by Simon Torok et al, acknowledged that truncation of Fahrenheit temperatures would have caused artificial warming in the early 1970s.

Some observers, prior to the change to metric units, recorded temperatures in whole degrees Fahrenheit, instead of recording to the nearest tenth of a degree, as specified in directions to observers.  If many observers truncated their measurements to the nearest whole degree below the actual measurement, prior to metrication, and after metrication recorded to tenths of a degree, this would result in an artificial warming in the early 1970s.  Discontinuities caused by such a practice, if it was wide-spread, would not be detected by the statistical programs used here.  Examination of field books does not suggest this practice was sufficiently common …”

Reference — Torok, S.J. and Nicholls, N (1996): A historical temperature dataset for Australia. Aust. Met. Mag. 45: 251-260

Torok and Nicholls, did not think the practice was common, but this audit shows that almost all sites were non-compliant during some part of the era before 1972, and the number of 0.1 and 0.9 records at the worst sites suggests those sites were almost 100% recorded to whole degrees.

The worst sites in Australia

For those who are interested, the worst state in Australia by far was NSW, but the Gold Medal for the Worst Celsius era Site in Australia goes to Katanning WA, where some 75% of all readings were rounded to  whole degrees.

The all time Most Preposterous Station Award goes to Tamworth Air AWS. It is supposed to be a High Quality record, yet from 1996 to 2006 there were 3485 days of continuous whole degree Celsius records. The temperature was recorded electronically by an Automatic Weather Station and transferred directly into the HQ record, and seemingly no one noticed for nearly ten years.

Pre 1910 records – no better or worse than newer ones

The BOM often tells us to ignore the readings prior to 1910, but when compared to the sloppy state in the century since then, on the issue of whole degree rounding, the early records are just as good. Surely the warm period in the late 1800’s is not being conveniently ignored?

Higher bars means sites with more rounding errors. About 35% of sites were non-compliant from 1880 – 1910, which was better than data around the 1960’s. The best data comes from 1937-1957. From Ken Stewarts description: “Another way of showing the relationship between age and quality is by plotting the 20 site running median of the Compliance Index (inverted) with the start-up date of sites. “

What does it mean?

It all depends on something that may be unknowable. How many whole number records were truncated, rather than rounded up or down? If truncation was common in the pre-1972 era the records would have been artificially lower than they should have been. The audit team ran through many scenarios and found that since more than half of all sites in Australia had rounding errors probably greater than 50%, if they were truncated at significant levels (say 33%, 50%, or 100%) before September 1972, it would cause artificial warming of between +0.1C and +0.4C per 100 years.

 

Ken Stewart and the auditing team call for a thorough independent audit of the BOM

Conclusion

This audit of a large sample of daily temperature observations at all sites associated with Australia’s High Quality Temperature Network provides convincing evidence that the record is of very poor quality and is replete with errors. Many HQ sites have recorded large amounts of data in recent  years that may be in error by up to 0.50 Celsius, being rounded to whole degrees, and more than half of the sample studied have recorded erroneous data at some time in the past 40 years. As well, the vast majority of sites used to compile the HQ Annual temperature dataset inaccurately recorded observations in the Fahrenheit era by recording in whole degrees. For nearly half of all sites, this amounts to at least 50% of their total observations. It is probable that more than 50% of all Australian observations were rounded. This alone means that temperatures before 1972 may be inaccurate by up to 0.250 C. If significant proportions of temperatures were rounded down, this would have the effect of making post-1972 temperatures relatively warmer, increasing warming trends by between 0.1oC and 0.4oC. Evidence is presented that this may have been the case. There is also evidence of very poor quality control in compiling the HQ record. The large amount of uncertainty in the records of so many sites means that homogenisation as practised by BOM researchers must be in question, and with it all analyses of Australia’s temperature trends plus the calibration of past proxy studies.

“Near enough for a sheep station” may have been the understandable attitude of hundreds of poorly trained weather observers in the past. However, it is NOT good enough for a modern scientific organisation such as the Bureau of Meteorology or the CSIRO, especially when climate analyses based on such poor quality historical data inform government policy on climate change.

A thorough audit of the Bureau of Meteorology’s practices is long overdue.

 ——————————————————

BACKGROUND

The driving inspiration behind this audit was Chris Gillham. He has written his interpretation of our findings at http://www.waclimate.net/round/rounded-australia.html

This independent audit of raw daily temperature records at 237 High Quality and nearby stations was conducted over seven weeks from January to March 2012.  It was a collaborative data validation analysis by:

Ken Stewart

Chris Gillham

Ian Hill

Ed Thurstan

Geoff Sherrington.

Others who contributed are Joanne Nova, Warwick Hughes, Lance Pidgeon, Anthony Cox.

For a description of the origins and process of the audit, see http://www.waclimate.net/round/rounded-background.html

None of the participants in this audit has received any financial reimbursement or other assistance from any third party, corporate or political, and all results are accurate and verifiable.

———————————————————–

Ken Stewart has the whole in-depth report at his site:  “Near Enough For a Sheep Station”.

The full audit results for all Australian HQ and associated weather stations are within:

The Excel macro applications built to conduct this audit are:

Both macros and the Table of Sites should be run simultaneously. Full instructions are included within the tenth fraction distribution calculator macro in the sheet titled Instructions.

Thank you Ken, Chris, Ian, Ed and Geoff!

9.1 out of 10 based on 95 ratings