- JoNova - https://joannenova.com.au -

Blockbuster: Are hot days in Australia mostly due to low rainfall, and electronic thermometers — not CO2?

Blame dry weather and electronic sensors for a lot of Australia’s warming trend…

In this provocative report, retired research scientist Bill Johnston analyzes Australian weather records in a fairly sophisticated and very detailed way, and finds they are “wholly unsuitable” for calculating long term trends. He uses a multi-pronged approach looking at temperatures, historical documents, statistical step changes, and in a novel process studies the way temperature varies with rainfall as well.

His two major findings are that local rainfall (or lack of) has a major impact on temperatures in a town, and that the introduction of the electronic sensors in the mid 1990s caused an abrupt step increase in maximum temperatures across Australia. There will be a lot more to say about these findings in coming months — the questions they raise are very pointed. Reading, between the lines, if Johnston is right, a lot of the advertised record heat across Australia has more to do with equipment changes, homogenisation, and rainfall patterns than a long term trend.

Bill Johnston: On Data Quality [PDF]

“Trends are not steps; and temperature changes due to station changes, instruments and processing is not climate change”, he said. “The Bureau are pulling our leg”.

The years when more rain falls are more likely to be years without high maximums. Bill Johnston finds that for every 100mm of rainfall, the maximum temperatures were about a third of a degree cooler.

Australian Temperatures, adjustments, homogenisation, Bureau of Meteorology, BOM

On the left hand side, the step ups in temperature are shown. Electronic sensors were introduced in the mid 1990s. The right hand graphs show how rainfall keeps maximum temperatures cooler. (Data is grouped “a,b,c” between the steps).

Johnston uses these rainfall correlations as a tool to check the quality of temperature records. When combined with step change analysis, he finds that unrecorded site moves or station changes are common. When the automatic sensors were introduced temperatures suddenly jump up, and their relationship with rainfall breaks down.

“Fleeting parcels of hot air, say from passing traffic or off airport runways, are more likely to be sensed by electronic instruments than by thermometers”, he said. 

Automatic weather stations (AWS) were introduced across Australia’s network within a few years. Because so many stations made the switch around the same time, homogenization procedures don’t detect their bias, and assume a natural step up in warming occurred. Worse, the artificial warm bias is transferred to stations that are not automated, reinforcing trends that don’t exist!

“Homogenisation is nonsense, and an open public inquiry into the Bureau’s activities is overdue”, he said.

The Bureau must be audited. Stations should not be homogenized until they are analyzed individually. And the analysis should start with site inspections and a detailed historical account of what is known about each site.

“The Bureau has scant knowledge about many important sites”, Bill said, “and some of what they claim cannot be trusted”.

Temperature is strongly related to local rainfall

Hot years are dry years, and wet years are not hot. It’s tritely obvious, yet kinda profound.  Johnston finds that from half to three quarters of the temperature variation is caused by changes in rainfall. When the land is bone dry, it heats up fast. But when soil moisture is high, the Sun has to evaporate the water first in order to heat the soil. The atmosphere above dry land has lower humidity and will rise and fall in temperature swings that are far larger than the atmosphere above moist landscapes. It varies somewhat with every town, but the relationship is consistent across the continent.

He also found that using rainfall to analyze temperature records reduces variation, while revealing aberrations in the data. What do we make of a town where the effect of rainfall on temperatures has a linear relationship for decades then suddenly changes to a random pattern? Homogenization can produce trends that don’t belong to the site.

New electronic sensors cause an artificial jump

Not only do electronic sensors pick up shorter spikes in temperature, they are also not linear instruments like mercury thermometers are. “AWS don’t measure temperature linearly like thermometers do; which causes them to spike on warms days. This biases the record.

There was also a wave of undocumented site and station changes around the time the Bureau took-over weather observations from the RAAF in the 1950s.  To mention a few; locations affected included Alice Springs, Norfolk Island,  Amberley, Broome, Mt. Gambier, Wagga Wagga and Laverton.

Disturbingly he finds that after many site and equipment changes have been accounted for, and the variability due to rainfall has been ruled out,  no temperature trend remains.

Below Johnston discusses Cowra and Wagga in detail. He observed the weather at Wagga Wagga Research Centre, visited many other sites and discussed issues with the staff. He has worked with climate data, calibrated commercial automatic weather stations and used climate data in many of his peer-reviewed studies.

Bill is pushing for an open public inquiry into the Bureau’s methods; its handling of data and biases in climate records.

Bill Johnston points out that local rainfall is useful for checking temperature data

Ignoring heat-storage in the landscape, which is cyclical; the local energy balance partitions heat loss between evaporation, which is cooling when the local environment is moist; and sensible heat transfer to the atmosphere (advection) during the day and radiation at night, which is warming when the environment is dry. Thus the longer it’s dry, the hotter it gets.

Rainfall is (should-be) linearly related to temperature, especially Tmax, but independent of it.

If the local heat-load changes, forcing a significant base-level shift, the relationship is still linear but is offset by the impact of the change.
If data become grossly disturbed or dislocated from the site (it’s fabricated for example, or implanted from somewhere else), variation around the relationship increases, data become random to each other and statistical linearity is lost. (Rainfall seasonality may also impact on this, but is not considered here.)

So, linearity is expected; and we need a statistic that indicates how good (bad) the relationship is.

Cowra

...

Figure 2. Cowra’s annual average Tmax and Tmin increase with time (dotted line). Tmax weighted by within-year variance to improve precision, increases by 0.4oC/decade (R2adj = 0.332); Tmin by 0.2oC/decade (R2adj = 0.296). Relationships seem highly significant (P< 0.001).Tmax trend includes step-changes in 1979 (+0.63oC; P =0.02) and 1997 (+1.04oC; P < 0.001); and Tmin trend in 1973 (+0.69; P < 0.001) and 1998 (+0.29oC; P= 0.04). Mainly due to missingness, highlighted Tmax data may be faulty.On the right, Tmax declines as rainfall increases: overall, including faulty data by –0.35oC/100 mm rainfall (R2adj = 0.484); excluding faulty data by ‑0.42oC/100mm (R2adj = 0.580) and factored by step-changes (excluding faulty data), by –0.30oC/100mm (R2adj = 0.779). Tmax lines are statistically parallel. Tmin varies randomly with rainfall, so there is no association.

Local rainfall at Cowra explains half of the temperature changes, but when suspect data and step changes are accounted for, rainfall explains over three quarters of the temperature changes:

The main points are that local rainfall naïvely explains 48.4% of overall Tmax variation; 58.0% when four suspect-years are ignored; and  77.9% when data step-changes are also accounted for.

What happened to the warming?

“We shouldn’t think naively of temperature changing in time.” “What we should look at are factors like rainfall and site changes that impact on measurements; account for their effects, then check for trend”, Bill said. 

“Importantly, with step-changes and rainfall accounted-for, there is no residual time-trend.”

Bill explains: “Wagga Wagga and Cowra, get more rain in summer than Rutherglen does”. “Summer rain cools maximum and minimum temperatures together”. “Cloud and fog reduce the numbers of frosty days in wet winters, however”. “Minimum temperatures, which are sensed around dawn each morning, are not as cold as when the sky is clear and the air is dry.”

“Taken overall, summer cooling and winter warming cancel each other-out, which is why Tmin is not so sensitive to annual rainfall”, he said. 

“Rutherglem receives a higher proportion of its annual rainfall in winter in wet years, which results in a much clearer Tmin response”.

 Wagga Wagga

The effect of rain on temperatures in Wagga Wagga

In figure 4, below, more rain in Wagga means cooler maximums (top right) but slightly warmer minimums.

Wagga, Rainfall, temperature, trends, NSW, Australia

Figure 4. Composite analysis of Wagga Wagga Research Centre Tmax and Tmin. Vertical dotted lines indicate metrication in 1972 and the re-location in 1977.  Missing data between 1956 and 1962; in 1968 and 1969; and after 1996 are problematic. However, only a few are outliers in the rainfall domain (right) (red squares). Ignoring them made no difference to trends in the time- or rainfall-domains and had little impact on R2adj.

 

Rutherglen

This graph shows Rutherglen raw data. The step up in 1996 is almost certainly the introduction of the automatic weather station system which resulted in a whopping three quarter of a degree rise in maximum temperature.

The blue line across the top indicates the number of days with data (so there is a big missing slab around 1960).

Bill has compared several brands of commercial automatic weather stations with thermometer data measured in Stevenson screens in the past.

“Upper-range temperatures are mainly affected”, he said. “Data for Rutherglen and elsewhere shows they over-range or ‘spike’ on warm days”. “This affects averages, and shows-up as a step-change”.  

The red squares mark years where temperatures look suspect because they don’t match the rainfall or they have many missing days (say in summer or winter).

 

Rutherglen, Victoria, Temperature, Rainfall

Figure 5. Step changes in Rutherglen’s raw data are indicative of site moves and other data problems. Highlighted data are detected as faulty. Transition to the automatic weather station in 1996 (vertical line) caused an abrupt Tmax increase of 0.75oC relative to previous thermometer data.

Minima are warmed by rain in winter, but cooled in summer

At Rutherglen there is a clear relationship between rainfall and minimum temperature. The record before 1923 is a mixture of Post Office and local data. Some data may also be in-filled between 1924 to 1996. Data from 1997 are from the AWS (and the step is clearly visible).

The faulty data show up as red squares —  they don’t fit with local rainfall.

Rutherglen, Victoria, Temperature, Rainfall, Bureau of Meteorology

Figure 6. Tmin increases with rainfall at Rutherglen and other southern temperate/Mediterranean sites because wet years are cloudy; foggy-winter-days more frequent and rainfall is more winter-dominant than at Wagga Wagga and Cowra.Groups are defined by Tmin step-changes; their group-means are different; regression lines are parallel; some outliers (red squares) are due to missing data others may be faulty. (Median rainfall (566 mm) is indicated by the vertical dotted line.)


Mysterious outliers and the trouble with homogenization

To show how widespread and insidious the problems are Johnston compares two towns in very different and distant locations. Rutherglen is part of the wine-growing region of central Victoria, whereas Ceduna is a coastal spot near the bone dry Nullabor plain of the Great Australian Bight.

See Figure 7 which graph the  temperature and rainfall patterns after the AWS systems were installed at both locations.

Notice two things:

  1. Firstly, the long term average rainfall and temperatures of both locations are plotted as dotted lines — straight across and straight up. This represents the data from the years before the AWS was installed. The data from the electronic  AWS system is forms a trend line that does not pass through the former average ratio (where the averages intersect).
  2. The same years at both places (labelled, 2007, 2009, 2013 and 2014) are well out-of-range relative to the rainfall for those years. At Rutherglen rainfall was near-average in 2013 and 2014; at Ceduna it was above average.

“Its inconceivable that temperatures would be so out-of-range for those years; notably its those years that the Bureau and CSIRO have relentlessly marketed in support of various political goals”, Bill said.

“Ceduna was not specifically chosen as a comparator to Rutherglen. It just happened to be about as far-away, as say, Alice Springs and Bourke, two stations whose data were used to homogenise each other”.  

Why are so many years such outliers, and almost all to the upside? (And the wind whispers… audit the BOM. Audit the BOM…)

Rutherglen, Temperature, Rainfall

Figure 7. Relationship between temperature and rainfall for Rutherglen and as a contrast, Ceduna’s AWS (post 1996). A non-parametric LOcally WEighted Scatterplot Smoothing (LOWESS) curve (smoothing parameter 0.8) fitted with a bootstrapped 95% confidence intervals track curvature. Dotted lines are pre-AWS average Tmax (21.7oC and 23.5oC) and each site’s median rainfall (566 mm and 282 mm). Indicated out-of-range values are suspiciously high relative to the distribution of other data in those years.

Johnston points out that homogenizing data can mean spreading artificial changes into good records:

Like Wagga Wagga airport (72150) and Bathurst Agricultural Institute (63005), Rutherglen is an homogenised ACORN-SAT (Australian Climate Observations Reference Network – Surface Air Temperature) site used to calculate Australia’s warming.

Without accounting for local rainfall, Cowra’s daily data with its faux-trend is used to homogenise Bathurst, Dubbo, Wylong and Canberra. Wagga Wagga Research’s un-trending dataset is used to homogenise Wagga Wagga airport, Cabramurra, Kerang and Rutherglen.

Also without adjusting for rainfall, Wagga Wagga airport and Rutherglen’s ACORN data are used to homogenise other ACORN sites, including some that are quite distant. Much potential exists in the process, for homogenisation to implant problems to other data, especially data not adjusted for rainfall and which are homogenised at daily time-steps.

“It makes no sense that if local rainfall explains most local temperature variation, it is not accounted for somehow in the homogenisation process”, Bill said.

In this graph from Norfolk Island, below, the rainfall increases both maxima and minima, but note the effect of the AWS system.

In the left hand side the step up is obvious. On the right hand side the blue triangles represent the AWS data and its relationship with rainfall. Obviously the AWS is reading artificially high temperatures compared to the rainfall.

Norfolk Island

At Norfolk Island, below, wet years are warmer than dry ones. The electronic AWS readings are also biased-high like at Rutherglen. 

In the left hand side the step up is obvious. On the right hand side the blue triangles represent the AWS data and its relationship with rainfall. Obviously the AWS is reading artificially high temperatures compared to the rainfall.

An important point is that the site was moved to a more exposed location after the Bureau took it over from the Royal New Zealand Air Force in 1948. Together with AWS over-ranging, this results in an artificial Tmin trend. There are many other examples of undocumented site moves that the Bureau seem unaware of and which also cause artificial trends. 


Norfolk Island, Temperature, Rainfall, Bureau of Meteorology

Figure 8. Like Rutherglen, Norfolk Island’s automatic weather station over-reports upper-range temperatures (Tmax +0.17oC, P = 0.07; Tmin + 0.38oC, P <0.001). This causes the instrument to be biased-high relative to the thermometer record, which ended in 1996. There is a Tmin step-change, likely caused by an undocumented site move after the site was taken over by the Bureau from the Royal New Zealand Air Force in 1948 (+0.78oC, <0.01), and a Tmax step-change in 1970 (+0.17oC, P= 0.03); both are undocumented.Grey-circles represent the first data group; red-squares, the intermediate group; blue triangles are AWS data. Lines with unique subscripts (a, b, c) are statistically dissimilar. (Lines a and b are uniquely dissimilar in Tmax; all lines are dissimilar in Tmin.) Data are raw averages and have not been screened for outliers.

Discussion

The dire prognosis from Johnston:

  • It’s hard to find sites where AWS-instruments are not biased-high, like at Rutherglen (Figure 5), Ceduna (Figure 7), and Norfolk Island (Figure 8).
  • In addition to sites already mentioned; AWS-bias occurs in data for Sydney airport, Sydney Observatory, Cape Leeuwin, Hay, Melbourne Regional Office, Geraldton airport (WA), Deniliquin, Mildura, Ballarat airport, Wilsons Promontory, Cape Otway, Bourke, Montague Island, Loxton, Gabo Island and Trangie (ag), to mention some that are analysed.
  • Data for ‘record-hot’ years at many places is highly unusual, and defies explanation (Figure 7).
  • Relationships between local temperature (especially Tmax) and local rainfall are also useful for detecting incongruent data. For instance, Tmax data from 1981 to 1993 at Bridgetown are random with respect to rainfall; thus probably imported from somewhere else. (Some sites, like Cape Otway can be diagnosed using derived metadata, such as counts of high and low extremes (Figure 10).)
...

Figure 10. Cape Otway’s AWS markedly over-reports upper-range temperatures. Data are log10Tmax count ratios of upper to lower extremes. Vertical dashed lines indicate other inhomogeneties. The AWS was installed in 1994 and replaced in 1995. Similar problems are evident across the Bureau’s network.

Conclusions

Temperature data in Australia are not up to the task of tracking climate warming. Most datasets have multiple problems that are not smudged-away by homogenization.

A confounded issue is that AWS-bias infects raw data, homogenized data, and comparisons between networks (ACORN vs. AWAP). Bias is therefore undetected by station comparisons.

There is evidence also that data for manually observed stations (Gunnedah, Moruya PS, and Kerang for instance) are adjusted by the Bureau to agree with the AWS network.

Bias is transferred; it reinforces trends, extremes, and trends in extremes that are due to the instrument, possibly the politics, but demonstrably not the climate.

“Unfortunately, the Minister responsible (Greg Hunt); my local Member (Dr. Andrew Leigh (ALP)) and the Bureau are uninterested in the issues raised here”. Bill concluded.

9.3 out of 10 based on 92 ratings