Hiding something? BOM throws out Bourke’s hot historic data, changes long cooling trend to warming

Hello Soviet style weather service? On January 3, 1909, an extremely hot 51.7C (125F) was recorded at Bourke.  It’s possibly the hottest ever temperature recorded in a Stevenson Screen in Australia, but the BOM has removed it as a clerical error. There are legitimate questions about the accuracy of records done so long ago — standards were different. But there are very legitimate questions about the BOMs treatment of this historic data. ‘The BOM has also removed the 40 years of weather recorded before 1910, which includes some very hot times. Now we find out the handwritten original notes  from 62 years of the mid 20th Century were supposed to be dumped in 1996 as well. Luckily, these historic documents were saved from the dustbin and quietly kept in private hands instead.

Bourke has one of the longest datasets in Australia — but the BOM, supposedly so concerned about the long term climate trends, appears to have little curiosity in the hot weather of the 1880’s and 1890’s (I talked about the amazing heatwave of 1896 here where hundreds died and people in Bourke escaped on special trains). If it had been a cool spell then, would the BOM feel more inclined to put some effort into analyzing them? All of the 50+ temperatures recorded do have a story to tell, yet they lie invisible in news reports of the 21st Century.

Perhaps most seriously, the regular BOM press releases of hottest ever records now rarely give any indication that these earlier hot records existed at all.

Ian Cole lives in Bourke, and runs the local radio station. For years, his father Neville used to do the meticulous recordings every three hours. Ian Cole feels very frustrated that he can’t broadcast any of that information to listeners in weather reports — the BOM won’t supply any data before the year 2000 to the official Weatherzone service provider. He remarked that “We keep on being told about records that are not actually records and averages that are not quite right”.

Bourke got new automated weather recorders in 1994, with a two year overlap of manual and new equipment.

Bourke raw Maxima trend:  was 1.7C cooling, now increased to a slight warming trend

Bourke raw Minima trend: was 0.53C warming, now increased to 1.64C warming trend

–These trends, calculated by Jennifer Marohasy have not been disputed by the BOM

Graham Lloyd at The Australian continues to ask the questions the BOM should have been asked for the last ten years.

The modern “trends” do not convey the temperatures that were actually recorded at Bourke.

Where is the respect for historic records?

Weatherman’s records detail heat that ’didn’t happen’

AS a child, Ian Cole would watch his father Neville take meticulous readings from the Bureau of Meteorology thermometer at the old post office in the western NSW town of Bourke and send the results through by teleprinter.

The temperature was recorded every three hours, including at night when the mercury sometimes plunged to freezing, and the data was logged in handwritten journals that included special notes to help explain the results.

For Mr Cole it is a simple matter of trusting the care and attention of his father. “Why should you change manually created records?” Mr Cole said. “At the moment they (BOM) are saying we have a warming climate but if the old figures are used we have a cooling climate.”

 Thank goodness someone saved the original notes:

The Stevenson Screen went to the dump and, but for fate, the handwritten notes could have gone there too. But without instruction, the records were kept and are now under lock and key, held as physical evidence of what the weather was really doing in the mid-20th century.

 Amazingly Bourke residents are told only of records since the year 2000? (This almost seems too hard to believe. Very Soviet.)

The records are also important in an ongoing row that frustrates Mr Cole. The Bourke cotton farmer may be managing director of the local radio station 2WEB but Mr Cole can only broadcast temperature records that date back to 2000 because the Bureau of Meteorology won’t supply historic records to service provider Weatherzone.

As a result “hottest day on record” doesn’t really mean what it seems. “We keep on being told about records that are not actually records and averages that are not quite right,” Mr Cole said.

Again it’s the homogenization practice which changes the trends:

Worse still there are concerns about what has happened to the precision of those handwritten records in the earlier years. Bourke now forms part of a network of weather stations used to make up the national record known as ACORN-SAT. The raw temperature records are “homogenised”, a method BOM says has been peer-reviewed as world’s best practice and is used by equivalent meteorological organisations across the world.

Independent research, the ­results of which have not been disputed by BOM, has shown that, after homogenisation, a 0.53C warming in the minimum temperature trend has been increased to a 1.64C warming trend. A 1.7C cooling trend in the maximum temperature series in the raw data for Bourke has been changed to a slight warming.

The Australian is doing a remarkable job.

The BOM explained Bourke in their reply to Graham Lloyd (discussed a few days ago here).

Bourke: the major adjustments (none of them more than 0.5 degrees Celsius) relate to site moves in 1994 (the instrument was moved from the town to the airport), 1999 (moved within the airport grounds) and 1938 (moved within the town), as well as 1950s inhomogeneities that were detected by neighbour comparisons which, based on station photos before and after, may be related to changes in vegetation (and therefore exposure of the instrument) around the site.

We will be discussing this record more in future, along with the “worlds best practice” of using thermometers hundreds of kilometers away as a reason to change the original data with rationalization of what “may” have happened.

Jennifer Marohasy has written this up on her blog. She also talked about old weather records changing at other sites, in a 7 minute podcast from Melbourne radio station 3AW yesterday.

No doubt some adjustments were required in Bourke, because there are station moves in the record. But again we find the warming trend is larger because of the adjustments, and Ken Stewart’s work shows that across the whole Australian ACORN dataset the warming adjustments to minima are larger than the cooling adjustments. Changes this large need to be carefully documented and explained.

Australians might just see Climate Change, renewable energy and carbon taxes differently if they also knew how meaningless most of the Headlines of Hottest Ever Weather really are.

Related stories

 

9 out of 10 based on 120 ratings

117 comments to Hiding something? BOM throws out Bourke’s hot historic data, changes long cooling trend to warming

  • #
    Yonniestone

    Jo first line, isn’t it a Stevenson Screen?

    A very hot day in 1909 just misses out on the BOM’s 1910 start of history in Australia, bad luck I guess…

    Fixed -thanks – Jo

    272

    • #
      Jennifer Marohasy

      Hey Yonniestone

      A Stevenson screen was installed in August 1908. In time for the heat wave. Then it was written up in the Sydney Morning Herald… http://trove.nla.gov.au/ndp/del/article/15025008

      Some times things go right. 🙂

      382

      • #

        The numbers listed for Burke in highlighted article of the SMH appear to be for a previous heatwave in 1896.

        The actual temperature for 1909 is in the text above; 125⁰F on the 3rd of January, 1909.

        Interestingly, the text for the graphic image says 119.5 for January the 15th, 1896(?), whereas the image “clearly” shows what appears to be 120.5. Similarly, January 2nd was “103” according to the text, but the image shows 108. January 6th was 112.5; and the dates January 9 and January 11th have been “interpreted” as what the reader thought was meant.

        I’ve fixed the temperatures (in text), but not the dates.

        90

        • #
          Lord Jim

          1896…

          January 2 … 108 = 42.2c
          January 3 … 102 = 38.8
          January 4 … 105.5 = 40.8
          January 5 … 112.5 = 44.7
          January 6 … 113.5 = 45.2
          January 7 … 118 = 47.7
          January 8 … 117 = 47.2
          January 9 … 108.5 = 42.5
          January 10 … 108 = 42.2
          January 11 … 108.5 = 42.5
          January 12 … 110 = 43.3
          January 13 … 113.5 = 45.2
          January 14 … 115 = 46.1
          January 15 … 120.5 = 49.1
          January 16 … 118 = 47.7
          January 17 … 118 = 47.7
          January 18 … 115.5 = 46.3
          January 19 … 119.5 = 48.6
          January 20 … 119 = 48.3
          January 21 … 114 = 45.5
          January 22 … 118.5 = 48
          January 23 … 118 = 47.7
          January 24 … 119 = 48.3
          January 25 … 115.5 = 46.3

          Now /that’s/ what I call a heatwave.

          390

          • #
            Andrew McRae

            We have had to remind ourselves of the proper definition of a heatwave many times. The definition with the most consensus (ugh!) behind it is: A heatwave is 5 days in a row each with maximum temperatures more than 5 degrees above the long term average for that location at that time of year.

            From BoM’s Climate Data Online, long term average of Tmax for Januaries in Bourke is 36.3°C. In the ten years centred on 1896 it was closer to 37.4°C.

            Taking the data you show above, which appears to be daily Tmax, and making a running Min(previous 5 days Tmax) statistic for it gives the temperatures that may meet heatwave duration, and the requirement for the consecutive TMax being above (36.3+5) 41.3°C is met by all 16 days from 9 Jan to 25th.
            Even using the higher short term average of Tmax, that whole period still qualifies.

            So you see, Lord Jim, it’s not just your own opinion that this was a heatwave, it satisfies the definition of a heatwave used by Europe, Canada, and Kevin Trenberth.

            100

          • #

            They were probably taken outside of a screen, given that one was only installed in 1908.

            The screen serves (at least) three purposes other than physical protection of the equipment: It prevents direct radiation from heating/cooling the thermometer; it prevents rain from turning your dry bulb thermometer into a temporary wet bulb; and the vents prevent direct cooling/heating of the bulb by wind. i.e. winds are diffused and the rate of change of air inside the screen is limited.

            The standards for screens haven’t been complied with in the long run. Apart from maintaining an appropriate location over suitable greound and removed from screening vegetation, structures, etc., the requisite white-wash was often neglected and in some cases, replaced with acrylic paints that have different spectral reflection and insulating properties. Degradation of white-wash would notionally allow the shelter structure itself to heat more in direct sunlight which’d introduce a warming bias. Or a cooling one if it was white-washed in spring.

            Thsoe aren’t so much a problem in terms of the original design intent of the screens. How could Thomas Stevenson foresee that pscientists would be looking for an anthroprogenic climate change “signal” through temperature analysis to thousandths of a degree Celsius? When it was only “normal” to report to the nearest half of a degree. It was only weather!

            60

          • #
            Robert O

            The old measurements have been taken to half a degree Farenheit which is fine. The conversions worry me a little because you can only go to an accuracy of a quarter of degree, not a tenth of degree celsuis without changing the original data. And in the context of global warming of less than a degree celsuis for the century any slight change of data is fairly important, particularly when data from Australia is a significant component of the global figure. It appears the Burke has been omitted from the stations used, but the same scenario probably occurs for other sites which are used.

            70

          • #

            Lord Jim
            Your 120.5 F reading for the 15th does not agree with the old newspapers. Most have 118 F for that day.
            EG:
            http://trove.nla.gov.au/ndp/del/article/39624723?
            http://trove.nla.gov.au/ndp/del/article/3077495
            http://trove.nla.gov.au/ndp/del/article/20444355?zoomLevel=5
            The newspapers get it all mixed up and condradict each other. I am not sure the BoM has the days right either(they may have the dates out by one). Getting dates out by one day does not matter much until it is homogenised away. Far better to look at very old government publications.

            20

        • #

          Bernd
          I should read the comments all before I post. I saw the change to the 119.5 temp first and got sus about who altered it.
          It helps to Zoom right in and compare the text digits to other clear ones.
          It is 119.5 and that is what the BoM have also but many papers at the time had 118F
          here it is zoomed.
          http://trove.nla.gov.au/ndp/del/article/15025008?zoomLevel=7

          10

      • #
        Yonniestone

        Hi Jennifer, I was just pointing out a typo of ‘Stephenson Screen’ in the first line.

        The quip about the BOM using 1910 for starting their recent homogenized temperature datasets should have a /sarc on the end. 🙂

        90

        • #

          No problems Yonnie. I am very literal.

          70

          • #
            Yonniestone

            Jennifer what’s your opinion on using data from private weather stations such as the CWOP program? http://en.wikipedia.org/wiki/Citizen_Weather_Observer_Program

            I was surprised at how many private stations there are globally when searching online the other day, and can see for or against arguments in their use.

            50

            • #

              Hey Yonnie

              I’ve no problems with the use of data from private weather stations.

              What we need, whether it be private or public data, is an agreed methodology for dealing with discontinuities that might be associated with a site move or changes at that site.

              110

              • #
                the Griss

                And that methodology must be proven NOT to change the overall trend, or at the very least give a balanced adjustment over the whole of Australia.

                90

              • #
                Andrew McRae

                And that methodology must be proven NOT to change the overall trend, or at the very least give a balanced adjustment over the whole of Australia.

                I thought that reverse-engineering the local adjustments from a desired long-term trend line was the very problem that we were fighting against.

                The moment you agree to use “Global Mean Surface Air Temperature” as a proxy indicator of global change in climate, you already commit to synthesising a global statistic dependent on individual site measurements. To decide the site adjustment based on the global mean it produces is reversing the information flow direction.

                Any site series adjustment has to be justifiable based on the changes local to that site, nothing more. Using nearest neighbours for comparison is just a way of detecting site changes in the absence of better metadata, and nearest has to be near enough, not the 100km+ smoothing nonsense we’ve been seeing.

                When the lab coats make a picture of observed rates of global warming and show it as a heat map, it’s immediately obvious that the heatmap is not a single uniform colour across the world, and different regions are warming at slightly different rates. It may be that the southern region of Australia was warming faster than the northern, so the real world warming may not be balanced over all Australia.

                It may also be that every time a station is moved, it’s because the town had grown around it and it had been moved out to a more isolated spot. The temperatures just after the move should be more authentically ambient than just before the move. That is not a balanced pattern of adjustment because the temps will always be adjusted downwards and if most moves were recent then most adjustments will be to old temperatures not newer ones. It’s unbalanced in timing and direction, yet it would be accurate to what happened.

                So there is no reason to require the site adjustments to be balanced over all of Australia or to not change the trendline. Indeed adjustments to offset UHI must change the overall trendline, as they’re offsetting a population growth trend.

                Maybe the fairest method is to choose whatever adjustment makes the local derivative continuous. I don’t know. The wizards over at Climate Audit should be way ahead of me on that one.

                40

      • #
        CJ

        Jen: what are the ‘peer reviewed’ papers underlying the BOM claim for their methods?
        Are these Hansen/GISS methods? Could you post links or at least the names of the pr’d procedures papers?
        Thanks, CJ

        20

  • #
    Lord Jim

    I would take Neville Cole’s annotated thermometer readings over BOM’s homogonized milk anyday.

    331

  • #
    handjive

    That 3aw podcast is humdinger!

    Quote this zinger from JM’s page:

    “Independent research, the results of which have not been disputed by BOM, has shown that, after homogenisation, a 0.53C warming in the minimum temperature trend has been increased to a 1.64C warming trend. A 1.7C cooling trend in the maximum temperature series in the raw data for Bourke has been changed to a slight warming.”

    321

  • #
    thingadonta

    I remember reading about Australia’s supposed ‘worst ever drought’ from ~2002-2007, but noted that the BOM’s own data showed that the drought from ~1895-1902 was longer and more severe.

    Bourke was caught up in this.

    Bourke may be on the margins of climatic zones (semi-arid, and possibly also sensitive to subtropical influence) and it might be expected to not behave in a linear fashion with regards to climate change, which so obsesses those who work at the BOM.

    222

    • #
      Andrew McRae

      The federal government controls the climate, therefore there were no dangerously hot days before Federation. It’s only logical, dear Citizen. 😉

      That’s an intriguing point about Bourke being on the edge of two climate zones. When I overlay a map of Australia’s Köppen-Geiger climate zones [1.6MB, PDF, Fig 9.] onto a map of towns and borders, I find Bourke is definitely in the middle of a Arid Steppe Hot (“BSh”) climate zone. It is nowhere near any other type of Koppen zone.
      Do these zones change much between one decade and another?
      Are there other ways of assigning climate zones to areas which reaches a different conclusion?

      100

      • #

        Are there other ways of assigning climate zones to areas which reaches a different conclusion?

        Building Code climate zones map:
        http://www.abcb.gov.au/en/major-initiatives/energy-efficiency/climate-zone-maps.aspx.
        There are other systems eg Atkinson (tropical), Trewartha (modified Köppen).

        10

        • #
          Andrew McRae

          Haha! Ahh, that was good for a laugh. The boundary between Zone 3 and Zone 4 is… a straight line… which is also exactly the boundary line between Qld and NSW.
          Oh what fools we skeptics have been! It turns out the climate really does obey political directions!

          Please tell me you posted that link as a joke.

          But actually it is partly my fault for not being specific enough in my question. I should have asked “Are there other scientific ways of assigning climate zones to areas which reach a different conclusion?”

          And the conclusion was the same anyhow, as Bourke still appears comfortably inside their Zone 4 by a distance of 120km or 1 degree of latitude. The Yass valley is smaller than that, yet is shown as a climatic zone distinct from its neighbours.

          Presumably to get a higher resolution on the climatic zones between Qld and NSW there has to be a small armed rebellion south of Bourke to break away from their shire council and form new towns, leaving the area north of Bourke in a smaller local government area of its own. Who said climate science wasn’t exciting?

          70

          • #

            Yep, the building code climate mapping and associated codes had to be negotiated. I was one of the people giving them a hard time up here at 19°. Trying to explain that heat is the problem and cold is a novelty, eg solve the heat not the cold. (This year has been cool from May to now.) Also the difference between humid tropics and dry tropics, eg dry tropics is humid, but less relief from evening rain.
            Climate zone 3 – Hot dry summer, warm winter, Climate zone 4 – Hot dry summer, cool winter. I expect that Bourke winters are cool. Except when they aren’t.

            50

      • #
        thingadonta

        Climate zones are a complicated beast. I suspect there is way more than one way to define them.

        I simply note that central Qld and far northern NSW are in a zone of transition between subtropical and warm temperate influences, both these are influenced by regional factors which are very different from each other, so that any changes in climate might result in larger local and potentially non-linear changes than might otherwise be expected. I would have to do more research, but one thing I would look at is rainfall and humidity over time and see if this has changed.

        00

  • #
    Ursus Augustus

    This whole matter has moved from some sort of scientological obsession into straight up and down fraud, in my opinion.

    It may well be that the whole thermometer record driven CAGW fantasy is not so much an empty headed fantasy but just a good ol’ boondoggle like collateralized debt deposits that was just too easy to exploit once you figured out how easy it was.

    The gullibility/complicity of the msm is also a large part of this in that they are like puppies shivering with excitement when you dangle the prospect of DISASTER in front of them. The only thing that will get their attention in the face of potential disaster is evidence of [out and out corruption, major errors and large flaws]

    ——-
    Forgive me for editing. Can commenters be careful with the use of the word fraud. There are clear legal requirements of financial profit and intent. – Jo

    90

    • #
      Ursus Augustus

      Apologies for indiscriminate use of the ‘f’ word without the ‘imo’ qualification, Jo.

      That said, Handjive down at #16 links to a letter to the Oz that puts it rather well referring to scientists ‘selling their souls to maintain funding of their pet projects’. I am not sure of the legal implications of selling one’s soul for some benefit, be it material, reputational or spiritual, [snip. Let’s not go there, though I get your point – Jo]

      40

    • #
      bobl

      I disagree Jo, there is a common usage of the term, being misrepresentation, and a legal term, which requires that someone be harmed, however that harm to my knowledge does not have to be monetary, but often is. For example I think (it’s my opinion that) the climate councils scared scientist campaign is bordering on fraudulent, since many of the opinions that they are using to get money are misrepresentations of fact or could be construed as menaces (give us money or you will fry). Certainly couldn’t meet advertising standards for truth.

      In publishing the term is usually taken to be the first form, a common usage term meaning a misrepresentation.

      40

      • #
        cohenite

        Misrepresentation is merely one element of fraud; a plaintiff must prove 5 elements of fraud and that is fairly difficult.

        What is more straightforward is litigating for alleged defamation which Jo does not want to be the target of. Calling the alarmists wrong and saying why is probably preferable.

        30

        • #
          Richard C (NZ)

          >”a plaintiff must prove 5 elements of fraud and that is fairly difficult.”

          (1) a false statement of a material fact,
          (2) knowledge on the part of the defendant that the statement is untrue,
          (3) intent on the part of the defendant to deceive the alleged victim,
          (4) justifiable reliance by the alleged victim on the statement, and
          (5) injury to the alleged victim as a result.

          http://legal-dictionary.thefreedictionary.com/fraud

          Interesting that “A statement of belief is not a statement of fact and thus is not fraudulent.”

          In the case of temperature records, 1) and 2) to date maybe, 3), 4) and 5) impossible to date I think. But as time goes on (and on) into the future?

          NZCSC got nowhere (went backwards) simply asking for “a Declaratory Judgement in respect of temperature records published by the National Institute of Water and Atmospheric Research (NIWA)”. Although I think they missed their chance by pulling in too much trivia (“unnecessarily prolix” – Judge). They should have focussed on their R&S93 series.

          That case had to be won before even thinking about other legal avenues – it failed dismally.

          I’ve been tracking the US legal developments in the ‘USA’ thread at CCG e.g. the recent Opinion delivered for

          Southeastern Legal Foundation v. Environmental Protection Agency

          http://www.climateconversation.wordshine.co.nz/open-threads/climate/regions/usa/#comment-858342

          Not much joy for SLC. Included in the evidence was this (which appears to have been ignored as was the NZCSET’s 7SS in NZCSET v. NIWA):

          Petition (Scientist’s brief)
          I. The Conclusion That EPA Drew From Its Three Lines Of Evidence Is Demonstrably Invalid
          A. First Line Of Evidence: EPA’s GHG Fingerprint (Or Hot Spot) Theory
          B. Second Line Of Evidence: The Purported Unusual Rise In GAST
          C. Third Line Of Evidence: Climate Models
          II. Serious Deficiencies In EPA’s Process Contributed To Its Scientific Errors

          Again, if that can’t be won then a “f***d” claim is out of the question.

          >”alleged defamation”

          Yes, best to drop statements that include the word “f***d” in connection with the entity in question. Flippant or otherwise.

          >”Calling the alarmists wrong and saying why”

          Yes exactly, but as above, for how long can that go before the threshold is crossed when some form of legal transgression is apparent?

          I’m wondering whether that form would be negligence or similar rather than “f***d”:

          http://en.wikipedia.org/wiki/Negligence

          Tony Cox, are you there?

          10

          • #
            bobl

            So, this would be legal then…

            I believe that you are desmond Tutu’s long lost heir and I believe that I have 25 billion dollars to deposit into your account if you give me $100,000 first.

            If you send me $100,000 dollars then I can keep it because I only believed you were the heir and I just believed there was 25 billion dollars to give you right?

            10

            • #

              No, because just using the word “belief” is not evidence that you believe the statement. Your intention is clearly fraudulent (which would be backed up with other evidence in this sort of scam) and can be shown through various avenues to be not a statement of belief.

              10

  • #
    the Griss

    Thing is, that if you homogenise create a spurious warming trend in Bourke, Alice Springs, and whatever other remote points they use out there….

    You only need to change 4 or 5 points and you are basically changing more than half the country !!

    If you follow Ken’s site, you will see that MANY sites to the west of the ranges in QLD, NSW and Vic have been “altered” in the same way as Rutherglen, Amberley, etc.

    So all up they have “fudged” some 70-80% (or maybe significantly more) of the whole area of Australia !

    272

    • #
      the Griss

      ps.. iirc, Alice Springs carries some 10% of Australia, just by itself.

      I’m pretty sure BOM have “homogenised” Alice Springs quite thoroughly, although I have no idea what against !

      221

      • #
        TdeF

        So if Australia is 25% of the world temperature and Alice Springs is 10% of Australia, the Alice is 2.5% of the world temperature. Homogonized? Probably Nuked. You can see why such locations would have been targets. What is surprising from all this is that the old records, critical to the Global Warming story, are deleted or just ignored and if possible, destroyed. In a digital age, this is bordering on criminal.

        322

        • #
          the Griss

          I don’t think Australia gives 25% of the world temperature record, does it ?

          52

        • #
          the Griss

          I thought Australia was a bit over 5% of the world land area.

          50

          • #
            TdeF

            I had speculated as much in my post.

            50% of the world is below the equator but most of the world’s land is above the equator, especially if you leave out Antarctica and most of the Southern Hemisphere is water. If you leave out the tropics and Antarctica, the continent of Australia dominates both land and measurements.
            As a consequence the Southern Hemisphere is much colder than the Northern. The Antarctic is 20C colder. Here in Melbourne, we are at the Latitude of Libya but apart from the lack of snow, it is a cooler place. This is probably the real reason we have an Ozone hole here too as 80% of the people live north of the Equator too, so the CFC explanation does not make sense. It is also why there was the search for the “Great Southern Land” to balance the planet.

            Anyway ROM replied to my speculation and I repeat the post here but have not verified what was said. I would not be surprised. This alone would explain international demands to edit our data, to fit the narrative. Maybe the BOM simply obliged?

            ROM. August 26th
            TdeF @ # 38

            What very, very few people realise is that the combined Australian and NZ weather and climate data account for ONE QUARTER of the total global surface in modelling computations where the land temperature data is used.

            So our data has an undue importance within the global climate network particularly as we together with NZ also have by far, or did have by far the most accurate historical weather and climate records for the entire southern Hemisphere which have been fed into the Global Historical Climate Network, the GHCN.

            The GHCN is rarely discussed as the USHCN or the latest variations of it has predominated in just about all discussions across the climate blogdom.

            170

            • #
              the Griss

              25%.. that’s crazy, but does explain why the rabid warmists at BOM are going to such lengths to [snip “find”] a warming trend in Australia.

              Problem for them is that nature just doesn’t seem to want to co-operate 🙂

              By manipulating the past to create the warming trend, they are feeding total garbage into their models.

              That certainly isn’t going to help with any realistic “predictions” 🙂

              110

              • #
                ROM

                Just to repeat my post in full of the 26th August that TdeF refers to above.

                And note that I refer to computations involving land temperatures which the GHCN [ Global Historical Climate Network,] the name given to the weather data now used for climate research, that has been collected from nationally run weather stations in countries located on all continents.

                The GHCN is the basic source of global weather records now used for Global climate related research.

                The USCHN is the US historical Climate Network and only applies to the USA land station coverage .

                The entirely land based GHCN data going back to the start of the 20th century is the ONLT pre satellite historical global data available for climate research prior to the beginnings of the satellite weather coverage starting in about 1978. Hence the GHCN’s importance in trying to decipher the climate prior to the mid 1970’s.

                Climate models;

                All climate models that use back casting begin their runs starting with the particular weather and climate conditions as recorded in the GHCN at a specific past historical period some many decades ago to demonstrate that they can predict accurately the climate as it then occurred through the following decades until the present.
                They all use this historical data from the GHCN to check and see if the model can accurately reproduce the historical climate. The models are then “tuned” ie frigged around with using varying amounts of aerosols or clouds or water vapour and about every other climate item you can think off other than ALL of the actual real conditions of the climates of the times to get them to match that past historical climate pattern from their back casting start period to the present.

                Once they have been “tuned” using all these other usually far from actual reality tuning methods and climate and weather related inputs usually in ways and amounts that “Nature” of this age and times has never used and can accurately reconstruct the past historical global climate from their start date of their back casting to the present, then it is claimed they can now accurately predict the future climate using those same models that were supposedly proven to be able to predict the past climate .

                And we all know by now just how badly that has come unstuck and how much angst, suffering, impoverishment and destruction of personal and national economies the climate modellers have managed to achieve in a short decade and half with those now known to quite spurious claims of being able to accurately predict the future global climate trends with their high falutin and totally impotent climate models which even the IPCC has started to back away from as the source of all climate wisdom.

                Thats the role the very important global climate data source, the GHCN has played in the climate research industry over the last two and half decades. 
                Now this remember is Global Climate research, the Global climate is a vast interacting set of very poorly understood energy interactions of every conceivable type with heat energy and the energy of heat containing, to varying degrees, enormous and constantly moving masses of air and water and which are the most important in distributing that vast pool of heat energy across the lands and oceans of the planet and in doing so creating our global climate.

                Australia’s and NZ’s role as historically long term, very high quality contributors of very high importance to the global land based GHCN data series was in my post of the 26th August repeated in full below ;

                ________________

                ROM
                August 26, 2014 at 9:58 pm · Reply
                TdeF @ # 38

                What very, very few people realise is that the combined Australian and NZ weather and climate data account for ONE QUARTER of the total global surface in modelling computations where the land temperature data is used.
                So our data has an undue importance within the global climate network particularly as we together with NZ also have by far, or did have by far the most accurate historical weather and climate records for the entire southern Hemisphere which have been fed into the Global Historical Climate Network, the GHCN.
                The GHCN is rarely discussed as the USHCN or the latest variations of it has predominated in just about all discussions across the climate blogdom.

                To see what I am posting on, just get a global type map point the tilt away from you and have a look at the amount of land masses in that bottom quarter of the globe below the equator and stretching from around mid indian Ocean at about the 75 degrees East Longitude then east across OZ and NZ to the mid South Pacific at around Longitude 140 degrees West, a coverage of the southern almost land mass free regions except for Oz and NZ, of the planet that spans some 145 degrees of Longitude south of the equator.
                Hence our data is vital to climate science research. a point that is almost completely lost in the climate debate and discussions.
                And a point that if our data from this very limited in area Southern quadrant land mass is corrupted or altered or has been made to go missing, it has a downstream flow on effect and an impact on the accuracy of any climate research far beyond the impact from the similar and increasingly revealed corruptions of NH data with its greater land area covered NH stations.

                141

              • #
                ROM

                Just to add to my above post on the global temperature data, this WUWT post gives a further insight into that data , where the best stations are located and how few there are with a long data history plus the interesting bit of info that the historical trends in global temperatures as derived from those long history base line stations [ 1706 to 2011 ] show an increasing [ global ? ] temperature trend of;

                1. A rise of 0.41°C per century is observed over the last 250 years.

                Analysis of Temperature Change using World Class Stations

                That 0.41 C increase per century in those long global history stations is not far off the 0.53C increase that Jo has posted above in the following passage.

                Independent research, the results of which have not been disputed by BOM, has shown that, after homogenisation, a 0.53C warming in the minimum temperature trend has been increased to a 1.64C warming trend. A 1.7C cooling trend in the maximum temperature series in the raw data for Bourke has been changed to a slight warming.

                The actions of the BOM are to put it very politely, a very long way indeed from honest bias free science if such corruption of the data actually can be classed as falling under the title of “science”.

                60

            • #
              ROM

              Post script;
              250 years ago when the above stations started their records was in the second half of the 1700’s and therefore still in the later stages of the cold climate Little Ice Age [ LIA ] which goes a long way towards explaining that 0.41 C increase per century since.

              60

            • #
              The Backslider

              Australia dominates both land and measurements.

              I think that South America is a little bigger and more significant than you realise.

              20

            • #
              The Backslider

              I’d say that a comparison is in order… Southern Africa is not to be sneezed at either.

              Are you saying that these are left out in the calculations?

              10

              • #
                ROM

                Southern Africa is not to be sneezed at either.

                Are you saying that these are left out in the calculations?

                Nowhere did I say that nor did I imply that.

                I have pointed out that Oz and NZ account for about that one quarter of the Earth’s surface from where the only temperature and climate data available is from the Australian and NZ weather and climate data bases,.
                A quadrant of the earth’s surface that stretches from the mid Indian Ocean then east to around the mid Pacific Ocean and all south of the equator in that roughly 145 degree wide quadrant of the earth’s surface.

                Simply because Oz and NZ are the ONLY significant land masses in that sector of the earth’s surface.

                Consequently OZ and NZ are the ONLY sources for land temperature data for the GHCN in that approximate quarter of the Earth’s surface prior to the introduction of operational weather satellites in the late 1970’s.

                It is quite obvious that the other quarter of the global land surface areas south of the equator , that other half of the southern hemisphere which includes the south of the equator, African and South American continents and the island nation of Madagascar and their respective nations are all located in that quadrant of Earth’s surface stretching west from the mid Indian ocean to around the mid Pacific ocean, a longitudinal range of about 215 degrees or thereabouts.

                Equally obvious is that the data records for the GHCN from those continent’s various national weather organisations account for that quadrant’s contribution to the GHCN’s data base and are treated as representative of that south of the equator sector of the earth’s past weather and climate from the time when their records were first kept.

                Equally obvious is that there are NO records or data of a long term nature from any part of the Southern Ocean areas which would be of use to the GHCN and climate research.
                Pacific Islands with their very small land areas and their almost unvarying ocean SST controlled weather are of little use even if such long term Island weather records could be located.

                So land surface records from continental scale areas such as Australia and associated with the NZ data are the only historical weather and climate records of real use in this southern quadrant, this approximate one quarter of the global surface area of the planet prior to the satellite era.

                20

    • #
      the Griss

      Alice Springs? Interesting tale. !!

      50

    • #
  • #
    Leigh

    Jo, I think you and everbody else is applying pressure to an organization in the the BOM.
    That really couldn’t care less about telling or giving any information to you or anybody that is questioning the [results].
    Pressure from here and your readers should be applied to their local members.
    I know a couple of politicians have tried and continue to try and get action but the BOM just continues on its merry way.
    Thumbing their collective noses at us all hoping it all just goes away.
    Politician’s usually only “act” if there is the imminent voter backlash facing them.
    We all know what the BOM has done.
    It’s now the responsibility of their masters, the federal government to put it right.
    Not the “peer reviewed worlds best practice” by [snip]

    —-

    Re Snip: Think very carefully about who you are making that claim about. Careful of the use of “fraud”. – Jo]

    90

    • #
      Leigh

      Jo, it is not lightly or flippantly that I and so many others refer to global warming as the greatest fraud inflicted on mankind.
      And those who perpetrated the fraud as fraudsters.
      But till somebody comes up with a better explanation of what your learned “freinds” at “our BOM”and its equivalents around the world are doing, there is no alternative discription.
      I am assuming our laws would be pretty well in sync with England and probably the rest of the world when it comes to defining what does and what doesn’t constitute a fraud.
      “The fraud act 2006 (C35) is an act of the parliment of the united kingdom.
      The Act gives a statutory definition of the criminal offence of fraud, defining it in three classes—fraud by false representation, fraud by failing to disclose information, and fraud by abuse of position.”
      There is substantially more to the legislation enacted by the English parliament but the definations in a nutshell are enough.
      Fraud it is.

      100

  • #
    manalive

    Like the global surface temperature record, the Australian surface record is a shambles.
    An apt metaphor might be the botched restoration of a damaged fresco by a Spanish housewife or even Mr Bean’s Whistler’s Mother.
    [snip. Rephrased. “It’s unscientific to” -j] throw away data, any data, that can’t be repeated.

    90

  • #
    Jennifer Marohasy

    Don’t forget. If you like what Jo does. Throw her some cash. There is a tip jar in the top right hand corner. But be generous. Cheers,

    252

  • #
    Bulldust

    It will be amusing to see WC try to poo-poo these transgressions by BoM as insignificant or justified. Hasn’t the proverbial leg to stand on.

    Quite embarrassing that BoM has to use vague qualifiers in its response. Don’t they know what is happening at their key installations?

    192

  • #
    gesta non verba

    The old adage “tell a lie often enough”,you know the rest.
    What’s worse is that they will do it with a straight face.

    121

  • #
    RealOz

    I have this hazy memory that in the Climategate Scandal (you know the one that just did not happen on THEIR ABC)mention was made in the emails of how the Australian Temp Data was a disgrace and that Australia needed to regularize/normalize their figures to be on any use.
    Wonder if any of the informed JO NOVA readers can comment on this please and in particular if I am correct if this is relevant to the excellent work done by “amateur scientist” Jennifer Marohasy!

    140

    • #
      ianl8888

      The HARRY_READ_ME.txt from CG1

      http://www.anenglishmanscastle.com/HARRY_READ_ME.txt

      amongst many caches available on googling

      70

    • #
      Andrew McRae

      ianl has nipped in ahead of me with the Harry_Read_Me link, which is the best answer.

      There was also a mention in the emails of unusable Australian data from Blair Trewin at the BoM:

      As far as the prospects for pre-1910 Australian data are concerned, it varies a bit from
      state to state (as you probably know, the various colonies operated as independent
      organisations until the Bureau was formed as a federal body in 1908). In summary:
      Queensland – good prospects of data back to early 1890s – Wragge was pretty diligent
      about getting Stevenson screens rolled out once he came in 1888.

      New South Wales – probably intractable. No standardisation before Stevenson screen
      installation, with many thermometers under tin verandahs or similar, and some earlier
      ones in unheated rooms, with very limited documentation. It might be possible to piece
      together a data set of some kind with station-by-station comparisons but I think this is
      doubtful.

      Could be just co-incidence that the BoM doesn’t use any data collected prior to the formation of the BoM. Could be that the BoM introduced equipment and procedure standardisation and so downplaying prior data makes the institution seem more important. Who knows.

      60

      • #
        Greg Cavanagh

        That to me reads as being in two mind sets.

        Too lazy to go through all the old data and try to make something of it.
        And too nit picking about precision that isn’t needed.

        40

  • #
    bemused

    One can only hope that there is karma and that eventually these [snip] get their comeuppance.

    30

  • #
    ColA

    It is not surprising the BOM is “adjusting” numbers to suit some wonky “homogenisation” dreamed up by the Yanks, what is totally unacceptable is that there is any possibility that original raw data of any sort is destroyed or lost. After all the BOM will need that info in 10 years time so they can “re-homogenise” when climate change turns to GLOBAL COOLING and they are trying to pacify the public that it is “really not that cold!!!”

    All jokes aside how can you ensure that the old original weather records are kept and properly archived. It would indeed be criminally reprehensible if the original hand written records were lost.

    70

  • #
    Athelstan.

    “Those are my principles, and if you don’t like them… well, I have others.” – Groucho Marx

    We know how it goes in BOM, the eco-space cadet warrior class – were in a frenzy because…..

    “Damn it if God and nature don’t do what they are telled to do, well…we’re just gonna have to fix it so that nature has to conform………………”

    “Lose them historical measurements matey, I’ve got some others…”

    Yep and it’s: Garbage in, Greenwash homogeneity out

    80

  • #
    pat

    fascinating how, as in so many matters, Oz/NZ is punching above its weight when it comes to the temperature datasets. thanx to those who have posted about this.

    btw, i posted the following for anthony & joe bastardi on WUWT’s Bastardi thread yesterday:

    2 pages: 28 Aug: New Scientist: Catherine Brahic: And now the weather, featuring climate change blame
    A new technique connecting individual weather events with the impact of greenhouse gas emissions could bring climate change into everyday weather reports…
    “Explaining why we’re getting the weather we’re getting should be part of the job of meteorological offices, as well as predicting it,” says Myles Allen at the University of Oxford. Allen was part of a team that carried out pioneering research that examined the impact of greenhouse gas emissions on weather…
    In the new set-up, a real-world seasonal forecast driven by data on current sea-surface temperatures will be run alongside a simulated “no global warming” seasonal forecast, in which greenhouse gas emissions have been stripped out…
    A main obstacle to bringing this kind of powerful climate modelling into standard weather forecasts is computing power. Models must be run many thousands of times to obtain statistically significant results, which requires expensive supercomputers…
    Ultimately, though, the key contribution of this work may be to get through to a general public for whom climate change has long been an abstract concept. By showing that what’s going on outside someone’s window is directly linked to climate change, researchers hope it will become obvious that what they are saying isn’t just a load of hot air.
    http://www.newscientist.com/article/mg22329842.400-and-now-the-weather-featuring-climate-change-blame.html

    something more not to look forward to in the MSM.

    50

  • #
    Rolf

    @Joanne, these days I think it is important you you be frank and not hide behind politically correct wording. Just have a look at all the weak politicians we have around the world, and we also have the new Hitler arising in Moscow. This might end the same way as last time 1939. So let people speak out, if they think it should be called fraud I guess that is what is the right name.

    ————–
    It has nothing to do with political correctness. The most useful comments are correct in English and legal terms. (Exceptions granted for funny) _Jo

    30

  • #
    Bob Koss

    If your BoM won’t provide you with old records I suggest you look here.
    http://www.surfacetemperatures.org/databank

    They have an embedded link to their newly constructed databank. Stage3 data is supposed to be the closest they could come to original data. It is quite comprehensive(32000+ stations) and broken into hourly, daily, and monthly databases. Their daily database has Bourke max/min data back to 1871 provided by GHCN-D. That one is a 2 gigabyte download tho’.

    I’m using the monthly database. Only half a gig download for that file. That also goes back to 1871 for Bourke. The Bourke record is quite good having very few years with missing months since 1878 while downtown. Better than after they moved to the airport.

    Point of interest. The airport had 12 months of data during 1995. That was also the last full year for the downtown site. The airport Tmax averaged 0.17C higher than downtown for the year. Airport Tmin averaged 1.21C lower than downtown. Airport Tavg was 0.52 lower than downtown. The Tavg trend is -0.65C per century over the length of the downtown site record.

    80

  • #
    handjive

    Great letter in the Australian:

    BoMs Brainstorming (via twitter)

    61

  • #
    ROM

    I would be careful in calling the Australian historical weather and climate data a shambles.
    I would think that all the original weather records are still there somewhere in the archives of the BOM.
    If they aren’t or have been discarded and / or destroyed then a few BOM personnel should finish up serving quite some time in Jail at the very least.

    The present situation is probably driven by ideology and confirmation bias as the BOM has never really been confronted and challenged like or to the extent that is being done here and now,
    And now even by major powers in the media with the politicals to come on board soon unless the BOM starts coming up with some cast iron reasons for it’s current attitudes and ways of processing weather and climate data and / or backs right off from it’s rather haughty, arrogant ivory tower approach and gets back to it’s former and now lost culture of integrity and honesty in it’s dealings with the public who pay it’s rather large and munificent funding.

    The reasons for my suggestion that the BOM’s data is NOT a shambles at least by world standards is that Australian weather records are arguably at least as good as the USA’s.
    The Europeans lost much of their historical weather data during WW2.
    The South American weather data was probably a real shambles and a dogs body of records with the instability of the various historical uprisings and revolutions of South America plus their lack of continuing development which is now being corrected and overcome. 
    Possibly Argentina out of all of South America had good records into the first half of the 20th century as it was one of the most prosperous and richest per capita nations on earth until the end of WW2 but it has been all downhill since then both economically and politically.

    Civil strife, revolutions and societal chaos usually means that the unimportant things like recording temperatures and etc just fall by the wayside.

    Africa possibly had some quite good records early to mid 20th century albeit very, very scattered and sparse, while the European colonial powers still held sway over the African continent.
    In fact in the Warwick Hughes /Phil Jones climate gate e-mail I posted here a few days ago, David Jones of the BOM’s Climate Analysis was quoted by Phil Jones as trying to track down some weather records from Africa as Jones was in dispute with some skeptics [ 🙂 ]

    With the ejection of the European colonial powers from Africa, the usual political and economic instability followed which is now almost at an end as Africa powers away economically albeit from a very low base.
    So it is doubtful if there are more than a very small handful of good records from most of Africa outside of South Africa which had it’s former European dominated, record and analysis culture until the very recent period.

    Asia is also in a similar situation but is decades ahead of South America and Africa in stability and hence reliable recorded keeping of climate and weather although any records will probably only relate to quite recent times of a couple of decades of data.

    The Chinese lost nearly all of their modern early 20th century weather and climate data during Mao’s “Great Proletarian Cultural Revolution” and the even more disastrous “Great Leap Forward” where an estimated 40 millions died of starvation.

    When looking at the weather and climate data gathering, archiving and processing, on a global comparison scale, Australia along with NZ is amongst the best there is.
    It is just that we are in a time where a totally irrational, unrealistic culture and integrity destroying meme has been inflicted upon our peoples by a tiny fanatical ideologically driven messianic syndrome afflicted, save the planet, green socialistic hatred driven movement has arisen and penetrated deep into the psyche of so many organisations and bureaucracies that have consequently almost completely lost sight of their iron bound responsibilities to the citizens of the country they operate in, for and on behalf of.

    The green movement will die as all such movements do and then as happens right through history the time of revision of the past history begins.
    If the original records are all there then it is only our generations that will be denied the real truth by those in the weather and climate bureaucracies and research academia who have now lost sight of their obligations and responsibilities to society and it’s people, the same society and peoples they profess to serve but have failed quite disastrously and despicably to do so in an honest and disciplined manner.

    120

    • #
      handjive

      Warwick Hughes asks a similar question:

      “I saw these diaries in the BoM library at Lonsdale Street, Melbourne in the early 1990′s – a series of maybe foolscap ledgers.
      I assumed they were put into Commonwealth archives – National Archives of Australia.
      If anybody can locate them – please let me know.

      http://www.warwickhughes.com/blog/?p=3218

      30

    • #
      Duster

      “… The reasons for my suggestion that the BOM’s data is NOT a shambles at least by world standards is that Australian weather records are arguably at least as good as the USA’s. …”

      And that, says an American, is sad. If you follow Steven Goddard’s site, WUWT, or several other sites, changed trends, shifted temperatures, missing data and even “zombie stations” where “data” are recorded for closed stations are all serious problems to trusting the US record. More disturbing, scientifically, it is a fact that the written historical temperature records are the closest thing we can get to “the event.” They are sole and most exact empirical evidence available. As such they should be copied, checked that the copies are at least as legible as the originals and stored at multiple sites. Under no circumstances should primary, raw data be discarded.

      Destroying such data is reminiscent of the ancient Egyptian Empire “erasing” Pharaohs, the destruction of historical records accompanying dynastic changes in Imperial China, and similar mistreatment of the historical record by the Germans and the Russians. It is by no means strictly a “Soviet” or “leftist” act. It is totalitarian behaviour in all its true ugliness, and an authoritarian (totalitarian) view point is not bounded by anything but a desire to control others.

      21

  • #
    TdeF

    Our red hander is at it again, clicking dislike even on questions, not statements. There were even two on Jennifer’s suggestion to put something in the tip jar, so this indulgent ratbag is conflicted, reading the blog for free, clicking for free but disagreeing with paying for the right to do so. This is someone who would steal from the poor box.

    213

  • #
    Don B

    In the US, homogenization and the dust bin have not yet changed some interesting facts. Of the 50 states, 31 set their all-time temperature record prior to 1940. Two states set the record in the 1800’s, in 1888 and 1898.

    http://usatoday30.usatoday.com/weather/wheat7.htm

    90

  • #
    john robertson

    As the collapse of this CAGW cult accelerates, there is no hiding the decline.
    Our bureaucrats are so remote from a shovel, that they will never understand the law of holes.
    There fore BOM will keep attempting CYA ruses until they isolate the politicians, embarrassing a few.
    Then they get fired.
    Oh dear, how sad.
    As this wave of mass hysteria recedes, the clean up will have to ask; Why did our institutions fail?
    I resent having ever more taxes extorted from me to fund such activist idiots.
    CAGW is and remains a creation of the bureaus.
    For this there is forgiveness
    The watchdogs are rabid and have betrayed their reason for existing.
    The cost to civil society is yet to be calculated.
    The damage to trust in govt is obvious.
    Without trust civl society crumbles.
    By my reasoning there is no punishment for bad actors within government that can be too extreme.
    These are not conscripts, each and everyone of them begged the public for their opportunity to perform these ROLES..

    70

  • #
    john robertson

    Oopes preview not post.
    As the collapse of this CAGW cult accelerates , there is no hiding the decline.
    Our bureaucrats are so removed from use of a shovel that they will never understand the rule of holes.
    The BOM bosses will keep frantically trying to CTA’s to the point they will betray their political masters.
    As the hysteria of the masses recedes, they must ask;How did this happen, again?
    CAGW was and is a creation of bureaucrats.
    For their part in this there can be no forgiveness.
    The agencies of government were created to prevent these kind of excesses, not cause them.
    Civilization is under attack by these worms hiding within it gnawing at the foundations.

    30

  • #
    Bruce

    Can anyone explain to me how much of an adjustment one makes to a recorded temperature when the site has been changed?

    Why is the adjustment always upward?

    30

    • #
      TdeF

      If the new site is cooler or hotter than the old, there would be at most a single discontinuity. What Jennifer is pointing out is a change in a trend over many years and thousands of readings and generally from from cooling to warming. That is not explained by a shift of location.

      A great deal of care would be taken in placing the Stephenson box in the first place after much debate. It would not have been careless and thoughtless and irresponsible. As much care and thought would have been taken in moving it, so you would not expect a discontinuity and the planners were trying to avoid such a discontinuity and even prove no discontinuity. A move would have been carefully documented and the records examined to show continuity. A previous blog entry shows that two boxes were run at the same time for two years just to document and prove continuity, which is being as careful as you could possibly be, but many temperatures are not only shifted, trends are reversed. The shifting explanation to explain temperature trend reversal is just absurd.

      The clear implication that the people who set these things up and operated them for decades were simply careless is surreal and insulting to thousands of people. To use this waffle as an excuse for making major changes to real data is unforgivable and certainly not science. You never alter real data. Any post measurement implied correction not only has to be fully justified, it has to be explained, documented and proven. You cannot say you think the box may have been moved, you would at the very least have to prove it. Otherwise you are blatantly changing data to suit a model, which is not science.

      Whoever did this thought no one would notice. This homogenization is a slur on the honor of the entire BOM formed in 1906 and all the tens of thousands of people who have worked there over 108 years and even the people around the country who took such careful measurements in the years before and since. Like the IPCC summary reports, this is likely the work of very few people. Given the size of Australia, its significance in world measurements for the Southern Hemisphere and the fact that Australia has the best records South of the Equator, we have huge impact on establishing the history of world land temperatures, so it is obvious why it was done.

      If as seems so obvious our summary records were altered simply to fit the Global Warming picture of the IPCC, this must be rectified by the BOM who need to stop defending a few people and publish the true temperature picture, unhomogenized. Science should not be politicized. Data alteration is not for the greater good.

      110

    • #
      Richard C (NZ)

      Bruce #25

      >”Can anyone explain to me how much of an adjustment one makes to a recorded temperature when the site has been changed?”

      See

      ‘Techniques involved in developing the Australian Climate Observations Reference Network – Surface Air Temperature (ACORN-SAT) dataset’

      CAWCR Technical Report No. 049
      Blair Trewin
      March 2012

      7.7 Implementation of data adjustment in the ACORN-SAT data set [Page 69]

      The size of adjustments generated by the techniques above was checked by comparing the means of pre- and post-adjustment data for the five calendar years prior to the inhomogeneity. The adjustment was only implemented if the means differed by at least 0.3°C on an annual basis, 0.3°C (not necessarily of the same sign) in at least two of the four seasons, or 0.5°C in at least one season (the last two provisions are in order to capture inhomogeneities that have different impacts in different seasons which might cancel each other out in an annual mean17). If the difference failed to satisfy one or more of these criteria the inhomogeneity was considered to be too small to justify adjustment.

      http://www.cawcr.gov.au/publications/technicalreports/CTR_049.pdf

      You can see examples of the actual ACORN adjustments made for site moves in that report in answer to your question (Note there is no 0.3 C lower limit in the NZ 7SS – see below).

      But it is NOT the site change adjustments that are contentious in this case (BOM/ACORN-SAT, Amberley, Rutherglen, Bourke). The contentious adjustments have been made in ACORN-SAT when none are justified for any LOCAL reason (e.g. no site move or instrument change, no local change recorded) i.e. an UNKNOWN breakpoint.

      The situation is different in NZ (NIWA/7SS). The contention here is that NIWA cannot provide their methodology as BOM have done (TR049 above, Menne & Williams 2009) for adjustments at KNOWN breakpoints (e.g. site moves). When established methodology (Rhoades and Salinger 1993) is applied to breakpoints, a different adjustment value is arrived at than what NIWA applies. The effect on the entire series is that the warming trend in the series (starts in a relatively cool period too) is about 3 times greater by NIWA than by R&S73 (0.91 vs 0.34 C/century). See:

      ‘Statistical Audit of the NIWA 7-Station Review’

      THE NEW ZEALAND CLIMATE SCIENCE COALITION, July 2011

      http://www.climateconversation.wordshine.co.nz/docs/Statistical%20Audit%20of%20the%20NIWA%207-Station%20Review%20Aug%202011.pdf

      The R&S93 method (applicable to NZ) is clearly defined in Appendix A, page 31.

      The equivalent ACORN method from TR049 above (not applicable to NZ) is M&W09 here:

      ‘Homogenization of Temperature Series via Pairwise Comparisons’

      MATTHEW J. MENNE AND CLAUDE N. WILLIAMS JR.

      NOAA/National Climatic Data Center, Asheville, North Carolina
      (Manuscript received 2 October 2007, in final form 2 September 2008)

      ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/menne-williams2009.pdf

      >”Why is the adjustment always upward?”

      It isn’t. See Jo’s post above 2nd last paragraph.

      60

    • #
      Duster

      It isn’t always upward. Contrariwise, adjustments to historical data are downward before about 1950 and tend to be upward after that. This preserves the “mean” value of the time series and balances. I sometimes suspect that the basic problem is not agenda but sheer incompetence in the thought department. There is no point in being an adept statistician or mathematician, if you can’t identify and deal properly with real-world issues. Being intelligent doesn’t necessarily make one smart or vice versa.

      20

  • #
    TedM

    I used to able to look back over the years of weather data using weatherzone pro. Now you can’t even go back to last year, obviously the BOM no longer permits it.

    They clearly want to shut down debate, or scrutiny. It’s good that some of the data is already out there. Don’t take the heat off the BOM.

    110

  • #
    John Smith101

    At 8:04am August 31 TdeF said: You cannot say you think the box [Stevenson screen] may have been moved, you would at the very least have to prove it. Otherwise you are blatantly changing data to suit a model, which is not science.

    Slightly off-topic but from the Climategate emails the following quotes might perhaps provide some background to BOM thinking, given that homogenisation has occurred in the BOM as well as the many other national temperature databases used in modelling:

    Phil, Here are some speculations on correcting SSTs [sea surface temperatures] to partly explain the 1940s warming blip. If you look at the attached plot you will see that the land also shows the 1940s blip (as I’m sure you know). So, if we could reduce the ocean blip by, say, 0.15 degC, then this would be significant for the global mean — but we’d still have to explain the land blip. I’ve chosen 0.15 here deliberately. This still leaves an ocean blip, and i think one needs to have some form of ocean blip to explain the land blip (via either some common forcing, or ocean forcing land, or vice versa, or all of these). When you look at other blips, the land blips are 1.5 to 2 times (roughly) the ocean blips — higher sensitivity plus thermal inertia effects. My 0.15 adjustment leaves things consistent with this, so you can see where I am coming from. Removing ENSO does not affect this. It would be good to remove at least part of the 1940s blip, but we are still left with “why the blip”. [Tom Wigley, to Phil Jones and Ben Santer – Email 1254147614]

    Solution 1: fudge the issue. Just accept that we are Fast-trackers and can therefore get away with anything. [Email 636]

    In any simple global formula, there should be at least two clearly identifiable sources of uncertainty. One is the sensitivity (d(melt)/dT) and the other is the total available ice. In the TAR, the latter never comes into it in their analysis (i.e., the ‘derivation’ of the GSIC formula) — but my point is that it *does* come in by accident due to the quadratic fudge factor. The total volume range is 5-32cm, which is, at the very least, inconsistent with other material in the chapter (see below). 5cm is clearly utterly ridiculous. [Tom Wigley Email 5175]

    I will press on with trying to work out why the temperature needs a ‘fudge factor’ along with the poorer modelling for winter. [Colin Harpham, UEA, 2007 Email 5054]

    With GCMs the issue is different. Tuning may be a way to fudge the physics. For example, understanding of clouds or aerosols is far from complete – so (ideally) researchers build the “best” model they can within the constraints of physical understanding and computational capacity. Then they tweak parameters to provide a good approximation to observations. It is this context that all the talk about “detuning” is confusing. How does one speak of “detuning” using the same physical models as before? A “detuned” model merely uses a different set of parameters that match observations – it not hard to find multiple combinations of parameters that give the similar model outputs (in complex models with many parameters/degrees of freedom) So how useful is a detuned model that uses old physics? Why is this being seen as some sort of a breakthrough? [Milind Kanliker, 2004 Email 1461]

    We had to remove the reference to “700 years in France” as I am not sure what this is , and it is not in the text anyway. The use of “likely” , “very likely” and my additional fudge word “unusual” are all carefully chosen where used. [Biffra, 2005 Email 1047]

    Either the scale needs adjusting, or we need to fudge the figures… [Elaine Barrow, UEA 1997 Email 723]

    My notes: as far as I can tell while the above quotes pertain to modelling I suspect that this might be a two-way street. Furthermore I think, from memory, that the Tom Wigley mentioned above was involved, in part, in the homogenisation and adjustments across a number of national temperature databases but I stand to be corrected.

    In modelling these alterations are called , in part, flux adjustments. From the IPCC’s Third Assessment we have this admission: The Third Assessment Report frankly admits that “flux adjustments” have been inserted in the Global Climate Models to make their results agree more closely with observations. While the TAR cites only 16 of C. D. Keeling’s papers, but it refers to “flux adjustments” 135 times. See, for example the discussion at). Science labels such techniques as flux adjustments and inter-network calibrations disparagingly as fudge factors. Flux adjustments: AR3, Technical Summary, p. 49, (Box 3) & AR4p117

    70

  • #
    Gbees

    Here’s a newspaper article in 1914 discussing 120F at Bourke being “highest since 1909” when the temperature in Bourke was recorded as 125F and Brewarrina 123F

    50

  • #
    Richard C (NZ)

    Re the “f***d” word.

    I don’t think the radical changes from cooling trend to warming trend were an intentional outcome by BOM. And it is intent that determines the legal view (see below).

    I think it is simply a result of ignoring local comparator stations in favour of remote (same as NIWA, GISS, BEST), and consequently over-reach with their homogenization methodology to apply adjustments where none should be made. And it is their attitude of defense when called out that exacerbates the miss-trust and too loose use of the “f” word.

    BOM’s method is clearly defined and publicly available (see #25.2 above) contrary to Graham Lloyd’s claim. But BOM now have some explaining to do in regard to their application of it to specific breakpoints their method identifies but for which there is no local justification i.e. they haven’t been thinking about their process enough. That’s not “f***d”, that’s sloppy.

    NIWA was just plain sloppy from the outset (no documented method basis at all when the appropriate one already existed, just ad hoc instead). But having got a favourable result (“unequivocal” warming) they knew it had leverage value and even managed to convince a Judge that their ad hoc method was “internationally recognized” when all that really means is that HadCRU use NIWA’s NZ7SS in CRUTEM4/HadCRUT4 by default – no questions, no audit (as HadCRU use the Australian HQ series), see here:

    ‘CRUTEM4 Temperature station data’
    http://www.cru.uea.ac.uk/cru/data/temperature/crutem4/station-data.htm

    BOM’s attitude to the leverage value of a strong warming trend (good for business) and defense of it appears to be similar to NIWA’s however.

    So although intent does not necessarily arise during the process at both BOM and NIWA (I don’t think it does at all), the subsequent respective BOM/NIWA defenses of highly questionable practice must be bordering on intent but in a new context i.e. having produced a strong warming trend in both countries they both intend to keep them despite contrary evidence.

    NIWA won their new-found on-going intent in court i.e. they acted on their intent, but although the NZCSET lost their court hearing v NIWA (ill conceived strategy, “unnecessarily prolix” – J Venning, I agree), their alternative 7SS series (see ‘Statistical Audit’ #25.2 above), ignored by the court, has not gone away. It still stands on its scientific and statistical merits (reviewed by 3 independent professional statisticians). Essentially NZ now has 2 alternative 7SS series of whatever method validity the respective methods offer (established vs ad hoc).

    In short, and in NIWA’s case, their ongoing intent and action to defend their sloppy (but advantageous to them) work has been accepted by the court, no “f***d”.

    But how long can this intent and action go on without becoming “f***dulent”?

    50

    • #
      Richard C (NZ)

      >”Re the “f***d” word”

      That was from Jo’s caution at #7

      30

    • #
      handjive

      Re: the f***d word.

      Scientists Dr Judy Ryan and her colleague, Dr Marjory Curtis are among many highly-qualified scientists who, as skeptics of the wrong-headed hysteria over supposed man-made global warming, are fighting to restore scientific integrity.

      18th February 2014

      “Dear Professor Karoly,

      We have been writing to you for a year requesting that you provide one credible study that supports your hypothesis of catastrophic, human caused global warming (CAGW).
      You have not been able to provide one.
      The letters and your responses are all on the public record https://www.facebook.com/DavidKarolyEmailThread?ref=hl

      In March 2013 we issued you the opportunity to either renounce your alarmist claims on the ABC news, or publicly provide empirical data-based evidence, that is available for scientific scrutiny, to support them.

      Almost a year has passed and still you have not provided the evidence.

      We remind you that the Australian people are experiencing financial disadvantage as a result of the host of policies and administrative decisions driven by advice regarding the science of climate change.

      Is that advice false or misleading? Does it deceive by concealing or omitting or embellishing or misrepresenting relevant facts?

      Graphs on the following pages were obtained or produced by various independents non-aligned examiners and auditors of BOM records.
      Are you are the author of the original regional temperature data or graphs used by BOM?

      Every graph shows that the raw data, which shows either a flat or downward (cooling) trend has been “adjusted” to a warming trend.
      Are you are associated in any way with producing BOM’s adjusted graphs?

      If so, in our opinion it is very misleading of both you and the BOM personnel to adjust the data to the extent that it misrepresents reality.
      We also think that it is very misleading of both you and BOM to omit to declare to the Australian people that you have “adjusted” the raw data.”

      read on: http://www.principia-scientific.org/aussie-alarmists-in-spin-as-government-climatologist-prof-karoly-is-cornered.html

      60

    • #
      Duster

      You made a perfectly cogent argument that the driving force at NIWA was incompetence. Asking how long it can go on before it becomes effectively malicious ignores too many aspects. There is no evidence that competence has increased in the staff. Defending old work is a time-honoured approach to not having to redo it and pass it along as someone else’s problem. It also conserves budget and disguises the inadequacies of the leadership. So, how long? How does “forever” sound? The f-word requires competence.

      20

  • #
    pat

    apologies if this has already been posted. not worth excerpting, but read it all and the usual mostly-partisan, uninformed comments by people with no curiosity whatsoever:

    27 Aug: Guardian: Graham Readfearn: Climate sceptics see a conspiracy in Australia’s record breaking heat
    Bureau of Meteorology says claims from one climate sceptic that it has corrupted temperature data are false
    You could cut the triumphalism on the climate science denialist blogs right now with a hardback copy of George Orwell’s Nineteen Eighty-Four…
    http://www.theguardian.com/environment/planet-oz/2014/aug/27/climate-sceptics-see-a-conspiracy-in-australias-record-breaking-heat

    30

  • #
    pat

    no MSM coverage at all, but Graham Readfearn/desmog have something to say:

    30 Aug: Desmogblog: Australia’s Climate Change Conspiracy Theorists Get Angry Over Radio Interview That Never Happened
    In the space of six days, Rupert Murdoch’s The Australian newspaper has published five news stories and an opinion piece attacking the credibility of the Australian government’s weather and climate agency, the Bureau of Meteorology.
    I’ve covered the guts of the early stories over on my Planet Oz blog for The Guardian…
    Now for those that don’t know, John Cook is the founder of the Skeptical Science website and the Climate Communication Fellow at the University of Queensland’s Global Change Institute. Another sceptic blogger JoNova also commented on the ABC interview with Cook. “We’re looking forward to seeing John Cook explain that on his blog,” she wrote…
    John Cook was not interviewed by ABC Goulburn Murray and he has apparently never met or spoken to the host in question, Bronwen O’Shea.
    Cook even offered an alibi! He was with his mum and before anyone asks, no I’ve not called John Cook’s mum to verify that the person she was with that morning was actually John Cook, her son.
    Just to be doubly sure, I asked the ABC for a response. I was told that they did not interview John Cook, but they did have a talkback caller who came on the line after the phone dropped out and this was “David from Sandy Creek” which… well… sort of sounds like John Cook… but not much!…
    http://www.desmogblog.com/2014/08/29/australia-s-climate-change-conspiracy-theorists-get-angry-over-radio-interview-never-happened

    30

    • #

      Hey Pat

      This may not be such a bad outcome… there are links from Desmog to both Jo’s site and also my ‘ABC of Rutherglen’ post… http://jennifermarohasy.com/2014/08/the-abc-of-rutherglen/

      While John Cook may not be able to understand the charts that I link to here…
      http://jennifermarohasy.com/temperatures/rutherglen/

      Other reading Desmog may just be indirectly getting access to information they would not otherwise stumble across.

      60

      • #
        Richard C (NZ)

        Readfearn doesn’t address the specific location issues raised by you Jennifer except to parrot BOM. If his readers have enough comprehension they will see that (if they are willing to understand that is).

        He’s right about Lloyd claiming BOM has not made their algorithms public though. They have. But now BOM have to justify (in no uncertain terms, which has all there has been so far – vague hand-waving) their application of them when there is no local reason,

        That will make interesting reading – if we ever see it. As will Readfern’s reporting of it – as if we will ever see that.

        30

  • #
    ROM

    Bruce @ # 25 posted;

    “Can anyone explain to me how much of an adjustment one makes to a recorded temperature when the site has been changed?

    Why is the adjustment always upward?”

    ________________

    There are at least four major adjustments including homogenization used on station temperature data and no doubt more but these are the major ones.

    1 / TOBS ; Time of Observation, a very contentious adjustment which can go either way with any single individual station.

    All stations are adjusted in temperatures to provide a corrected temperature at a clearly defined common time that is applied to every station so that the trends can be accurately [ ? ] computed across all stations.

    As anybody would know, a few minutes difference in readings before or after say the 9 am or 6 pm reading when temperatures are rapidly rising as the day warms up or in the evening, cools down can make a big difference in temperature readings. So if the TOBS is known adjustments will be made to the station temperature to compensate for the TOBS differences.
    If the station’s TOBS data is not known, they guess! Its called applying statistical corrections to the data.
    And it is a serious bone of contention within the climate research community.

    2 / The second adjustment is for the UHI [ Urban Heat Island] effect on recorded temperatures which has been discussed at length right across the blog sphere.

    Basically most stations in the developed world started life as stations set up in a rural dominant landscape. But over decades the urban sprawl has surrounded those stations.
    Where houses, buildings, sealed roads, parking lots. Air cons, vehicles and all the other associated paraphernalia of a town or city exists, there is a strong warming effect within and close to and down wind of such newly urbanised locations. Consequently any stations that had been established before urbanisation in that locality will show a marked warming in recorded temperatures which is supposedly compensated for by adjustments by the climate data and data processing organisations; ie NCDC, BOM, CRU, GISS and etc.

    The amount of the UHI adjustments are a very serious point of considerable argument with the IPCC using Phil Jones paper claiming only a 0.5 C adjustment for UHI which he claimed is derived from a Chinese paper for which it has since been shown was totally fraudulent as the supposed Chinese data sources for the collaborative paper have never been revealed / located nor are likely to be as the data was supposedly collected before and during the near decade long chaos of Mao’s Cultural Revolution.
    It has been admitted since that the Chinese data to support the very low UHI effect of 0.5 C adjustment downwards to compensate for UHI effects on stations is non existent but the IPCC sticks with this figure as to move from it destroys a large part of their claims of dangerously increasing temperatures.

    The real story on UHI is that it is very much higher in temperature effects than the IPCC has admitted and that it varies a great deal between stations depending on what type of natural environment that the station was originally established in such as grasslands, forest, desert like environments and etc and how urbanisation grew up around that station.

    For a short but very good read of this very wide variation in the UHI in relation to the surrounding natural environment I would suggest this “Climate Audit” post;

    New Light on UHI

    3 / The third point and one which is being torn apart within the climate research community and the technical climate blog sphere is the adjustments for the break points in the station temperature data.
    A “break point” is a sudden shift up or down in the station’s temperature data [ or in any data base ] which it is assumed is caused by a shift in the station’s locality to a different environment. Say from a former rural but slowly urbanised location with it’s consequent warming over time to a new location further out into a new rural environment which is also usually at a different altitude which also affects the temperatures recorded, leading to a significant break point, a considerable fall [ rarely an increase ] in temperatures recorded by the newly re-located stations.

    And so we have a “break point” which in theory and for computational purposes the station data must now be adjusted to give common homogeneous standards for the recorders station temperatures so they can be used by the climate modelers.

    And this is where the fight really starts and where the really big corruption of the global temperature bases are located leading a completely spurious warming trend and crazily the constant altering of decades old, back into the early 1900’s recorded tempertures which are nearly always cooled well below the temperatures actually recorded at the time due to the incredibly irrational, from a lay persons viewpoint, manner in which the break points are treated in the Data processing algorithims .
    Break points as above are assumed to occur due to re-location of stations but that is far from the truth. A station can begin in a wide open environment but over the years as an example a plantation of trees or even a small bush land or forest a few tens or hundreds of metres upwind can become quite large and of considerable height.

    [ In agriculture it has been researched and the rain fall is affected down wind of a row of trees by around ten times the height of the trees.

    I have run comparitive experiments on our farm strictly to satisfy my own curiosity since the early 1970’s through to the early 1990’s and some of my findings were subsequently included in the official results in the Vic Dept of Ag research results some years ago.
    In one instance on some experimental wheat plots which were located 250 metres down wind of the 15 metres high Buloke timber which is about the only tree species of note that grows across the heavy cracking clay soils of the Wimmera plains north of Horsham, we installed two absolutely identical quality rain gauges which we had carefully checked and calibrated against each other,[ why two gauges I can’t remember
    One gauge was 250 metres downwind of that patch of Buloke timber.
    The other identical gauge was a further 100 metres on or about 350 metres down wind of that timber.
    The closest Gauge to the timber over a four month period in every case measured very close to ten percent consistently less rainfall than the more distant gauge just 100 metres further away from that timber.
    This despite us swapping gauges and all the other repeated attempts to find differences in the gauges. Now thats rainfall but temperatures are likely to be affected just as much either up or down by those adjacent trees when they are located upwind in the prevailing winds of the area.]

    Now where the station was located and the timber has grown over the years to quite a height thus slowly affecting and altering the recorded station temperature trends then somebody comes along and figures that those trees need lopping, So they are lopped or cut down which then opens up the area around the station in a matter of hours and you suddenly have what appears to be a significant break point for the station data, a station that has never moved or been altered or shifted but the data processing algorithims then assume that the station has had some major change and apply a break point step adjustment that is entirely unwarranted and totally inappropriate for that station thus altering the station’s entire recorded data base and therefore the total temperature data base,

    For an explanation on the rather poorly thought out and possibly deliberate temperature increasing break point step adjustment algorithms including temperature decreasing , ie ; cooling adjustments of historical temperatures which then provides the required long term historical warming trends so beloved of the warmnunists then this WUWT post will explain the process better than I can.

    The psychology that drives the utter stupidity and irrationalism of quite deliberately altering past historical temperatures through the use of these adjustment algorithms, the creation of a false apparent cool of historical temperatures which are then used to establish an apparent but fictitious steep warming trend in global temperatures which is then used in turn as a public propaganda source is near criminal in it’s exploitation of a false statistical exercise to promote a catastrophic climate warming meme which in cold hard numerical measeaured temperature terms does not exist and never has in historical times.

    To see for yourselves how these break point adjustments are actually applied and the consequences of HOW they are applied

    Why Automatic Temperature Adjustments Don’t Work
    To quote from the opening remarks in this WUWT post;.
    _________________
    In a recent comment on Lucia’s blog The Blackboard, Zeke Hausfather had this to say about the NCDC temperature adjustments:

    “The reason why station values in the distant past end up getting adjusted is due to a choice by NCDC to assume that current values are the “true” values. Each month, as new station data come in, NCDC runs their pairwise homogenization algorithm which looks for non-climatic breakpoints by comparing each station to its surrounding stations. When these breakpoints are detected, they are removed. If a small step change is detected in a 100-year station record in the year 2006, for example, removing that step change will move all the values for that station prior to 2006 up or down by the amount of the breakpoint removed. As long as new data leads to new breakpoint detection, the past station temperatures will be raised or lowered by the size of the breakpoint.”

    In other words, an automatic computer algorithm searches for breakpoints, and then automatically adjusts the whole prior record up or down by the amount of the breakpoint.

    60

    • #
      mmxx

      ROM

      I appreciated your interesting contribution of information and views.

      In your rain gauge siting trial, the separation distance was 100m.

      Activities that set out to homogenise recorded data of weather across sites that are separated by several hundred kilometres, contain high risk factors for error. Those attempting this also make themselves open to questions of bias in assumptions that are made, for example in estimating break points. As Jo’s/ Jennifer’s posts have uncovered, these questions become more pertinent when most homogenised data reverses cooling or neutral trends to heating.

      20

    • #
      Electronica

      ROM. I don’t believe your statement “A ‘break point’ is a sudden shift up or down in the station’s temperature data [ or in any data base ] which it is assumed is caused by a shift in the station’s locality to a different environment.” is accurate. Breakpoints can be caused by station moves, record gaps, TOBS changes and what are called “empirical breaks” (this covers a lot of other reasons whereby systematic errors can cause shifts in the raw data that are not climate related).
      There is so much mainstream literature about the breakpoint issue that you needn’t resort to WUWT to get, shall we say, a very particular point of view about the problem. Quite randomly I’d point you to an article like:

      http://onlinelibrary.wiley.com/doi/10.1029/2012JD017729/full
      (I’ve linked to this as the full text is available.)

      Mainstream climate science appears to be taking the whole issue of homogenisation and breakpoints very seriously, and it behoves people like you to read the mainstream literature and then see whether or not it makes more sense.
      All I can say is that if temperature data is simply served up RAW it will most likely give you the wrong picture. Why? Because temperature measurement is deliciously susceptible to systematic error. It suits the notion of conspiracy, but if you’ve ever worked as an experimental scientist, you’ll know that it’s a bugbear for almost any experimental data ever gathered.

      11

      • #
        ROM

        For a scientific explanation to suit somebody at your level Electronica then you are no doubt correct.

        For the guy or gal who has another life to live and who has only limited time to delve into such details then your scientific explanations with all the ifs, buts, why fores and wherefores is just an overload they won’t bother to read.

        Something the guest commenters at WUWT seem to be very aware of and which is evident in their presentations.

        So I try to make it readable for those people who come to Jo’s blog for information so at least they have some concept of what is being discussed in climate matters.

        Your take on the science is one reason why science has become so narrow and compartmentalized as there is so many aspects to any one subject that nobody can cover the lot anymore. and for a lay person such as myself then there is only so much I will try to take in and digest before moving on.

        If what I  am given to digest in information is simple and presentable and correct in it’s basics that will do nicely as far as I am concerned and as I suspect most others are concerned also other than those experts in those increasingly narrow fields.

        No doubt Jo also would welcome a contribution in the comments section and maybe even in a headline post from the likes of yourself explaining all those items on adjustments to temperature data that I got wrong or should have gone into in greater depth.
        Thats what the comments section is all about.

        10

  • #
  • #
    pat

    Jennifer –

    yes, the desmog thread may cause some to investigate for themselves, let’s hope so. i appreciate many CAGW sceptics were on to the scam long before i was but, post-Climategate, i can’t understand how anyone would not be at least mildly sceptical.

    nowadays, with the nearly 18-yr pause being admitted to by all & sundry, it’s insane how people who otherwise seem quite intelligent, continue to believe the most extreme predictions are correct. there’s no explaining it.

    50

  • #
    pat

    31 Aug: UK Daily Mail: David Rose: Myth of arctic meltdown: Stunning satellite images show summer ice cap is thicker and covers 1.7million square kilometres MORE than 2 years ago…despite Al Gore’s prediction it would be ICE-FREE by now
    •Seven years after former US Vice-President Al Gore’s warning, Arctic ice cap has expanded for second year in row
    •An area twice the size of Alaska – America’s biggest state – was open water two years ago and is now covered in ice
    •These satellite images taken from University of Illinois’s Cryosphere project show ice has become more concentrated
    http://www.dailymail.co.uk/news/article-2738653/Stunning-satellite-images-summer-ice-cap-thicker-covers-1-7million-square-kilometres-MORE-2-years-ago-despite-Al-Gore-s-prediction-ICE-FREE-now.html

    20

  • #
    pat

    meanwhile, over in CAGW-land:

    31 Aug: Times of India: Arup Chaterjee: Kolkata wildlife photographer off to ‘shoot’ polar bears
    City-based wildlife photographer Amartya Mukherjee, who has been in the forests of India and many other countries to capture breathtaking moments from animal and bird life, is set to break new ground for himself.
    On Wednesday, the 36-year-old chartered accountant will be off to the Arctic to open a new album. And, as on most occasions, to click on a message.
    “My main focus will be on polar bears. The reason why I’m making this trip is similar to those that had me climb Kilimanjaro,” he told TOI. “Global warming and the accompanying climate changes have had glaciers retreat in most parts of the world. I saw far less than what Hemingway would have seen when he wrote ‘The Snows of Kilimanjaro’ and, by the time my three-year-old son grows up, there will be very little snow left on Africa’s highest peak. In polar regions, depletion of ice cover means lesser space for polar bears. As the frozen land disintegrates, they will have to swim unimaginable distances through cold oceanic water to go from one place to another and experts feel the polar bears may disappear in as little as 40 years time. The earlier one goes and documents these lovely creatures in their threatened world, the more conscious people will become. That’s what I am trying as a wildlife photographer.”…
    “We will be in the Arctic Sea for 11 nights with 10 other passengers on an ‘ice-strengthened’ ship called ‘Stockholm’. I’ve chosen a small ship because the deck, from where most of the photographs will be clicked, is relatively low and will help me get closer to eye-level. We will also land on an island or two in a rubber dingy to get close without being intrusive,” added Mukherjee, who says he is braving a colder climate to get the right light conditions in a place where half the year is in daylight and the other in darkness.
    “Most people go on polar expeditions in July-August when temperatures are kinder. But the sun is overhead and harsh and it’s not good for photography…
    http://timesofindia.indiatimes.com/city/kolkata/Kolkata-wildlife-photographer-off-to-shoot-polar-bears/articleshow/41292207.cms?

    10

  • #
    Roy Hogue

    What if they doctored up the data and nobody came?

    What if they showed off their latest climate model and nobody came?

    Would the climate obey them even if we didn’t look to see what’s going on?

    Does the climate obey them now for that matter?

    No! No! No! And no!

    So why do we look? I could sleep in every morning as profitably as I can worry about what some climate scientist is doing. I wish they all would just stay in bed too. On the other hand, if they did then the birth rate would surely go up and there goes another complaint in the making. Is there maybe some deserted island we can send them all to, say Mars. Then we could let them doctor up all the records they want to.

    No way to win. 😉

    20

  • #

    While downloading the NCDC/NOAA data, I spotted this readme file:

    These data originated from the Australian Government Bureau of Meteorology. Data is made public at the following FTP site:
    ftp://ftp.bom.gov.au/anon/home/ncc/www/change/HQdailyT/. Data were extracted and uploaded to FTP on 9 Sep 2011 by Jared Rennie

    Below is a description of the data

    An Australian high-quality daily temperature dataset has been developed by Trewin (2001). The development of the dataset involved detection and removal of gross single-day errors and examination of available metadata for evidence of inhomogeneities. Rather than making homogeneity adjustments in mean temperatures, the daily temperature records were adjusted for discontinuities at the 5, 10, …, 90, 95 percentile levels. This makes the dataset particularly useful for examining changes in the occurrence of extreme temperature events, such as numbers of hot and cold days per year.

    There are 99 non-urban stations in the Australian high-quality daily temperature dataset, with a further four located at major cities. Only limited amounts of Australian daily temperature data have been digitised for the years prior to 1957. Consequently, most corrected daily temperature records are only available from this time. However, a project within the Bureau of Meteorology to digitise hourly and daily data prior to 1957 at 50 key locations throughout the country should enable some of the records in the dataset to be extended backwards in time.

    As well as being useful for studies of temperature extremes, the high-quality temperature dataset has been used as the base dataset for an operational seasonal temperature prediction scheme and is currently being used for routine monitoring of Australian and regional mean temperatures at the monthly and seasonal timescale.

    Reference: Trewin, B.C. 2001. Extreme temperature events in Australia. PhD Thesis, School of Earth Sciences, University of Melbourne, Australia.

    The core of that text hasn’t been changed in the latest published BoM dataset.

    So it looks like an earth sciences PhD thesis is the basis for homogenisation. Must be an excellent one. It’s certainly BIG. Check.

    Appendix B (supplementary info) lists weather stations and where their discontinuities were adjusted. More minima than maxima were adjusted. e.g. Amberley Feb 1952, Aug 1980, Dec 1943; Bourke Dec 1958 Feb 1963; Rutherglen Mar 1966, Sep 1974, Aug 1979; Melbourne Aug 1925, Jan 1930; … just a few examples.

    10

    • #
      Richard C (NZ)

      >”So it looks like an earth sciences PhD thesis is the basis for homogenisation”

      Can’t open/download the tome but I don’t think it is (and it is respect to HQ Daily, not ACORN-SAT – see below).

      Refer CAWCR Technical Report No. 049

      http://www.cawcr.gov.au/publications/technicalreports/CTR_049.pdf

      Page 21 pdf,

      3.2 Previous networks [to ACORN-SAT] used for climate change analyses in Australia

      (a) Annual data set. This set was originally developed by Torok and Nicholls (1996) and enhanced by Della-Marta et al. (2004).

      (b) Daily data set. This set was developed by Trewin (2001a)

      Page 53 pdf,

      7. DEVELOPMENT OF HOMOGENISED DATA SETS
      7.2 The detection of inhomogeneities

      Since the ability to detect a breakpoint in a time series is a function of the ratio of the size of the
      breakpoint to the standard deviation of the data, a common technique to improve the signal-to-noise ratio is to apply statistical tests to the difference between the time series at the candidate site and that of a reference series that is representative of the background climate at the candidate site (Peterson et al., 1998). […]

      Reference series are commonly constructed as a weighted mean of data from neighbouring locations. This method was used in the development of previous Australian temperature data sets (Della-Marta et al., 2004; Trewin, 2001a)

      ++++++++++

      So the homogenization basis for the HQ Daily dataset by Trewin (2001) appears to be Peterson et al (1998). This is the dataset incorporated in CRUTEM4/HadCrUT4:

      ‘CRUTEM4 Temperature station data’
      http://www.cru.uea.ac.uk/cru/data/temperature/crutem4/station-data.htm

      ftp://ftp.bom.gov.au/anon/home/ncc/www/change/HQdailyT/HQdailyT_info.pdf

      10

      • #

        Can’t open/download the tome but I don’t think it is (and it is respect to HQ Daily, not ACORN-SAT – see below).

        The link I provided is to the Uni Melbourne site. Two download links are at the bottom of the page. The paper has been scanned; quite unexpected for a PhD thesis published “recently” at a university. Candidates usually have to provide papers in digital form (commonly TeX-based, using stylesheets designated by the organization) for a number of reasons beyond mere readability.

        The Abstract from that paper appears, almost verbatum, in the PDF accompanying the 2013 dataset provided by BoM.

        I’ve downloaded the CTR’s as well as a bunch of climate data; even though I don’t have the time to spend on that right now. (Paying work beckons.)

        Most disturbing to my mind is the ready and leisurely disassociation of the measurement of physical phenomena from the data. Data no longer have any physical meaning or have their variability bound by the physical universe, they are statistically tortured as though they were just numbers. They may as well be from n-space. They’re not even temporary field intensities (temperatures) any more.

        Data that doesn’t fit their curves becomes “noise”; something to eliminate. Fitting data to models. As in e.g. the above quoted:

        Since the ability to detect a breakpoint in a time series is a function of the ratio of the size of the breakpoint to the standard deviation of the data, a common technique to improve the signal-to-noise ratio is to apply statistical tests to the difference between the time series at the candidate site and that of a reference series that is representative of the background climate at the candidate site

        (bold mine)

        I smell a rat.

        10

        • #
          Richard C (NZ)

          Bernd

          >”Two download links are at the bottom of the page”

          Yes I tried those but nothing forthcoming (my connection, my problem). But no matter, the HQ Daily basis was Peterson et al (1998) as Trewin describes in TR049 above even though the development of HQ Daily was the Trewin (2001) thesis. I’m OK with that. I wouldn’t be if the basis was simply the thesis, that’s the NIWA situation.

          >”Data no longer have any physical meaning”

          Now you’re talking Bernd. For ACORN-SAT it’s not the homogenization method (Menne & Williams 2009) that’s the problem to my mind, it is the adjustment method (Percentile Matching – PM95) and application of it.

          See the #38.2 thread below for definition of PM (from a Health Policy page) and an example of percentile matching (from the Casualty Actuarial Society).

          We’re talking about statistical models and parameter (not data) estimates.

          00

        • #
          Richard C (NZ)

          >”I smell a rat”

          In respect to Trewin (my emphasis):

          “the difference between the time series at the candidate site and that of a reference series that is representative of the background climate at the candidate site

          Yes, a rat. This was also a core bone of contention in NZCSET v. NIWA (although you would hardly know from the Statement of Claim – and it failed).

          They (both BOM and NIWA) are selecting REMOTE comparator stations (calling them “nearby” or “neighbouring”) to represent the background climate at the candidate site when in fact they are nothing like it.

          And I’ll repeat here part of my reply to Mikky below:

          I say, if there is no LOCAL reason to make the adjustment even though the adjustment method (PM95) requires one by REMOTE reason, don’t make the adjustment unless a LOCAL reason for the break can be clearly determined and documented for scrutiny.

          20

    • #
      Richard C (NZ)

      Following on from #38.1

      Homogenization and adjustment are two entirely different concepts in respect to ACORN-SAT.

      Homogenization Method – M&W09

      For the homogenization method refer:

      CAWCR Technical Report No. 049
      http://www.cawcr.gov.au/publications/technicalreports/CTR_049.pdf

      From page 54 pdf (note the italicized element):

      7.2 The detection of inhomogeneities

      The method used for statistical detection (but not adjustment) of potential inhomogeneities in the ACORN-SAT data set broadly follows the method used by Menne and Williams (2009) for the continental United States, to which readers are referred for a full description. Details of the implementation of the method in the Australian context, and of deviations from the original method, are described in the following paragraphs.

      The main deviations from the Menne and Williams method are:….(continues)….

      Adjustment Method – PM95

      For the adjustment method refer Pages 56 – 65 pdf:

      7.3 Adjustment of data to remove inhomogeneities – an overview
      7.4 The percentile-matching (PM) algorithm
      7.4.2 The non-overlap case
      7.5 Monthly adjustment method
      7.6 Evaluation of different adjustment methods

      Page 65 pdf,

      “Based on this evaluation, the PM95 method [7.4] with 10 neighbours was selected for use in most cases. Details of the implementation are given in the following section.”

      Page 69 pdf,

      7.7 Implementation of data adjustment in the ACORN-SAT data set [PM95]

      # # #

      So for ACORN-SAT it’s the M&W09 homogenization method (7.2) and the PM95 adjustment method (7.4) that everyone should get familiar with.

      More to follow on the percentile matching method (PM).

      10

      • #
        Richard C (NZ)

        Following on from #38.2

        TR049 is a bit vague on percentile-matching (no reference is given). It is a statistical model method, from a Health Policy page:

        Term: Matching (Moments and Percentile)

        Definition:
        This is one of the methods used in obtaining parameter estimates of a statistical model by equating the theoretical and empirical moments or percentiles. For moment matching for instance, the theoretical mean or standard deviation will be equated to the empirical mean or standard deviation. For percentile matching, the theoretical mode, for example, will be equated to the empirical mode.

        http://mchp-appserv.cpe.umanitoba.ca/viewDefinition.php?definitionID=103797

        The closest to a tutorial I could find is this from Casualty Actuarial Society http://www.casact.org/:

        ‘Estimation, Evaluation, and Selection of Actuarial Models’

        Stuart A. Klugman
        November 20, 2002

        https://www.casact.org/library/studynotes/klugman4.pdf

        Page 27 pdf,

        2.3 Estimation for parametric models
        2.3.1 Introduction
        2.3.2 Method of moments and percentile matching

        Page 29 pdf,

        Definition 2.27 A percentile matching estimate of θ is…….(equations)

        The motivation for this estimator is that it produces a model with p percentiles that match the data (as represented by the empirical distribution). As with the method of moments, there is no guarantee that the equations will have a solution, or if there is a solution, that it will be unique.

        One problem with this definition is that percentiles for discrete random variables are not always well-defined. For example, with Data Set B and using the definition of percentile from the Part 3 Study Note, the 50th percentile is any number between 384 and 457 (because half the sample is below and above any of these numbers). The convention is to use the midpoint. However, for other percentiles, there is no “official” interpolation scheme.

        The following definition will be used in this Note.

        Definition 2.28 The smoothed empirical estimate of a percentile is found by……(equations)

        Page 30,

        Example 2.29 Use percentile matching to estimate parameters for the exponential and Pareto distributions for Data Set B.

        # # #

        So for ACORN-SAT adjustments, we are dealing with “parameter estimates”, with no guarantee that an estimate will be unique.

        Enjoy everyone.

        10

  • #
    Mikky

    It should be possible to shed a lot of light on what has been done with a simple bit of reverse engineering.

    Simply subtracting the raw from the homogenised data gives the offset that has been applied.

    At a quick glance it looks like continuously time varying offsets have been applied, whereas the algorithms described only give step changes in offsets (e.g. from a move or a time correction) between periods of constant offset.

    If continuously time varying offsets have been applied then it may be hard to avoid the conclusion that some highly unscientific data manipulation has been applied to arrive at Politically Correct and Funding Friendly conclusions.

    00

    • #
      Richard C (NZ)

      >”Simply subtracting the raw from the homogenised data gives the offset that has been applied.”

      Example of Amberley and Bourke by Marohasy, Abbot, Stewart, and Jensen:

      Amberley Figure 2, page 5,
      Bourke Figure 7, page 11.

      http://jennifermarohasy.com/wp-content/uploads/2011/08/Changing_Temperature_Data.pdf

      >”At a quick glance it looks like continuously time varying offsets have been applied, whereas the algorithms described only give step changes in offsets”

      No, steps, not time varying offsets. In the Amberley example above, there are 2 steps, 1980 and 1996. Combined downward change to early series temperature of over 1.5 degrees C.

      For Bourke the steps are down between 1911 and 1915 by about 0.7 degree C and then up between 1951 and 1953 by about 0.45 degree C,

      >”If continuously time varying offsets have been applied”

      No in both examples.

      The contentious step adjustments (not time varying) have been made in ACORN-SAT when none are justified for any LOCAL reason (e.g. no site move or instrument change, no local change recorded) i.e. an UNKNOWN breakpoint determined by REMOTE comparator sites. The ACORN-SAT adjustment method for such a breakpoint is at #38.2 above.

      The case of Rutherglen illustrates this,

      ‘Big adjustments? BOM says Rutherglen site shifted, former workers there say “No”’

      http://joannenova.com.au/2014/08/bom-claims-rutherglen-data-was-adjusted-because-of-site-move-but-it-didnt-happen/#more-37863

      Rutherglen
      http://jennifermarohasy.com/temperatures/rutherglen/

      As does Bourke,

      ‘Rewriting the History of Bourke: Part 2, Adjusting Maximum Temperatures Both Down and uP, and Then Changing Them Altogether’

      By Jennifer Marohasy on April 6, 2014

      […] In a report entitled ‘Techniques involved in developing the Australian Climate Observation Reference Network – Surface Air Temperature (ACORN-SAT) dataset’ (CAWCR Technical Report No. 049), Blair Trewin explains [see #38.2 above] that up to 40 neighbouring weather stations can be used for detecting inhomogeneities and up to 10 can be used for adjustments. What this means is that temperatures, ever so diligently recorded in the olden days at Bourke by the postmaster, can be change on the basis that it wasn’t so hot at a nearby station that may in fact be many hundreds of kilometres away, even in a different climate zone.

      Consider the recorded versus adjusted values for January 1939, Table 1. The recorded values have been changed. And every time the postmaster recorded 40 degrees, Dr Trewin has seen fit to change this value to 39.1 degree Celsius. Why?

      http://jennifermarohasy.com/2014/04/rewriting-the-history-of-bourke-part-2-adjusting-maximum-temperatures-both-down-and-up-and-then-changing-them-altogether/

      Yeah, Why?

      00

      • #
        Mikky

        Thanks for the info, it was the case of Rutherglen that made me think of “tapered” offsets.

        Looking at the case of Amberley, Figure 1 of the Marohasy, Abbot, Stewart, and Jensen pdf,
        it does look like there is a “sudden” step down (by around 2C) of minimum temperatures around 1980,
        and I can well believe that an algorithm would detect that step down,
        and if no nearby data had a similar step then the software might make the “corrections” seen.

        What may have been lacking is human intervention to question what the software has done, what could be the reason for ALL measurements prior to 1980 to be around 2C too high? The step down around 1980 may be a much more plausible hypothesis.

        00

        • #
          Richard C (NZ)

          >”if no nearby data had a similar step then the software might make the “corrections” seen.”

          Almost Mikky but there’s a complication. If the adjustment method did actually look at nearby data maybe OK, but that’s the problem – they are not in the case studies. They have been shown to ignore nearby data in preference for locations hundreds of kilometres away. Marohasy and others have demonstrated this.

          Then the adjustment is way out of proportion to the break detected.

          >”What may have been lacking is human intervention to question what the software has done”

          Yes exactly, I think so too. But that gap in human intervention to question has now been filled.

          The contentious adjustments are essentially non-climatic e.g. weather related (would you believe? see TR049), or lawn mowing. Weather should be left in (I would have thought that was a no-brainer) and lawn mowing is regular, not once in a blue moon, so lawn mowing breaks should show up every few weeks or months, or maybe once a year in OZ.

          I say, if there is no LOCAL reason to make the adjustment even though the adjustment method (PM95 above) requires one by REMOTE reason, don’t make the adjustment unless the LOCAL reason for the break can be clearly determined and documented for scrutiny.

          10

          • #
            Richard C (NZ)

            >”They have been shown to ignore nearby data in preference for locations hundreds of kilometres away.”

            Case of Rutherglen:

            Jo says: Let’s check out those [near] neighbors (Deniliquin, Wagga Wagga, Sale, Kerang, Cabramurra)

            http://joannenova.com.au/2014/08/bom-finally-explains-cooling-changed-to-warming-trends-because-stations-might-have-moved/

            10

          • #
            Mikky

            I say, if there is no LOCAL reason to make the adjustment even though the adjustment method (PM95 above) requires one by REMOTE reason, don’t make the adjustment unless the LOCAL reason for the break can be clearly determined and documented for scrutiny.

            Totally agree, and I expect most people would.

            No doubt every “algorithm” used by BoM has some general merit, what seems to be lacking is common sense in the application of those algorithms and in the lack of sensible sanity checks.

            00

  • #