The future of climate alarmism is bogus statistics

Dr David Evans and Joanne Nova

The temptation is all too strong. How many bureaucrats would work just as hard to show that their department was less important, less necessary, and less deserving of funding? It’s the fatal trap of socialist management. The incentives are wrong.

When governments are faced with poor reports, but they write their own report cards, they have many options to upgrade their “score”. It’s insane to think that people might not take every opportunity they can to improve their mark. They are human.

Big problems like inflation, unemployment, national growth, or global temperatures can be “improved” two ways –one way takes tough decisions and years of work, and the other way takes a quiet statistical summit, a white paper and an in-house training weekend. It’s easier to “solve” big problems by changing the way you measure them. By changing definitions, methods of interpreting the data, or through sheer statistical chicanery it’s possible to issue press releases with the words “improvement”, “better than expected” or at least “figures have plateaued”.

For example, the inflation of the 1970s was partly “cured” by defining inflation as the consumer price index (CPI), then changing the way that CPI is measured in ways that lower the CPI. Today, the US CPI is about 3 percentage points lower than it would be if the method of 1980 was used. Another example is unemployment, where governments continually refine what counts as “unemployed” so as to lower the unemployment number.

There doesn’t need to be a conspiracy. There is constant pressure and rewards in one direction and very little incentive to “find” or “discover” errors that increase the CPI. It’s a ratchet effect. Dozens of incremental changes slowly ratchet the CPI figure endlessly downwards.

It’s hard to measure prices. There is a lot of room for tweaking. You’d think arithmetic addition would be enough, but apparently it’s better to do it geometrically. Did you know? It doesn’t work like that for us at the checkout. And the quality of goods is improving, true, but how do you compare a modern personal computer with one from 1990? Someone has to decide just how much better the new computer is. That’s open to interpretation, and that interpretation affects welfare payments. Seriously, if someone decides a modern PC is worth, say, ten old ones, then their price effectively “falls” tenfold; the CPI “falls”, and so do future social security cheques. It’s sophisticated theft.

There is excellent summary of John Williams work “auditing” the CPI numbers through his site Shadow Statistics, in a way, he is the “Steve McIntyre and Anthony Watts” of US economic statistics. If you are not aware of how deep the “adjustments” go, and how much those numbers are transformed, get ready to be shocked.

Governments are going to have the same temptation with global temperature, mismeasuring it to make it appear to be rising faster than it really is.

There will be tremendous pressure on those who measure global temperature to show a rising temperature. The minister who commissions a report that finds that CO2 is minor may be right, but he or she is also now a minister of a smaller department. Likewise if Directors of Climate Institutes announce a finding that there is less reason to fear carbon, they are also announcing their own budget cut.

The organizations that measure temperature are in government control. No private group has an interest in measuring global temperatures over the decades required. The government organizations that measure temperatures will tend to produce the answers the government needs. Anyone with a passion to find the adjustments that cool the statistics will feel like they wade through mud uphill; all that hard work and yet no one will get excited. Rewards will be muted. People who do not comply with the unspoken culture will be effectively forced out of those organizations—their careers will stall, or they won’t be hired in the first place if they don’t have the right “attitude”. The staff will self select. Only true believers, for whom the ends justify the means, and those with a sufficiently flexible attitude to truth, will survive and thrive in those organizations.

Think a future of systematic cheating of temperature data to justify taxes, profits, and world government is too cynical? It’s already started:

  • 89% of official ground-based thermometers used to measure temperatures in the USA violate siting requirements—because they are too close to an artificial heat source. NOAA has a $4billion budget but it took a team of volunteers to go out and do the site surveys that NOAA ought to have done. Thanks to Anthony Watts.
  • More than half of the worldwide official temperature-measuring stations are at airports, near radiating heat sources like tarmac, and even in the wash of jet exhaust. Land-based thermometers are mainly measuring increased urbanization and increased air travel.
  • Since 2003 the ARGO network has used the most advanced temperature sensors in the ocean, but the results have already been adjusted “up”. 3,000 buoys roam the world’s oceans, continually diving down to measure temperatures and radioing back the results. To the surprise and annoyance of the government climate organizations, Argo initially showed ocean cooling. Soon afterwards, the buoys giving the coolest results were excluded from the network – their results were simply omitted. Maybe this was a legitimate correction, but like almost all the “adjustments” on any equipment effectively increase “measured” warming. This is not random…
  • The Briffa Hockey Stick Graph effectively “adjusted” down all the temperatures for the last 1000 years thanks to one freak tree that grew wildly fast in the last 15 years. Somehow hundreds of reviewers at the IPCC didn’t look at the data or include all the other studies that disagree. Any implication that this graph was “expert”, verified, or carefully checked, was therefore so baseless it was equivalent to fraud. Again, despite billions in research budgets for climate research, we wouldn’t know about the entirely invalid nature of this graph if it weren’t for the work of a dedicated volunteer – Steve McIntyre.

How would any one person know if global warming is occurring?

People simply don’t notice changes as small as 0.5°C, the amount of global warming over the last century. Even if you thought it was cooling, those with control of the statistics could explain that it was just “regional variation”. It’s beautiful: this scam could go for decades.

Even if snow falls earlier than usual and temperatures are noticeably cooler, what’s to stop a dedicated searcher “finding” warming in the vague tropics, or the deep ocean, or in the upper troposphere? The only thing stopping them is the work of volunteers. This is a crazy way to run civilization.

Cross posted with CFACT as Government-has-the-wrong-incentives

10 out of 10 based on 4 ratings

67 comments to The future of climate alarmism is bogus statistics

  • #
    Henry chance

    We have recently seen data that is tampered with and data this is secret. How can we call it science if it is hidden data?

    20

  • #
    Lionell Griffith

    Henry: “How can we call it science if it is hidden data?”

    Its easy when you hold that reality is a social construct and that words are references to subjective intentions derived from that social consruct. All you have to do is get enough people saying the same things and it becomes “real”. Its the flower and fruit of post modern philosophy. Yes, its rotten to the core and will result in the total collapse of modern technological civilization. THAT is its purpose. It comes from the wish to free one’s self from reality and the responsibility for knowing it. The one who has that wish knows on some level he is not fit to be called human and thus hates the fact he IS human. He seeks to destroy that which he hates.

    Reality doesn’t care which way you chose. Choose correctly and you thrive. Choose incorrectly and you die. There is no middle ground except that of the cannibalism of the productive for the sake of the wishful. Such a practice appears to work as long as there is other people’s wealth to steal and consume. When that runs out, its the end. Our current “leaders” are moving us rapidly to that end.

    40

  • #
    Alan Sutherland

    Yes this is the way beauracracy works. There is however interdepartmental competition to take into account. The Ministry of Ecomomic Development (as an example) can be downscaled because the Ministry of Climate Change is saying we have to curb development for the climate. Now the MED is unhappy about this because it would mean less staff. The person in charge has less status and this will be reflected in his/her salary. Eventually there will be a fightback, usually by the MED forming a new department (e.g called Economic Response Unit to Climate Change). Once staffed up, a secret agenda will be to permit some people to research climate the purpose for which is to reduce the scaremongering so that at budget time the MCC gets less of the new budget allocations than the MED. But not to kill Climate Change fears because every one knows that ALL departments and Ministries get more money the more we keep the fear going.

    In NZ, we had a referendum on the issue of parental smacking. 84% of the 48% who responded said that a light smack should not be a criminal offence. Mr Key didn’t like this answer, nor did a number of other politicians. So the numbers were tortured. 84% of 48% is only 40% of the voters so the result is not clear! Then of course we have surveys where 1,000 people are asked who they would prefer as Prime Minister. 58% might say John Key so this is a positive response Mr Key can be happy with. Hang on a minute, 1,000 is only 0.025% of the population so 58% of 0.025% is 0.0145% of the population. The point of all this being that you pick the answer before you torture,or not torture, the numbers.

    10

  • #
    Bruce

    Here are some encouraging statistics from the UK, that show that depsite all the govt propaganda, the public aren’t buying it.

    http://www.timesonline.co.uk/tol/news/environment/article6916648.ece

    10

  • #

    We added a new chart on Friday that speaks to the issue of temperature mis-measurement. The chart can be found here:

    http://www.c3headlines.com/

    or, here, once it moves from the top of our home page:

    http://www.c3headlines.com/chartsimagespdfs/

    C3H Editor

    10

  • #
    Girma

    The one and only one issue in the global warming debate is the TRUTH.

    The one and only one question is “Does human emission of CO2 causes global warming?

    One method to find the TRUTH is to compare the computer model projections with actual observations.

    Case Closed!

    10

  • #
    Denny

    Girma, Post 6,

    What are you? A Parrot!!! Go to the last article Joanne posted and see my reply to your parrot talk..I do agree though on what you state! :o)

    10

  • #
    Denny

    Joanne, wow, where do you get your energy??? Another great article!..I came across this one yesterday..It took my a while to go thru it and post. Information that will help the “Realists” cause!

    http://www.globalwarminghoax.com/e107_plugins/forum/forum_viewtopic.php?1226.last

    Here’s a record for 447 peer-review papers that help the Realist! You’ll scratch your head as to the number but I explain why!

    10

  • #
    Mark Stevens

    This beautiful saturday morning, the usefull idiots in chief unleash the latest “scientific” scare tactic/prognosis of the forthcoming apocalypse Australia (script could be plagurised from the new flick 2012!)And that only the good and the great peoples hero`s kevi & penny of COP15 can lead you to salvation.
    Welcome one and all to the dystopic orwellian new world order. The prospective imposition of an AGW contrived global government will be a difinative scorched mark in mankinds short history. I have a feeling that there will be tears before bedtime.

    10

  • #
    Lea

    Why is manufactured indispensibility solely the domain of “socialist management”? Don’t you hairy-chested freelancers try the same thing on?

    Why would any government be motivated to overplay AGW with bogus statistics? To do so only brings heat and scrutiny from all parts of the political spectrum. The government is accused of not doing enough or wasting time and money on a non-problem. Are you suggesting any government globally is enjoying this trauma? The issue,implying as it does a threat to toys ,playtime and Cornucopian whimsy, is pure poison. Governments everywhere have been downplaying the data for more than two decades.

    Australian governments and oppositions have been dragged kicking and screaming to acknowledge AGW,and that action on emissions should be taken,and the result? Mock schemes that allow the maintenance of the political pantomime,pushing targets into a contingent-laden future, without in the least mitigating the general cynicism. The government would leap on any substantial,scientifically coherent refutation of AGW. It doesn’t exist,despite-or perhaps because of- dogged volunteerism.

    10

  • #
    Tel

    I think the real problem is the central clearing-house for raw data. If the raw data gets published (e.g. downloadable on a web page) then systematic searching for weirdness can be done by anyone and everyone. If I set a big concrete block into the cliffs near the ocean and mark tide lines on it then that makes a real measurement, but if I send the measurements in secret to a central office who aggregates the result of many such sea-level markers then we are all trusting a single point of failure. Broad publication of the raw data solves this, the central collection point must be forced to use public channels to obtain raw data, such that anyone can obtain the same data.

    With regards to the economic data, often a single central authority gets to collect the raw data and present it for publication the way they want it. With prices, anyone can collect data and it is pretty well understood amongst householders that the price of all the basic things you need to survive has gone up in recent years, much faster than incomes have. All the official number juggling can’t convince people better than what they see for themselves in the supermarket. With things like GDP and employment, people are kind of forced to trust government bogostats because these things are too difficult to collect for oneself.

    On the inter-generational issue, there is a genuine measurement problem with trying to figure out in a quantitative manner how much better my life is compared with my grandfathers. It just doesn’t have an easy answer, the world has changed, lifestyles have changed and standards have changed. Lots of complex systems are genuinely difficult to measure.

    10

  • #
    Tel

    Why is manufactured indispensibility solely the domain of “socialist management”? Don’t you hairy-chested freelancers try the same thing on?

    In a free market, the economic term for this behaviour is “rent seeking” and yes it happens. However, sometimes such rent seeking opportunities are directly bestowed upon favoured individuals by government decree (e.g. the Patent system) so the real authority comes from the central planner. When such things are done correctly, the rent seeking is limited (e.g. Patents expire after a time and the invention goes into the public domain).

    In other cases, rent seeking opportunities are found by achieving a monopoly status or at least a partial monopoly status (known as cornering the market) and this is a well understood problem in free market systems. That is why it is desirable to have not only a free market but a competitive market where no single player can make themselves into an indispensable gatekeeper.

    The real question is what mechanisms exist to break out of a monopoly stranglehold? In a free market it is pretty much impossible to hold a monopoly position for a long time without government support. If government does its job correctly and works against monopolists then the market will regain competitiveness over time as people find workarounds. The harder the monopolist milks their advantage, the more incentive they generate for workarounds.

    Now we ask what mechanisms exist for escaping a central planned government monopoly situation? Competition won’t help when the government can just rule competitors illegal. The only way to break it is to use the democratic system to politicise the issue, which takes huge time and effort. The core purpose of this website is to defeat the centralisation of science and return to a competitive marketplace for ideas where all comers play on equal footing — which requires politicising the issue.

    10

  • #
    Tel

    Like magic, out pops the perfect example of climate statistics at work:

    http://www.abc.net.au/am/content/2009/s2742793.htm

    So for example if we’re talking about the return frequency of the, say, the one-in-100-year storm, that is going to affect different parts of the country in a different manner.

    And for example, I think in Sydney there were some major storms a while ago and if you’re looking at the one-in-100-year event that’s going to become far more frequent by the year 2100.

    Someone is going to have to explain the bold part to me in very simple words.

    But there’s more:

    One hundred and twenty ports are within 200 metres of the Australian coastline, …

    Putting ports on the coastline, dumbarse Australians. That could never work. For starters, the tide goes up and down by more than a meter, so the port must obviously get flooded once a day. Gotta build a port on top of Uluru where it will be safe.

    The Climate Change Minister Penny Wong says she too is concerned by this assessment and a national emergency plan needs to be put in place.

    And from now on, every planning and development approval must consider climate change as one of the greatest risks.

    Ahhh central planning, it fixes almost as many things as beer does.

    10

  • #
    Lea

    Tel,if anyone and everyone is interested in the raw data,then they need to observe rigorous standards in its handling and,very importantly,its dissemination. Systematic searching for weirdness already applies,despite cynicism about scientific process by an agitated few. Who is going to audit the competing meteorological products that the free-market delivers to government? Will there not ever be pressure to sugar the pill?

    Meanwhile,the suggestion in this article is that somehow there is an equivalence between the political handling of manipulable social and market indices, and the handling of physical data like air temperature,because they are both collected/commissioned by government. Argument by false equivalence is pretty weak. One simply can’t omit meteorological data without quickly falling below a threshold that renders the pure and/or derived product useless. While fiddling the CPI is naughty,it doesn’t render the product entirely useless,and there are other means of estimating the full detail pretty well.

    Government has a monopoly on meteorological data because it is expensive to gather thoroughly and process,and private enterprise showed little interest in its value and potential because it requires rather more investment than it delivers immediate return. Sad but true. This isn’t the centralising of science by decree,it’s market forces at work. Has the government ruled the gathering of met data by private groups illegal? No. The market failed to identify a profitable way of gathering detailed information and left the field to community subsidy. With the rise of markets for meteorological products stimulated by the rise of communication technology,private enterprise now sees profit in buying some of that data to package for resale. Private enterprise also provides equipment and re-analysis,but how many private dedicated weather satellites are there,and how many private met station networks?

    The unanticipated current public interest in meteorological data quality is now used as a justification to decry the “socialistic” “government” “non-transparent” involvement in its collection. If this website wanted to “defeat the centralisation of science” and bring on the triumph of the marketplace,they wouldn’t be offering trite ‘analysis’ and unsupported assertion in place of a real technical product. Or begging for chocolate.

    10

  • #
    Tel

    Tel,if anyone and everyone is interested in the raw data,then they need to observe rigorous standards in its handling and,very importantly,its dissemination.

    Yes absolutely, but it is not difficult. A person takes a reading and puts together a data record for that one reading, then they sign the record with their own personal cryptographic key and uploads that record to a database which is replicated to many places around the Internet. The person who took the reading is personally responsible for the honesty of all records signed with their personal key.

    Once distributed to various websites, anyone can download these records and verify the signatures.

    If you want even better protection, each measurement station should be setup so that the hash of the previous record from that station is incorporated as a field in the current record. This will make it obvious if any record is missing from the chain.

    There’s a good explanation of what can be done here:

    http://en.wikipedia.org/wiki/Public-key_cryptography

    Of course, I’m not suggesting actually encrypting the data records, merely signing them for protection against tampering. Software distribution already uses this technique so it has been tested but if particular flaws were discovered in the process then they could be addressed as they came to light.

    I offer my consulting services at extraordinarily reasonable rates should anyone want to implement my suggestion.

    10

  • #
    Lea

    As far as dating and tracking data,that process is fine,Tel. But competent people need to ‘tamper’ with machines and their output for many practical reasons,and stakeholders need to recognise this. One cannot screen the stakeholders for technical literacy and motivation.

    10

  • #
    Tel

    Government has a monopoly on meteorological data because it is expensive to gather thoroughly and process,and private enterprise showed little interest in its value and potential because it requires rather more investment than it delivers immediate return. Sad but true. This isn’t the centralising of science by decree,it’s market forces at work.

    There are two quite distinct issues here, one being who is funding the data collection process, and the other being what happens to the data once it is collected.

    With regards to collection, private collection is real and has always been a part of data collection although you will rarely see a full thank-you list coming from any of the big budget operators. One example:

    http://www.smh.com.au/news/technology/take-the-weather-with-you/2009/03/18/1237054977910.html

    As a general rule, individual collection points don’t make much money, the money goes to the people who aggregate the data. However, technology is making it much cheaper and easier to build this kind of collection network.

    I don’t have a problem with public money going into stimulating additional data collection beyond the “natural” level that a free market would entail. However, it must be recognised that by deliberately stimulating additional supply, they automatically discourage other potential suppliers (that’s how market feedback works). Thus, the explanation, “We have to do this because no one else is!” means nothing. I’ll also point out that government could get a lot more data collection for a lot less money by working with private data collectors (e.g. offering subsidised weather stations) rather than working against them.

    With regards to dissemination, as I said most small-scale private collectors are perfectly happy to give away their data either or free or for a very low price. If my tax money is going into government data collection then as the financier of this operation I would like the government raw data to also be released to the public, in total for either free or a very low price. When taxpayer money goes into analysis then the results of this should fall under the same rule as far as I’m concerned.

    With regards to private data aggregation and analysis, I recognise that these can be quite costly so a competitive market in such things would ensure that buyers get the best price available. If government is determined to push taxpayer money into large scale aggregation and analysis of data such that they kill any private interest in supply then from a science point of view there will always be someone willing to run an independent analysis (out of determination to prove a point, even though it may unprofitable to do so).

    Finally I’ll also point out that the worldwide interest in climate data right now is primarily the result of governments threatening to crank up our tax for little or no detectable benefit. In this sense, it is not genuine productive interest in such things, it is merely a defensive interest to prevent panic merchants using it as a club to whack their opponents. I assure you that I have much better things to do than worry over global warming, but if my tax keeps going up and up every time I get a little bit ahead, and if a litany of weirdo excuses are used to discourage me from even making any effort in life then I simply cannot do all those other things that I’d actually enjoy doing so I’m stuck here manning the barricades and trying to deflect cut number 998.

    I don’t want to be the enemy of the central planners, but they put me in a position where I have no other good options. All they need to do is leave me free to get on with things and I’ll be out of their hair. I’m even willing to pay my share of protection money and write that off as one of life’s expenses but this open-ended take what we want arrangement is too much.

    10

  • #
    Mark Stevens

    The Future Of Climate Alarmism Is Bogus Statistics

    no kidding. penny peril`s bedtime tales of terror SLEEP WELL MY PRETTIES.

    And what a globaly governed coincidence, the nobel prize winning pres (good bloke,left of field category)approves `asian trading block`THIS WAY TO THE NEW WORLD ORDER

    interesting times indeed…

    10

  • #

    Thanks for the other examples of “adjustments” or data corrections in favour of warming.

    This is the perfect thread to put up all the examples you can find, or even just write up examples you remember. Lets compile a list. It would make a very persuasive paper or post. I’d really appreciate the help.

    Links to graphs of adjusted data or scientific papers would be very useful. 🙂

    Cheers,

    Jo

    10

  • #
    Tel

    This is off topic and indulgent but it’s also the funniest comment I’ve seen in a while:

    http://blogs.abc.net.au/offair/2009/11/a-big-dry-australia.html

    In California Democratic Senator Dianne Feinstein is trying to block the building of solar power plants in the Mojave Desert. It’s a move that, understandably, bemuses the Governator.

    “If we can’t put solar power plants in the Mojave Desert, I don’t know where the hell we can put it,” Arnold Schwarzenegger said.

    You have to get the Conan accent working right and read it nice and slow 🙂 think along the lines of, “If you don’t bring me strength in battle…”

    At any rate, the Luddite aspect of the Green movement is starting to accumulate disgruntlement, good!

    10

  • #
    Rod Smith

    I guess I’m just being an old stick in the mud but —

    re: Raw weather data availability —

    The WMO Global Telecommunication System (GTS) still moves all significant surface and upper-air observations plus forecasts in near real-time to telecommunications hubs and thence to customers around the world. Basically this is to support world wide aviation operations, centralized short and long term forecasting, and weather alerts/warnings.

    Personally, I don’t include most climate network observations, especially those of just temperature, plus possibly a daily precip accumulation, as being “significant,” and I don’t believe they meet WMO basic reporting standards or are generally available via GTS, however I could be mistaken.

    These world-wide observations and forecasts are generally taken automatically or by trained and qualified weather observers and are transmitted very shortly after creation. Thus there is only preliminary quality control. The “raw data” is generally collected at several central locations, put through quality control, and the results archived. I think that some of this pre-archive “quality control” now amounts to fabrication, but that is just my opinion.

    I would agree whole-heartedly that the “raw” data, BEFORE ANY CORRECTION, should be archived and available to all potential users. In fact, I don’t believe this is the case in any of the current archives. And since all (as far as I know) of these archives are operated or sponsored by governments, it seems to be a political problem.

    I would think the best way to force the issue would be through WMO. Every person here ought to be able to pick up the phone and ask his own particular national WMO representative why this can’t be required. I haven’t talked to a WMO representative for years, but when I was in the weather “business” I did, and I got good, straight, timely answers to my questions.

    I would also assume that in most cases, governments can (and I’m sure, do) direct their WMO representatives to push for desirable regulations, hence it might be useful to contact legislators as well.

    As an addendum, I will add a paragraph describing the format of 2008 world-wide surface observations provided on DVD by NCDC:

    “Element Quality Data Section – The element quality data section contains information on data that have been determined erroneous or suspect during quality control procedures. Also, some of the original data source codes and flags are stored here. This section is variable in length and contains 16 characters for each erroneous or suspect parameter. The section has a minimum length of 0 characters and a maximum length of 1587 (1584 plus a 3 character section identifier) characters.”

    Has anyone checked that source? Is the raw data there?

    10

  • #
    Coolio

    @ Alan Southerland

    Last year I asked CDIAC for an update of their aged data tables.

    Their response was:

    It could be any one of a number of reasons, depending on which data set you are talking about.

    Some are politically sensitive and require a lot of scrutiny.

    Politically sensitive… ahhh.

    10

  • #
    Lea

    So,Coolio,if you ran a database and you noticed that some users were producing sub-standard products from that data,would you not be wary?

    10

  • #
    Ian George

    David and Jo,
    Totally agree. I am checking some of their statements they have recently made and am looking into the following.
    ‘NSW statewide average maximum temperature was 25.2°C which is 0.7°C above the historical average of 24.4°C.’
    This is a statement from the BOM ‘The Recent Climate – summaries and diagnostics’ posted on their website at:-
    http://www.bom.gov.au/climate/current/month/nsw/summary.shtml
    But check the summary statistics table which list all the W/S in NSW – there appears more cooler temps than warmer temps.
    So I added all the temps in the summary and divided them by the 185 stations listed and found that the average should read below 24.4°C, not above.
    I asked the bureau how they arrive at the average temp but so far no answer. Does anyone know how they do it?
    They also announce events like Darwin’s hottest October ever (‘Maximum temperatures were unusually warm in Darwin this October, with Darwin recording it’s warmest month on record’). However, they neglect the higher Oct max temps in the late 1800s, especially 1889 with 35.7C (http://www.bom.gov.au/jsp/ncc/cdio/cvg/av). If they keep giving this sort of misinformation over the media, people believe it and they fulfil their agenda.

    10

  • #
    Lea

    Ian,did you check Note 1,at the foot of the table? Some listed stations do not meet the minimum time criteria for calculating variations in the long term. I think the claim for Darwin is based on the Darwin Aero site,which is the official site since 1941…as in fact the BOM press release states. I agree they could have raised the old Post Office data as a caveat,though I’d add that pre-1910 data has quality control issues in many cases.

    10

  • #
    Tel

    So,Coolio,if you ran a database and you noticed that some users were producing sub-standard products from that data,would you not be wary?

    No you really don’t get it. The idea is to have each record signed off against the source who generated that record. Databases have filter algorithms so if someone wants to filter out a particular source for whatever reason then such would be specified in the query. The important point is that all the raw data is available and tampering is not possible. That is the whole and complete job of the database operator.

    Analysis and filtering of that data is a separate step so the person doing the analysis gets to decide which filters are appropriate for their particular purpose. If they produce a product then they document the source of the raw data, and they put their own trademark to the product that they produce.

    It is absolutely unacceptable for the database itself (or admin thereof) to decide to impose filters and to start deleting or adjusting records, nor should they stick their nose into trying to tell people what purpose the data might be used for. Such things are not the business of the database administrators.

    If you believe that calibration is important then by all means encourage a system of calibration signoff on the data source (for a given date). There’s a good place for government in this process (and I don’t mean by taking over the whole process):

    * Setting standards for data record format to encourage easy exchange of records.

    * Setting standards for a signoff system so that verification and filtering can be done on a systematic basis

    * Pushing small subsidies into the industry such as offering cheap calibration services, or an easy to use national register of approved calibration services (i.e. foster a marketplace).

    * Putting together an open standard for how a data aggregation server should operate (in terms of delimiting the responsibilities and interfaces) so that it becomes a commodity rather than a special case.

    10

  • #
    Lea

    Tel @ 26,thank you,but I do get it,you’ve presented a model data admin. scheme. Perhaps if Coolio provided the raw data- the full detail of his interaction with CDIAC- we’d have some context to evaluate the truncated response we have at present.
    You may find BOMs data practises unacceptable but,given that in their case the data administrator is also the dominant collector and user of that data, they are more than aware of the historical and technical issues that compromise data quality.
    In response to your post @ 17, have you any evidence that government is ‘working against’ private data collectors in this area? Or does the need for certain standards impose high costs on small players? Folks everywhere are running independent analyses of government data, but their independence doesn’t guarantee the soundness of the work,no matter how superficially noble or ideologically attractive their motivation. Re your penultimate paragraph, it’s unfortunate that you should find the matter so demotivating. Your allusion to government keeping private enterprise out of meteorological and climate research ignores the historical context; while a partial/full privatisation model for BOM may be workable today,it took a century of socialising the losses to get it here. Ooh,I see my ‘data’ @10 has been ‘filtered’…

    10

  • #
    Tel

    I’m well aware that the current system involves a single monolithic entity that does all the jobs: data collection, data aggregation and data analysis. What you don’t get is why this design is precisely where the problem comes from. The internal couplings of the current monolithic entity are non-transparent. In effect we are handed the “just trust us” line which we are constantly being handed again and again.

    What you also don’t get is operation of the basic law of supply and demand in a marketplace. Artificial stimulation of supply only has a small effect on increasing demand so it crowds out all other suppliers. To use a very simple example: suppose a government department was created to deliver free (or very cheap) oranges all over the community. To some extent people would eat more oranges, but mostly there’s a limit on how many oranges you can eat, so the free oranges merely replace the supply of oranges that people were buying anyhow. Everyone in the business of selling oranges would try to get a government supply contract, but if that was not available they would quickly give up and shut down their business.

    Suppose government really pushed to make people consume more oranges with advertising campaigns, health promotions, and what have you. People would start using oranges for other purposes just to find something to do with them. They would be degreasing engine parts, throwing them at the neighbours barking dogs, munging them into fertiliser so they can grow something not an orange, etc. The end result would be expensive, wasteful, and inefficient. Then government would say, “But look, we are the ONLY ones who produce oranges, without us you would have no oranges at all!”

    Going back to the problem of a monolithic government department, if the internal workings of such a department are sealed from outside scrutiny then it becomes impossible for niche suppliers to work in a way that usefully couples with the monolithic entity. The niche supplier in the market must effectively rebuild the entire structure from bottom to top, which is an enormous waste of effort. Thus, not only do they crowd out other suppliers but then they put a large barrier of entry in the way of anyone trying to startup a new idea. This is a bad problem when it happens with a private monopoly but worse when done with public money because the people paying for it have no choice.

    If you want to go into the history of meteorological data collection, it was done for military reasons and I suspect that a great deal of current meteorological activity remains driven by the military. I reject the implication that there is some debt that needs to somehow be repaid from past “socialised losses”, I’m only asking for data that the public paid for to be made available to said public (within the bounds of military security classification, should that apply here).

    You seem to believe that the monolithic entity achieves better quality control than would a bunch of semi-independent suppliers and I would argue that not only is there no evidence for this belief, there is good reason to believe the opposite. For starters, who exactly audits the monolithic government department? Well, it audits itself! No one outside has the necessary visibility to be able to perform an audit.

    To provide a more practical explanation, let’s go back to the oranges example. In the one scenario you go into your local office and you get handed a bag of oranges and that’s it, you are told to leave. You have a choice of one supplier. In your bag of oranges you get what you get. Does it make you feel any better if the bag comes with a note, “we checked em” ?

    In the other scenario, you have a diverse range of suppliers and some of there oranges are rotten but you don’t have to buy those, you get to shop around and choose what you want. If the supplier wants your business, they will deliver what you want. The role of government in this situation is to make sure that the suppliers cannot cheat, so they cannot promise one thing and deliver something else. This requires a framework where evaluation of the product can easily be done by all parties, the product does not have to be good, it merely has to be something that can be evaluated.

    In the case of Coolio’s data, by my thinking “politically sensitive” is never a valid explanation for scientific manipulation but making the original raw measurement data available would avoid any need for explanation because Coolio could independently re-run the analysis. In the current situation, no matter how many details of his correspondence Coolio publishes, it will always be one person’s word against another — a useless standoff.

    In the case of the blog filtering comments, of course with a click anyone can read them, but if that’s not convenient I’m sure it could be structured for each user to set their personal threshold settings to hide or unhide comments. You would have to ask Joanne nicely and see what she can get implemented. There are many other potential features that might be nice like keyword tags in comments, the ability to ignore certain people (on a personal preference basis). I think it is harsh and unfair to criticise some limitations of one particular set of tools (put together on a tiny budget) and use this as justification for deficiencies in systems with many thousands of times bigger budgets.

    10

  • #
    Ian George

    Lea
    What’s interesting is that the Darwin PO, which has records from 1888 to 1941, posts an average yearly max temp of 32.6C.
    The Darwin airport W/S began recording in 1941 and goes to the present day and has an average max temp of 32.0C. This is a drop of 0.6C over the past 70 years. Since I doubt the UH effect would have been very great in the first half of the 1900s, does that mean that temps are cooling in Darwin?
    By the way, Stevenson Screens were used at Darwin PO in the 1890s so it can’t be that either.

    10

  • #
    Brian G Valentine

    Did anyone ever notice how “global temperatures” seem to have a habit of rising when they are actually measured, instead of estimated?

    How many temperature measurements of the North and South Polar regions do you think were actually made 100 years ago?

    And who would have made them?

    As far as I can tell, despite all the advertisement of their capabilities to measure things, all of the data gathered by the Catlin Research Team in their late winter hapless journey to the Arctic last year were as follows:

    “temperatures never exceeded -40 degrees for a period of two weeks.”

    That is about as much data as any other “research team” would gather in a foray into the Arctic or Antarctic in winter of 100 years ago, too.

    [I would be interested to know if Catlin made any meaningful observations whatsoever. At least they didn’t have to fool around making sextant measurements of stars to determine their location, which requires considerable practice and repeated measurements to pinpoint their location anywhere within about 100 km of the Pole]

    10

  • #
    Charles Bourbaki

    Ian George; I asked the bureau how they arrive at the average temp but so far no answer. Does anyone know how they do it?

    More to the point; does anyone know why they do it? As an intensive thermodynamic variable, the average temperature of two systems has no meaning if the systems are not in equilibrium. An average temperature of the globe of 288K or whatever is a statistic that has no meaning in physics. And then to get hot and sweaty about a 0.6C rise in it really defies belief.

    See C. Essex, R. McKitrick, B. Andresen: Does a Global Temperature Exist?; J. Non-Equil. Thermod. vol. 32, p. 1-27 (2007). and here for an article by Prof Essex

    10

  • #

    Hi Lea and others, people, would it be more useful if we saved our negative ratings for comments that were rude, or broke laws of logic? There have been plenty of those, and I would be more than happy for them to be hidden as they contribute nothing to the discussion.

    While most of us may not agree with Lea, there is a polite discussion. This is a good thing. Lea may well be asking questions that many people think, so these points are useful, and Tel especially has done an excellent job of answering. (Thanks!)

    Lea, I’ve tweaked the ratings. I hope your point stays visible, but you do understand why your point in #10 is not the case?

    Why is manufactured indispensability solely the domain of “socialist management”? Don’t you hairy-chested freelancers try the same thing on?

    Yes, and it’s called a monopoly and the legal system or the government ought to try and limit any situation where one group is the main controller of any supply chain. Plus the freelancers, or corporates are not the ones making the laws, so there should never be a situation where there is a conflict of interest where the group measuring the item is also the group who makes the rules on who has to buy it, and how much they have to pay for it. (Microsoft does push this to extremes, because it does kind of “make the rules”).

    With the CPI, the government has to pay more if the CPI is higher, so, perhaps unconsciously, everyone in the department heaves a sigh of relief when it is “lower”, and no one parties in the office when someone suggests a factor that would “raise” it. Why? The department-head knows he’ll have to ring the Minister of Pensions or what-not, and explain why they’ve been underestimating CPI for 20 years and really ought to massively increase social security (retrospectively). no one wants to make that phone call.

    Why would any government be motivated to overplay AGW with bogus statistics?

    Read the blog.
    “The minister who commissions a report that finds that CO2 is minor may be right, but he or she is also now a minister of a smaller department. Likewise if Directors of Climate Institutes announce a finding that there is less reason to fear carbon, they are also announcing their own budget cut.”

    All ministers want to be ministers of the biggest most powerful departments. Who wants to be the one promoted to the $100 million Dept of the Environment, only to have to sack half your workers and be in charge of a $50 million department?

    10

  • #
    Louis Hissink

    Jo,

    I think you are right in pointing out that there is no conspiracy behind AGW – the problem is the use of the scientific method to prove an agreed on consensus that emitting CO2 causes warming. This is called pseudoscience and it’s a technique that derives from Plato, so it’s not a new thing at all.

    I’ve started looking at the Fabian connection as well and their aspirations for a just society etc are indeed laudable, and sincere but they seem unable to understand that socialism actually does not work except under a totalitarian system, or as a mutually agreeing networked collective that Maurice Strong started to create after 1989.

    What we are really fighting against is statism – and on reflection its really no different to the feudal system that we are supposed to have left behind centuries ago. Socialism is better understood under this light, rather than the older Marxist cliches used before. Same robber barons but this time veneered with impeccable manners and a fondness for statism where they run the state and indirectly its institutions.

    Hayek pointed out that the road to serfdom happens accidentally – from incremental increase in government regulation to solve mainly imagined, problems. Those among us who lust for power (the lust is the same whether spiritual or secular power) will remain always with us and take advantage of the situation handed to them on a plate designed by our politicians with their ever increasing enactment of bills and regulations.

    I often wonder what a politician’s KPI is – the creation of an increasing number of new acts of parliament?

    So Hayek was right – we are literally sliding into serfdom and the majority don’t seem to realise it, especially the younger generation.

    But it’s worth remembering that the long term goal of the Fabians is a world socialist system and they are close to achieving it if Copenhagen is signed.

    As for the Greenie belief in a future catastrophe – well, Velikovsky, writing in his capacity as a pyschoanalyst, summarised it in his posthumously written work “Mankind in Amnesia”, for those interested in the topic.

    10

  • #
    Louis Hissink

    Jo,

    Monopoly – originally meant a government sanctioned exclusivity to a particular individual or group, but now subtly changed to its present meaning. Murray Rothbard wrote on it on the Mises site.

    10

  • #
    Ian George

    Charles,
    I was only asking how the BOM came up with the figure for the the each state’s average monthly temperature. Does anyone know?
    Do they average out all the available state W/S averages? Do they choose the average between the highest and lowest temps in each area?

    10

  • #
    Brian G Valentine

    Charles #31 –

    You are correct, there really is no way to define the “temperature” of something that never reaches a steady state, let alone an equilibrium.

    Moreover, the “average” temperature of the Earth is described by a variable that is consistently too high to represent an “averaged” temperature, this has been dicussed at length by Gerlich & Tscheuschner.

    [I used the “Foundations of Mathematics” series by the “author” with your last name as text books when I was a grad stdent, I keep thinking you are a group and not an individual. Just joking]

    10

  • #
    Gaz

    With the CPI, the government has to pay more if the CPI is higher, so, perhaps unconsciously, everyone in the department heaves a sigh of relief when it is “lower”, and no one parties in the office when someone suggests a factor that would “raise” it.

    Jo, I can only admire the Quixotic zeal with which you charge into areas about which you have no clue.

    Do you have any evidence whatsoever that the construction of the CPI is influenced by political considerations in even the slightest way?

    By the way, it’s the Bureau, not the “department”.

    If you’re going to attack someone’s reputation, you could at least get their name right.

    10

  • #
    Brian G Valentine

    In Australia the CPI is compiled by the Australian Bureau of Statistics ABS, which by 1975 law is not a Government entity (in the USA the Bureau of Labor Statistics, which compiles the CPI is part of the Department of Labor), and so the CPI would not be expected to be influenced by the Administration’s politics in Australia, but I don’t think Joanne is claiming that the CPI is influenced by the Administration’s or anybody else’s politics, she simply notes that the ABS is not partcularly enthusiastic at any time to deliver bad news to the Government or the Public or anybody else.

    I think this has more to do with human nature, frankly

    10

  • #
    Fiona

    Compelling viewing on YouTube: The Great Climate Swindle.

    Documentary includes commentary from many scientists saying that man-made global warming is propaganda not science.

    10

  • #
    Fiona

    Sorry that should have been “The Great Global Warming Swindle”. The layman’s science explained.

    10

  • #

    Jo, I can only admire the Quixotic zeal with which you charge into areas about which you have no clue.

    And my link in the article to Shadow statistics, and referral to hedonic calculations (via computer prices), as well as geometric calculations leaves you thinking I havent looked carefully into this?

    True, literally it’s The ABS, or The Bureau of Labor Stats in the US, but I was talking generically, not specifically. I’m not accusing anyone of being dishonest, I’m pointing out how the system creates a rachet effect with incentives to lower the CPI.

    As for evidence. Sure, when I take a breather from debunking climate change I’ll get back to economics. I’ve graphed CPI and Money supply for five decades, and they fit tightly until they diverge from the early 1980’s. Yes, there is a very real reason and empirical data to suspect that the CPI is not measuring what it used to.

    10

  • #
    Girma

    David & Joanne.

    You wrote, “… this scam could go for decades.”

    I disagree. It will not continue more than couple of years, because we will take them to account this discrepancy between computer projections and actual global temperature measurements.

    10

  • #
    Tel

    Interesting article by Essex, McKitrick and Andresen above but although I see what they are getting at, I really don’t like the article because it claims “statistics are not actually temperatures”. The problem is that ALL temperature measurements are averages, it doesn’t even make physical sense to have a temperature existing at a point in space for an instant in time. Temperature is a statistical property of kinetic energy in atoms, you can’t have a statistical property without collecting a group of samples and performing some sort of digest calculation.

    The authors obviously know this because they state:

    Science and engineering are used to such indirect measurements. Temperature itself, for example, is almost never measured according to its thermodynamic definition, T = (U/S)V, N1, N2, N3 … , but by measuring a volume, bending, electrical conductance, eigenfrequency of a crystal, radiation spectrum etc. All such measurements rely on different assumptions being met. Physicists and chemists are well aware of these restrictions.

    They make it sound like there is some Mojo within physicists and chemists setting them apart from other mortals, but I assure you that the only Mojo is that physicists and chemists (and engineers) are willing to accept that their measurement has errors, and go through some process to put upper and lower bounds on those errors.

    Consider a practical example: a doctor uses a pyrometer to take a check a patient for fever by poking it into the patient’s ear. Suppose the pyrometer detects a single photon emitted by the cavity, does that give the temperature? No, you need a whole histogram of photons collected before you can reasonably estimate the temperature. In other words, you need a statistically significant sample. The inside of a human ear is not precisely uniform temperature, nor it is a black body, but it’s a good enough approximation within the bounds of what you are trying to measure.

    The authors go out to extremes to try and prove their point by claiming that a measurement is only valid when no overlap occurs:

    In contrast, it is obviously valid to make the observation that the temperature field of the Sun is hotter than the temperature field of Pluto, yet each of these bodies have nonequilibrium temperature fields. What makes this comparison different? The simple answer is that there are no common values in the respective temperature fields. This was not the case for the example in the preceding section, where the interval spanned by the range of temperature at time t = 0 contained the interval spanned at later times.

    But it you want to go to extremes, the temperature of a tungsten-halogen filament light bulb here on Earth is hotter than the surface of the sun (you can tell by the blue colour of the halogen light). Thus there is overlap between the respective temperature fields of Earth and Sun so would this invalidate a statement that the sun was hotter than the Earth? You need some sort of spatial average to reduce the significance of a handful of light bulbs.

    By the way, ignoring all questions of physical interpretation and considering only the mathematics of the situation, suppose I take a Lorenz attractor (only 3 state variables) and calculate a time series using these equations. At some time I make a small step in one of the parameters of the attractor and continue the time series. The final output data gets handed to someone else who is given the task of discovering at which time the change occurred and how big was the step. Can it be done? What is the error?

    What if the output of the time series is chopped down to only a single state variable (with the other two hidden)?

    10

  • #

    Tel: “…I really don’t like the article because it claims “statistics are not actually temperatures”. The problem is that ALL temperature measurements are averages,….”

    You are making an error of construction. You are assuming that since temperature is itself an average of sorts that it makes physical sense to sum temperatures and then divide by the number of temperatures. It does not.

    Temperature is RELATED to the average of the kinetic energy of the atoms/molecules in the substance measured but it is NOT a measure of kinetic energy itself. It does make physical sense to sum kinetic energies to find the total kinetic energy. If you were to sum the kinetic energy of the atoms/molecules of the globe, divide by the number of atoms/molecules, you could find the actual average kinetic energy. Then, if you could convert THAT average into temperature, you would actually have an “average” temperature. This is NOT what you do when you sum temperatures and then divide by the number of temperatures taken.

    Consider that if I have two glasses of water each at 82 degrees F, if I “add” them to each other, I have water at 82 degrees F and not 164 degrees F. I have twice the kinetic energy BECAUSE I have twice the water with each half having half the kinetic energy. I do not have twice the temperature BECAUSE I have also have twice the atoms/molecules. Hence, a summation of temperatures does not make physical sense and computing Global Temperature by summing temperatures ALSO does not make physical sense.

    Simply because an action feels like its OK to perform does not make it proper to perform. It actually must be proper to perform. The error is that averaging feels so right because many averages we use daily are quite proper averages. Yet, an average of temperatures has no meaning in reality without first averaging the kinetic energy of the particles at the points the temperatures were measured and computing the compound temperature from that.

    10

  • #
    Keith

    Australian CPI statistics are not only used to increment pension and other payments. They actually get used to adjust revenue as well. Fuel excise, capital gains tax for examples.

    The Australian Statistician is the most independent public servant in the country. He/she cannot be sacked by their Minister (the Treasurer). It takes both Houses of Parliament to do this.

    10

  • #
    Tel

    If you were to sum the kinetic energy of the atoms/molecules of the globe, divide by the number of atoms/molecules, you could find the actual average kinetic energy. Then, if you could convert THAT average into temperature, you would actually have an “average” temperature.

    By saying that you are also saying that at least from a theoretical standpoint a global temperature does actually exist. The article above makes the claim that there is not even in principle a global temperature.

    Of course any statistical property of a macroscopic object cannot fully describe the object. You say “two glasses of water each at 82 degrees F” but no glass of water is precisely uniform temperature, nor is water precisely the same as an ideal gas. If anyone was to take Essex, McKitrick and Andresen at their word then it would not be possible to measure the temperature of a glass of water (or anything at all). Exactly the same problem of averaging happens when you measure a glass of water as it does when you measure the entire Earth. There is no difference in principle, the only difference is that the glass of water is closer to the ideal theory, so the measurement error caused by non-uniformity is smaller.

    It makes a lot more sense to do a formal estimate of the error based on whatever definition of global temperature you want to go with and based on what sort of coverage we currently have with surface temperature probes. For example, in a city you can easily see a temperature gradient of the order of 10 degrees within one block if you are walking out of the shade of some trees into a carpark or across a road. Based on knowledge of typical local temperature gradients and upper and lower bounds it is possible to get a reasonably good estimate of how many temperature probes would be required to model a city to some particular desired accuracy. For a low accuracy estimate, a few probes is enough. If you want more accuracy you would need more probes… that’s normal with statistical measurements. All statistical measurements throw away data, deal with it.

    Using this sort of approach, you could get some idea of the error margin in global temperature estimates, and also the error margin in the thermal maps that you see. The result would be much more useful than a strictly correct but completely impractical claim that temperature does not exist.

    10

  • #
    CyberForester

    “Governments are going to have the same temptation with global temperature, mismeasuring it to make it appear to be rising faster than it really is.”

    They are also going to manipulate the data to show that their actions have had an effect on the climate.

    The CPI has always been a bone of contention with me. I have protested that there is CPI, PPI and every individual’s personal inflation experience, which is usually 3 to % higher than any official “measure”.

    10

  • #
    Denny

    Update on Post 8,

    Here’s a record for 447 peer-review papers that help the Realist! You’ll scratch your head as to the number but I explain why!

    Well, I must “apologize” for an incomplete referral. After finding out from the Author of this article that He couldn’t find the 3rd article that wouldn’t connect, I noticed that all of the article wasn’t there. It’s the largest I’ve ever posted. I had to break the article into 3 parts. Also at this time I still have an issue on one paper. It’s being taken care of soon. Here they are:

    http://www.globalwarminghoax.com/e107_plugins/forum/forum_viewtopic.php?1236

    Part II: http://www.globalwarminghoax.com/e107_plugins/forum/forum_viewtopic.php?1237

    Part III: http://www.globalwarminghoax.com/e107_plugins/forum/forum_viewtopic.php?1238

    Sorry for the inconvience if you only read the article at GWH.com Now you know why I have a “Frustrated Monkey” for an Avitar at my postings! Yes, I even make Him do it too! 🙂

    10

  • #

    Tel,

    You continue to equivocate.

    The point they were making was that averaging point temperatures does not produce a true global thermodynamic temperature. Its simply an average of a set of numbers. A number with all the meaning of the average of all the phone numbers in a phone book. Its simply a number arrived at by a particular process. That process violates the nature of underlying reality and, as such, is total nonsense.

    I pointed out a process by which one could produce an “average” temperature that had a physical bases. The process is anything but computing an average of point temperatures no matter how numerous.

    If you don’t like the thought experiment of two glasses of water at the same temperature, how about the following?

    One quart of water at 175 degrees F and ten quarts of water at 25 degrees F. The “average” temperature of the two containers is 100 degrees F. Mix the two together and the temperature is slightly less than 175 degrees F.

    The AVERAGING of point temperatures does not produce a thermodynamically average temperature. The reason is that the summing of temperatures and dividing by the number of temperatures has no relevance to the thermodynamics of the situation. It is only a mathematical operation that sometimes makes physical sense but does not in all cases even though it can be done.

    To know “error margin” you must know what the true value is. You admit that each measurement will have an error AND that it is impossible to know THE temperature at a given point because there is a natural thermodynamical variation in the point energy level.
    Hence, it is impossible to know the “error margin”. The most you can know is the variation in repeated measurements of a given point. As demonstrated above, averaging point temperatures has no physical meaning. Hence, such “average” numbers are not even wrong, they are totally arbitrary. An error bar on a meaningless number is meaningless squared.

    Again we get back to the issue that reality is real and that words refer to existent things in that reality. What you want, wish, need, demand, command, intend, or say are totally and completely irrelevant to what reality actually is no matter how many of you there are.

    Tel: “The result would be much more useful than a strictly correct but completely impractical claim that temperature does not exist.”

    What more can I say. You are saying that a totally meaningless number is much more “useful” than what is actually the case that a global temperature cannot be measured or at least cannot be measured by the methods used to date. Sure, the warming alarmists are “using” the numbers produced by fraudulent and invalid method. Of what value is this to us? Looks to me we are the ones who are going to have to pay the price for the fraud.

    10

  • #
    Tel

    The point they were making was that averaging point temperatures does not produce a true global thermodynamic temperature.

    Incorrect. They claim that there is no such thing as a global temperature, regardless of measurement methodology.

    Physical, mathematical and observational grounds are employed to show that there is no physically meaningful global temperature for the Earth in the context of the issue of global warming.

    That’s the first part of their abstract.

    10

  • #
    Tel

    To know “error margin” you must know what the true value is. You admit that each measurement will have an error AND that it is impossible to know THE temperature at a given point because there is a natural thermodynamical variation in the point energy level. Hence, it is impossible to know the “error margin”.

    That’s not how measurement of any physical quantity actually works. If you knew the true value you would not need to measure it.

    For example, when digitising an analog signal you get an error margin of half the LSB in either direction. You simply never know where the true value is within that error span, but if it was outside that span then it would toggle to the next whole bit (presuming the ADC is working and you are not saturated at either end of the scale). The whole point of error estimation is figuring out ways to put bounds on values without knowing the true value.

    10

  • #

    Tel,

    Again you equivocate.

    Error margin means ERROR margin. Error is the difference between the true value and the measured value. Since you cannot know the true value all you can do is measure it repeatedly and compute the variation of measurement about the mean measurement. You expect (wish, hope, feel, want) that the resultant variation is the margin of error but all it is, is the computed variation in your measurements.

    If your mean value were actually the true value, your computed variation about the mean would be your error margin. It may be close. You hope it is close. Hope doesn’t make it so. Why not call it what it actually is: VARIATION OF MEASUREMENT?

    I understand that the reality based meanings of words are mostly irrelevant to you. You simply use your foggy internal intention of meaning to pretend you are saying something about reality. As a consequence you hold that reality is not real. Its only what you imagine it is. That way you can bombast, distort, and equivocate argumentation with others and appear to get away with it.

    It might come as a surprise to you but reality doesn’t give a damn about how many argumentation points you win. The ONLY thing that counts is to understand clearly what reality actually is and act accordingly.

    10

  • #
    Neil Lindsay

    I have dealt in statistical data analysis all my life and refused to take data out to fit what I had expected to prove. It is common for statisticians to remove data that doesn’t fit the idea they want to prove.

    There is a saying among statisticians, “Figures don’t lie, but liars figure.”

    10

  • #
    Tel

    Once again you win Lionel.

    10

  • #
    Ian George

    Lionel
    Do you know how the BOM calculates its state monthly average temperatures which it posts in its climate summary report?
    ie at http://www.bom.gov.au/climate/current/month/nsw/summary.shtml

    Does anyone?

    10

  • #

    Ian,

    From note 1 on your link:

    “Averages: Averages are based on the period 1961 to 1990 which is a convention of the World Meteorological Organisation

    Normals are long-term averages based on observations from all available years of record, which vary widely from site to site. They are not shown for sites with less than 20 years of record for temperature and less than 30 years of record for rainfall, as they cannot then be calculated reliably.”

    I presume “average” has the usual interpretation: sum the measured values and divide by the number of values summed. However, I am not able to find any explicit statement of how they compute the “average” beyond that noted above. How they measure the values and how they select the measured values to average is not obvious to me at this time.

    As I suggested in posts 44 and 49 (See post 31 for a link to a more exhaustive technical discussion of the issue) an average of temperatures produces a number without physical meaning. Especially if its an average from measurements from widely dispersed points over a long period of time.

    Clearly, the “professionals” pretend their average temperatures have meaning and the general public buys it. That’s how they keep their jobs and expensive toys even when they can’t predict the weather all that well beyond three days or so. All you can be sure of is that there is a 100% chance of a forecast.

    10

  • #
    Denny

    Tel: Post 54,

    Once again you win Lionel

    .

    Tel, don’t you know it’s hard to stop a Lionell “train”! You know the one all young boys always wanted for Christmas? Ooops, maybe that was before your time…Right? Brian would know, he probably has one!

    10

  • #
    ian George

    Thanks Lionel.
    I understand what you’re saying ie these averages have no real meaning but it’s the only comparison we have at the moment.
    When I looked at the NSW monthly av max summary for October, I noticed many cooler anomalies than higher ones.
    So I added the 185 temperatures listed in their summary statistics and divided by that number to get around 23.0C, well below the average of 24.4C (not the 0.7C above as they record). I emailed the BOM but as yet haven’t received a reply.
    Why I mention this is because I believe this is the sort of thing that Jo is talking about in her post. Maybe someone in the know can help me work out why there is such a discrepancy.

    10

  • #
    Tel

    The Australian BOM explain their statistical process…

    http://www.bom.gov.au/climate/cdo/about/about-stats.shtml

    Mean values

    The mean value, also known as the average, is one of the most common statistics used to provide an estimate of what is most likely to happen. It is not necessarily equal to the most commonly occurring value, which is known as the mode, but for most elements it will be close. By itself, the mean does not provide any information about how the observations are scattered around the mean; whether they are tightly grouped or broadly scattered.

    Median is also explained.

    They did a major reprocess of statistics in 2007 with the intent of improving their data quality. Some of the classification methods changed.

    http://www.bom.gov.au/jsp/ncc/climate_averages/temperature/index.jsp

    Average annual temperatures (maximum, minimum or mean) are calculated by adding daily temperature values each year, dividing by the number of days in that year to get an average for that particular year. The average values for each year in a specified period (1961 to 1990) are added together and the final value is calculated by dividing by the number of years in the period (30 years in this case).

    10

  • #

    ian: I understand what you’re saying ie these averages have no real meaning but it’s the only comparison we have at the moment.

    Isn’t that a bit like the drunk looking for his lost car keys under the street lamp because the light is better even though he lost them in a dark ally two blocks east?

    Especially if the “only comparison we have” is used to justify the UN taking over the worlds economies and running every one’s life down to the most minute detail. As in we must use on one sheet of toilet paper, use only “renewable” energy, have the transfer of our capital wealth by force and fraud to the non-productive, and stop the industrial revolution for everyone – especially for the formerly free USA?

    It seems that no matter what the question, the answer is always more government control and more redistribution of wealth from the producers to the non-producers. So why not use the price of postage stamps for an index? That at least has a physical significance and is more related to the ultimate goal of destruction of the worlds economy in that it is part of that economy.

    Freedom, the free market, and capitalism is the only answer that works. However, what works depends upon your goal. A surviving and thriving global technological civilization is not the UN’s goal. Hence they choose a different path to achieve the universal enslavement of all to all with them in the dictator’s seat. If you destroy science, you destroy applied reason. If you destroy high energy production, you destroy freedom. If you destroy freedom, you destroy the ability to apply reason.

    10

  • #
    Rod Smith

    Tel, I don’t know how we determine private, backyard weather stations, no matter if expensive or cheap, are accurately measuring and consistently reporting weather. I am of the opinion that lack of periodic inspection, accuracy, and operator certification is a problem. In the case of US Coop reports it is a disgrace and I maintain heads should roll.

    In the case of being changed before publishing, I am led to believe by some posts on this site that “peer review” catches such chicanery, but I have yet to see any convincing evidence. (Rant off.)

    And while we are at it, we need to clarify just what raw date might be changed at collection/aggregation sites.

    Since I was in that “business” from about 1962 to 1971, I think few can imagine how expensive and labor intensive it was. Yes, we used computers even in the early 60’s and the amount of software to do the job was very large and considerably more complex that most folks would imagine.

    Let me explain a valid reason to change raw data after collection and before aggregation. During much of the cold war, all data from the Soviet Union violated WMO time coding standards by encoding all time as Moscow Standard Time, but not identified as such in the transmission. This was confusing.

    Many (most) sources were various WMO and non-WMO relay sites. Some were collected by our CW intercept, and we found out that Moscow didn’t know a number of those reporting sites even existed. My sources were selected to ensure that ALL the Soviet data was received and processed; I had considerable overlap and duplication of coverage.

    Some of the the input centers corrected the MST data time to Zulu time as specified by WMO, and logically so to accommodate the processing of synoptic data. Others did not. As I remember a Scandinavian source (Oslo maybe?) always made that change while Delhi didn’t. (This has been a long time ago, so don’t be surprised if someone says I’ve got those site mixed up.)

    My editors (ever heard of weather editors?) routinely made the changes before distribution it to standardize it for all our users.

    You may be aware of the synoptic code problem in those days that confused the encoding of temperatures below -50C. I won’t get into that problem, or the dew points from Mexico. Still I’ve seen reports of positive temperature when it was obvious that a negative indicator had been dropped. We fixed those before relay too. I think some real time QC is important. For example an outlier temperature is sometimes easy to spot — remember the below zero temperature reading from South Florida very recently? Who would perform QC on all these ad-hoc weather reports in real or near real-time?

    Oh, and finally, we were processing reports for operations that had strong safety implications. Public safety is usually not involved in a temperature reported from the shore of Lake Lunker! But we used highly trained and experienced people for the job, around the clock, every day of the year, and I would argue we did the right thing on changing some raw data. It was not an easy job.

    Frankly, I think encrypting weather observations is a very bad idea. Rather, we need to demand and ensure honest handling by government of all records, and not just those in weather/climate reports.

    10

  • #
    Denny

    Breaking News People!

    Hadley CRU Hacked!</em> Hundreds of Files Released!

    http://www.globalwarminghoax.com/e107_plugins/forum/forum_viewtopic.php?1249.last

    10

  • #
    MattB

    Wow – quick as a flash Denny rocks up with breaking news so old even my mum has emailed me about it:)

    10

  • #

    MattB: Wow – quick as a flash Denny rocks up with breaking news so old even my mum has emailed me about it:)

    Is THAT your best shot?

    Rather feeble. Where is your denial of its validity? Where is your charge that the files are all made up? Where is your claim that nothing in the files can be trusted? Clearly, the files have not been peer reviewed so they cannot constitute scientific evidence, can they?

    I looked at just three emails at random/haphazardly and found very damming evidence in one. Statistically that means there is likely that roughly half of the emails will also contain damming evidence. Evidence that AGW is a total fraud and that data has been “corrected” to tell a story that has little to nothing to do with actual climate. If the files are found to be genuine, they are clear evidence of scientific malpractice. I would not be surprised if they are also found to be evidence of a monumental crime of fraud and theft by deception in process.

    10

  • #
  • #

    […] The BOM (Bureau of Meteorology) today has come under increased attack as its revealed (yet again, but this time by the unfolding of Climategate) the sorry state of its climate data. The code commented on as ‘false’ and ‘a bloody mess’  by climategate BOM says “It was unlikely to have come directly from the bureau’s centre because unchecked, raw data was rarely requested for climate analysis“. Hmmm, unusual – normally scientists want the raw data without all the artificial BOM adjustments so they CAN analysis the data. I would say a lot of scientists would be curious as to whether BOM also didnt ‘hide the decline‘. Like the American stations the Australian stations seem a mess, Anthony Watts at Surface Stations has been auditing the American Stations and what he has found isnt pretty. Warwick Hughes comments on BOM problems here. Warwick also says the BOM is a national disgrace! Even the BOM acknowledges that “the majority of stations do not have a complete unbroken record for any given element”. Doesnt seem to stop them issuing climate alarmism! […]

    10

  • #
    BLouis79

    Later versions of the Global Energy Budget (Trenberth, Kiehl, Fasullo) highlight more problems.
    (2008 preprint published 2009: http://www.cgd.ucar.edu/cas/Trenberth/trenberth.papers/10.1175_2008BAMS2634.1.pdf)

    “There is a TOA (top of atmosphere) imbalance of 6.4 W m-2 from CERES data and this is outside of the realm of current estimates of global imbalances that are expected from observed increases in carbon dioxide and other greenhouse gases in the atmosphere. The TOA energy imbalance can probably be most accurately determined from climate models and is estimated to be 0.85±0.15 W m-2 by Hansen et al. (2005)[…].”

    Brilliant! The computer model is used to estimate the global energy budget numbers that are used as a basis for estimating the CO2 warming affect.

    10