And so begins the list of errors. The climate models, it turns out, have 95% certainty but are based on partial derivatives of dependent variables with 0% certitude, and that’s a No No. Let me explain: effectively climate models model a hypothetical world where all things freeze in a constant state while one factor doubles. But in the real world, many variables are changing simultaneously and the rules are different.
Partial differentials of dependent variables is a wildcard — it may produce an OK estimate sometimes, but other times it produces nonsense, and ominously, there is effectively no way to test. If the climate models predicted the climate, we’d know they got away with it. They didn’t, but we can’t say if they failed because of a partial derivative. It could have been something else. We just know it’s bad practice.
To see an example of how partial differentials can produce quixotic contradictions in a normal and simple situation, see what happens when they are used with the Ideal Gas Law in this PDF from MIT.
Partial derivatives are useful when there are only a few, independent variables. But in the climate paradox, there are a lot of variables and most of them are dependent. Partial derivatives might work, or they might blow up. For them to make sense we’d need to live in a world which can be held in a constant steady state while one factor does a step change. It’s a situation that probably hasn’t happened in the last 4.5 billion years.
The field of climate science draws on maths, but rarely draws on the leading mathematical minds. This first error of the three illustrates how people who may be well trained in geography or oceanography (or divinity) can miss points that professional mathematicians and modelers would not.
The big problem here is that a model built on the misuse of a basic maths technique that cannot be tested, should not ever, as in never, be described as 95% certain. Resting a theory on unverifiable and hypothetical quantities is asking for trouble. Hey but it’s only the future of the planet that’s at stake. If it were something more important, climate scientists would have brought in some serious maths guys.
This error is fairly easy to describe; the harder, bigger errors are coming soon, as we try to roll out the points as fast as we can. — Jo
4. Error 1: Partial Derivatives
There are three significant errors with the conventional basic climate model (which was described in the basic climate model core part 1,and basic climate model in full part II). In this post we discuss the first error, the misapplication of the mathematical technique of partial derivatives, because it is the easiest of the three to describe.
By the way, noting that there are problems with the conventional model is hardly new even in establishment circles, but apparently itemizing them is a little unusual. For example, Sherwood et. al said in 2015 : “While the forcing–feedback paradigm has always been recognized as imperfect, such discrepancies have previously been attributed to variations in “efficacy” (Hansen et al. 1984), which did not clarify their nature.”
The basic model relies heavily on partial derivatives. A partial derivative is the ratio of the changes in two variables, when everything apart from those two variables is held constant. When applied to the climate, this means everything about the climate must be held constant while we imagine how much one variable would change if the other was altered.
For example, how does changing the surface temperature affect how much heat is radiated to space (the outgoing longwave radiation, or OLR), if everything else — including humidity, clouds, gases, lapse rates, the tropopause, and absorbed sunlight — stays the same? (This particular partial derivative is the Planck sensitivity, central to the conventional model.)
Keep reading →
Wait til you see what Lance Pidgeon has found. He was looking at the BOM website temperature archive maps of Australia for early last century (using AWAP data). He was wondering how the Bureau of Meteorology could possibly create maps this detailed for specific days that long ago. He was especially curious about the remote, vast areas where there were no thermometers, yet there were wiggly jiggly temperature lines on the map, shaded as if they had meaning. I’ve heard that more people have visited the South pole, than have stood at the point in central Australia where the three large western and central states meet.
Then he noticed something positively strange — April 14th in 1915 and one year later in 1916 looked almost identical, as did the same day in 1917. The more he looked, the weirder things got. He plodded, year after year, all the way from 1911 to 1917, then through Jan, Feb, March, and so on. Worse, he tells me he could keep going right through to 1956 without seeing much change (though there are interesting exceptions). After that, temperatures of the area start to vary from year to year, like the “weather” we’d expect if we had multiple thermometers in the site with the black square, which we still don’t have. Even in the modern era there are only two sites.
It is not just ”April 14ths” each year that are suspiciously similar, it’s pretty much all days of the same month. In this blockbuster graph below, he looks at one spot in central Australia, about the size of Tasmania (which is 65,000 km2), and tracks the temperature profile of that same area, on the same day, year after year. The BOM tells us they have good temperature records. They tell us the AWAP analyses are based on “in situ” surface observations, and they make much about AWAP trends being “unadjusted“. Yet here in an AWAP Map, presumably derived from the same data as those “unadjusted trends” there’s an area with no thermometers before 1940 (when Warburton opened) and we see detailed temperature lines that are identical year after year.
Do AWAP maps and AWAP data matter?
AWAP maps are used in press releases and in the news. The detailed wiggles send a message to the world that “we have very accurate data”. But when the BOM tells us we set an “area averaged” record across the whole of Australia since 1910, they don’t mention that it’s compared to “calculations” of estimated wiggles over hundreds of thousands of kilometers where there are barely any roads, let alone thermometers. Nor do they mention that suspiciously, magically, in the early part of the AWAP record the temperatures in remote central Australia appear to be the same year after year — or at least they are in the maps*.
Significantly, the BOM use trendlines from the AWAP data as justification that their all-homogenized ACORN data is virtually the same as the “unadjusted” data. It’s their excuse for why their massive adjustments in the ACORN set look neutral (when other analysis of ACORN shows the adjustments that warm the trend are much more common). The AWAP maps are created from this data too (but obviously the maps themselves aren’t “raw” because they must have an area weighting algorithm run over the data, plus elevations, plus who knows what else?). The question then is what is the state of the AWAP “unadjusted” data? The maps generated from it suggest quality control is awful, weekly data disagrees with daily data, and the program used to do area-weighting and to generate the maps is not producing results that look credible. How “unadjusted” are the trendlines that are called “unadjusted”?
Below Lance Pidgeon has graphed the squares that fit in the black box in central Australia (shown in the map below this) from Jan 14th each year, then Feb 14th, then March 14th… you get the picture. Astonishing. Thank citizen science for telling you what the BOM doesn’t mention.
Year after year on the same day of the year, the temperature patterns are identical?
Guest Post by Lance Pidgeon
Building the past — BoM style
To produce an area averaged temperature for Australia a fine matrix of squares on a map can be used. Just add the values in the squares and divide by the total after correcting for latitude. Anomalies and trends can be produced over time. Simple right?
The BoM “Australian Water Availability Project” (AWAP) maps have a fine lat-long grid over many years of daily data. But in parts of Australia thermometer sites are hundreds of kilometers apart, especially in the first half of last century. To make a complete picture gaps need to be filled — but with what exactly? Between thermometers MUST be derived values. Does the fill come from raw data, estimates and future averages or a desired outcome? Is the gridded data that was used to generate this map called “raw” data by the BOM?
In the map below, we’ll take a close look at the area marked by the black square. Horizontal lines drawn from the same island in W.A. shows two mapping methods.
Pasted squares in the graph pictures have been chosen to show a problem that would effect the whole map to some degree. Tasmania is about the same size as Ireland, Switzerland or the state of West Virginia in the USA.
Keep reading →
Read the post to see it properly.
A feast. A feast! For those who want the meat, the math and the diagrams (don’t miss the diagrams). As far as we know, this is the first time the architecture of the basic climate model has been laid out in one place on the Net. This is the most math heavy post this series, but it has to be done, and properly. This is where the 1.2 °C direct effect of doubling CO2 gets amplified to 2.5 °C with fairly basic physics. If the equations are not your forte, look at the “the Establishment Case” below the equations to get some idea why establishment scientists find it mind-bendingly hard to imagine how climate sensitivity could possibly be much different.
For commenters who know there are problems with this model (don’t we all), one of the points of doing this is to get through to the establishment leaders — to speak their language instead of having separate conversations. Of course, for some minds it will not matter what skeptical scientists say, but for other, key people, it will. We would expect seeing the flaws laid out so clearly will undercut the implacable confidence of leaders, though they may not say so.
Again, this ties for Most Uncontroversial Post on this blog. Everything here, the IPCC would agree with (except maybe the last sentence).
Do admire the diagram –Figure 2. It’s no accident it is similar to an electrical circuit diagram. Modeling and feedbacks have been tested to the nth by thousands of electrical engineers around the world building things we use every day. Thinking of the climate model this way is a useful technique to figure out where it goes wrong.
In a sense this is where silicon chip wisdom starts to scrutinize the climate maze.
3. The Conventional Basic Climate Model — In Full
Dr David Evans, 24 September 2015, Project home, Intro, Previous, Next, Nomenclature.
Here is the conventional basic climate model, in full. It builds on the previous post in the series, which explained how the model worked in the case of no feedbacks and only a CO2 input. This post uses the same terminology, notation, and ideas, without necessarily explaining them again. This is “heaviest” post in the series.
Readers who don’t want to see details of the mathematical modeling might want to skip straight to the diagram and the calculation of ECS at the end — a general understanding of the previous post and of the model diagram below is sufficient to make sense of the ensuing posts in this series.
The Set-Up: Multiple Influences and Multiple Feedbacks
Consider the hypothetical situation where only the only things that can change are:
Keep reading →
Imagine the crime of trying to audit the BOM?
Last year, Graham Lloyd wrote in The Australian about how the BOM had made whopping two degree adjustments to data which turned cooling trends to warming trends and instead of improving the data, it created discontinuities. The BOM’s eventual explanation lamely exclaimed that the stations “might” have moved. (And they might not, too. Who knows, but remember this is what 95% certainty looks like.) Lloyd wrote about how historical records of extreme heat at Bourke had effectively been thrown in the trash. Who cares about historical records?
In response to the embarrassment and revealing questions, Tony Abbott wanted an investigation. But Greg Hunt, and The Dept of Environment opposed the investigation and opposed doing “due diligence”. What are they afraid of? Instead, Hunt helped the BOM set up a one-day-wonder investigation with hand-picked statisticians that wasted another nine months before admitting that the BOM methods would never be publicly available or able to be replicated. If it can’t be replicated, it isn’t science.
The BOM’s defense is always that their mystery method is considered “best practice” by other agencies around the world — who share the same incentives to exaggerate warming, and who also use unscientific and undisclosed (though different) adjustments.
It’s clear Greg Hunt doesn’t want good environmental data. Nor does the ABC, which is already talking about how they hope to get money from the Turnbull government.
4 Sept: Guardian: Daniel Hurst:
Abbott considered investigation into ‘exaggerated’ Bureau of Meteorology temperature data
Documents show former PM was briefed on setting up a taskforce into whether the Bureau of Meteorology exaggerated records – as claimed in the Australian
But the environment minister, Greg Hunt, pushed for the then prime minister to drop the idea.
The documents, obtained by the ABC under freedom of information laws, show the Department of the Prime Minister and Cabinet (PM&C) prepared a brief for Abbott in September 2014 noting that recent articles published by the Australian had “accused [the bureau] of altering its temperature data records to exaggerate estimates of global warming”.
The brief said the bureau’s climate records were “recognised internationally as among the best in the world” and used “a scientific approach that has been peer-reviewed”. “Nevertheless, the public need confidence information on Australia’s, and the world’s, climate is reliable and based on the best available science,” the then secretary of PM&C, Ian Watt, wrote…
PM&C subsequently prepared a new brief for Abbott suggesting he agree to amending the terms of reference for the taskforce so that it would merely provide “coordination and advice” on “quality assured climate and emissions data for Australia”. The brief said Bishop had “agreed to the removal of reference” to the bureau…
ABC’s Jake Sturmer reports on Greg Hunt:
One Department of Prime Minister and Cabinet bureaucrat described a Department of Environment official as being “on a campaign” to get the references to BoM removed from the taskforce’s responsibilities.
Further documents appear to show Mr Hunt convinced senior cabinet members to remove any references of “due diligence” or “quality assurance”.
Sturmer doesn’t seem to think that actively avoiding due diligence was important enough to warrant getting opinions from people who did want due diligence. Nor does he notice that the one-day forum was never going to look for evidence that the BOM adjustments were unjustified. Like most ABC journalists, he gullibly accepts that a lack of evidence from a non-investigation, is worth something, and he doesn’t contact skeptical scientists to get a different view:
Keep reading →
This is the most uncontroversial post ever put on this blog — it’s everything the IPCC would agree with and the key to their unshakable confidence.
This post is for the independent thinkers, the brains that want to know exactly where the famous, core, 1.2 °C figure comes from. That’s the number of degrees that a doubling of CO2 would bring, and it’s a figure that underlies decades of research and the figure that the big models are built around. Here, as far as we know, is the simplest, accurate reference to that reasoning and their maths. We have always assumed the 1.2 °C figure is correct, and focused on questioning the feedbacks that are assumed to drive that base figure up to 3°C (or 6°C or molten-Venus-here-we-come!)
We are not criticizing the estimate here, but this is so key and central to the whole climate-clean-green industry, and the models, it has to be laid out. This is the source of “implacable confidence” among the leading thinkers of establishment climate science. It is long past time that skeptical scientists put these details — warts and all — out in public. Dr David Evans is laying out the foundation for a complete and systematic analysis. The heaviest stuff is here and in the next post. The exciting stuff starts after that. For those who want to follow the unfolding science pay attention to the acronym table (and diagrams) below. The notations like ASR and OLR are the language we need to speak.
For those who are not number-heads, stay tuned, I will be spacing other news and stories among the serious science posts, often doing more than one post a day. It’s going to be very busy as I feed in the cutting edge research to the usual mix of politics, citizen science, quirky news and satire. – Jo
2. The Conventional Basic Climate Model — Simple
Dr David Evans, 23 September 2015, Project home, Intro, Next, Nomenclature.
The conventional basic climate model is moderately complicated when all the bits and pieces are included, so we are going to present it over two blog posts. In this post we’ll just consider the simplest case, direct warming, where the only input is the change in carbon dioxide level, and there are no feedbacks.
For those not accustomed to mathematical modelling, this post might seem complicated. To others it may appear rudimentary and insubstantial—“surely there is more to it”. It is ruthlessly logical, squeezing as much information as possible from clues and relationships, but be aware as you read through it of the vast number of climate factors that are ignored.
I’ve tried to make it easier to follow by supplementing many equations with explanations, aiming for a broader audience than just STEM people. With enough diligence and patience (yes, it does take time to think about it), most readers will be able to at least get the general idea.
In any case, it might satisfy your curiosity about how models are made and why some people believe in the carbon dioxide theory despite all the contrary evidence.
Keep reading →
For those of you who are die-hard puzzle solvers here to spar about cutting edge research: good news, here’s where we begin the long awaited update to Dr David Evans’ climate research. There are a few surprises, sacred cows, we did not expect we would need to challenge, like the idea of “forcings”.
Government science is stuck in a rut, strangled – trying to capture the creative genius of discovery and force it through a bureaucratic formula, like it can work to a deadline or be judged by the number of papers, or pages, or citations, or by b-grade officials. Blogs are new, but this form of independent scientific research, done for the thrill of discovery, outside institutions and funded by philanthropists, is the way science was mainly done before WWII.
For the first time we are going to explain the architecture of the inner core of the climate models, the small model at the center that the big GCM’s are built around. It is mostly a physics model, and it’s mostly “basic” and mostly right. It’s the reason for the implacable confidence of the establishment in the climate debate. But there are a couple of big problems… and we’ll get to them. Mathematical analysis found the problem but once we explain it, it will seem obvious even without any maths. There will be a moment when people will say “Wow, they really did that?”
Evans takes his experience as a heavyweight expert modeler, Fourier maths PhD, and goes down through the basics, the key papers, to expose this flaw in climate model architecture. It will turn this debate upside down. There are now some things I used to say that I need to say differently. Some points we used to concede, that we now question.
To build a solar climate model David had to unpack the current CO2 driven model. As far as we know, this is the first time an independent modeling expert has gone through the climate architecture, looking at the key equations, assumptions, and structure.
What follows is a big shift forward, for us personally, as well as for the debate. We have always criticized the big spaghetti GCMs and their feedbacks. But within the big model there is a core basic physics model. That core is the foundation of the idea that CO2 causes 1.2C of warming. Think Hansen 1984, the Charney Report 1979, then think Arrhenius 1896. (No, this is not about disputing whether the “Greenhouse effect” is real or the second law — we have outspokenly disagreed with such disputation, and still do.)
What was curious last time was the way skeptics led the debate. Fans of the establishment came up with no criticism themselves (bar the usual namecalling), but were left to copy skeptical lines. Disappointingly some skeptics resorted to personal attacks, which we expect from alarmists. We have less patience for that timewasting now.
In the end we are all on the same team, we’d rather work together.
Thanks to the supporters and philanthropists (largely readers of this blog) who make it possible. After seven years of blogging here, and three years of David working full time unpaid, these offices, this household, is entirely dependent on donations. It’s a strange, hard path. Exciting, but full of potholes. We are determined to improve climate research, for the sake of farmers, for families and for the environment. A talent finds an outlet, somehow. A gift must be used. Thanks to all those who have faith that research for its own sake is a pursuit worth pursuing.
Come on a journey with us…
1. Introducing a Series of Blog Posts on Climate Science
Dr David Evans, 22 September 2015. Project home, Next post.
Breaking the Intellectual Standoff
There is an intellectual standoff in climate change. Skeptics point to empirical evidence that disagrees with the climate models. Yet the climate scientists insist that their calculations showing a high sensitivity to carbon dioxide are correct — because they use well-established physics, such as spectroscopy, radiation physics, and adiabatic lapse rates.
How can well-accepted physics produce the wrong answer? We mapped out the architecture of their climate models and discovered that while the physics appears to be correct, the climate scientists applied it wrongly. Most of the projected warming comes from two specific mistakes.
Given all the empirical evidence against the carbon dioxide theory, there had to be problems in the basic sensitivity calculation. Now we’ve found them.
Series of Blog Posts
We are going to explain this and more in a series of blog posts. To build a better model, we had to understand the conventional basic climate model, the core model used to compute the high sensitivity to carbon dioxide. We unpack it, show the errors, then fix it. We calculate the sensitivity to carbon dioxide using this alternative model — which shows that the sensitivity to carbon dioxide is much lower.
If carbon dioxide didn’t cause much of the recent global warming, what did? The series continues with the revamped notch-delay solar theory (the previous problem concerning causality of notch filters has been resolved). This finds evidence that albedo modulation involving the Sun is the likely cause of global warming, and produces a falsifiable prediction for the climate of the next decade.
In its complete form this work has evolved into two scientific papers, one about the modelling and mathematical errors in the conventional basic climate model and fixing them (carbon dioxide isn’t the culprit), and another for the revamped notch-delay solar theory (it’s the Sun). Both are currently undergoing peer review. These posts are useful in airing the ideas for comments, and testing the papers for errors.
The Basic Model is Crucial to Climate Alarmism
Our understanding of the effect of carbon dioxide rests on the conventional basic climate model.
That model is used to calculate the sensitivity of surface temperature to the concentration of atmospheric carbon dioxide. It dates back to 1896 with Arrhenius . It was updated in the 1960s and 1970s, and described in some detail in the otherwise-rather-brief Charney Report of 1979, which is the seminal document that ushered in the current era of concern about carbon dioxide . It is the cornerstone of the carbon dioxide theory of global warming. Predating computer simulations, it is often referred to as “basic physics”.
Despite the numerous mismatches between theory and climate observations to date, many climate scientists remain firm in their belief in the danger of carbon dioxide essentially because of this model, rather than because of huge opaque computer models. The basic model ignited concern about carbon dioxide; without it we probably wouldn’t be too worried.
There is no empirical evidence that rising levels of carbon dioxide will raise the temperature of the Earth’s surface as fast as the UN’s Intergovernmental Panel on Climate Change (IPCC) predicts. The predictions are entirely based on models and calculations.
Modern climate science is imbued with the ideas of the basic model. For instance, it is from this model that we get the notion that the effect of a climate influence can be encapsulated by the radiation imbalance or “forcing” it causes.
Keep reading →
It’s only been a week, and already the door is open to the emissions trading monster. The Nationals may have got Turnbull to agree in writing last Tuesday that he would not change the Abbott policies, but writing things on paper is not enough, apparently it needs to be carved in stone.
If the member for Goldman Sachs still wants the fake “free” market solution — the one he threw away his leadership for in 2009 — he can keep the current coalition plan but use foreign credits to meet the targets. The global carbon market is the $2 Trillion dollar scheme to enrich financial houses, crooks and bureaucrats. It’s a whole fiat currency, ready-to-corrupt. The vested interests in this are knocking at every door. They’d be mad not too. But what kind of world do we want to live in? We don’t have to reward the do-nothing unproductive sector and the corrupt.
A carbon tax is a pointless waste, and the worst kind of carbon tax is a global trading scheme.
If Australians don’t want to be sold out in Paris, they need to protest now. I suggest writing to The Nationals, Libs, Nick Xenophon and media outlets.
Six reasons a carbon trading tax is worse than a direct tax:
- It’s not a free market but a sick imitation. A free market cannot be based on a ubiquitous molecule produced by all life on Earth, the oceans, and a lot of its rocks. It breaks all the rules of free markets. Most of the players can’t pay, and their behavior won’t change. The government alone sets the supply and the demand; their market is built on loopholes, exemptions, and special rules from the start. Dark green crony capitalism. The Abbott Direct Action plan is much more a true free market.
- Fake markets feed corruption, bureaucrats, and bankers. Do we really need more of them? An ETS draws productivity out of the real economy and feeds parasites and cheats.
- We lose sovereign control. The price of the EU market is set by bureaucrats in Brussels. How many carbon credits will they give away to friends this year, and which pandering group qualifies to sell them? If we buy or sell EU carbon credits we give sovereign control over the cost of our energy (and economy) to people we have no say in electing, who are not accountable in any way, and do not have our best interests at heart.
- It’s forever. We create “property rights” worth billions. Most taxes can be voted out. A trading scheme tax can’t be unwound without paying massive dollars in compensation. See point 2.
- It’s not efficient. It has an overwhelming auditing burden. A global market based on items we can’t “see” or really want needs constant overbearing auditing to stop cheats (e.g .See corruption in Russia and Ukraine, August 2015). Corruption is already rife in the EU carbon market, what chance does it have in the third world? Countries without rule of law, democracy, transparency and a free press are riddled with corruption and just not ready or capable to maintain a fake free market. Hands up who wants to make the corrupt even wealthier and more powerful in those places? It’s yet another force working against honest businessmen in the third world. Say No for the sake of the poor.
- Money goes overseas for nothing. There is no useful product sent back to Australia in return. At least if we sequester carbon in our soils and plants we get something, even if it’s terrible value for money.
If we want solar panels and wind panels to work, just fund the damn research ourselves. Then we own it, and might be able to sell something useful one day.
Here’s Turnbull testing the waters to see how much liberals, nationals and voters protest
Turnbull suggests changes to climate policy are possible
Turnbull, who has also sought to allay concerns from the Nationals and conservative members of the Liberal party that he would overturn the Coalition’s existing climate policies, repeated his support for the government’s post-2020 emissions reduction targets and the other measures Hunt had assembled “with great care”.
But he also kept the door open to tweaking the policies if, as many observers predict, they will be inadequate in the longer term. “There will be changes to policies if they don’t work as well as we think, or we think others can work better. None of this is written in stone, but I don’t have any plan to change those policies because everything we see at the moment suggests they’re working very well.”
Keep reading →
This example below shows the dangers of cherry picked and buried data. It shows how great news and joy can be reported from rancid results, and the only protection against this is open access. When the taxpayer funds research that is not fully and transparently public, and immediately available, the people are funding PR rather than science. “Peer review” does little to stop this, little to clean up the mess after it happens, and the truth can take years to be set free.
Ten percent of teenagers taking an anti-depressant harmed themselves or attempted suicide. This was ten times the rate of the teens on the placebo. The results of this clinical trial were published in 2001, but those alarming statistics were not reported. The drug went on to be widely used. A new reanalysis of the data, reported in the BMJ, revealed the dark and hidden dangers. The company that funded the research, Glaxo Smith Kline, has already faced record fines of $4.2 billion. The Journal of the American Academy of Child and Adolescent Psychiatry won’t retract the paper.
There are many ways to hide data. In this case, the results of the trial include 80,000 records which were provided in a form that could only be accessed one at a time.
There are many branches of science, and psycho-science where the data is partially withheld, or unavailable, or provided in a difficult-to-use form. David Jones of the Australian BOM was caught in the climate gate emails saying “we have a policy of providing any complainer with every single station observation when they question our data (this usually snows them).” Stefan Lewandowsky did a survey and held back a quarter of the results. John Cook and the University of Queensland threatened to sue people making their data public.
All publicly funded data must be made available upon publication; the public shouldn’t need to ask for it. If they have to FOI, the government funded system of “science” has already failed.
Data from a peer-reviewed antidepressant trial misleading:
An Australian-led study of a popular antidepressant has shown that it can tip young people into suicide.
In a rare re-analysis of a controversial clinical trial, the researchers found that the drug paroxetine — touted in 2001 as safe and effective for teenagers — was neither.
Keep reading →
Poor climate scientists know they can’t win the science debate against the engineers, geologists, chemists and physicists who are better scientists, better informed, mostly unfunded and unleashed all over the Internet.
To avoid coughing up the “overwhelming evidence” the climate experts say they have, but can’t seem to find, they are pulling out the Panzers, resorting to pleas for RICO investigations. Treat the skeptical scientists like Racketeers, they say! And what’s their evidence for this conspiracy of corruption… oh lordy, these people are scientists, they must have emails, cheques, tapes and photos. Surely? But no, their evidence are pop-smear-books where the deepest darkest evidence is the common use of “tobacco tactics”! But every activist group under the sun, including honest groups, uses at least some of the exact same tactics. How does anyone point out flaws without “seeding doubt” about them? Either the flaws are real or they’re not, and that’s what a scientist discusses, not “motives”.
There is no law of science called “tobacco-tactics”. If man-made global warming is a dire threat, the evidence comes from instruments that measure the climate, not from smear-o-rama by association.
Indeed, the Team-Tobacco of climate are the believers not the skeptics
I looked up Tobacco Tactics for the first time:
A long-standing tactic of the tobacco industry and its supporters is to try to marginalise and denigrate its critics.
You mean like calling them “deniers”, denigrating their qualifications, printing fantasies about their funding, attacking their religious beliefs, and inventing spurious links to … wait, something as black as the tobacco industry, or the holocaust? How about stranding skeptics at airports, canceling their tickets, sacking them, removing climate skeptics titles and canceling email accounts. What about using students to protest emotionally at universities to stop the research even starting (see Lomborg, Bjorn, UWA)?
The Tobacco industry influenced scientists by commissioning research, and funding scientists, which is similar to the one-sided government funding that pours money at climate models we know are failing, yet doesn’t fund models built on solar factors and natural cycles. Then there is Ghost Writing, like say where Greenpeace writes a document for the IPCC, or where Greenpeace and WWF set the BBC journalistic policy?
In arguments and reasoning the tobacco groups “shift the debate away from the health issues”, just as climate scientists shift the debate away from the science and towards the pitiful tiny funding available to a few skeptics. Climate experts don’t want to talk about the missing hotspot, the uncertainties of ocean heat content measurements, the wild variations of past climates, or the way models fail dismally at predicting the last 18 years.
Legal strategies of Big Tobacco include FOI requests, which both skeptics and unskeptics use. Skeptics want the scientific data, that they shouldn’t even have to FOI. Climate unskeptics want the funding details of skeptical scientists so they can smear them by association and scare off potential donors.
We can build on this theme: please readers — fish for inspiration through the Tobacco Tactics files, and suggest away.
The letter calling for a RICO investigation into skeptical scientists:
Dear President Obama, Attorney General Lynch, and OSTP Director Holdren,
As you know, an overwhelming majority of climate scientists are convinced about the potentially serious adverse effects of human-induced climate change on human health, agriculture, and biodiversity. We applaud your efforts to regulate emissions and the other steps you are taking. Nonetheless, as climate scientists we are exceedingly concerned that America’s response to climate change – indeed, the world’s response to climate change – is insufficient. The risks posed by climate change, including increasing extreme weather events, rising sea levels, and increasing ocean acidity – and potential strategies for addressing them – are detailed in the Third National Climate Assessment (2014), Climate Change Impacts in the United States. The stability of the Earth’s climate over the past ten thousand years contributed to the growth of agriculture and therefore, a thriving human civilization. We are now at high risk of seriously destabilizing the Earth’s climate and irreparably harming people around the world, especially the world’s poorest people.
Keep reading →
19 contributors have published
2044 posts that generated