A science presenter, writer, speaker & former TV host; author of The Skeptic's Handbook (over 200,000 copies distributed & available in 15 languages).


The Skeptics Handbook

Think it has been debunked? See here.

The Skeptics Handbook II

Climate Money Paper




The nerds have the numbers on precious metals investments on the ASX



Wow? On sea-levels NSW councils told to take “scientific” approach, not IPCC predictions

This is a big deal. Here’s a state government telling people to be more scientific, and not blindly follow the IPCC. This is a win we need to translate to other areas.

The former Labor government in NSW had told councils they had to plan for sea-level rise  “according to the IPCC”, but that made sea-side properties unsalable, and was pretty painfully stupid compared to what the tide gauges were actually saying (like in Sydney where the rise is a tiny 6cm a century). The new strategy says councils need to be scientific and look at the conditions on each beach separately.

In this issue, the costs of following the IPCC plan were borne by those living on the coast (and property developers), and that pain motivated them to press the State government to get the IPCC out of the way. This is a reminder that it is worth protesting and sane things do happen.

If we can get citizens of the free west to appreciate the true cost of the IPCC, it would surely be gone by 2020. Now there’s a target..

Rob Stokes announces shake-up of council coastal management

In an interview with The Australian, Mr Stokes said he would be announcing “a much more scientific and evidence-based ­approach … it reflects recognition that what is happening on the coast is a product of what is happening to the sand off the coast,” he said.

“We will be integrating coastal management and planning with what is happening in the adjacent seabed.

The initiatives mark the second phase of the Coalition government’s demolition of the previous Labor government’s policy, which among other things directed local councils on the coast to enforce the climate change and sea level rise predictions of the UN Intergovernmental Panel on Climate Change.

Under that regime, councils in some cases included sea-level rise warnings on the planning certificates of some seaside properties based not on what was happening on the beaches concerned — including one that is acquiring sand naturally and pushing back the sea — but on IPCC predictions.

Many owners found that under this policy, their properties became almost unsaleable.

The Taree Council might be the smartest one of all — it plans to let the landowners figure it out themselves. They need to work out the risks, and how much action they should take. That will sort out the skeptics from the gullibles.

Congratulations to researcher Phil Watson, and Bob Carter, and to landholders on NSW coasts. I hear that some especially deserve credit: “Strong thanks are due to Batemans Bay residents Neville Hughes, Pat Aiken and other coastal NSW resident groups for their unwearying opposition to, and protests against, the former policy.”

More information

P. J. Watson (2011) Is There Evidence Yet of Acceleration in Mean Sea Level Rise around Mainland Australia?. Journal of Coastal Research: Volume 27, Issue 2: pp. 368 – 377.  doi: 10.2112/JCOASTRES-D-10-00141.1 [Link Abstract PDF ]


VN:F [1.9.22_1171]
Rating: 9.1/10 (71 votes cast)

Blockbuster: Are hot days in Australia mostly due to low rainfall, and electronic thermometers — not CO2?

Blame dry weather and electronic sensors for a lot of Australia’s warming trend…

In this provocative report, retired research scientist Bill Johnston analyzes Australian weather records in a fairly sophisticated and very detailed way, and finds they are “wholly unsuitable” for calculating long term trends. He uses a multi-pronged approach looking at temperatures, historical documents, statistical step changes, and in a novel process studies the way temperature varies with rainfall as well.

His two major findings are that local rainfall (or lack of) has a major impact on temperatures in a town, and that the introduction of the electronic sensors in the mid 1990s caused an abrupt step increase in maximum temperatures across Australia. There will be a lot more to say about these findings in coming months — the questions they raise are very pointed. Reading, between the lines, if Johnston is right, a lot of the advertised record heat across Australia has more to do with equipment changes, homogenisation, and rainfall patterns than a long term trend.

Bill Johnston: On Data Quality [PDF]

“Trends are not steps; and temperature changes due to station changes, instruments and processing is not climate change”, he said. “The Bureau are pulling our leg”.

The years when more rain falls are more likely to be years without high maximums. Bill Johnston finds that for every 100mm of rainfall, the maximum temperatures were about a third of a degree cooler.

Australian Temperatures, adjustments, homogenisation, Bureau of Meteorology, BOM

On the left hand side, the step ups in temperature are shown. Electronic sensors were introduced in the mid 1990s. The right hand graphs show how rainfall keeps maximum temperatures cooler. (Data is grouped “a,b,c” between the steps).

Johnston uses these rainfall correlations as a tool to check the quality of temperature records. When combined with step change analysis, he finds that unrecorded site moves or station changes are common. When the automatic sensors were introduced temperatures suddenly jump up, and their relationship with rainfall breaks down.

“Fleeting parcels of hot air, say from passing traffic or off airport runways, are more likely to be sensed by electronic instruments than by thermometers”, he said. 

Automatic weather stations (AWS) were introduced across Australia’s network within a few years. Because so many stations made the switch around the same time, homogenization procedures don’t detect their bias, and assume a natural step up in warming occurred. Worse, the artificial warm bias is transferred to stations that are not automated, reinforcing trends that don’t exist!

“Homogenisation is nonsense, and an open public inquiry into the Bureau’s activities is overdue”, he said.

The Bureau must be audited. Stations should not be homogenized until they are analyzed individually. And the analysis should start with site inspections and a detailed historical account of what is known about each site.

“The Bureau has scant knowledge about many important sites”, Bill said, “and some of what they claim cannot be trusted”.

Temperature is strongly related to local rainfall

Hot years are dry years, and wet years are not hot. It’s tritely obvious, yet kinda profound.  Johnston finds that from half to three quarters of the temperature variation is caused by changes in rainfall. When the land is bone dry, it heats up fast. But when soil moisture is high, the Sun has to evaporate the water first in order to heat the soil. The atmosphere above dry land has lower humidity and will rise and fall in temperature swings that are far larger than the atmosphere above moist landscapes. It varies somewhat with every town, but the relationship is consistent across the continent.

He also found that using rainfall to analyze temperature records reduces variation, while revealing aberrations in the data. What do we make of a town where the effect of rainfall on temperatures has a linear relationship for decades then suddenly changes to a random pattern? Homogenization can produce trends that don’t belong to the site.

New electronic sensors cause an artificial jump

Not only do electronic sensors pick up shorter spikes in temperature, they are also not linear instruments like mercury thermometers are. “AWS don’t measure temperature linearly like thermometers do; which causes them to spike on warms days. This biases the record.

There was also a wave of undocumented site and station changes around the time the Bureau took-over weather observations from the RAAF in the 1950s.  To mention a few; locations affected included Alice Springs, Norfolk Island,  Amberley, Broome, Mt. Gambier, Wagga Wagga and Laverton.

Disturbingly he finds that after many site and equipment changes have been accounted for, and the variability due to rainfall has been ruled out,  no temperature trend remains.

Below Johnston discusses Cowra and Wagga in detail. He observed the weather at Wagga Wagga Research Centre, visited many other sites and discussed issues with the staff. He has worked with climate data, calibrated commercial automatic weather stations and used climate data in many of his peer-reviewed studies.

Bill is pushing for an open public inquiry into the Bureau’s methods; its handling of data and biases in climate records.

Bill Johnston points out that local rainfall is useful for checking temperature data

Ignoring heat-storage in the landscape, which is cyclical; the local energy balance partitions heat loss between evaporation, which is cooling when the local environment is moist; and sensible heat transfer to the atmosphere (advection) during the day and radiation at night, which is warming when the environment is dry. Thus the longer it’s dry, the hotter it gets.

Rainfall is (should-be) linearly related to temperature, especially Tmax, but independent of it.

If the local heat-load changes, forcing a significant base-level shift, the relationship is still linear but is offset by the impact of the change.
If data become grossly disturbed or dislocated from the site (it’s fabricated for example, or implanted from somewhere else), variation around the relationship increases, data become random to each other and statistical linearity is lost. (Rainfall seasonality may also impact on this, but is not considered here.)

So, linearity is expected; and we need a statistic that indicates how good (bad) the relationship is.

Keep reading  →

VN:F [1.9.22_1171]
Rating: 9.3/10 (86 votes cast)

Divestment protests are astroturf: university students are paid pawns to be political activists

Inside Divestment: The Illiberal Movement to Turn a Generation Against Fossil Fuels

The National Association of Scholars (NAS) study: Inside Divestment (pdf)

The Fossil Fuel Divestment Movement (FFDM) is running on more than 1000 American colleges and universities, and about 30 of them have “divested”. It’s embedded in the tertiary sector, some 4,000 profs have signed supportive petitions, and some teach it in lectures.

The authors of a new 290 page study on Divestment Campaigns say that it’s not the grassroots thing it poses as, but is driven by professional political activists and funded by wealthy donors and foundations. The organization pushing the US campaign,, pays and trains students to be activists. Key dates and events are decided from the top down.

It’s all another charade — the holier than thou promises to sell fossil fuel stocks are often just empty PR stunts.  Most divestment declarations are empty PR — 66% of the  universities are saying they divest but hanging on to some fossil stocks. And four universities have not sold a thing — including Oxford. (Where are those protests?). But in the end,  it doesn’t matter whether the divestments actually happen, a mere detail, because if they did, they wouldn’t affect the share price anyway, and they certainly won’t cool the planet. Success is not about cutting CO2, emissions, or investments, it’s about getting out a press release, and then hyping up the students. (Though if the students also talk their uni funds into buying up solar and wind, I expect some renewables investors might be happy about that. Ka Ching. Ka Ching.)

The protests are — as per the norm at higher education — not about free persuasive debate, but about bullying, intimidation and smears.  Fools like you and I might think students would be grateful for income from fossil fuel investments which fund their universities. Instead impressionable pawns are taught to hate the energy companies which probably help them make to their classes, and sit in a warm room when they get there. The whole idea is so nutty, anti-free speech, and bad mannered that naturally it’s most popular at places like Harvard, Stanford, Yale and  Swarthmore. That’s where the most competitive students vie to be the most fashionable thinkers. Err, “congrats” to them.

I want to know where the student movement is that is protesting that universities are putting politics above students. How much extra do students have to pay to make up for the money burned on a symbolic protest? What are students missing out on because the investment board chose less profitable investments? Where are those placards…

Keep reading  →

VN:F [1.9.22_1171]
Rating: 9.2/10 (74 votes cast)

New Science 19: The invisible nameless model that controls the whole field of climate science

Don’t underestimate the importance of the nameless basic model. It sounds small, but in the culture and philosophy of climate science it’s bigger and carries more weight than the massive hairy GCMs. Like an invisible gossamer web, it’s overarching. It spans and defines all the other models. When they produce “dumb” answers, the basic model holds them in, for thou shalt not stray too far from the climate sensitivity defined by the basic model. It defines what “dumb” is. (It’s just “basic physics” after all.) One model to bind them all. What could possibly go wrong?

A lot, apparently. The physics might be right, but the equations are calculating imaginary conditions. The answers might be arithmetically correct but useless at the same time. They miss the real route that energy flows through to space.

By definition, as long as the basic model is wrong, the GCM models can never  get it right.

It’s not like climate scientists consult the oracle of the basic model every day, or even once a year… they don’t need to. They were taught it their climate larval stage, often long before they’d written one paper. The basic model shows that the warming of the 1980s and 90s was “caused” by CO2. Then the GCMs are tuned to that trend and that assumption.

The silent basic model is thus the whole nub of the problem. People have been arguing about the parameters when they should also have been talking about the equations that connect those parameters  — the architecture.  They added the separate warming forces before they did the feedbacks, but they should have done the feedbacks first and then added them all together after.

My favorite line below: “Once two numbers are added, there is no way of telling what the original numbers were just from their sum.” The models can’t figure out what was man-made and what was natural…

In the basic model the atmosphere responds the same way to all forms of warming, always the water vapor amplification and never simple rerouting.

Good news here, we’re in a low-acronym-zone today. This post is ASR and OLR-free, low on WVELS. No equations!

The solar model starts again soon…



19. Comments on Conventional versus Alternative

Dr David Evans, 11 November 2015, David Evans’ Basic Climate Models Home, Intro, Previous, Next.

This post completes the first two parts of this series — problems with the conventional basic climate model, and fixing them with the alternative basic climate model. Here are just some general comments, tying together some of the main ideas. There are hardly any acronyms, and no equations.

After this the series will embark on its third and final part, an hypothesis about the main cause of global warming. Kindly note that whether the third part of the series eventually proves to be right or wrong has no bearing on the correctness of these first two parts about climate model architecture.

This post is only about basic models, not GCMs, except where it specifically states otherwise.

Two Major Errors in the Conventional Architecture

Keep reading  →

VN:F [1.9.22_1171]
Rating: 9.3/10 (82 votes cast)

Those policies will cause HOW much cooling exactly? The best case fantasy a mere 0.17C by 2100

Suppose we give billions to the bureaucratic geniuses in Paris. Suppose they are right about how global warming works (though we know they are not). What do we get for all that money?

Combined, all plans, carried out, successful best case, at a cost of hundreds of trillions + : 0.17°C

More realistic more pessimistic case: 0.05°C

If the infra-red reroutes through the atmosphere and climate sensitivity has been overestimated by 5 – 10 fold: 0.02°C

The UN wants us to spend $89 trillion by 2030 to “green up” everything. For that we hope in theory, if we’re lucky to get a reduction of one sixth of a degree 70 years later. Rush, Rush, buy that plan today! Order two, and don’t count the dead.

The real reduction, using the best empirical data, and a corrected basic model, is more likely to be in the order of one thousandth of a degree 9 decades from now.

Keep reading  →

VN:F [1.9.22_1171]
Rating: 9.4/10 (95 votes cast)

Megadroughts in past 2000 years worse, longer, than current droughts

What hockeystick eh?

A new atlas shows droughts of the past were worse than those today — and they cannot have been caused by man-made CO2. Despite the claims of “unprecedented” droughts, the worst droughts in Europe and the US were a thousand years ago. Cook et al 2015[1] put together an old world drought atlas from tree rings data as a proxy for summer wetness and dryness across Europe. They compare the severity and timing of European droughts with the North American Drought Atlas (NADA) released in 2004. Yes, it’s a tree ring study with all the caveats about how trees are responding to several factors at once etc etc. But at least the modern era is measured with the same proxy as used in the old eras.

Something else is causing droughts, something modern models don’t include:

“megadroughts reconstructed over north-central Europe in the 11th and mid-15th centuries reinforce other evidence from North America and Asia that droughts were more severe, extensive, and prolonged over Northern Hemisphere land areas before the 20th century, with an inadequate understanding of their causes.”

The worst megadrought in the California and Nevada regions was from 832 to 1074 CE (golly, 242 years). The worst drought in  north-central Europe was from 1437 to 1473 CE, lasting 37 years.

Climate models don’t predict any of the droughts below, and all of them occurred before 99% of our emissions were released.

Authors compare results from the new atlas and its counterparts across three time spans: the generally warm Medieval Climate Anomaly (1000-1200); the Little Ice Age (1550-1750); and the modern period (1850-2012).

The atlases together show persistently drier-than-average conditions across north-central Europe over the past 1,000 years, and a history of megadroughts in the Northern Hemisphere that lasted longer during the Medieval Climate Anomaly than they did during the 20th century. But there is little understanding as to why, the authors write. Climate models have had difficulty reproducing megadroughts of the past, indicating something may be missing in their representation of the climate system, Cook said.

Droughts, Rainfall, Europe, Last Millenia, 1000AD - 2000AD, Atlas.

Figure 3A Maps from a new 2,000-year drought atlas show rainfall conditions over the whole continent, and much of the Mediterranean. A chart for 1741 shows severe drought (brown areas) running from Ireland into central Europe and beyond. A chart for the year 1315 shows the opposite problem—too much rain (dark green areas), which made farming almost impossible. (Cook et al., Science Advances, 2015)

A large part of the Northern Hemisphere is included in the study.

Droughts, North America, Year 1000 - 2000, last millenia, Atlas

Figure 4B

The worst droughts:

Keep reading  →

VN:F [1.9.22_1171]
Rating: 9.3/10 (70 votes cast)

Weekend Unthreaded

VN:F [1.9.22_1171]
Rating: 7.5/10 (23 votes cast)

Beautiful to watch: Davies UK M.P. quotes IPCC in Parliament as a reason to be skeptical

This video gives me hope. Finally we are starting to see more sane commentary in western parliaments. David TC Davies MP shows how politicians can master enough of the scientific details on this debate to crush the usual bumper-sticker trite “consensus” hogwash. He talks of Roman warming, the Medieval Warm Period, the Younger Dryas, the age of the Earth. It’s high school science type level, but more than enough to expose some of the silliness. He also counters the “climate change denier” tag. He cites just enough key numbers to back up each of his points. His skill here is in prioritizing the numbers that matter. Here’s hoping a few of the silent political skeptics will feel more confident to speak out. The bullying and namecalling breaks when enough people stand up to it. That’s coming.

No one I’ve ever met has ever suggested the climate never changes…

Even the IPCC is not saying that most of the warming [since industrialisation] is caused by humans …

It is absolutely certain that the more we rely on renewable energy the more we have to pay for it. No politician from any party should be running away from this. They should be willing to go out and make the argument if they think we should be paying more … but none of them are. Nobody thinks it’s a good idea to increase energy bills…

Mr Davies is a former worker at a Steel Works, and a former Special Police Constable.

The official David Davies MP webpage and Davies personal website. 

h/t Willie Soon, and Lou M.

VN:F [1.9.22_1171]
Rating: 9.2/10 (142 votes cast)

The mystery of a massive 9Gt of CO2 that came and went — could it be phytoplankton?

There is a mystery peak in global CO2 levels in 1990. For some reason from 1989 suddenly global carbon levels jumped higher than they normal would and by 9,000 million tonnes (that’s equivalent to 2,500 mT of carbon)*. It’s only a little blip in an upward line, but as a deviation from the long steady norm, it’s a dramatic change (see the second graph below). Within a few years the excess disappeared and the reasonably straight line increase in CO2 resumed. The sudden jump is equivalent to nearly half of our total annual human fossil fuel emissions. Nothing about this peak fits with the timing of human induced fossil fuel emissions. These were not big years in our output (indeed it coincides with the collapse of the soviet block when inefficient Russian industry was shut down) .

The mystery of the massive CO2 bubble exposes how little we know about why CO2 levels rise and fall, and whether human emissions make much difference. The world is spending $4 billion dollars a day trying to change this global CO2 level of 0.04% (400ppm) but apparently other large forces are at work pushing up CO2, and then absorbing it, and we don’t even know what they are.

Tom Quirk has been dedicated in isolating this quixotic spike, and hunting for the cause.  He wondered if the ocean was responsible. By looking at isotopes of carbon he finds that it was not produced out of the dissolved CO2 from the ocean. Instead the CO2 was released from  a biological source, and mostly one in the Northern Hemisphere. With some great sleuthing he finds a remarkable paper from 15 years ago that documents a sudden drop in fish stocks at the same time. He puts forward a suggestion that there was a regime shift in the ocean then, and the currents stopped stirring up as many nutrients. This meant phytoplankton activity was lost, fish starved, and the lack of activity by marine biology meant CO2 accumulated in the air. Within the next few years the phytoplankton recovered and drew out that extra CO2.

It’s as if there was an extra three China’s on Earth for a year — pouring out extra CO2. We see how fast biology absorbs that extra load, and wonder (yet again) what the fuss is all about. Tax the krill instead eh?



Some inconvenient measurements

Guest Post by Tom Quirk

There is an unexplained atmospheric CO2 “bubble” centred around 1990. The apparent smooth and continuous rise in atmospheric CO2 concentrations is broken by an anomaly that can be seen in the figure below.


CO2, global variability, 1989

Average yearly CO2 concentrations at the South Pole and Point Barrow from Scripps measurements. The straight line is a best fit to the South Pole data.


Average yearly CO2 concentrations at the South Pole and Point Barrow from Scripps measurements. The straight line is a best fit to the South Pole data with an annual increase of 1.5 ppm per year.

Plotting the residual differences of measurements from the straight line fit shows that as the world cooled in the 1960s excess CO2 accumulated at low annual rates. During the 1970s and 1980s CO2 was accruing at about 1.5 ppm per year, the average rate of the last 55 years. Then suddenly in 1989 – 1991 large amounts of CO2 were added to and withdrawn from the atmosphere. A further turning point occurred in 1995 when the annual rate of increase reached its highest level.


Residual differences from straight line fit to average yearly CO2 concentrations at the South Pole *see above). Also similar residuals for Mauna Loa and Point Barrow. Note the break in the trends in 1977 and 1995 at the times of phase changes in the PDO and ADO

Keep reading  →

VN:F [1.9.22_1171]
Rating: 9.6/10 (44 votes cast)

New Science 18: Finally climate sensitivity calculated at just one tenth of official estimates.

Reflected solar radiation, emitted heat radiation, NASA, climate sensitivity, greenhouse gases

Image: NASA

In years to come it may be recognized that this blog post produced the first modeled accurate figure for climate sensitivity. Equilibrium Climate Sensitivity sounds dry, but it’s the driving theme, the holy grail of climate science. The accurate figure (whatever it is) summarizes the whole entirety of  carbon dioxide’s atmospheric importance. That number determines whether we are headed for a champagne picnic or a baking apocalypse.

To calculate a better estimate, David identified the flaws of the conventional basic model, and rebuilt it. The basic climate model is the top-down approach looking at inputs and outputs of the whole system. It defines the culture and textbooks of the modern global warming movement. GCMs (the big hairy coupled global models) are bottom-up approaches, doomed to failure by trying to add up every detail and ending up drowning in mile-high uncertainty bands. But the GCMs are ultimately tweaked to arrive at a similar ballpark climate sensitivity as the textbook model for the “basic physics” dictates. Hence this core model is where the debate needs to be. (Everyone knows the GCMs are broken.)

For decades the world of conventional climate research has been stuck in a groundhog day with major research overturning older ideas, but somehow the upper and lower bounds of climate sensitivity stayed the same. It’s always 1.5 – 4.5 deg C (and their models never work). Their “best” estimates of climate sensitivity are relentlessly, slowly shrinking (they were around 3.5°, now around 2°C). The new alternative model doesn’t rely on the bizarre idea that all feedbacks can only operate off the surface. The alternative model (we are going to have to come up with a better name) allows feedbacks to act differently for different warming influences, and thus energy can reroute from one outgoing “pipe” to another.

The river of energy is flowing to space. If we put a rock in the way, the flow just reroutes around it.

In the “…nameless..;-).” alternative model, Evans uses the same physics but better architecture and a few more empirical data points, and we can finally estimate what the true climate sensitivity must be.

Donate to independent science. Thanks! ..
We could really use your help.
This research is only possible
thanks to people like you.

Because the “pipes” for outgoing radiation to space are elastic, and can adapt to increases in energy, the climate sensitivity to CO2 could be very low. Indeed it is not possible to put a lower bound on the figure — it may be almost zero. It is possible to put an upper bound on it — which is about half a degree. The most likely estimate is around 0.25°C. Empirical estimates by Lindzen and Choi,[1][2] Spencer and Braswell[3][4] and Idso[5] suggest it is 0.4°C – 0.7°C. We can  argue the toss between 0.25 and 0.5, but no one can argue that we need to spend a single dollar to reduce our emissions if doubling CO2 only causes minor and beneficial warming.

What is striking is how small the changes need to be to compensate for “all that extra CO2″. The effect of the increased CO2 of the last few decades was neutralized if the height of the cloud tops or the water vapor emissions layer fell by just a few tens of meters. These changes are so small they are hard to detect, but there are empirical observations that suggest both may have occurred to some degree. The small changes required show just how easy it would be for the atmosphere to emit a constant energy flow, regardless of a blockage in one “pipe”.

DEFINITION: The equilibrium climate sensitivity (ECS) is the surface warming ΔTS when the CO2 concentration doubles and the other drivers are unchanged. Note that the effect of CO2 is logarithmic, so each doubling or fraction thereof has the same effect on surface warming.

–  Jo



18. Calculating the ECS Using the Alternative Model

Dr David Evans, 5 November 2015, David Evans’ Basic Climate Models Home, Intro, Previous, Next, Nomenclature.

This post employs the alternative model to quantitatively analyze the climate of the last few decades, estimating the CO2 sensitivity (λC), the fraction of global warming due to increasing CO2 (μ), and the equilibrium climate sensitivity (ECS). The formulas for these quantities were derived in post 16.

Here is the spreadsheet with the alternative model and the numerical calculations in this post: download (Excel, 250 KB).

Simple Case

Keep reading  →

VN:F [1.9.22_1171]
Rating: 9.3/10 (95 votes cast)