- JoNova - https://joannenova.com.au -

New Science 1: Pushing the edge of climate research. Back to the new-old way of doing science

For those of you who are die-hard puzzle solvers here to spar about cutting edge research: good news, here’s where we begin the long awaited update to Dr David Evans’ climate research. There are a few surprises, sacred cows, we did not expect we would need to challenge, like the idea of “forcings”.

global climate models, basic physics

Government science is stuck in a rut, strangled – trying to capture the creative genius of discovery and force it through a bureaucratic formula, like it can work to a deadline or be judged by the number of papers, or pages, or citations, or by b-grade officials. Blogs are new, but this form of independent scientific research, done for the thrill of discovery, outside institutions and funded by philanthropists, is the way science was mainly done before WWII.

For the first time we are going to explain the architecture of the inner core of the climate models, the small model at the center that the big GCM’s are built around. It is mostly a physics model, and it’s mostly “basic” and mostly right. It’s the reason for the implacable confidence of the establishment in the climate debate. But there are a couple of big problems… and we’ll get to them. Mathematical analysis found the problem but once we explain it, it will seem obvious even without any maths. There will be a moment when people will say “Wow, they really did that?”

Evans takes his experience as a heavyweight expert modeler, Fourier maths PhD, and goes down through the basics, the key papers, to expose this flaw in climate model architecture. It will turn this debate upside down. There are now some things I used to say that I need to say differently. Some points we used to concede, that we now question.

To build a solar climate model David had to unpack the current CO2 driven model. As far as we know, this is the first time an independent modeling expert has gone through the climate architecture, looking at the key equations, assumptions, and structure.

What follows is a big shift forward, for us personally, as well as for the debate. We have always criticized the big spaghetti GCMs and their feedbacks. But within the big model there is a core basic physics model. That core is the foundation of the idea that CO2 causes 1.2C of  warming. Think Hansen 1984, the Charney Report 1979, then think Arrhenius 1896. (No, this is not about disputing whether the “Greenhouse effect” is real or the second law — we have outspokenly disagreed with such disputation, and still do.)

What was curious last time was the way skeptics led the debate. Fans of the establishment came up with no criticism themselves (bar the usual namecalling), but were left to copy skeptical lines. Disappointingly some skeptics resorted to personal attacks, which we expect from alarmists. We have less patience for that timewasting now.

In the end we are all on the same team, we’d rather work together.

Thanks to the supporters and philanthropists (largely readers of this blog) who make it possible. After seven years of blogging here, and three years of David working full time unpaid, these offices, this household, is entirely dependent on donations. It’s a strange, hard path. Exciting, but full of potholes. We are determined to improve climate research, for the sake of farmers, for families and for the environment. A talent finds an outlet, somehow. A gift must be used. Thanks to all those who have faith that research for its own sake is a pursuit worth pursuing.

Come on a journey with us…

Jo

1. Introducing a Series of Blog Posts on Climate Science

Dr David Evans, 22 September 2015. David Evans’ Basic Climate Models Home, Next post.

Breaking the Intellectual Standoff

There is an intellectual standoff in climate change. Skeptics point to empirical evidence that disagrees with the climate models. Yet the climate scientists insist that their calculations showing a high sensitivity to carbon dioxide are correct — because they use well-established physics, such as spectroscopy, radiation physics, and adiabatic lapse rates.

How can well-accepted physics produce the wrong answer? We mapped out the architecture of their climate models and discovered that while the physics appears to be correct, the climate scientists applied it wrongly. Most of the projected warming comes from two specific mistakes.

Given all the empirical evidence against the carbon dioxide theory, there had to be problems in the basic sensitivity calculation. Now we’ve found them.

Series of Blog Posts

We are going to explain this and more in a series of blog posts. To build a better model, we had to understand the conventional basic climate model, the core model used to compute the high sensitivity to carbon dioxide. We unpack it, show the errors, then fix it. We calculate the sensitivity to carbon dioxide using this alternative model — which shows that the sensitivity to carbon dioxide is much lower.

If carbon dioxide didn’t cause much of the recent global warming, what did? The series continues with the revamped notch-delay solar theory (the previous problem concerning causality of notch filters has been resolved). This finds evidence that albedo modulation involving the Sun is the likely cause of global warming, and produces a falsifiable prediction for the climate of the next decade.

In its complete form this work has evolved into two scientific papers, one about the modelling and mathematical errors in the conventional basic climate model and fixing them (carbon dioxide isn’t the culprit), and another for the revamped notch-delay solar theory (it’s the Sun). Both are currently undergoing peer review. These posts are useful in airing the ideas for comments, and testing the papers for errors.

The Basic Model is Crucial to Climate Alarmism

Our understanding of the effect of carbon dioxide rests on the conventional basic climate model.

That model is used to calculate the sensitivity of surface temperature to the concentration of atmospheric carbon dioxide. Dating back to 1896 with Arrhenius, it was updated in the 1960s and 1970s. The model is presented as the  first argument in the Charney Report of 1979, the seminal document that ushered in the current era of concern about carbon dioxide [1]. It is the cornerstone of the carbon dioxide theory of global warming. Predating computer simulations, it is often referred to as “basic physics” (though somewhat inaccurately — the basic model is actually the application of basic physics to the climate).

Despite the numerous mismatches between theory and climate observations to date, many climate scientists remain firm in their belief in the danger of carbon dioxide essentially because of this model, rather than because of huge opaque computer models. The basic model ignited concern about carbon dioxide; without it we probably wouldn’t be too worried.

There is no empirical evidence that rising levels of carbon dioxide will raise the temperature of the Earth’s surface as fast as the UN’s Intergovernmental Panel on Climate Change (IPCC) predicts. The predictions are entirely based on models and calculations.

Modern climate science is imbued with the ideas of the basic model. For instance, it is from this model that we get the notion that the effect of a climate influence can be encapsulated by the radiation imbalance or “forcing” it causes.

Basic climate model

Figure 1: The basic climate model applies basic physics in a simple calculation done by hand, on paper.

Big computerized general circulation climate model

Figure 2: A general circulation model (GCM) is a big, opaque climate model expressed as many thousands of lines of code, run on big computers, that attempts to simulate the Earth’s climate in fine detail. It mirrors the ideas of the basic climate model and broadly gives the same results.

Most skeptics arrived at their doubts in the carbon dioxide theory of global warming by noting the discrepancies between that theory and empirical evidence. They are empiricists. Consequently, many skeptics are unaware of the basic model, or its power, because it has been irrelevant to them.

But establishment scientists are well aware of the basic climate model, and that it unambiguously points to carbon dioxide as the cause of recent global warming. If you believe the conventional basic model, there is indeed good reason for being alarmed about rising carbon dioxide levels. So the first part of this series, on the errors in the conventional model, may mean more to the leading lights of the establishment than to skeptics.

The conventional basic climate model is superficially compelling. It is at the heart of the belief that carbon dioxide poses a dangerous threat. It must be compelling—otherwise why else would many sensible scientists still support the theory in the face of ample confounding evidence? They are convinced the basic physics is correct, so they know the something must be wrong with data that appears to contradict it.

A Rock and a Hard Place

Something has to give. Which is wrong, the theory or the empirical evidence?

That “overwhelming evidence” the establishment talks about but never shows? It’s basically the conventional basic climate model, and by extension the big computerized models (global circulation models, or GCMs). Some would try to obscure this by adding all the data they say supports the GCMs. But when you boil it all back, past the conflicted real world data and the opaque computer models, what they mean is the basic climate model. That’s it. The basic model is why they are so confident they are correct. Of course, as they too know, models are not evidence, so they have been less than forthcoming about this publicly, at least on the PR front. Over the next two posts, we will present the basic model properly, in full detail, in keeping with the leading theorists and the dominant textbook.

Science

Science was established in the Enlightenment, as a method for finding the truth that was independent of anyone’s authority. It held replication of experiments or observations as it highest authority, not any person. Prior to the advent of science, the highest authority was usually taken to be a person, either living, such as the Pope, or dead, speaking via ancients text. If these were deemed to pronounce a ruling on some issue, then experiments weren’t even attempted. Very importantly, science is a means of finding the truth that is independent of the political power structure — and look how technical progress accelerated once the Enlightenment began.

Galileo's Leaning Tower of Pisa experiment

Figure 3: Galileo’s Leaning Tower of Pisa experiment showed that objects fell at the same speed regardless of their mass (ignoring air resistance), proving Aristotle wrong. For over 1,900 years Aristotle’s theory went untested.

It is a sad feature of the climate fracas that the modus operandi for most journalists and politicians has been to ask the perceived experts — the ones with the highest positions in the political structure, the bureaucracy. They have reverted to pre-scientific behavior, making some humans their highest authority. It is the skeptics who have continued the scientific practice of making evidence their highest authority — taking data as the source of truth, believing data rather than experts when they are in conflict.

Science is about observers reporting what they saw. Nowadays peer review and one-sided funding feeds a confirmation bias. Lost is the culture where repeated observations matter and where no authority, and no opinion, is higher than the data.

Feedbacks

Skeptics who are aware of the basic climate model have generally accepted it, but dispute the value of the feedback parameter used in the model. They say the feedback parameter, the sum of all the feedbacks to the warming caused by increased atmospheric carbon dioxide, must be quite negative, thereby dampening the overall warming due to the carbon dioxide. But the establishment says the feedbacks are strongly positive in toto, amplifying the effect of the carbon dioxide.

While some individual feedbacks are negative, the biggest one (water vapor) is strongly positive, and the sum of all the known (widely accepted) feedbacks is strongly positive. The main feedbacks are based on established physics in most cases, and at least arguably so via the big computerized models in other cases.

In the past I too have made this argument, but no longer.

Skeptical scientists say there must be a missing negative feedback, massive enough to turn the overall feedback parameter from strongly positive to quite negative, thereby bringing the model in line with the empirical evidence for a low sensitivity to carbon dioxide. Assuming the logical framework of the basic climate model is sound, there is no other possibility.

So the argument over whether carbon dioxide controls the climate became an argument over the sign and size of feedbacks, with alarmists saying we know them all and have them quantified correctly (roughly), and skeptics saying there must be a massive negative feedback missing.

In the past I too have made this argument, but no longer.

Carbon dioxide increases, the surface warms which causes feedbacks, but critics have not been able to demonstrate any omitted feedbacks of significance, despite searching for over 30 years. Perhaps there aren’t any. In any case, in this series we are going to proceed as if all such significant feedbacks are known and furthermore that they are correctly quantified as per the latest IPCC Assessment Report (AR5).

Here I will instead argue that the structure of the conventional basic climate model is wrong. The known ingredients or the model are correct or near enough, but the model is connected up the wrong way —  as if a plumber connected up the pipes wrongly (anyone see the movie Brazil?). Basically it is going to come down to one connection. The basic physics is correct, but the climate scientists misapplied it. After fixing the plumbing, it all flows beautifully.

This argument potentially breaks the intellectual logjam. The empirical reality was measured correctly after all.

Model Evolution

… climate changes slowly, so testing the basic climate model has taken decades.

The basic climate model, like any model, simplifies reality by making approximations that seem reasonable.  The model relies on the same techniques used in countless physical science models, but note the survivorship bias: those models have been tested against reality and found to be useful.

In modelling, it is difficult to know before testing whether a model will happen to work well enough. It is often impossible to know the impact of the errors introduced by the inevitable approximations, or whether anything vital has been left out. In fields where experiments can be performed quickly, like chemistry or electronics, a model is tested within hours or days and quietly discarded if it turns out not to work. But climate changes slowly, so testing the basic climate model has taken decades.

The fundamental predictions of modern climate science are failing — there is the stubborn fact of the “pause”, the water vapor emissions layer did not ascend when the surface was warming in the 1980s and 1990s (the “missing hotspot”), and temperature does not follow carbon dioxide in the ice cores.

The Three Big Model Failures

First, no conventional model predicted the pause . We’ve had increasing carbon dioxide  (a third of all human carbon emissions in history have occurred since 1998) but not the commensurate rise in global temperature predicted by the IPCC (it has not warmed significantly since the late 1990s). The First Assessment Report of the IPCC in 1990 predicted warming of 0.2 to 0.5 °C per decade for the ensuing decades, whereas it warmed at most 0.17°C per decade since then (and that was in the ’90s) — this is not a matter of interpretation or ambiguity; it is simply a matter of downloading any of the five main global temperature series.

Second, all mainstream climate models predict a “hotspot”, a warming in the upper troposphere (about 10 km or 6 miles up, in the tropics) caused by an ascending water vapor emissions layer, during periods of warming such as the 1980s and 1990s. This is crucial, because two thirds of their predicted warming is from water vapor; only one third is directly due to increasing carbon dioxide. So no hotspot means not much cause for alarm. Our only suitable instruments for detecting the hotspot are weather balloons—thirty million of them since the 1950s, released from hundreds of locations, twice a day. They show no hotspot, and indicate that the water vapor emissions layer descended slightly instead. Satellites are unsuitable because they intrinsically aggregate information from several vertical kilometers into each data point, but the predicted ascent is only tens of meters.

Third, changes in temperature did not follow changes in carbon dioxide over the last half million years, as predicted by climate scientists in the 1990s, but rather the other way around. This is also significant because this fact was well known and universally acknowledged by 2003, yet Al Gore made his movie two years later in 2005, where he presented the ice cores as his only evidence that carbon caused temperature. Gore introduced this segment of his movie with some lawyerly weasel words.

Starting Point

I’m a professional modeler. The ever-inquisitive Lord Christopher Monckton has been plaguing me for years with questions about the basic climate model, ever since I showed an interest when we met in Bali in 2007. But because contradiction of the model is a necessary starting point for the notch-delay solar theory, I investigated it with renewed determination in 2013.

Some may recall that the starting point for the notch-delay theory we presented in June 2014 was to answer the question: “If the recent global warming was associated almost entirely with solar radiation, and had no dependence on carbon dioxide, what solar model would account for it?” When launching that theory, I noted:

“We also clear up a few theoretical befuddlements about the influence of CO2 that may have caused warmists to overestimate the potency of rising CO2. The fans of the CO2 dominant models are not going to be happy. It seems the climate is an 80-20 sort of thing, where there is a dominant influence responsible for 80% of climate change and a tail of 20% of other factors. It turns out that the CO2 concentration is not the 80% factor, but in the 20% tail.”

At the time, no one asked me where I got those figures from…

It’s taken a while to sort out the problems of the conventional model, neatly and simply. [“A while” means months of toil tracing through papers, following dead ends, to get back to the points that matter. – Jo] But now we are ready, finally.

Quantitative Reasoning

These blog posts employ quantitative reasoning — there will be some equations, but they’ve been kept to a minimum.

Significance?

The basic model architecture is wrong. Carbon dioxide causes only minor warming. The climate is largely driven by factors outside our control. Windfarms and solar panels are not just poor at reducing carbon dioxide — even if they did succeed in reducing carbon dioxide, they’d be useless at cooling the planet. It is only four billion dollars a day worldwide, wasted.

The findings here are unlikely to be popular with the establishment, and maybe nor with some established skeptics. Rebuilding paradigms is always painful. As J.K. Galbraith said in The Affluent Society, “We face here the greatest of vested interests, those of the mind.” Some old dogs will resent learning the new tricks. Career skeptics have their theories, their track records and preferences, and not all of them are compatible with this new material, though most are. If they expend hours examining or put their credibility on the line endorsing someone else’s ideas, what’s in it for them? Nothing really. The usual quid pro quo of the academic scientific world does not apply.  Some will be annoyed they didn’t think of it (and didn’t we see shades of that last year, when the notch-delay solar theory was introduced?). All contributions will be carefully acknowledged and credited. We’d rather do this as a team than battle alone.

We expect the usual brickbats al la Alinsky. We are used to it. The last thing we expect in the big wide world is polite, curious conversation. But we live in hope. Perhaps this time some skeptics will have the discipline to disagree with the ideas without also loading in emotional ad hominem or fact-free attacks?

(An historical aside, possibly apocryphal: In the 1920s a German newspaper held a contest for the most sensational headline possible. Most entries dealt with the end of the world, the second coming of Christ, and things of that nature, but the winner was “Arch-Duke Franz-Ferdinand Alive, World War Fought by Mistake!”. In the same vein, though of course much less momentous: “Modeling Errors Found: Climate Wars Fought by Mistake!”.)

So dear reader (and dear taxpayers of the first world), this series is proffered in the hope that perhaps some will appreciate it, and that truth and sanity will eventually prevail, somewhere down the track…

By the way, physicist Christopher Keating offered $10,000 to anyone who could “disprove climate change”. If the offer was still open I’d enter and expect to win, but unfortunately he closed it a year ago.

Now having got all that out of the way, hopefully we can stay technically focused for the remaining posts in this series.

 

ABOUT David:

Dr David Evans is an electrical engineer and mathematician, who earned six university degrees over ten years, including a PhD from Stanford University in electrical engineering (digital signal processing): PhD. (E.E), M.S. (E.E.), M.S. (Stats) [at Stanford], B.E. (Hons, University Medal), M.A. (Applied Math), B.Sc.[University of Sydney]. His specialty is in Fourier analysis and signal processing. He trained with Professor Ronald Bracewell late of Stanford University.

References

[1^] Charney, J., Arakawa, A., Baker, J., Bolin, B., Dickinson, R., Goody, R., et al. (1979). Carbon Dioxide and Climate: A Scientific Assessment. Washington D.C.: National Academy of Sciences.

8.2 out of 10 based on 108 ratings