The Climate Council calculate the “odds” that one warm year could be as hot as it was. But those “odds” depend on a logical fallacy, major, inexplicable adjustments and models we know are broken. There are invisible assumptions underlying that claim which are documentably untrue. The “odds” might as well be lotto results.
The fallacy is argument from ignorance, a failure of logic and reasoning like saying “X is true, because we can’t think of anything else“.
To estimate meaningful odds, scientists would have to understand the major driving factors of our climate, well enough to be able to assign probabilities to outcomes. But their models are hopelessly broken, they can’t predict a decadal average on a global or continental scale. They can’t hindcast the past “bumps” without using major adjustments to make the raw observations fit the models. They don’t know why the medieval warm period was warm, they don’t know why the Little Ice Age was cool. They don’t know why the world started warming 200 years before we poured out industrial levels of CO2. They don’t know if the mystery factors driving our climate for the last 4.5 billion years are still operating. If we can’t predict the [...]
Two papers on ocean heat released together today. The first says the missing heat is not in the deep ocean abyss below 2000m. The second finds the missing heat in missing data in the Southern Hemisphere instead. Toss out one excuse, move to another.
The first paper by Llovel and Willis et al, looked at the total sea-level rise as measured by adjusted satellites*, then removed the part of that rise due to expanding warming oceans above 2,000 m and the part due to ice melting off glaciers and ice-sheets.** The upshot is that the bottom half of the ocean is apparently not warming — there was nothing much left for the deep oceans to do. This result comes from Argo buoy data which went into full operation in 2005. (Before Argo the uncertainties in ocean temperature measurements massively outweigh the expected temperature changes, so the “data” is pretty useless.)
Figure 2 | Global mean steric sea-level change contributions from different layers of the ocean. 0–2,000m (red), 0–700m (green), 700–2,000m (blue). The dashed black curve shows an estimate for the remainder of the ocean below 2,000m computed by removing the 0–2,000m estimate from the GRACE-corrected observed mean sea-level time series. [...]
Finally, for only the 87th time, climate modellers have uncovered the definitive proof they’ve been finding in different forms every year since 1988.
ARC extreme unscience – corrected at no cost to the Australian taxpayer. Click for a big printable copy.
They seek, and find, the most excellent propaganda they can pretend is science. Look, this is the specific handprint of non-specific climate-change! Everything bar climate-sameness is proof the climate changes. How inane? The unscientific vagueness gives this poster away as being more about propaganda than about communication of science.
… in a special edition of the Bulletin of the American Meteorological Society, examining extreme events around the world during 2013, a series of papers home in on the Australian heat waves, and identify a human influence.
Using short, noisy records, with flawed and adjusted data, it is possible to run broken climate models and show “definitively” that current heat-waves and hottest years are due to man-made emissions. And if you believe that, you could be gullible enough to be a Guardian journalist.
That is, climate models that do not include solar factors like magnetic fields, solar winds, cosmic rays, solar spectral changes, or lunar effects are able to [...]
You won’t believe… Research shows surprise global warming ‘hiatus’ could have been forecast
[The Guardian] Australian and US climate experts say with new ocean-based modelling tools, the early 2000s warming slowdown was foreseeable. Australian and US researchers have shown that the slowdown in the rate of global warming in the early 2000s, known as a so-called “global warming hiatus”, could have been predicted if today’s tools for decade-by-decade climate forecasting had been available in the 1990s.
And I’ve got a model that would have predicted the 1987 stock market crash, the GFC, and the winner of the Melbourne Cup. What I would not have predicted is that lame excuses this transparent, would be made by people calling themselves scientists, Gerald Meehl, and repeated by people calling themselves journalists. (That’s you, Melissa Davey). Though I’m not surprised that research this weak had to be published by Nature. (Where else?)
Although global temperatures remain close to record highs, they have shown little warming trend over the past 15 years, a slowdown that earlier climate models had been largely unable to predict.
This has been used by climate change sceptics as evidence that climate change prediction models are flawed.
Imagine that, the stupid [...]
Remember how CO2 is supposed to cause warmer winters, and warmer nights? Well now CO2 also produces cold snaps. No matter what weather you get, there is a citation to blame CO2. Nature (the formerly great science journal) and Northeastern University have produced another permutation of outputs from models we know are broken.
The first line in the press release is false and smugly so: “most scientists — 97 percent of them, to be exact — agree that the temperature of the planet is rising and that the increase is due to human activities….” 10 seconds on Google would have shown — 60% of geoscientists and engineers don’t agree.
If Kodra and co were trying to be accurate, they could have said “97% of annointed climate scientists agree… “. If they were trying to be scientific, of course, they wouldn’t mention a consensus at all. If they had good evidence, they’d talk about that instead.
They dug deep in The-Book-of-Cliches for the press release. Strip away the advertising spin and I think this is the nub of the work:
“While global temperature is indeed increasing, so too is the variability in temperature extremes. For instance, while each [...]
We could spend hours analyzing the new IPCC report about the impacts of climate change. Or we could just point out:
Everything in the Working Group II report depends entirely on Working Group I.
( see footnote 1 SPM, page 3).
Working Group I depends entirely on climate models and 98% of them didn’t predict the pause.
The models are broken. They are based on flawed assumptions about water vapor.
Working Group I, remember, was supposed to tell us the scientific case for man-made global warming. If our emissions aren’t driving the climate towards a catastrophe, then we don’t need to analyze what happens during the catastrophe we probably won’t get. This applies equally to War, Pestilence, Famine, Drought, Floods, Storms, and Shrinking Fish (which, keep in mind, could have led to the ultimate disaster: shrinking fish and chips).
To cut a long story short, the 95% certainty of Working Group I boils down to climate models and 98% of them didn’t predict the pause in surface temperature trends (von Storch 2013) . Even under the most generous interpretation, models are proven failures, 100% right except for rain, [...]
The backdown continues. Faced with the ongoing failure of their models, the search rolls on for any factor that helps “explain” why the official climate scientists are still right even though they got it so wrong. The new England et al paper endorses skeptics in so many ways.
The world might warm by only 2.1 degrees this century, not 4c. (Skeptics were right — the models exaggerate). There has been and is a pause in warming which the 95%-certain-models didn’t predict. (The science wasn’t settled.) What the trade-winds giveth, they can also taketh away. If they “cause cooling” after 2000, then they probably “caused warming” before that. How much less important is CO2? Ultimately, newer models are less wrong if they include changes in wind speed, but they don’t know what drives the wind. It’s curve fitting with one more variable.
As usual, the models still can’t predict the climate, but they can be adjusted post hoc with new factors to trim their overestimates back to within the errors bars of some observations.
As I said nearly 2 years ago, Matthew England owes Nick Minchin an apology:
Nick Minchin: ” there is a major problem with the warmist argument [...]
Joint Post: Geoff Sherrington and JoNova
The IPCC Synthesis Report first order draft has been leaked (h/t Tallbloke) . It is part of the big Fifth Assessment report see the parts already released here. The Synthesis Report supposedly summarizes the science. In the real world the topic du jour is the plateau, pause, or hiatus in warming which the IPCC can no longer ignore. Instead the masters of keyword phrases test new bounds in saying things that are technically correct, while not stating the bleeding obvious. Luckily we are here to help them. : -)
“The rate of warming of the observed global-mean surface temperature has been smaller over the past 15 years (1998-2012) than over the past 30 to 60 years (Figure SYR.1a; Box SYR.1) and is estimated to be around one-third to one-half of the trend over the period 1951–2012. Nevertheless, the decade of the 2000s has been the warmest in the instrumental record (Figure SYR.1a).”
Translated: Yes temperatures are not rising faster as we predicted, even though more CO2 was pumped out faster than ever. Let’s ignore that this shows the models were wrong, the important thing is to [...]
Nicola Scafetta has a new paper (in long line of papers) on a semi-empirical model which has a better fit than Global Circulation Models (CGM) favored by the IPCC. We ought be careful not to read too much into it, but nor to ignore the message in it about the grand failure of the GCM’s. Scafetta used Fourier analysis to find six cycles, then uses those six cycles to produce a climate model he runs for as long as 2000 years which seems to match the best multiproxies. In terms of discovering the absolute truth about the climate, this is not an end-point way to use Fourier analysis, as it is just “curve fitting” With six flexible cycle frequencies (plus amplitude and phase) there are 18* 6 tuneable parameters, more than enough to model any wiggly line on a graph, and there are scores of astronomical cycles to pick from. *.[Nicola Scafetta replies to this below, pointing out he uses the "6 major detected astronomical oscillations", and their phases are fixed. I am happy to be corrected. His model is more useful than I thought. Apologies for the misunderstanding. - Jo]
But Scafetta’s work suggests it’s madness not to [...]
And the public conversation finally starts to move on to discussing not whether the IPCC is wrong, but why it was wrong, and what we need to do about it. Credit to Judith Curry and the Financial Post. I’ve posted a few paragraphs here. The whole story is in the link at the top. – Jo
Judith A. Curry, Special to Financial Post
Kill the IPCC: After decades and billions spent, the climate body still fails to prove humans behind warming
The IPCC is in a state of permanent paradigm paralysis. It is the problem, not the solution
The IPCC has given us a diagnosis of a planetary fever and a prescription for planet Earth. In this article, I provide a diagnosis and prescription for the IPCC: paradigm paralysis, caused by motivated reasoning, oversimplification, and consensus seeking; worsened and made permanent by a vicious positive feedback effect at the climate science-policy interface.
In its latest report released Friday, after several decades and expenditures in the bazillions, the IPCC still has not provided a convincing argument for how much warming in the 20th century has been caused by humans.
We tried a simple solution for a [...]
17 contributors have published
1834 posts that generated