Don’t underestimate the importance of the nameless basic model. It sounds small, but in the culture and philosophy of climate science it’s bigger and carries more weight than the massive hairy GCMs. Like an invisible gossamer web, it’s overarching. It spans and defines all the other models. When they produce “dumb” answers, the basic model holds them in, for thou shalt not stray too far from the climate sensitivity defined by the basic model. It defines what “dumb” is. (It’s just “basic physics” after all.) One model to bind them all. What could possibly go wrong?
A lot, apparently. The physics might be right, but the equations are calculating imaginary conditions. The answers might be arithmetically correct but useless at the same time. They miss the real route that energy flows through to space.
By definition, as long as the basic model is wrong, the GCM models can never get it right.
It’s not like climate scientists consult the oracle of the basic model every day, or even once a year… they don’t need to. They were taught it their climate larval stage, often long before they’d written one paper. The basic model shows that the warming of the 1980s and 90s was “caused” by CO2. Then the GCMs are tuned to that trend and that assumption.
The silent basic model is thus the whole nub of the problem. People have been arguing about the parameters when they should also have been talking about the equations that connect those parameters — the architecture. They added the separate warming forces before they did the feedbacks, but they should have done the feedbacks first and then added them all together after.
My favorite line below: “Once two numbers are added, there is no way of telling what the original numbers were just from their sum.” The models can’t figure out what was man-made and what was natural…
In the basic model the atmosphere responds the same way to all forms of warming, always the water vapor amplification and never simple rerouting.
Good news here, we’re in a low-acronym-zone today. This post is ASR and OLR-free, low on WVELS. No equations!
This post completes the first two parts of this series — problems with the conventional basic climate model, and fixing them with the alternative basic climate model. Here are just some general comments, tying together some of the main ideas. There are hardly any acronyms, and no equations.
After this the series will embark on its third and final part, an hypothesis about the main cause of global warming. Kindly note that whether the third part of the series eventually proves to be right or wrong has no bearing on the correctness of these first two parts about climate model architecture.
This post is only about basic models, not GCMs, except where it specifically states otherwise.
Suppose we give billions to the bureaucratic geniuses in Paris. Suppose they are right about how global warming works (though we know they are not). What do we get for all that money?
Combined, all plans, carried out, successful best case, at a cost of hundreds of trillions + : 0.17°C
The UN wants us to spend $89 trillion by 2030 to “green up” everything. For that we hope in theory, if we’re lucky to get a reduction of one sixth of a degree 70 years later. Rush, Rush, buy that plan today! Order two, and don’t count the dead.
A new atlas shows droughts of the past were worse than those today — and they cannot have been caused by man-made CO2. Despite the claims of “unprecedented” droughts, the worst droughts in Europe and the US were a thousand years ago. Cook et al 2015[1] put together an old world drought atlas from tree rings data as a proxy for summer wetness and dryness across Europe. They compare the severity and timing of European droughts with the North American Drought Atlas (NADA) released in 2004. Yes, it’s a tree ring study with all the caveats about how trees are responding to several factors at once etc etc. But at least the modern era is measured with the same proxy as used in the old eras.
Something else is causing droughts, something modern models don’t include:
“megadroughts reconstructed over north-central Europe in the 11th and mid-15th centuries reinforce other evidence from North America and Asia that droughts were more severe, extensive, and prolonged over Northern Hemisphere land areas before the 20th century, with an inadequate understanding of their causes.”
The worst megadrought in the California and Nevada regions was from 832 to 1074 CE (golly, 242 years). The worst drought in north-central Europe was from 1437 to 1473 CE, lasting 37 years.
Climate models don’t predict any of the droughts below, and all of them occurred before 99% of our emissions were released.
Authors compare results from the new atlas and its counterparts across three time spans: the generally warm Medieval Climate Anomaly (1000-1200); the Little Ice Age (1550-1750); and the modern period (1850-2012).
The atlases together show persistently drier-than-average conditions across north-central Europe over the past 1,000 years, and a history of megadroughts in the Northern Hemisphere that lasted longer during the Medieval Climate Anomaly than they did during the 20th century. But there is little understanding as to why, the authors write. Climate models have had difficulty reproducing megadroughts of the past, indicating something may be missing in their representation of the climate system, Cook said.
Figure 3A Maps from a new 2,000-year drought atlas show rainfall conditions over the whole continent, and much of the Mediterranean. A chart for 1741 shows severe drought (brown areas) running from Ireland into central Europe and beyond. A chart for the year 1315 shows the opposite problem—too much rain (dark green areas), which made farming almost impossible. (Cook et al., Science Advances, 2015)
A large part of the Northern Hemisphere is included in the study.
This video gives me hope. Finally we are starting to see more sane commentary in western parliaments. David TC Davies MP shows how politicians can master enough of the scientific details on this debate to crush the usual bumper-sticker trite “consensus” hogwash. He talks of Roman warming, the Medieval Warm Period, the Younger Dryas, the age of the Earth. It’s high school science type level, but more than enough to expose some of the silliness. He also counters the “climate change denier” tag. He cites just enough key numbers to back up each of his points. His skill here is in prioritizing the numbers that matter. Here’s hoping a few of the silent political skeptics will feel more confident to speak out. The bullying and namecalling breaks when enough people stand up to it. That’s coming.
No one I’ve ever met has ever suggested the climate never changes…
Even the IPCC is not saying that most of the warming [since industrialisation] is caused by humans …
It is absolutely certain that the more we rely on renewable energy the more we have to pay for it. No politician from any party should be running away from this. They should be willing to go out and make the argument if they think we should be paying more … but none of them are. Nobody thinks it’s a good idea to increase energy bills…
Mr Davies is a former worker at a Steel Works, and a former Special Police Constable.
There is a mystery peak in global CO2 levels in 1990. For some reason from 1989 suddenly global carbon levels jumped higher than they normal would and by 9,000 million tonnes (that’s equivalent to 2,500 mT of carbon)*. It’s only a little blip in an upward line, but as a deviation from the long steady norm, it’s a dramatic change (see the second graph below). Within a few years the excess disappeared and the reasonably straight line increase in CO2 resumed. The sudden jump is equivalent to nearly half of our total annual human fossil fuel emissions. Nothing about this peak fits with the timing of human induced fossil fuel emissions. These were not big years in our output (indeed it coincides with the collapse of the soviet block when inefficient Russian industry was shut down) .
The mystery of the massive CO2 bubble exposes how little we know about why CO2 levels rise and fall, and whether human emissions make much difference. The world is spending $4 billion dollars a day trying to change this global CO2 level of 0.04% (400ppm) but apparently other large forces are at work pushing up CO2, and then absorbing it, and we don’t even know what they are.
Tom Quirk has been dedicated in isolating this quixotic spike, and hunting for the cause. He wondered if the ocean was responsible. By looking at isotopes of carbon he finds that it was not produced out of the dissolved CO2 from the ocean. Instead the CO2 was released from a biological source, and mostly one in the Northern Hemisphere. With some great sleuthing he finds a remarkable paper from 15 years ago that documents a sudden drop in fish stocks at the same time. He puts forward a suggestion that there was a regime shift in the ocean then, and the currents stopped stirring up as many nutrients. This meant phytoplankton activity was lost, fish starved, and the lack of activity by marine biology meant CO2 accumulated in the air. Within the next few years the phytoplankton recovered and drew out that extra CO2.
It’s as if there was an extra three China’s on Earth for a year — pouring out extra CO2. We see how fast biology absorbs that extra load, and wonder (yet again) what the fuss is all about. Tax the krill instead eh?
–Jo
————————————————————————–
Some inconvenient measurements
Guest Post by Tom Quirk
There is an unexplained atmospheric CO2 “bubble” centred around 1990. The apparent smooth and continuous rise in atmospheric CO2 concentrations is broken by an anomaly that can be seen in the figure below.
Average yearly CO2 concentrations at the South Pole and Point Barrow from Scripps measurements. The straight line is a best fit to the South Pole data.
Average yearly CO2 concentrations at the South Pole and Point Barrow from Scripps measurements. The straight line is a best fit to the South Pole data with an annual increase of 1.5 ppm per year.
Plotting the residual differences of measurements from the straight line fit shows that as the world cooled in the 1960s excess CO2 accumulated at low annual rates. During the 1970s and 1980s CO2 was accruing at about 1.5 ppm per year, the average rate of the last 55 years. Then suddenly in 1989 – 1991 large amounts of CO2 were added to and withdrawn from the atmosphere. A further turning point occurred in 1995 when the annual rate of increase reached its highest level.
Residual differences from straight line fit to average yearly CO2 concentrations at the South Pole *see above). Also similar residuals for Mauna Loa and Point Barrow. Note the break in the trends in 1977 and 1995 at the times of phase changes in the PDO and ADO
In years to come it may be recognized that this blog post produced the first modeled accurate figure for climate sensitivity. Equilibrium Climate Sensitivity sounds dry, but it’s the driving theme, the holy grail of climate science. The accurate figure (whatever it is) summarizes the whole entirety of carbon dioxide’s atmospheric importance. That number determines whether we are headed for a champagne picnic or a baking apocalypse.
To calculate a better estimate, David identified the flaws of the conventional basic model, and rebuilt it. The basic climate model is the top-down approach looking at inputs and outputs of the whole system. It defines the culture and textbooks of the modern global warming movement. GCMs (the big hairy coupled global models) are bottom-up approaches, doomed to failure by trying to add up every detail and ending up drowning in mile-high uncertainty bands. But the GCMs are ultimately tweaked to arrive at a similar ballpark climate sensitivity as the textbook model for the “basic physics” dictates. Hence this core model is where the debate needs to be. (Everyone knows the GCMs are broken.)
For decades the world of conventional climate research has been stuck in a groundhog day with major research overturning older ideas, but somehow the upper and lower bounds of climate sensitivity stayed the same. It’s always 1.5 – 4.5 deg C (and their models never work). Their “best” estimates of climate sensitivity are relentlessly, slowly shrinking (they were around 3.5°, now around 2°C). The new alternative model doesn’t rely on the bizarre idea that all feedbacks can only operate off the surface. The alternative model (we are going to have to come up with a better name) allows feedbacks to act differently for different warming influences, and thus energy can reroute from one outgoing “pipe” to another.
The river of energy is flowing to space. If we put a rock in the way, the flow just reroutes around it.
In the “…nameless..;-).” alternative model, Evans uses the same physics but better architecture and a few more empirical data points, and we can finally estimate what the true climate sensitivity must be.
..
We could really use your help.
This research is only possible
thanks to people like you.
Because the “pipes” for outgoing radiation to space are elastic, and can adapt to increases in energy, the climate sensitivity to CO2 could be very low. Indeed it is not possible to put a lower bound on the figure — it may be almost zero. It is possible to put an upper bound on it — which is about half a degree. The most likely estimate is around 0.25°C. Empirical estimates by Lindzen and Choi,[1][2] Spencer and Braswell[3][4] and Idso[5] suggest it is 0.4°C – 0.7°C. We can argue the toss between 0.25 and 0.5, but no one can argue that we need to spend a single dollar to reduce our emissions if doubling CO2 only causes minor and beneficial warming.
What is striking is how small the changes need to be to compensate for “all that extra CO2”. The effect of the increased CO2 of the last few decades was neutralized if the height of the cloud tops or the water vapor emissions layer fell by just a few tens of meters. These changes are so small they are hard to detect, but there are empirical observations that suggest both may have occurred to some degree. The small changes required show just how easy it would be for the atmosphere to emit a constant energy flow, regardless of a blockage in one “pipe”.
DEFINITION: The equilibrium climate sensitivity (ECS) is the surface warming ΔTS when the CO2 concentration doubles and the other drivers are unchanged. Note that the effect of CO2 is logarithmic, so each doubling or fraction thereof has the same effect on surface warming.
— Jo
…
18. Calculating the ECS Using the Alternative Model
This post employs the alternative model to quantitatively analyze the climate of the last few decades, estimating the CO2 sensitivity (λC), the fraction of global warming due to increasing CO2 (μ), and the equilibrium climate sensitivity (ECS). The formulas for these quantities were derived in post 16.
Here is the spreadsheet with the alternative model and the numerical calculations in this post: download (Excel, 250 KB).
The devastating result of the latest CSIRO survey: 54% of Australians don’t believe the experts at the IPCC, and are not convinced that humans are the dominant cause of climate change. Starkly, only 28% of Liberal voters agree with Malcolm Turnbull. Amazingly 40% of Labor voters and even a quarter of green voters don’t accept the IPCC litany. Presumably they think humans have some effect but are not the major cause.
More importantly, even most of those who believe are not motivated. When it comes to spending money on the environment, 80% of Australians don’t voluntarily do it and 80% don’t care enough to change their vote because of this issue. Despite all the relentless propaganda, despite all the government funded groups being in lock-step, the trends are slowly falling for believers (from 2010-2014), though not “statistically significantly”. (Though longer term studies from 1990 to now show that falling trend).
Don’t miss below how a climate science professor reveals that most of the people he knows just follow their political team’s fashions. How telling? Also below, see how the ABC spins this to meaninglessness to support the religion. It’s all so predictable.
Most Australians disagree with IPCC experts
…
Left leaning voters are less skeptical, and more gullible
This graph of political views could be re-titled “the gullibility index”. Three quarters of Green voters believe mankind controls the weather. But three quarters of coalition voters aren’t fooled. Those who are more likely to take entrepreneurial risks, who face real competition, and believe in true free markets have a better grounding in reality. What a surprise.
…
Most Australians won’t voluntarily pay money to environmental causes, nor will they change their vote
…
This survey is loaded, and still uses the ambiguous term “climate change” as if it means something. That said, it’s bigger and better than most others of its kind. But sadly it was still written as though man-made global warming is powerful, threatening, and worth doing something about. Usually, they even surveyed the same individuals over the years, and as many as half of them changed their opinion on climate change at least once. In other words, half the population is neither here nor there, and their views may shift depending on the weather, or what they heard on the radio the day before. As I’ve said all along, we are only one good prime time documentary away from ending the meme. No wonder the believers are panicky about counter opinions. More of the same propaganda is not going to increase the number of believers, but if the skeptic talking points get a public airing it’s all over.
University, media and public servants just follow their political team … as revealed by Prof Pitman
Professor Andy Pitman reveals that most people around him just follow their masters, and don’t think for themselves. He projects this passive gullibility onto enrepeneurs, risk takers, self employed people and small business owners who are more likely to vote “right”, predicting they will be as passive as the public servants and media that he knows:
Andy Pitman, Director of ARC Centre of Excellence for Climate System Science at the University of NSW, predicted that many Coalition voters will take their cue from the new PM and shift their views.
“To a substantial degree, when asked, a significant fraction of the public say what they think their preferred party says” on issues such as climate change, Professor Pitman said.
“My experience of the public service and right the way through to some media outlets, they absolutely listen to the vibe from the top and respond to it,” he said.
The university and media elite believe they think for themselves, but narcissistically and arrogantly assume that most Australians don’t. The reality of course is that it’s the university “thinkers” who follow the fashions in thinking more slavishly than anyone else.
Thus do the claws of centralized big-government grow. I’s a committee, then a tribunal, then one day, a court?
An actual international court would end up being corruption-in-a-can — (think FIFA but with more money). The only way any legal system can work is with one law for all. As soon as countries can have individual targets, clauses and excuses, the wheeling and dealing begins. But a climate court won’t even pretend to start with an even playing field.
The United Nations may launch an International Tribunal of Climate Justice which could see states who fail to uphold the international deal to tackle climate change brought before a court.
The International Tribunal of Climate Justice has been suggested as one of three options to ensure that all parties follow what is agreed during the Paris summit next month.
In essence, this means that any UN member state who agrees to the international deal to tackle climate change and fails to honour its commitment can be brought before the tribunal.
Toothless, Trojan, or Effective Tool?
This tribunal of climate justice could well be a toothless tiger, with no enforceable authority — “nothing to be afraid of”. But if that’s the case, it’s just a junket-job reward for the “right” people and why have it at all? On the other hand, if it has the power to change domestic energy policies, or direct billions in fines or reparations, that’s real power over voters. Once started, how hard would it be to “add teeth”? Perhaps legal minded readers can help?
But even without enforceable power — the toothless institution becomes a tool in the hands of the domestic thought police. Just the threat of being taken to “court” will be enough to beat weak politicians into line. The tribunal could produce embarrassing declarations of how backward a politician is, and the local big-government-loving media buffs will pick and choose which reports to amplify to suit themselves. It’s just another chance for the big-gov-blob to score PR against their favourite targets. It’s leverage. Because of the Love-media effect the tribunal is primarily a tool against Western Democracies. Third world tyrants would hardly care about a slap on the wrist from a UN committee: they don’t want votes, and they own their media.
Real democracies don’t need international tribunals. They need accurate information, and freedom of speech.
The good voters will act (as they have already done) to protect species, to set up nature reserves, pollution controls, and to fund research as they see fit. The idea of an international court presupposes that the voters can’t deal with “a potential catastrophe”.
The possibility of an International Tribunal of Climate Justice has been criticised in the U.S., where some commentators have argued that it will lead to developing nations dragging President Obama into court and bypassing Congress.
‘Whatever they call it, countries who sign onto this agreement will be voting to expand the reach of the UN climate bureaucracy, cede national sovereignty, and create a one-way street along which billions will be redistributed from developed to poor nations,’ claims CFACT co-founder Craig Rucker.
‘Developed nations would be expected to slash their emissions while the “poor” [developing] countries expand theirs,’ he adds.
The only document any government should sign must have an out-clause. If the voters don’t like what Obama-Turnbull-Trudeau-Cameron-Key-et al sign up for, they must be able to opt out.
The Draft treaty is almost unreadable. But the ambition is clear. The “best available science” according to who? Not the voters.
Is there any better proof of how fragile the facade is? The evidence for the Big Scare Campaign is so pathetically weak that it cannot cope with one weatherman who writes a critical book. They are so afraid they did not even muddy the waters — it’s obvious why he was sacked. Verdier was suspended immediately after he launched his skeptical book. Now a month later, sacked.
“I put myself in the path of COP21, which is a bulldozer, and this is the result,” Verdier told RTL radio station in October.
Right now thousands of people in France are wondering why being a skeptical journalist (as he calls himself) is such a crime. They are also wondering how many other meteorologists are skeptical but too afraid to speak out.
One thing everyone knows for sure now is that France24 is definitely not telling them the whole story about global warming. (See Sputnik News, no paywall)
A French weatherman has broadcast the news of his own sacking, saying he had been fired for writing a book challenging climate change.
Philippe Verdier, arguably France’s most famous weatherman, took to the internet to read a letter from the state broadcaster informing him of his dismissal.
Things are hotting up. After all the hard work of the past few posts, the payoff begins. By solving the flaws inherent in the basic conventional model we solve some of its biggest missed-predictions. And the clincher for conventional models has always been the missing hot spot. Without it, over half the projected warming just vanishes. And if it is telling the tale of a negative type of feedback instead of a positive one, then all bets are off — not three degrees, not even one degree, it’s more like “half” a degree. Go panic about that.
Here David gets into the empirical data — the radiosondes, the satellites, and shows how his model fits their results, whereas the establishment models have repeatedly been forced to deny them. Twenty eight million radiosondes get the wrong results: how many ways can we adjust them? Tweak that cold bias, blend in the wind shear, change the color-scales, homogenize the heck. Smooth, sort, shovel and grind those graphs. The fingerprint of CO2 was everywhere in 2005, though gradually became the non-unique signal of any kind of warming, but it still wasn’t there. It kept being “found”, though it was never reported missing. Wash, rinse and repeat.
It’s the thorn that won’t go away, the key to the scary predictions. Without water vapor amplification, there is only beneficial balmy warming, and the threat of bountiful crops.
The models to the left of us, the measurements to the right… obviously the models expect a hot spot. Obviously, it isn’t there.
With the paradigm shift of adding separate warmings and allowing the rerouting feedback, we find that as CO2 warms the atmosphere, the air high in the troposphere just needs to dry out in the thinnest of layers and the water vapor emissions layer emits from a slightly lower altitude. The emissions are coming from a slightly warmer part of the sky, and thus more energy escapes to space. A substantial part of the effect of CO2 is neutralized.
The corollary for this is that for the first time we can start looking at the GCMs, though only through inference. But the GCMs predict the hot spot, which suggests they are treating extra warming from CO2 as if it were extra warming from the sun. They are making the mistake of applying the same feedbacks to CO2 as to solar warming. Hint hint hint…
We are not up to calculating the climate sensitivity yet, but that’s coming.
Enjoy, the pieces are starting to fall into place. I haven’t been harping on about that hot spot for seven years for nothing.
Before applying data to the alternative model of the last post, which we shall do in the next post, we first dwell on one crucial aspect of the data.
Some notes on terminology before we begin:
A concept of much concern in this post is the water vapor emission layer, which is such a mouthful that we frequently use its acronym WVEL (which we pronounce so as to rhyme with “bevel”).
The “solar response” means the “response of the surface temperature to increased solar absorbed radiation” — that is, the response to the Sun, not the response of the Sun.
The “CO2 response” is the “response of the surface temperature to an increased concentration of atmospheric carbon dioxide” — the response to increasing CO2.
The solar and CO2 responses are perhaps best explained by Fig. 1 of post 13.
Overview
The “hotspot” is the informal name for a warming of the upper troposphere, caused by an ascent of the water vapor emission layer (WVEL).
In the conventional model (see Fig. 2 of post 3), surface warming for any reason causes a hotspot. The water vapor amplification mechanism in the solar response causes a hotspot because it thickens the water vapor layer, causing the WVEL to ascend. The conventional model applies this solar response to every climate driver (post 9).
In the alternative model (see Fig. 1 of post 13), the solar response causes the water vapor layer to thicken and thus the WVEL to ascend, while the CO2 response could either thicken or thin the water vapor layer, causing the WVEL to either ascend or descend (at this stage, before considering the data, we don’t know which). The water vapor responds to both responses, and both processes may be occurring simultaneously.
Given the apparent lack of a hotspot over the last few decades, coincident with a rapid rise in CO2 concentration, it would appear that (a) the CO2 response causes the WVEL to descend, which is consistent with the rerouting feedback, and (b) the descent of the WVEL due to the CO2 response outweighs the ascent of the WVEL due to the water vapor amplification caused by the increase in absorbed solar radiation (ASR) that also occurred over that time.
The Hotspot is Caused by an Ascending WVEL
The water vapor emission layer (WVEL) is effectively the upper optical boundary of the water vapor in the atmosphere, on average. It is at about one optical depth as seen from space, on the wavelengths at which water vapor absorbs and emits. Its average height is ~8 km or ~360 hPa (post 14), which is in the upper troposphere. The most important part of the WVEL is in the tropics, where it is warmest and most of the radiation to space is occurring, and the WVEL, like the tropopause, is somewhat higher — maybe 10 km.
If the WVEL ascends, it creates the hotspot. The air above the WVEL is dry, but the air below the WVEL is moist and therefore warmer — because water vapor is condensing and releasing its latent heat. If the WVEL ascends it creates the hotspot, which is the warming of a volume that was dry and cool when just above the WVEL but which becomes moist and warmer as the WVEL ascends above it.
Conversely, a falling WVEL produce a volume of cooling (a “coolspot”?).
Because water vapor is quite dynamic in the upper troposphere, its upper boundary often moving up and down several kilometers over time at a given location, the “instantaneous WVEL” moves up and down. The (average) WVEL is the average of the instantaneous WVEL. Because of this dynamism, hotspot warming can extend for a couple of kilometers in height — as the WVEL moves up the instantaneous WVEL tends to move up, warming volumes over a couple of kilometers of vertical extent to some degree.
The traditional illustration of the hotspot comes in a diagram of atmospheric warming (color) by latitude (x-axis) and height (y-axis). If the instantaneous WVEL stayed very close to the WVEL (i.e. no dynamism), the hotspot would be a strip of strong warming in the upper troposphere somewhere around 300 to 400 hPa, and the height of the strip would be the amount by which the hotspot ascended. But the dynamism of the water vapor causes a cloud of instantaneous WVEL heights clustering around the WVEL, ensuring that the hotspot is smeared out over a couple of kilometers in height.
The hotspot is distinct from warming in the upper troposphere produced by a change in lapse rate. As the surface warms due to increased ASR, more evaporation causes a moister atmosphere and thus a lower lapse rate, which causes the atmosphere at a given height to warm. While this surface warming also causes the WVEL to ascend (next section), warming due to increased lapse rate is broadly and diffusely spread through the atmosphere with a shallow gradient, in contrast to the hotspot which is a smear of warming centered on the WVEL.
The Solar Response Causes the WVEL to Ascend
The conventional explanation for the hotspot is that surface warming causes more evaporation (70% of the surface is ocean), and the greater volume of water vapor in the atmosphere is assumed to push up the WVEL, which is the top layer of the water vapor. However, as noted by Paltridge, Arking, and Pook in 2009 [1], more water vapor in the atmosphere does not necessarily lead to more water vapor in the upper troposphere if the extra water vapor is mainly confined to a more stable lower troposphere with less overturning, as appears to be the case according to the better radiosonde data from 1973.
Note that the temperature at height h is
where Γ is the average lapse rate (about 6.5 °C per km).
Within the solar response (Fig. 1 of post 13), the surface warming due to increased ASR (that is, ΔTS,A) is about twice the increase in radiating temperature (ΔTR) because of amplification by the non-albedo solar feedbacks (Eq.s (3) and (4) of post 13). Because the radiating temperature is roughly the average of the warmings of the various emission layers, and because all the emission layers warm by about the amount of surface warming before the effects of changes to lapse rates and emission layer heights, one or more of the main non-surface emission layers must warm significantly less than the surface. The height of the CO2 emissions layer is determined by the concentration of CO2, so its height cannot change. The lapse rate changes slightly, but that works against the amplification of warming (the lapse rate feedback is negative). That leaves just the heights of the WVEL and the cloud tops, one or both of which must ascend to cooler places in the atmosphere. Clouds are not well understood, but it is generally thought that WVEL provides the bulk of the accommodation. Thus the WVEL ascends significantly in response to surface warming due to more ASR.
In the conventional model all climate drivers cause the solar response (Fig. 2 of post 3 or Fig. 2 of post 13), and all surface warming is due to the solar response, so the surface warming ΔTS is equal to the surface warming due to ASR ΔTS,A. Thus, the conventional explanation is that all warming influences cause the WVEL to ascend, thereby causing a hotspot. In the alternative model however, while extra ASR causes the WVEL to ascend as per the solar response in the conventional model (Fig. 1 of post 13), other climate drivers operating through their own responses might simultaneously cause the WVEL to descend.
The WVEL Has Not Ascended in the Last Few Decades
The only instruments with sufficient vertical resolution to measure the change in height of the WVEL over the last few decades (ΔhW) are the radiosondes. Satellites are not suitable because they aggregate information from several vertical kilometers into each data point.
Radiosonde-derived temperature and humidity data is used here. It is accepted that the latter especially must be treated with great caution, particularly at altitudes above the 500 hPa pressure level. Following the discussion in Paltridge, Arking, and Pook (2009), the humidity data is restricted to tropical and mid-latitude data at least ~0.5 g/kg, from 1973. While the data is not good enough to estimate changes in the average height of the WVEL, ΔhW, it is sufficient to at least distinguish the direction of movement.
Surface temperatures here are the midpoints of UAH and HadCrut4, 5-year smoothed and centered.
– Temperature Data
The temperatures measured by the radiosondes are shown in Fig. 1 below, for 1979 to 1999 (the only image as a function of height and latitude ever publicly released, apparently).
Over those two decades surface warming was ~0.120.20 °C per decade, which would have caused a similar warming at all levels of the troposphere had there been no change in the lapse rate (Eq. (1)). Lapse rate changes warm the atmosphere even more. Surface warming causes more evaporation and a damper atmosphere, and thus a lower lapse rate (our lapse rate is a positive number, about 6.5 °C per km). So, at a given height, there is warming due to the change in lapse rate (Eq. (1)). Working through the details*, at the WVEL height of ~8 km the atmosphere warmed by ~0.1627 ± 0.035 °C per decade over the period in Fig. 1, from a combination of surface warming (0.1220 °C per decade) and the attendant lapse rate change (0.047 ± 0.035 °C per decade). At 10 km, it would have warmed slightly more.
The real story in Fig. 1 is hidden between the lines. This graph is for the two decades that saw the most rapid surface warming of the last 50 years. It does not show the lowest 1.5 km, which would have been a yellow-orange color because the surface warmed by ~0.20 °C per decade. Yet it shows barely any upper tropospheric warming. The graph is putting it in the nicest possible way, but actually the numbers are devastating for the conventional models — because the warming observed around the WVEL height of 8 to 10 km is less than the warming merely due to the surface warming and lapse rate changes.
This tells us there was active cooling at work in the upper troposphere, some factor that countered the warming coming up from the ground. It’s like the dog that didn’t bark. If the structure of the troposphere stayed the same (that is, the WVEL did not move), then the surface warming and lapse rate changes would have warmed the upper troposphere at 8 km by ~0.1627 ± 0.035 °C per decade. But the observed warming was about 0.1 °C per decade, as shown in the radiosonde data of Fig. 1. Therefore there was a slight counteracting cooling, presumably due to structural changes in the upper troposphere. This suggests the WVEL fell slightly; it is not compatible with an ascending WVEL.
In other words, if you subtract out the warming in Fig. 1 that is simply due to surface warming and attendant lapse rate changes, what is left are the temperature changes for other reasons. This reveals a slight cooling in the upper troposphere around 8 km, or 10 km in the tropics. Again, this can only mean the WVEL descended, not ascended.
…
Figure 1: Atmospheric warming 1979 to 1999, as measured by the radiosondes. The horizontal axis shows latitude, the vertical axis height (km on the right, hPa on the left). From the US CCSP report of 2006, Fig. 5.7E in section 5.5 on page 116 (Santer 2006, [2]), see also Singer 2011 [3].
Dr Roy Spencer, who pioneered microwave sounding for measuring atmospheric temperatures from satellites, recently (May 2015) used a different mix of microwave channels to specifically look for the hotspot using the satellite data — see his graph of how broad the data collected is, or conversely, how low the vertical resolution is. He concludes: “But I am increasingly convinced that the hotspot really has gone missing. … I believe the missing hotspot is indirect evidence that upper tropospheric water vapor is not increasing, and so upper tropospheric water vapor (the most important layer for water vapor feedback) is not amplifying warming from increasing CO2.”
The cooling strips above 12 km are due to ozone depletion, and are too high to be of interest here.
– Temperatures Predicted by the Conventional GCMs
As something of an aside because this series is about basic climate models, the measured data in Fig. 1 is nothing like the picture predicted by the big computerized climate models, the general circulation models (GCMs).
The GISS Climate Model E, a prototypical GCM, makes many of its outputs public. From 1979 to 1999 the CO2 concentration went from 337 ppm to 368 ppm, an increase of 9%, or 13% of a doubling (ΔL=0.13). The nearest the GISS model will publicly simulate is for a 25% increase in CO2 (ΔL=0.32) with no change to solar irradiance, shown in Fig. 2. Obviously the intensity of warming will be different because the change in CO2 is different, but the pattern or quality of the heating is of interest here. Note the prominent heating in the tropics at ~10 km (250 hPa) — this is the “hotspot”. The GISS model would show roughly the same pattern, just with not as much warming, for a CO2 increase that was only 13% of a doubling, as occurred between 1979 and 1999.
Figure 3 shows what the GISS model predicts for a 2% increase in solar irradiance, with no change in CO2. It is roughly what the model predicts for a full doubling of CO2 (Fig. 2 is only for 32% of a CO2 doubling, so the pattern is similar but the warming is not as intense as for a full doubling). Fig. 3 has the same pattern of atmospheric warming as Fig. 2, because conventional models, following the conventional basic climate model in Fig. 2 of post 9, essentially apply the solar response to all climate forcings — roughly the same sensitivity and the same feedbacks to surface warming, but with some smaller differences such as that incoming sunlight interacts with ozone (see post 9).
Figure 2: Atmospheric warming when the CO2 concentration increases by 25% (or 32% of a doubling) with no change in solar irradiance, as predicted by a typical conventional climate model (the GISS model E, 5/4 * CO2, 100 year response, Lat–Hgt). The “20”s on either end of the horizontal temperature scale are in error, so ignore them. Compare to reality in Fig. 1, but note that Fig.1 is for only a 9% increase in CO2 concentration (or 13% of a doubling). The dark red spot over the tropics at about 10 km (250 hPa, left vertical scale) is the hotspot; it amplifies the surface warming in the model, because it simulates the WVEL ascending and emitting less OLR, thereby requiring the surface to emit more OLR and thus be warmer than otherwise.
Figure 3: Atmospheric warming when the solar irradiance increases by 2% but CO2 is held constant, as predicted by the GISS model E (1.02 * solar irrad, 100 year response, Lat–Hgt). The “20”s on either end of the horizontal temperature scale are in error, so ignore them. Compare to Fig. 2: note the similar pattern of atmospheric warming, including the prominent hotspot, because conventional models roughly apply the solar response (similar sensitivity, same feedbacks to surface warming) to all climate influences. In the GISS model, a ~2% increase in solar irradiance produces the roughly same results as a full doubling of CO2 (Fig. 2 only shows 32% of a CO2 doubling).
Figures 2 and 3 are clearly nothing like Fig. 1. The supporters of the conventional model explain away this clash between GCMs and empirical evidence by ignoring or disputing the radiosonde data, and substituting vague satellite data instead — even though satellites, due to inadequate vertical resolution, are the wrong tool for the job. For example, see here or here or here.
A simpler explanation, that accords with the measured data in Fig. 1, lies in the alternative model presented in this series: simply don’t apply the solar response to the influence of CO2 (see post 13). Occam’s razor.
– Humidity Data
Consider the specific humidity data from the radiosondes, shown in Fig. 4. The more reliable data only goes to 400 hPa, but above 500 hPa the trend is one of drying. This agrees with the model in Fig.s 1b, 2b, 3 and 5 of Paltridge, Arking, and Pook (2009). The same trends are shown by the earlier radiosonde data from 1948 to 1973. Like the temperature data, this suggests a descending WVEL, and is not compatible with an ascending WVEL.
Figure 4: The atmosphere near the average WVEL height (around 360 hPa) shows a drying trend since 1973.
– Conclusion
The WVEL has descended in the last few decades, but we cannot determine how much:
Conclusion: The CO2 Response Causes the WVEL to Descend
In the last few decades there was surface warming yet the WVEL did not ascend — there is no hotspot. Therefore the conventional model is incorrect.
In the alternative model, the warming influences of ASR and CO2 are both considered. The albedo data discussed in post 10 indicates a small fall in reflected solar radiation from 1984 that is larger than the smoothed changes in TSI occurring in that period, so ASR presumably increased from 1984 — which caused some surface warming and invoked the solar response, thereby causing the WVEL to ascend. Yet the WVEL was observed to descend. Therefore the WVEL descended due to the CO2 response (to the increasing CO2), which outweighed the ascent due to the solar response (to the increased ASR). Hence the CO2 response to increasing CO2 causes the WVEL to descend. This is also supporting evidence for the rerouting feedback.
In other words, the strong rise in CO2 concentration and the lack of a hotspot together suggest that the effect of the CO2 response is to cause the WVEL to descend, and that this descent was only partly offset by the ascent caused by extra ASR and the solar response.
*Details of the effect of lapse rate change due to surface warming on the temperature at the WVEL height: Assuming that lapse rate change was uniform at all heights, the warming it caused can be estimated from the lapse rate feedback in AR5 (fLR, the increase in OLR per increase in surface temperature due to lapse rate change (∂R / ∂TS), whose value is -0.6 ± 0.4 W/m2 per °C — see post 3) and the parameter g from the OLR model (the increase in OLR per increase in lapse rate (∂R / ∂Γ), whose value is -13.5 W/m2 per °C/km — see Eq. 14 of post 15). The increase in lapse rate per increase in surface temperature (∂Γ / ∂TS) is thus about fLR / g, or 0.044 ± 0.03 °C/km per °C. Hence the change in lapse rate for the observed surface warming of ~0.1220 °C per decade is 0.005388 ± 0.00366 °C/km per decade. At 8 km, the warming due to the change in lapse rate is thus about 0.047 ± 0.035 °C per decade.
References
[1^] Paltridge, G., Arking, A., & Pook, M. (2009). Trends in middle- and upper-level tropospheric humidity from NCEP reanalysis data. Theoretical and Applied Climatology, 98:351-359.
[2^] Santer, B. D. (2006). US Climate Change Science Program 2006, Temperature Trends in the Lower Atmosphere – Understanding and Reconciling Differences.
[3^] Singer, S. F. (2011). Lack of Consistency between Modeled and Observed Temperature Trends. Energy and Environment, Vol 22 No. 4, pp. 375 – 406.
Bizarrely Russia may yet slow the trainwreck in Paris. The irony — the former communist state may be the one to thwart the ambitions of the freeloading global bureaucracy. We can’t relax though, because like the Chinese, Putin will have a price and if the equation changes so that Russia gains an advantage over the West (through generous exemptions or credits or some other trade deal) he’ll pay lip-service to the Climate extremists.
[NY Times (Reuters)] Russia’s official view appears to have changed little since 2003, when Putin told an international climate conference that warmer temperatures would mean Russians “spend less on fur coats” while “agricultural specialists say our grain production will increase, and thank God for that”.
The president believes that “there is no global warming, that this is a fraud to restrain the industrial development of several countries including Russia,” says Stanislav Belkovsky, a political analyst and critic of Putin. “That is why this subject is not topical for the majority of the Russian mass media and society in general.”
Putin’s scepticism dates from the early 2000s, when his staff “did very, very extensive work trying to understand all sides of the climate debate”, said Andrey Illarionov, Putin’s senior economic adviser at the time and now a senior fellow at the Cato Institute in Washington.
“We found that, while climate change does exist, it is cyclical, and the anthropogenic role is very limited,” he said. “It became clear that the climate is a complicated system and that, so far, the evidence presented for the need to ‘fight’ global warming was rather unfounded.”
Putin has been casting doubt on man-made global warming since the early 2000s, according to the Times. In 2003, Putin told an international climate conference warming would allow Russians to “spend less on fur coats,” adding that “agricultural specialists say our grain production will increase, and thank God for that.” — Michael Bastach, Newsbusters
The gullible journalists at the New York Times wrote this up thinking they are highlighting the poor state of journalism in Russia and how it largely ignores the Western fetish for invoking Industrial factors as Weather Gods. Instead, on some topics the Russian media is just telling it like it is. The economic screws are tight in Russia, and there is no time for frivolous fantasies about controlling the weather. Russians know that fires are caused by fuel loads and arsonists, not by car exhaust and air conditioners.
Galina Timchenko, former editor-in-chief of news site Lenta.ru, lays it out: “Unfortunately climate change is not very interesting to the public,” she says.
It used to be that scientists were supposed to publish their methods, discuss their reasoning, and point out the weaknesses of their work. Now, it’s confidential.
The House Science Committee in the US is demanding with a subpoena that NOAA release internal communications related to the Karl et al study (that tried to remove the “pause” in global temperatures.) NOAA is refusing saying:
“It is a long-standing practice in the scientific community to protect the confidentiality of deliberative scientific discussions.”
Yes. It’s been longstanding since morning tea on Tuesday.
The new post-modern science conversation:
SCIENTIST 1: So why did Karl et al adjust the ocean buoy readings by a figure that is so uncertain as to be meaningless? From Kennedy et al 0.12 ± 1.7°C.What were you thinking?
KARL ET AL and co: snip [That’s confidential.Stop this now. We’re feeling harrassed!]
What is the world coming to if congress succeeds in exposing objective, rational discussion about thermometers?
In typical style I looked at this draft and told David that the second half of his post should be at the top (that’s where he discusses how his model solves so many problems). He replied that the equations were the most important part, and he wasn’t going to flip them around. So, for readers who don’t speak mathematica-lingua, all I can say, is don’t miss the second half below.
Also in typical style, David prefers this picture he’s just drawn in his diagramming software, to my cartoon in the intro to post 11:
In this post, David combines the two smaller models to make one basic climate model (that’s the sum-of-warmings and the OLR models). Unlike the mainstream conventional basic model that underlies the entire establishment culture and philosophy, the alternative model uses more empirical data (and from the real world too, not just the lab). It’s also less reliant on hypothetical partial derivatives. Plus, in the alternate model, different forcings can cause different responses. In the conventional model, the architecture assumes the climate responds to to all forcings the same way.
CO2 has a warming effect on the atmosphere, rather than just on the surface, and architecturally-speaking, this extra energy could be rerouted and escape through a different path (like water flowing through a different pipe). In the conventional basic model, remember, the only type of feedbacks allowed are responses to surface warming. The climate’s response is supposedly the same regardless of whether the Sun provides extra incoming energy or the atmosphere blocks a part of the outgoing flow. What was CO2 emissions shifts to become water vapor emissions. If the outgoing flow just redistributes from one wavelength to another , the conventional models are up a creek full of manure, so to speak.
The rerouting of outgoing energy is potentially a massive negative feedback, but in Certified Climate Model Speak we can’t even say that, since the term “negative feedback” is defined as negative feedbacks to surface warming, not atmospheric warming. In the blind language of standard models there is no term to describe this. No wonder they can’t get out of the rut they are stuck in.
To reduce problems with partial derivatives, David switches from using the Plank sensitivity to the Stefan Boltzmann sensitivity. The Planck variant relies on holding “everything else constant”, which allows the tropospheric temperatures to change (in unison) but keeps the stratospheric temperature constant. It’s different (slightly) from the Stefan Boltzmann sensitivity (which applies under all circumstances) principally because much of the CO2 and ozone emission layers are in the stratosphere.
We can do better than in 1896 — apparently Arrhenius used something akin to the conventional basic model, but with of course no climate data to work with.
August 31, 201. This coronal mass ejection just missed Earth, according to NASA
There were two mysterious sudden spikes in carbon 14 in tree rings around a thousand years ago. Now some researchers at Lund University say they’ve matched those to beryllium layers in ice cores from the Arctic and Antarctic. Some wild event made these changes across continents all over the world at the same time, and about the only thing that could have done that was a massive solar storm (or two). There are estimates these extreme storms would have been ten times stronger than the biggest solar storms we have had in the last few decades. The two big bad storms are described as a few times bigger than even the largest solar storm in modern history, which was The Carrington Event in 1859. The radioactive spikes specifically show up in tree rings in 774/775AD and 993/994AD. It’s pretty cool that we can pin those years down so accurately, and as an aside, I imagine it makes a fairly handy calibration point for tree ring researchers now that we know it was global.
Unfortunately, if one of those happened now, it would not be fun. The stream of particles off such a major storm would play havoc with our electrical networks and equipment. We’d get only hours (or less) of warning, but the power blackouts that follow could last for months. Transformers, apparently, are particularly vulnerable to being destroyed — and the waiting list for new ones is five months.The Lloyds insurance report makes for rather ominous reading, and it was only describing the possibility of another Carrington event, not one these bigger solar-bombs.
The Carrington event in 1859 turned the sky blood red in places and produced auroras as far south as Panama (18 degrees North of the equator). Many telegraph systems stopped working, but in others operators even turned off their batteries just to run on “auroral current”.
If one of those geomagnetic superstorms was launched at us, and was widespread, it’s hard to imagine how it would not get really ugly. Ponder the anarchy of trying to operate cities of millions without electricity, without running water, with fuel pumps inoperable, and no fridges and freezers to store food. (A good movie script if ever there was…). The level of disaster would depend on whether there were many unscathed western regions that could help out.
“Solar storms and the particles they release result in spectacular phenomena such as auroras, but they can also pose a serious risk to our society. In extreme cases they have caused major power outages, and they could also lead to breakdowns of satellites and communication systems. According to a new study solar storms could be much more powerful than previously assumed. Researchers have now confirmed that Earth was hit by two extreme solar storms more than 1000 years ago.
“If such enormous solar storms would hit Earth today, they could have devastating effects on our power supply, satellites and communication systems,” says Raimund Muscheler at the Department of Geology, Lund University.
Today, the American Academy of Pediatrics (AAP) released a policy statement that links climate change with the health of children, urging pediatricians and politicians to work together to solve this crisis and protect children from climate-related threats including natural disasters, heat stress, lower air quality, increased infections, and threats to food and water supplies.
“Every child need a safe and healthy environment and climate change is a rising public health threat to all children in this country and around the world,” said AAP President Sandra G. Hassink, MD, FAAP. “Pediatricians have a unique and powerful voice in this conversation due to their knowledge of child health and disease and their role in ensuring the health of current and future children.”
This is the climate change that children without cars and electricity dealt with:
This is the climate change babies who are 30 and under have to deal with:
UAH satellite data — the same range as the Vostok graph axis.
All those parents leaving their children out in floods and storms, be warned:
“Children are uniquely at risk to the direct impacts of climate changes like climate-related disaster–including floods and storms–where they are exposed to increased risk of injury, death, loss of or separation from caregivers and mental health consequences,” explained Samantha Ahdoot, MD, lead author of the policy statement
Why are paediatricians selling out their good reputation by pandering to odious political correctness? What parent hears this and thinks suddenly “I must act now — I thought climate change only happened to adults!”
And if children in poor countries are affected by climate change, then lets get them some damn coal fired electricity.
The hard sciences are less and less fooled by the charade of sciencey fear mongering (unlike some psychologists). It is great to see scientific groups speaking out, though we know this PDF, which was first published on the 24th of August 2015, will be ignored by the ABC, BBC, and CBC science propaganda teams. Not the right message.
The Société de Calcul Mathématique SA, in France has issued a long in depth white paper on climate change:
“The battle against global warming: an absurd,costly and pointless crusade”
The impact on the entire field of scientific research is particularly clear and especially pernicious.
There is not a single fact, figure or observation that leads us to conclude that the world‘s climate is in any way disturbed‘
“Conclusions based on any kind of model should be disregarded. As the SCM specializes in building mathematical models, we should also be recognized as competent to criticize them. Models are useful when attempting to review our knowledge, but they should not be used as an aid to decision-making until they have been validated.”
OLR — outgoing longwave radiation — is so key, so central to the climate debate that if we had top notch data on the radiation coming off the planet, we would have solved the effect of extra CO2 a long time ago. That we don’t have a specific satellite monitoring these changes in detail is like the dog that didn’t bark. Apparently a specialist OLR satellite was to be launched in 2015. More info on the RAVAN Satellite here (was supposed to launch in Sept 2015). (UPDATE: Planned for 2016) h/t siliggy.
There are four main pipes to space, and in David’s work each pipe is considered separately. The conventional model assumes that increasing atmospheric CO2 constricts the CO2 pipe, which warms the surface, causing more evaporation, which then constricts the Water Vapor pipe (this is the “water vapor amplification”, even more constriction of radiation to space by water vapor that forces the surface to emit more by being yet warmer). But the missing hot spot tells us that this theory is wrong. In this OLR model, the water vapor pipe could either expand or constrict. An expansion means a drop in the height of the emissions layer, a descent to a warmer place in the troposphere. The top layer of any atmospheric “blanket” is where all the action is, and where the photons of infra red can finally escape to space. The gas below the top of the blanket or emission layer doesn’t matter that much for OLR because the photons can’t escape to space from there — there are just too many molecules to capture those rays of infra red before they can get far.
The amount of outgoing radiation emitted by an emission layer always depends entirely on how warm it is, and as the “blanket” of humidity (or water vapor) thickens, the top part of that layer presumably ascends, reaches higher, and becomes cooler, which means it emits less (the pipe constricts). Alternately if the blanket gets thinner near its top (even if getting thicker far below), the height of the emissions layer falls lower into warmer air, so it emits more.
The model estimates how much the radiation to space changes as various properties of the the emission layers and the atmosphere change — the heights of the emission layers, the lapse rate, the surface temperature, the cloud fraction, and the CO2 concentration.
Lapse rate is the rate the temperature falls or lapses per kilometer, as we go up vertically through the atmosphere. We calculate the temperature at a given height by working upwards from the surface, as the surface temperature less the lapse rate (in deg C per km) times the height (in km).
The cloud fraction splits the atmospheric window between the cloud tops and surface emissions.
Incidentally, now that we have the OLR model we can estimate the Planck sensitivity, under those hypothetical Planck conditions.
This post presents a model of outgoing longwave radiation (OLR), which we are soon going to join with the sum-of-warmings model to form the alternative basic climate model, as outlined in the post on modelling strategy.
The last post presented the basic parameters of the various layers that effectively emit OLR. In this post we model the sensitivity of OLR to changes in some of those parameters — for example, how much does OLR decrease if the water vapor emissions layer (WVEL) ascends by 100 meters? See Fig. 1 of that post to see how some of the parameters are defined.
The final results of this series are not very sensitive to the OLR model, which can be refined later if the alternative basic climate model is found to be useful.
If you want skip the details, jump to “The Model Equation” and “Diagram” sections near the end.
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Ok
Recent Comments