# Are transfer functions meaningless (the “white noise” point)? Beware your assumptions!

Some people are claiming that the transfer function is meaningless because you could use white noise instead of temperature data and get the same notch. It’s true, you could. But the argument is itself a surprisingly banal fallacy. It looks seductive, but it’s like saying that it is meaningless to add 3 oranges to 3 oranges because you could add 3 oranges to 3 apples and you’d still get six!

It is trivially obvious that the transfer function will find a relationship between entirely unrelated time series, as any mathematical tool will when it’s misapplied. The question that matters — as with any mathematical tool — is has it been misapplied? What matters is whether the base assumption is valid, and whether the results will be a useful answer to the question you’ve asked. If the assumption is that apples and oranges are both pieces of fruit, and the question you ask is “how many pieces of fruit do we have”, then it is useful to add apples and oranges. But if you are trying to compare changes in fruit consumption, adding the two is mindless. So let’s look at the assumptions and the question being asked.

### Assumptions first

Two assumptions were made before computing the transfer function. And before anyone complains that the whole project was a circular tautology — pay attention — the assumptions are temporary. They are a “what if” used to see if we get a meaningful answer. Later the assumptions are dropped and tested.

Assuming that:

1. Recent global warming was associated almost entirely with TSI.

2. The climate system is linear and invariant

…then, the transfer function from TSI to temperature is of great interest and sinusoidal analysis is appropriate.

David Evans has been explicit about both right from the start, but not all commenters seem to realize the implications.

The transfer function between TSI and Earth’s (surface) temperature will be meaningless if there is no causal link between TSI and Earth’s temperature. (Some people may need to read that twice).

This “link” could be an indirect one. It doesn’t mean that TSI itself is causing the change in temperature. It could, for example, mean that TSI is a leading indicator of other solar events that lag it by 11 years. It could be that those other events — say magnetic fields, solar wind, UV or other spectrum changes — are the ones actually causing the albedo changes that cause the temperature to change 11 years after the TSI changes.

By all means, if you have definitive evidence that changes in TSI cannot possibly be directly or indirectly associated with changes in Earth’s temperature, do let us know. It will save us a lot of time. Likewise, if you know of any reason why TSI can not possibly be a leading indicator for some other solar factor which acts with an 11 year cycle, please let us know. Some people are willing to declare they know that TSI cannot be associated with changes in Earth’s temperature.  Some of us have an open mind. The solar dynamo is not completely worked out. Fair?

As to the second part, what question was David Evans asking, and are the results useful? He made it explicit.

The initial aim of this project is to answer this question: If the recent global warming was associated almost entirely with solar radiation, and had no dependence on CO2, what solar model would account for it?

So is the discovery of a notch filter useful, and does it help to create a solar model? It certainly looks that way so far.

The model was constructed in the frequency domain. The main feature in the transfer function is the notch, so we tried building a similar notch in the model. The existence of the notch implies there has to be an accompanying  delay (the timing seems unnaturally perfect, people are understandbly having trouble wrapping their heads around that). The delay was later found to likely be 11 years, which is not only the length of the major cycle of the solar dynamo but is borne out by other independent studies– such as Usoskin, Soon, Archibald, Friis-Christensen and Lassen, Solheim, Moffa-Sanchez, etc. (see  post III). Later, a model based on an 11 year delay was found to produce reasonable results (see that hindcasting).

So the notch turned out to be very useful in building the model, giving two of the  five elements in the model (the other three are the low pass filter, the RATS multiplier, and the immediate path for TSI, which were deduced by physical reasoning).

It would be better if there was a known mechanism. Of course, but Rome wasn’t built in a day, steady on. We are working on it. If people already knew what force X was then presumably they would have noticed its correlation with temperature and the climate problem would already have been solved, wouldn’t it? Some commenters, (those not focused on fallacies like argument from incredulity or the mechanics of publication time-tables) are being very helpful in gathering clues on force X — thank you!.

PS: The release?

And for those who are impatiently waiting the full working model, we’re working on it. There are a few last-minute things  to sort out. The spreadsheet used data to August 2013 in the investigation, and was frozen months ago with that data. That’s the copy that is available to people who got advance notice. Now that we are releasing it, it would be nice to update the data, while preserving the original calculations. David is copying the Aug 2013 data and updating all the data. We are also figuring out the creative commons conditions that would be workable, and deciding how to manage suggestions, adaptations, and modifications. We suspect the normal open source software sites don’t deal with 20Mb Excel files which people can modify, but which are very difficult to track changes on (does anyone know of a similar project?). Right now the sciencespeak legal department, open science support team, human relations division and marketing arm are working flat tack. (That’s both of us. 😉 )

The biggest impediment at the moment is that some people still haven’t read the first posts we put up carefully enough. Even though we answered their questions personally in comments they still keep repeating the same points. Should we have kept the whole project secret until we had solved all these questions? Perhaps, but it’s been immensely helpful to get some feedback and help from some readers, and we didn’t know who would be the most useful beforehand. They have made themselves known.

On the other hand we’re being compared to Phil Jones and Michael Mann by one commenter, which we think is a tiny bit over-the-top, given that Jones and Mann are funded by the taxpayer and they spent years and used legal means to prevent their data being made public. To put a fine point on it, we got no income from taxes, and we owe the critics nothing. We also ask nothing of them (except, implicitly, patience and manners). Maybe that looks equivalent to a few people —  we can’t see it. All the fuss, seriously, is flattering (if counter-productive).

Manners makes no difference to the scientific method, but ultimately the human practice of the Scientific Method is only ever advanced by … humans, and manners do matter. Science is never advanced by namecalling, misquoting, strawmen and personal attacks. Please quote us exactly, eh?

9.3 out of 10 based on 138 ratings

### 342 comments to Are transfer functions meaningless (the “white noise” point)? Beware your assumptions!

• #
Popeye26

“Are transfer functions meaningless (the “white noise” point)? Beware your assumptions!, 1.0 out of 10 based on 1 rating”

Jo/David – Don’t know WHO gave you a score of 1 out of 10 when I first logged in but someone had.

Maybe a mistake??? or if it WAS given on purpose it’s VERY poor form IMHO!!

Waiting with bated breath for the spreadsheet.

Cheers,

• #

> Assuming … global warming was associated almost entirely with TSI … then, the transfer function from TSI to temperature is of great interest

Certainly. But its a very big assumption; in fact, likely wrong.

> the assumptions are temporary… Later the assumptions are dropped and tested.

Its very hard to know what you mean by that (cue “but we haven’t released it all yet!” Which gets the reply: “yes, that’s why its hard to know what you mean”). You make your assumptions, you do your curve fitting / xfer function fitting, you make your predictions or whatevs, how then do you go about “dropping your assumptions”? You can’t; they’re fundamentally built into what you’ve done. If you’re going to drop the assumption of TSI-primality by simply feeding a pile of other series into the model you’ve built and finding they don’t match your desired output as well, then that won’t work.

Incidentally, I don’t know why you’re talking about white noise, the temperature data doesn’t look like white noise; as DE has already said.

• #
the Griss

I think the white noise issue was bought up by Willis, and it does need answering for those people not up with transfer functions. The electrical engineers, mathematicians, audio guys, will understand the underlying principles, but many will not.

As Richard has shown in other threads, there are regional (approx) 11 year signals found by several different people, but they are not found by David at the global scale.

That implies at least some sort of smearing or delay mechanism and therefore greatly reinforces David’s hypothesis.

It is highly unlikely you would get an 11 year frequency signal at any level if CO2 were the main driver of climate, because CO2 most definitely does not have an 11 year signal.

An 11 year signal in any temperature record local or regional immediately implies a strong solar influence. How can it possibly be regional and not also global.?

• #
the Griss

that’s badly organised and worded, sorry……… its getting late.

• #

I think you, and JN, and quite possibly WE, are missing the point. Correcting “white noise” to “red noise” is trivial, lets assume that done. I’m basing the red-noisyness on, say, fig 4 from http://joannenova.com.au/2014/06/big-news-part-ii-for-the-first-time-a-mysterious-notch-filter-found-in-the-climate/ where DE says “The temperature amplitude spectrum, the smooth orange curve in Figure 4, is essentially straight over more than three orders of frequency magnitude” which at this scale makes it red noise. Whether DE is correct or not in that description and analysis is another matter (not being able to see ENSO is dodgy, IMHO).

Anyway, at that point, the xfer function is from TSI, which has a peak at 11y, to temperature, which looks in this analysis like red noise and of course doesn’t have a peak. So yes: if you replace one realisation of red noise with a different one, you’d get the same xfer function. But that isn’t interesting or remarkable.

• #
the Griss

“not being able to see ENSO is dodgy, IMHO”

ENSO is a transfer of energy WITHIN the system.

“are missing the point”

No, the point is that we are looking at an output that is expected to have a frequency signal, and it hasn’t. Why not? That is what is interesting.

The fact that an approx. 11 year signal has been found in several local and regional temperature data shows that the sun is a major driver of temperatures at that scale.

That makes it ludicrous to say that it is NOT a major driver at a global scale.

That is why I think David may well be onto something.

And certainly something worth giving an open ear to, rather than the crass negativity it has received from people who haven’t fully understood, and who should basically know better.

• #
Philip Shehan

If ENSO is the transfer of energy within the system, how do you get your proposed step up in temperature with el nino effects?

• #
the Griss

Go back and read from the start. !!

• #
Philip Shehan

Yes el nino is a transfer of energy within the system. (The ocean/atmosphere system).

You repeatedly claim that the 1998 el nino led to a step up in temperature (some 2 years after it concluded). Yet no such step occured after the 2010 el nino.

And you do not propose that steps down occur with the other part of the ENSO energy transfer within the system, the la nina events.

• #
the Griss

“Yet no such step occured after the 2010 el nino”

Precisely.. Now you need to figure out why. ! 🙂

No, la nina event go hand in hand with gradual cooling.

• #

Griss,
Reads a lot like some are actually beginning to realize the implications, but are still trying to divert attention back to co2.

• #
bobl

My goodness William, I actually gave you a thumbs up on that.

Yes, you have got it, the 11 year signal is essentially being cancelled out which leaves us with noise, that is an output that is uncorrelated with sunspots. Good you got that! The fact that substituting one data set that is poorly correlated to the input with another generates the same transfer function is unremarkable, yes. The mystery is WHY is TSI not correlated with temperature over the cycle, logic says it should be, after all it is for other solar cycles!

Your comment about ENSO, who saya enso is not represented in the output, the model doesn’t test that, it would be possible to derive that transfer function too. If ENSO is uncorrelated to the sunspot cycle it appears as uncorrelated noise, but it will be to some extent correlated and in my opinion since ENSO is a lagged function of TSI and serves to smear energy across time, it’s likely a component of the low pass filter

• #

ENSO shows up in the global temperature series (but how strongly I’m not sure) with a ~5 year quasi-periodicity, so yes DE’s model is capable of seeing that signature. But ENSO is a coupled ocean-atmosphere mode not forced by TSI (or CO2 for that matter) so won’t appear in DE’s model: there’s nothing in the model capable of generating it.

ENSO and TSI have different spectral peaks so one can’t be a lagged fn of the other.

• #
bobl

That’s quite possible, if ENSO is not modulated by TSI then that is the case, however my view is that there is every chance that ENSO is partially correlated to TSI, since it’s a thermal effect, exactly how, well that’s another matter.

• #
the Griss

“ENSO shows up in the global temperature series ..with a ~5 year quasi-periodicity”

That’s strange, Fourier series analysis doesn’t pick it up.

….please provide some proof of your assertion (not stoat or wiki, they are meaningless and worthless thanks to a certain somebody 😉 )

Again you fail to think, and realise what ENSO is part of.

• #

“How can it possibly be regional and not also global.?”

Latitudinally shifting jets and climate zones would do that.

Globally, the shifting acts as a negative system response but the regional efects would be large especially in locations near climate zone boundaries.

• #
the Griss

What I meant was that you can’t have solar forcing being a major player at regional levels, but not at a global level.

(I was tired last night, and didn’t express that very well)

• #
Philip Shehan

If the 11 year solar cycle cancels out over the long term, how do you explain the rise in temperature over the last century?

http://www.woodfortrees.org/plot/gistemp/from/mean:12/plot/gistemp/trend/from

• #
the Griss

You haven’t been paying attention again, Philip..

Stay at the bottom of the class.

• #

JN said: the assumptions are temporary… Later the assumptions are dropped and tested.

WC says: It’s very hard to know what you mean by that

JN responds: Read the posts carefully. I did mention it in the hindcasting one where I said: ” (This is the point where the solar assumption is dropped and tested.)”

Maybe the assumption is wrong, but you haven’t even tried to make a case.

If the assumption is wrong the notch is likely spurious. It’s harder, but not impossible, to make a model from spurious, non-existent relationships. The delay at least passes the initial test with the solar model — it makes reasonable hindcasting — which doesn’t prove its real, but lends itself to the hypothesis that the sun might play a role through this indirect factor which operates at a delay from TSI. The delayed solar hypothesis also implies a one cycle delayed correlation with TSI, and other researchers found that delayed correlation.

Have you got any evidence that falsifies the hypothesis?

Regarding “White noise”, as important as you are, we are responding to other people.OK?

• #

If you’re claiming to have dropped the assumption already – in part 7 – then you’re simply wrong (I’ve already covered that in “If you’re going to drop the assumption of TSI-primality by simply feeding a pile of other series into the model you’ve built and finding they don’t match your desired output as well, then that won’t work.”). The model you’ve built there still relies on that assumption. Because the model you’ve built and used there relies on the notch filter, which relies on the xfer function, which relies on TSI-primacy.

> Later the assumptions are dropped and tested.

Re-reading this, it makes even less sense. If the assumptions have been dropped, then you don’t care about them, so why would you bother test them?

What I think you mean is: your/DE’s idea for building this model comes from assuming TSI-primacy. Doing that, you’ve managed to build a model that (with certain other assumptions) postdicts (to a degree of accuracy that is unstated, but appears to satisfy you) the observed global temperature, leading you to think that your model is correct / interesting / plausible /whatever. But nothing in that process amounts to “dropping” TSI-primacy. Its still built in.

> Maybe the assumption is wrong, but you haven’t even tried to make a case.

There’s the obvious one: that you end up with an unphyiscally large effect of TSI on climate. Svensmark etc solve this by invoking some amplification in the atmos; you solve it by an unknown force X. All those suffer from a post-hoc made-up-to-fit problem. Along the way, you end up with a massively overestimated effect of bomb tests; that’s another major hole you really need to work on.

• #
David Reeve

I’m a long time follower of your blog Jo, but I do think the issue WC raises here is critical to the logic of your case.

Way back at the start of this thing I commented that your adoption of a notch model is predicated on your stated assumption of global temperature being a linear function of the TSI input. Swapping out predicates for a proposition part way through development of the argument seems a good way to fool oneself.

If we in fact have a new proposition with new predicates, then this hasn’t been made clear.

• #

William,

Part 1 is using Fourier work to find out what happens — IF TSI is associated with Earths temperature how is it associated? What kind of transfer function do we see between the systems? The answer is “the notch” which means a delay. The temporary assumption is used to discover something. But is it real? We test it in part II.

Part 2 (Post VII and more to come) asks — how do we figure out if that means something? It’s hard to answer that. The model is one kind of test for the theory. If the notch delay is real, then it will produce some kind of meaningful results — like for example, a model created with the same type of notch-delay filter will recreate temperature results — or at least it won’t produce unbelieveable results which would disprove the theory. And the model hindcasts reasonably well. But models are only one kind of test, other researchers (as I keep saying, and you don’t seem to notice, see Part III) show that there is a lag in correlation of 10 years, or a correlation between the length of the cycle and the temperature of the next cycle. Then other things will fit like TSI will peak in 1987, and temps will peak 11 years later like 1998 (Lockwood and frolich). None of this is proof. But can you disprove? Evidently not, or you would say so, right?

There is not TSI primacy. Read the post on hindcasting again. This model hindcasts just as well with the 100%-CO2-driven assumption as the GCM’s do. That’s 0% solar. It does reasonably well for the 20th C without any solar effect at all. David said that we can’t tell which one is right (or what the proportions are) based on the hindcasting alone.

Then, as I repeat, for the bomb tests, I agreed they look too high, and we made no attempt to hide that. But I also linked to 2 peer reviewed papers and one long well written document. One argued that the atmospheric tests might have made enough cooling to offset man-made warming at the time. There were 503 bombs with 440Mt yield. And everything over 1 Mt reaches the stratosphere. How much dust was produced? Each test was different. See Fujii for starters. There aren’t satellite records, we wish, we wish. This could be about more than fine dust up there anyhow, the bombs make radioactive particles which may seed clouds. And one radioactive particle that was measured — C14 — stayed above natural levels until the mid 1980’s.

Have you got any reason to suppose that 440Mt would not cool the planet somewhat? If not, then the debate we are having is not about whether the bomb tests should be in the model but how much cooling they do?

Your argument against the model amounts to sheer incredulity. Care to put some data or reasoning behind it?

You seem happy enough to accept CO2 driven models that predicted upper tropospheric water vapor would increase and 28 million weather balloons show it did not.

• #

Re bombs, clearly we’re not going to agree, and I’m not sure there’s much scope for meaningful discussion. I’ve responded over at http://joannenova.com.au/2014/06/big-news-viii-new-solar-model-predicts-imminent-global-cooling/#comment-1498273.

Re the model, errm, I don’t think that makes things any better for you. But I grow somewhat weary of trying to analyse a model that hasn’t been fully revealed; and you grow weary of people saying that; so I’ll drop it for now.

> CO2 driven models that predicted upper tropospheric water vapor would increase and

A new topic, what fun. Sadly you don’t link to what you mean. I’ll guess. AR5 says “Radiosonde humidity data for the troposphere were used sparingly in AR4, noting a renewed appreciation for biases with the operational radiosonde data that had been highlighted by several major field campaigns and intercomparisons. Since AR4 there have been three distinct efforts to homogenize the tropospheric humidity records from operational radiosonde measurements (Durre et al., 2009; McCarthy
et al., 2009; Dai et al., 2011) (Supplementary Material 2.SM.6.1, Table 2.SM.9). Over the common period of record from 1973 onwards, the resulting estimates are in substantive agreement regarding specific humidity trends at the largest geographical scales. On average, the impact of the correction procedures is to remove an artificial temporal trend towards drying in the raw data and indicate a positive trend in free tropospheric specific humidity over the period of record. In each analysis, the rate of increase in the free troposphere is concluded to be largely consistent with that expected from the Clausius–Clapeyron relation (about 7% per degree Celsius)”.

That doesn’t support what you want. I could look in AR4 I suppose. http://www.ipcc.ch/publications_and_data/ar4/wg1/en/ch3s3-4-2-2.html looks relevant.

It says “To summarise, the available data do not indicate a detectable trend in upper-tropospheric relative humidity. However, there is now evidence for global increases in upper-tropospheric specific humidity over the past two decades, which is consistent with the observed increases in tropospheric temperatures and the absence of any change in relative humidity.”

That doesn’t support your assertion either.

• #

William, gosh, we’ve put out a fairly a big target, ripe for criticism, yet already you retreat to … defending the missing hot spot? Really? See Paltridge 2009, Christy 2010, Fu 2011, Douglass 2007, McKitrick 2010, also those predictions Karl et al (2006) CCSP 2006 Report, Ch 1, p 25; and the results: Karl et al (2006) CCSP Ch 5,Fig 5.7 in section 5.5 on page 116. We have discussed all these including IPCC AR5 here.

• #

Actually, I realise I’ve missed a trick here: you start talking about water vapour, claiming it hasn’t increased. I point out you’re wrong: WV in the upper trop has increased, just as expected. Rather than defending your point, you switch to another one. That, new, conversation continues at #27.

• #
bobl

William, Jo, perhaps, if I may.

William, your point that the TSI to Temperature dynamics are “baked into the model are correct” they are, but this isn’t the point that is trying to be made. Essentially the process autocorrelates all the TSI cycles over the historical record, and produces a relationship between TSI and temperature – but it does not preserve the dynamics of that data, all the wiggles in the climates are not preserved through to the model, just the various cycle times and amplitudes are extracted. Indeed the phase data has been discarded because it is too low in resolution. The various frequencies are extracted out of the whole 150 years at once. There are in fact an infinite number of actual input waveforms over 150 years that would give the same output. Once the model is built we can ignore how it was derived, and presume its a black box.

So how do we know that the filter produces the same output as the wave that produced it (simulates the climate) and not one of the other infinite number of alternative waveforms when we stimulate it with an input signal. Well, we apply the data to the filter in the time domain and look at the time domain response of what comes out, this has wiggles that we can compare with actual temperature on a month to month, year to year basis. See how this is different to the way we derived the model?

• #

> Indeed the phase data has been discarded because it is too low in resolution

That assertion of DE’s didn’t make sense; of course phase can be resolved. However, it was so peripheral I didn’t bother with it.

> Once the model is built we can ignore how it was derived, and presume its a black box.

You suddenly jump to this sentence, but you can’t. The model, thus derived, carries the xfer function in it; and that xfer function comes from TSI-vs-Temp.

• #
bobl

You’re not getting it. Drop your prejudices for a moment, and read carefully.

For all intents and purposes the model once built is considered to be independent of the data and is then tested for being representive of the system, that is we drop the presumption that the model is representative of the climate.

That is all David is saying, he claims nothig with this statement. It’s engineering speak for, “I’m going to pretend I know nothing about the black box” and I’m then going to test it on that basis.

I’m sorry, but I can’t make this concept simpler than that for you.

• #
the Griss

Your lack of understanding of the process is shining through like a beacon, and exposing your agenda.

You have produce NOTHING that would counter what the models are saying.

You are just creating BROWN NOISE.. as a WC is wont to do. !!

• #
Philip Shehan

“Science is never advanced by namecalling, misquoting, strawmen and personal attacks.”

• #
the Griss

Speaking of strawmen.. !!

• #
Philip Shehan

As above.

• #
the Griss

“Because the model you’ve built ……. which relies on TSI-primacy.”

The IPCC climate model rely on CO2 primacy.. and have failed magnificently.

At least one that is based on the sun driving the system has something resembling common sense behind it, rather than agenda.

We all note that you are backing the agenda driven model, rather than the common sense one.

• #
Jaymez

William, you’ve been extremely active at this blog criticising anything which questions or refutes the belief that human CO2 emissions are causing dangerous global warming. So I was curious if you have made any attempt to criticise any of the UN IPCC orthodoxy. Given your ubiquitous presence on the internet I felt sure I would be able to find something.

I looked for perhaps a criticism by you of the outlandish claims made in IPCC Climate reports. You know, maybe about the claim the Himalayan glaciers might all be gone by 2035. Or of the IPCC Chairman Rajendra K. Pachauri and his claim that Indian experts on Himalayan glaciers were practising voodoo science. I looked for a something from you criticising those who were claiming the temperature is rising “even faster than predicted”, or those claiming that sea levels rise is accelerating.

I earnestly looked for some criticism of Michael Mann’s temperature reconstructions which were used to create the ‘Hockey Stick’ graph, and make the MWP disappear. I thought maybe I would find something from you criticising the deleted portion of Keith Briffa’s temperature reconstruction. I checked to see if I could find where you highlighted the fact that many predictions by climate scientists and used in climate models have been based purely on assumptions of climate feedbacks which have no empirical basis.

I even looked for some disappointment in your writings about the behaviour of a number of climate scientists who acted ‘unscientifically’ as revealed by the climate-gate emails. I even thought you may have commented on the difference between what scientist wrote in the main body of the IPCC Climate reports versus what appeared in the Summary for Policy Makers.

I was interested to see if you had made a point about the claim the ‘science was settled’ followed by years of research and ‘adjustments’ to the science to find excuses for why the global average temperature wasn’t increasing at an accelerated rate even though human greenhouse emissions were.

I wondered whether you had shown any frustration that a best estimate for climate sensitivity, which is the most important climate change variable, was no longer provided in AR5, yet the IPCC claimed their confidence level that humans were responsible for most of the warming since 1950 had increased to 95% without any data to support that.

But I found nothing like that. Perhaps I missed it?

What came up most regularly in my searches was how you used your position at Wikipedia to edit anything which didn’t agree with the IPCC position, and that you even went out of your way to try to discredit scientists who had opposing views. I’m sure others have seen the figures bandied around:

“All told, Connolley created or rewrote 5,428 unique Wikipedia articles. His control over Wikipedia was greater still, however, through the role he obtained at Wikipedia as a website administrator, which allowed him to act with virtual impunity. When Connolley didn’t like the subject of a certain article, he removed it — more than 500 articles of various descriptions disappeared at his hand. When he disapproved of the arguments that others were making, he often had them barred — over 2,000 Wikipedia contributors who ran afoul of him found themselves blocked from making further contributions. Acolytes whose writing conformed to Connolley’s global warming views, in contrast, were rewarded with Wikipedia’s blessings. In these ways, Connolley turned Wikipedia into the missionary wing of the global warming movement.” http://blogs.telegraph.co.uk/news/jamesdelingpole/100020515/climategate-the-corruption-of-wikipedia/

I didn’t find anywhere that you had questioned the IPCC position or the extreme claims made by alarmist scientists. Perhaps there was an example here or there, but they have been clearly swamped by your total devotion to and faith in the IPCC orthodoxy. Yet I have easily found examples where well known climate skeptics have been ready to accept parts of the UN IPCC reports which have reasonable scientific justification and to criticise any claims by climate skeptics which do not.

In contrast I noted many well-known climate skeptics who readily criticised uninformed and poorly justified arguments, or positions they disagreed with on the ‘skeptical side’ of the debate. Those I looked at included Anthony Watts, Jo Nova, Judith Curry, Roy Spencer, Richard Lindzen, Bjorn Lomborg.

What does this tell me? You have chosen to support the side which believes humans are causing dangerous global warming through greenhouse gas emissions and you will champion that cause and fight any threat to it no matter what facts get in your way. You have certainly shown that religious fervour at this site and I have certainly grown tired of reading your comments just in case there was anything of value in them.

• #

WC,

Your emptiness is being exposed. The internet remembers. You cannot hide.

You can end it here and now by simply presenting something significant that is yours rather than the endless reflections of reflected reflections.

STAND AND DELIVER!

• #

[Snipped – relevance? – Mod]

• #
Lionell Griffith

Finally, an honest commentary from our very own Black Knight. Isn’t it interesting how every publish word exposes the truth. This even as he tries to hide behind a squid like flood of words. Try as he might, he cannot hide.

• #

> [Snipped – relevance? – Mod]

You’re letting Lionell Griffith’s content-free postings through and snipping my brief replies. Your bias is showing, mod. And this site was doing so well, too.

> Try as he might, he cannot hide.

That’s rather ironic, no, since my witty one-line rebuttal (no flood of words) has been neatly “hidden” by the mods.

[As we don’t always moderate in realtime, it sometimes easier just to chop off the head of the snake rather than go through and work out who is responding to the original O/T or irrelevant comment. Also a tip, if you are going to post a link to a video or anything, a few words of introduction on what it is about would be appropriate. – Mod]

• #
the Griss

Feel it, WC. Your own medicine. 🙂

• #
Rereke Whakaaro

As I see it William, you were asked, by Lionell, who is well respected here by the way, to state what your scientific contribution has been to the climate debate.

Now, I don’t know what your reply was, but since it was moderated, for not being relevant to Lionell’s question, I can let my imagination run free.

The moderators here have a very light touch, as a rule, so again, when they moderated your second comment, I can only assume the worse. “Witty” you say? Wit is close to humour. I have never witnessed you being intentionally humorous before. What a pity, that we all missed it.

• #

Well, lets hope that’s enough excuse making from you.

[I will not snip this so everyone can see, it is totally irrelevant if they wanted to waste their time looking at a Monty Python-The Black Knight video. Now they’ll be left wondering whether you see yourself as a player in a comedy, or as one of the characters portrayed, or that you think your role here is to post irrelevant material, thus avoiding the questions you are asked. But don’t expect your future O/T irrelevant material to be posted or not snipped after the fact. – Mod]

• #
the Griss

“A few words of introduction would be appropriate..”

…..not that anyone here is going to take much credence of anything you link to.

The ONLY expectation is propaganda , lies and mis-information .. so…

WHY BOTHER. !!!!

• #
Steven Mosher

Science is never advanced by namecalling, misquoting, strawmen and personal attacks.

• #
Truthseeker

Steven,

You have just outlined the four tatics used by the alarmists to promote their dogma. This is because they have no science to debate with.

• #
Philip Shehan

Tell it to Griss. I have.

• #
Steven Mosher

Science is never advanced by namecalling, misquoting, strawmen and personal attacks.

• #
Truthseeker

Steven,

You have just outlined the four tatics used by the alarmists to promote their dogma. This is because they have no science to debate with.

You did forget the main alarmist tatic – lying.

• #
David Reeve

So knowing this to be true, why are folks insisting on playing the man, not the ball? Surely WC and what he stands for are well known? If this is a science blog, then what gives with the wearisome trotting out of shibboleths?

WC raised a serious objection upthread that Jo has not yet answered bar a spot of hand waving. The physicality of the notch and delay model is predicated on an assumption of dependence of global temperatures on TSI. It is not possible to drop this predicate without also losing the physicality. What you are left with is just another curve fitting exercise.

• #
crakar24

David,

You do understand what a notch filter is? Just in case you dont, a notch filter is an electronic circuit that blocks or allows frequencies in a narrow band. The frequency range is dependant on components used, ergo if you have a swept frequency between 100Khz and 900 Khz at 10dB and your notch fliter operated from 300 to 600 Khz then you would have a notch/hole/gap/whatever in your swept frequency output if it blocks, if it allows then only those frquencies will pass.

Now looking at the sun, if TSI is constant and temps on the surface are constant then there is no filtering (notch) if however TSI fluctuates but the surface does not then that would suggest the peaks and troughs of the TSI cycle are filtered in some way.

WC’s problems begin with is premise, there is no assumption of dependence of global temps on TSI for without TSI there would be no global temps…………..well there would be of course but it would be somewhere near 0 kelvin.

Hope this helps

• #
Jaymez

I don’t want to drag this issue out as Moderaters have indicated it is O/T, but You have a funny idea of what constitutes ‘hand waving’. Jo Nova’s responses were pretty specific. If you think she has missed something then how about pointing to what you think she missed, or what you think was wrong in her responses?

Re your comment: “Surely WC and what he stands for are well known?”

I really don’t know how well known – but I was genuinely interested in seeing if WC had made any attempt to approach climate science in sort of balanced way – even an attempt to ‘seem’ balanced. My research on Google showed that he hasn’t.

• #
bobl

It’s a model, for the purposes of testing the model for it’s behaviour, we ignore where it came from and treat it as a black box, we pump time domain data into the model and see what comes out. If what comes out approximates the climate, we postulate that the model may be representative.

It’s not too hard to understand if you try,

• #
the Griss

namecalling..

you mean like “denier” ??

Science has not advanced at all under the CGW meme, because all debate is shut down or refused.

And YOU are part of that.

As soon as a possible advance appears that might undermine the meme.. its full on attack to discredit the people involved, using lies and misinformation.

Has happened time and time again

• #
the Griss

I meant to write “climate Science has not advanced at all under the CGW meme”.

Real science is making good steady progress.

• #

> I was curious if you have made any attempt to criticise any of the UN IPCC orthodoxy.

I, and I’m sure our hostess, would rather not turn this into a thread about my opinions. But since the mods have let your rather lengthy question stand, I assume they’ll permit a brief reply.

The “2035” stuff was all a bit silly and certainly overblown; I discuss it at http://scienceblogs.com/stoat/2010/01/21/ipcc-use-of-non-peer-reviewed/ wherein you will certainly find criticism of WG II. The “2035” is best described in wiki: https://en.wikipedia.org/w/index.php?title=Criticism_of_the_IPCC_AR4&oldid=339040892#Projected_date_of_melting_of_Himalayan_glaciers.3B_use_of_2035_in_place_of_2350

> voodoo

> I looked

Did you? That seems unlikely, because had you googled “stoat 2035” you’d have easily found the posting I’ve just found for you.

> I didn’t find anywhere that you had questioned the IPCC position or the extreme claims made by alarmist scientists

Because you didn’t actually look. Try for example http://scienceblogs.com/stoat/2012/03/17/arctic-methane-emergency-group/

> Connolley created or rewrote 5,428 unique Wikipedia articles

Sigh. this is the std silliness by people who are clueless about wiki. That 5k articles includes reverts of vandalism, spelling corrections, and any number of minor edits. As well as a number of more significant edits. How many of the 5k articles were sig? No-one has ever bothered to find out. You’re bandying around meaningless numbers. http://scienceblogs.com/stoat/2010/01/04/a-childs-garden-of-wikipedia-p/ explains some of this (yes yes I know you won’t read it, I live in hope though).

> claim the ‘science was settled’

https://en.wikipedia.org/wiki/User:William_M._Connolley/The_science_is_settled

> I noted many well-known climate skeptics who readily criticised uninformed and poorly justified arguments, or positions they disagreed with on the ‘skeptical side’ of the debate

That’s no surprise. Its a target-rich environment.

• #
Rereke Whakaaro

William,

I know it is traditional to try to use propaganda to try and justify other propaganda, but you have to be good at it, you see, or people start to notice.

We have noticed, so I am afraid to say that you seem to have lost your touch, old boy. Very sad and all that, because such a thing can tend to influence future prospects; a rather dismal prospect, wouldn’t you agree?

• #
the Griss

In fact, it highlights that you are here for one purpose, and it is not a constructive one.

The WC has been EXPOSED, and its not a pretty sight.

It hasn’t been flushed, and its full of ****. !

• #
Jaymez

Well William, if anyone bothers to read you ‘rebuttal’ they will see how flimsy it is. You have just underscored how sneaky and underhand your approach is by writing a response which pretends to address the issues I raised when they are mere deflections and delusions.

1. Re”the 2035 stuff” – What you presented hardly criticised the stupid claim, in fact it basically tries to cover it up implying that it was a transposition error and should have been 2350. But the WWF grey paper the 2035 figure was taken from had 2035 and the IPCC readily claimed it. The reference you pointed me to wasn’t about you pointing out how silly the claim was. After they were well and truly sprung you simply agreed that grey material shouldn’t be used.

2. Re the ‘voodoo science’ claim, you have pointed to a reference where you are not criticising Rajendra K. Pachauri’s claim that Indians were practising voodoo science when disagreeing with the IPCC claim of 2035. You personally are trying to denigrate work done by Physicist and the former ISRO chairman, U.R. Rao. In fact you make yourself seem even more foolish with:

” In fact, the contribution of decreasing cosmic ray activity to climate change is almost 40 per cent, argues Dr. Rao in a paper which has been accepted for publication in Current Science, the pre-eminent Indian science journal. The IPCC model, on the other hand, says that the contribution of carbon emissions is over 90 per cent.

So far so boring. If I wanted to believe that kind of stuff, I’d have believed Svensmark 10 years ago or whenever it was (blah blah blah).”

Before then you also wrote: “The report would have been quietly forgotten but for the wild excitement over the 2035/2350 slip-up in the IPCC WGII report (which was itself based on some dubious Indian work, but never mind).”

If my memory serves me wasn’t Dr. Rao shortly after appointed to do some well paid work for one of Pachauri’s body’s which he controls?

Anyway you were again simply trying to claim the IPCC had just transposed the 2035/2350 figure which you know isn’t in fact true because the WWF paper referenced has the 2035 figure in it.

Sure there was a totally unrelated paper which uses a figure of 2350 which you can dig up from somewhere and may have been used by the WWF, but that’s not the paper referenced by the IPCC. In any event as you know, that 2350 date was pure guesstimate based on no calculation, so would have been just a silly in the IPCC report. But you couldn’t bring yourself to make the criticism it deserved.

3. What I actually wrote in context was: “I didn’t find anywhere that you had questioned the IPCC position or the extreme claims made by alarmist scientists. Perhaps there was an example here or there, but they have been clearly swamped by your total devotion to and faith in the IPCC orthodoxy.

The best you can do is drag up an article by the “Arctic Methane Emergency Group” (AMEG), which I don’t think even Al Gore could take seriously and use you criticism of their claim:

“We declare there now exists an extremely high international security risk* from abrupt and runaway global warming being triggered by the end-summer collapse of Arctic sea ice towards a fraction of the current record and release of huge quantities of methane gas from the seabed. Such global warming would lead at first to worldwide crop failures but ultimately and inexorably to the collapse of civilization as we know it. This colossal threat demands an immediate emergency scale response to cool the Arctic and save the sea ice. The latest available data indicates that a sea ice collapse is more than likely by 2015 and even possible this summer (2012). Thus some measures to counter the threat have to be ready within a few months.”

Clearly that is bleedingly obviously stupid, so you hardly get any ‘skeptic’ points for criticising that!

4. I still note you weren’t able to show any real criticism you made of the other important issues I mentioned:

– the IPCC’s claim that the Himalayan Glaciers would disappear by 2035,
– temperature is rising “even faster than predicted”,
– or those claiming that sea levels rise is accelerating.
– Michael Mann’s temperature reconstructions….the ‘Hockey Stick’ graph, and disappearing MWP.
– the deleted portion of Keith Briffa’s temperature reconstruction.
– the fact that many predictions by climate scientists and used in climate models have been based purely on assumptions of climate feedbacks which have no empirical basis.
– the behaviour of a number of climate scientists who acted ‘unscientifically’ as revealed by the climate-gate emails.
– the difference between what scientist wrote in the main body of the IPCC Climate reports versus what appeared in the Summary for Policy Makers.
– the claim the ‘science was settled’ followed by years of research and ‘adjustments’
– the global average temperature wasn’t increasing at an accelerated rate even though human greenhouse emissions were.
– a best estimate for climate sensitivity, is no longer provided in AR5, yet the IPCC claimed their confidence level that humans were responsible for most of the warming since 1950 had increased to 95% without any data to support that.

So your attempt to show some ability to critique the IPCC line was extremely feeble to say the least.

5. And then you refer us readers to your own excuses for your heavy-handed, bias, monopolistic approach at Wikipedia which you try to pass off by claiming it is “the std silliness by people who are clueless about wiki”. You are the only one you are fooling with that line. There have been far too many individuals who had their edits changed, or eminent individuals who had their profiles altered, and far too many inconvenient facts which you made disappear for you to pass it off as just the complaints of people who are clueless about Wiki. Which incidentally would have to include the Wiki people who voted to suspend your activities – I suppose they were clueless too?

Come on William, you may have the ability to re-write history elsewhere, but you can’t get away with it here. All those who care to read your comments must know that you will happily stray as far from the truth as you need to just to make sure your dangerous AGW beliefs are supported!

[OK Guys, I think we get the point, lets get back on topic. – Mod]

• #

> 2035… hardly criticised

Yes. Because the fluff was grossly overblown; that was the point I was making.

> there was a totally… paper which uses a figure of 2350… used by the WWF, but that’s not the paper referenced by the IPCC

Indeed. AFAIK, the WWF transcribed 2350 into 2035, erroneously, and this was used by WG II, who really should have gone back to the source (and then thrown it out, because as I pointed out, the original source was poor). Wiki explains this.

> the ‘Hockey Stick’ graph

Why would I criticise THS? You lot have a consensual delusion about it, but its pointless trying to discuss here. Its not as if those discussions haven’t already happened multiple times. Ditto “the climate-gate emails”.

> the Summary for Policy Makers

I agree that its a shame that the SPM is somewhat water down from the science in the main report. But I’m at a loss to know why you’d advance that on “your side”.

> accelerated rate even though human greenhouse emissions were

Forcing is log(Co2), roughly. So with a 2%-ish per year rate of increase, the forcing increases roughly linearly. Try to get your memes right.

> increased to 95% without any data to support that

Sure there’s data. Its in the WG 1 report, nicely presented. If you can’t read it, I can’t help you.

> critique the IPCC line

I pretty well agree with the IPCC line. As you’d expect, I’d expect. So why would you expect heavy critique from me?

[Mods: I hope I’m allowed a shortish reply to J’s rather long post above, which you’ve allowed through. I’ll ignore any more in this vein, though.]

(Ok but please no more and that goes for others here too) CTS

• #
the Griss

glug, glug, glug. Sinking into the mire of your own creation. 🙂

• #
PeterK

William Connolley: The Mensa Invitational has come up with some new words lately and I found this one that describes you to a “T.”

Bozone ( n.): The substance surrounding stupid people that stops bright ideas from penetrating. The bozone layer, unfortunately, shows little sign of breaking down in the near future.

• #
Philip Shehan

Just a couple of points on Jaymez criticisms:

“I earnestly looked for some criticism of Michael Mann’s temperature reconstructions which were used to create the ‘Hockey Stick’ graph, and make the MWP disappear.”

In response to the hockey stick controversy, the US congress commissioned a thorough investigation of temperature reconstruction by the National Research Council of the National Academies. This comprehensive report weighed in at 160 pages.

http://www.nap.edu/openbook.php?record_id=11676&page=R1

Far from demolishing hockey sticks produced by a number of groups using a number of proxies the report broadly accepts their validity (Chapter 11):

“Large-scale surface temperature reconstructions yield a generally consistent picture of temperature trends during the preceding millennium, including relatively warm conditions centered around A.D. 1000 (identified by some as the “Medieval Warm Period”) and a relatively cold period (or “Little Ice Age”) centered around 1700. The existence of a Little Ice Age from roughly 1500 to 1850 is supported by a wide variety of evidence including ice cores, tree rings, borehole temperatures, glacier length records, and historical documents.”

“I wondered whether you had shown any frustration that a best estimate for climate sensitivity, which is the most important climate change variable, was no longer provided in AR5, yet the IPCC claimed their confidence level that humans were responsible for most of the warming since 1950 had increased to 95% without any data to support that.”

The sensitivity question is one of the more complex areas in evaluating the magnitude of AGW.

IPCC 5 does not give a single number for this values but a range of between 1.5 and 4.5 C.

The difficulty in pinning down the magnitude of the rate of warming with increasing CO2 concentration has no direct bearing on the level of confidence that CO2 is causing warming.

• #

Philip. Yes there was a medieval warm period and a little ice age but why did you not show anything relevant to the hockey stick that would now have a long pause at 90 degrees attached to the end of it?

Here is a more relevant bit you did not quote.

The graph illustrating the trend, often called the hockey stick curve (reproduced in Figure O-4), received wide attention because it was interpreted by some people as definitive evidence of human-induced global warming. The ensuing debate in the scientific literature continues even as this report goes to press (von Storch et al. 2006, Wahl et al. 2006).

Why is it that you warmists can never be honest?

• #
the Griss

Why is it that you warmists can never be honest?

This is Philip…

Say no more. !

He’s down there with Mann as an amateur statistical mal-nipulator.

(Darn, Germany just scored !)

• #
Philip Shehan

Griss, just because you have no understanding of statistical error margins and never supply them with your trend lines (which you and I agree on) does not justify your accusation that I manipulate them.

• #
Philip Shehan

“why did you not show anything relevant to the hockey stick that would now have a long pause at 90 degrees attached to the end of it?

Because Silliguy, even IF I accepted the premise of your question, I am responding to Jaymez’ criticisms of Mann’s graph as published in 1999. Whatever happened after that date is irrelevant.

There is nothing whatsoever in the passage you quote which contradicts my statement:

“Far from demolishing hockey sticks produced by a number of groups using a number of proxies the report broadly accepts their validity (Chapter 11):”

And again, the fact that “because it was interpreted by some people” and there is a debate about that interpretation of the graph is entirely irrelevant to the implication of Jamez comment that the graph itself is deceptive or wrong.

And YOU left out this bit:

Because Despite the wide error bars, Figure O-4 was misinterpreted by some as indicating the existence of one “definitive” reconstruction with small century-to-century variability prior to the mid-19th century.

That this misinterpretation of an aspect of the graph is more relevant than the report’s conclusion I quoted is complete nonsense:

“Large-scale surface temperature reconstructions yield a generally consistent picture of temperature trends during the preceding millennium, including relatively warm conditions centered around A.D. 1000 (identified by some as the “Medieval Warm Period”) and a relatively cold period (or “Little Ice Age”) centered around 1700.”

“Science is never advanced by namecalling, misquoting, strawmen and personal attacks.”

• #
the Griss

” Because Silliguy”

“because you have no understanding of statistical error margins “

Apparently this cannot be said too often here:

“Science is never advanced by namecalling, misquoting, strawmen and personal attacks.”

Like you were saying , Philip. !!! 🙂

• #
Philip Shehan

Griss I have never seen you use error margins or acknowledge their importance in your many posts involving linear regression. (Although when I use linear regression you call me a one trick linear monkey but when I use non linear fits where appropriate you are just as abusive.)

That was a fair call considering the stuff you routinely write about me.

Ditto for the person who you echoed when he accused me of being dishonest which constituted “namecalling, misquoting, strawmen and personal attacks” on the basis of a nonsense reading of my original comment. Inferring that he is a silly guy was utterly tepid in response.

• #
the Griss

The humorous farce is that you come here quoting something as being against science advancement, then do exactly that thing..

DOH !!! I think the word is HYPOCRITE !!!!!! roflmao.. !

Me, I don’t give a rat’s what an insignificant like you thinks or calls me. Its just random yapping. 🙂

Oh and I assume your are referring to your cracked recording of junior high linear trend junk? correct?

A third order poly fit, wasn’t it.. joke of the century……. roflmao !

• #
Philip Shehan

Thank you Griss for confirming my description of your habitual mode of argument, and your habit of accusing others who use linear regression, which you habitually use, of being linear monkeys while abusing them for using nonlinear fits where appropriate.

My comment that you have no understanding of linear regression is based on the fact that “I have never seen you use error margins or acknowledge their importance in your many posts involving linear regression.” And I have seen quite a few of these comments.

This is not a personal attack (on your character, honesty etc.)but a statement of your demonstrated failure to deal with this essential feature of regression analysis in your many comments on this, including personal attacks on me.

If you have ever posted on error margins which show that I am incorrect, please direct me to that comment.

• #

If I were you I’d just ignore tG entirely. Everyone else does. “If you argue with a fool, the chances are that he is doing the same”.

• #
the Griss

Team game hey, dumb and dumber

but which is which ?? 🙂

It seems that still haven’t understood enough maths to understand that linear trends on a chaotic cyclic field is a monkey’s game. !! DOH !!!!!

Now.. let see if EITHER of you can actually contribute something POSITIVE to the discussion. !!!!!!!!!!!!!!!!!!

• #
the Griss

And if the little SkS trend calculator went out to 3sd’s about a centre pivot…

… you might even be able to get to 99.7% certainly..

Wouldn’t that be FANTASTIC !!! 🙂

• #
the Griss

Anyway, why would I waste my time on such a puerile meaningless activity, when I have a monkey I can just prod. 🙂

• #
Philip Shehan

Sound advice William . I usually do ignore him.

There is another saying: “Never wrestle with a pig. You will both get dirty but only the pig will enjoy it.

• #
the Griss

“I usually do ignore him.”

Ohh….. were you getting lonely rambling to yourself on that week old thread, and decided to come to this one to say nothing here instead. !

• #
the Griss

“but only the pig will enjoy it.”

Yes.. and here you are.

• #
the Griss

“Ignore tG entirely. Everyone else does.”

But you can’t ignore me, can you WC. or you PS…

I am in your minds. (plenty of empty space in there, f’sure) 🙂

• #
Philip Shehan

“A third order poly fit, wasn’t it.. joke of the century……. roflmao !”

I have explained to him and others that in this context an equation used to construct the line fit does not require nor imply a theoretical explanation.

A third order polynomial fit to temperature data is no more ludicrous than a linear fit, for which there is also no theoretical justification.

Its purpose here is an aid to the eye in determining the general shape of the line.

The curves which have the temperature increase accelerating with time are clearly superior to a linear fit.

http://www.skepticalscience.com/pics/AMTI.png

It is true that you could eyeball a curve fit, drwing a line through the data points. And sxperiments have shown that the human eye is extremely good at this.

However, with a mathematically fitted curve, you can calculate the R values – how far the data deviates from the accelerating curve as opposed to the straight line – which can mathematically confirm which lineshape better follows the data.

And my one complaint about Robert Way’s curve is that he does not tell us what the equation is.

It may not be a third order polynomial, but some other pseudo-exponential function which matches the general shape of the time-temperature and time-CO2 concentration curves.

http://tinyurl.com/aj2us99

The correlation with CO2 is not mathematically constructed.

The calculations I made of the relationship between temperature and log CO2 concentration were in fact based on linear fits of the temperature data to shorter time periods (from 1958 and 1979) where a linear fit, (for which there is no theoretical justification) is a reasonable approximation.

• #
Philip Shehan

Griss:

“Ohh….. were you getting lonely rambling to yourself on that week old thread, and decided to come to this one to say nothing here instead. !”

You say that you are not interested in what I write, yet torrents of comments from you follow mine.

If you had not been sneaking over there and looking at what I wrote, how would you know whether or not I was posting on “a week old thread”?

• #
the Griss

“how would you know whether or not I was posting on “a week old thread”

Because you kept appearing on the recent comments list on the right of the page.. you clueless clown!!

• #
Philip Shehan

And here you are again Griss responding to a post from someone you claim to ignore, but clearly have an obsession with.

• #
Dark Distardly

Seems this comment by George Orwell, is rather applicable to this William character doesn’t it?

“Do remember that dishonesty and cowardice always have to be paid for.

Don’t imagine that for years on end you can make yourself the boot-licking propagandist of the Soviet régime, or any other régime, and then suddenly return to mental decency.

Once a whore, always a whore.”

• #
Lionell Griffith

Propaganda is an attempt to rewrite reality for others. It will eventually fail on every level. This is especially true for the only instrument man has to perceive reality: his rational mind.

The fact is that years of generating propaganda destroys the mind’s ability to see the truth clearly. Only the propaganda remains. A propagandist thereby becomes trapped by his own attempts to rewrite reality. The mind begins to believe its own lies and reality remains what it is. Reality always wins.

• #
the Griss

Don’t imagine that for years on end you can make yourself the boot-licking propagandist of the Soviet régime,

Nice…. That describes the WC to a tee !!! 🙂

• #
Philip Shehan

Apparently this cannot be said too often here:

“Science is never advanced by namecalling, misquoting, strawmen and personal attacks.”

• #
Philip Shehan

“Do remember that dishonesty and cowardice always have to be paid for.”

At least William appears to have the courage to put his real name to his opinions.

• #
Backslider

At least William appears to have the courage to put his real name to his opinions.

And you always like to squark “strawman”…. ughhh!

• #
the Griss

“And you always like to squark “strawman”…. ‘

He does own a mirror, I assume. 🙂

• #
Philip Shehan

There is no “strawman” in pointing out that in response to “Dark Dastardly”‘s comment reflecting on people’s courage that William uses his real name.

That “Backslider” and “the Griss” are sensitive on this point comes as no surprise.

• #
Jaymez

In reality though Phil, WC has shown he loves the attention and notoriety, so I wouldn’t put it down to courage. For instance I remember reading his own bio somewhere and in it he declared “I am famous” or words to that effect.

• #

Jaymez, I’d kiss ya if ya was me sister; but you’re not, so I won’t.

But you get my drift!:)

Well done.

• #
the Griss

Scary !!!!! 🙂

• #
Scott

WC’s biography written perfectly by Jaymez. Well done.

• #
Rereke Whakaaro

Nicely done, Jaymez. I tip my hat to you.

• #
• #
Backslider

they have been clearly swamped by your total devotion to and faith in the IPCC orthodoxy.

Quite correct. WC on numerous occasions states that you are a “denier” solely on the basis that you do not accept what the IPCC has to say.

• #
Hamish McDougal

William C (Queen of the Wiki)

And if the the Evans/Nova model out-performs most of the GCMs in prediction against scenario projection you’ll become a denier skeptic?
Or is that too scientific for you?

• #
justjoshin

Why aren’t you in a wikipedia edit war? Oh yeah, you aren’t welcome there any more. For exactly the reason you are always down-voted here, you refuse to acknowledge contrary arguments.

Very likely wrong? Why would that be?

Any person with an iota of scientific training knows that the strength of a model lies in it’s ability to predict future results. The GCM’s to date have failed miserably in this regard. Shall we perhaps look at the predictive power of this new model before dismissing it out of hand?

• #
Philip Shehan

Oh please. Some of us get “down voted” here for anything.

I am getting the thumbs down for a comment which consists entirely of a quote from Ms Nova. Needless to say they are unaccompanied by any explanaton of why they do not like her sttement.

• #
the Griss

Your very presence pollutes the forum……. and stops logical discussion……

A totally negative influence

…..so what do you expect !

• #
the Griss

There is one common aspect of all the forums you have been banned from….. YOU !!!!!!!

• #
Raven

Oh please. Some of us get “down voted” here for anything.

I am getting the thumbs down for a comment which consists entirely of a quote from Ms Nova. Needless to say they are unaccompanied by any explanaton of why they do not like her sttement.

Well done Philip,
You’ve got the hang of these strawmen, I see . . . and perhaps therein lies the explanation you’re grappling for?

All you have to do is connect the dots.
Strawmen = red thumbs.

Too easy . . .

• #
Philip Shehan

“Raven” How is my comment a straw man and how is it that when someone else posts exactly the same quote they get multiple green thumbs?

• #
Mikky

I still think this is going to be a train crash, but hope that all concerned will walk away from it unscathed, hope these comments will help:

1. Absence of 11-year signal does NOT prove that a notch exists.
For example, the effective filter could just have a NULL (e.g. from thermal inertia averaging),
or there could be another influence (e.g. magnetic) that cancels by being 180 degrees out of phase,
so there you have 2 possible explanations that did NOT involve a notch.

2. I’ve downloaded the TSI reconstruction data and implemented what I think is your notch filter (2 poles, 2 zeroes).
The notch filtering made very little difference to the data,
suggesting that the original data actually has much less 11-year signal energy than you think it does,
e.g. general noise and harmonics of the 11-year oscillation.
What does have a big effect is the lowpass filter.

It would be a shame to be shot down for something that you don’t actually need.

3. Releasing all the data and code, is that really a good idea at this stage?
99% of people will ignore it, 99% of people who change it will break it, 99% of people who run it will misinterpret its outputs.
I’d much rather have a short and concise algorithm specification,
reverse engineering spreadsheets is not my idea of fun.

• #

Mikky
I do not think it will be a train crash. Having worked with R&D efforts I know it is normal to stuff things up a lot at first then refine it bit by bit to produce a result that works. David and Jo have brough us all in at the prototype stage and things are proceeding as per normal with a lot of good minds on the job.

If this question alone is answered it will have been worthwhile.

what solar model would account for it?

While others are crediting Willis for the bringing the noise problem to the surface I think it may have been you and other electroncs as opposed to electrical people. Slowly people are beginning to see noise, distortion and LPF as independantly adjusting the notch attenuation. It will not be hard to introduce other frequencies into the spread sheet and see the difference it really makes. I suggest this will help to quantify the noise, distortion and LPF.

David being an electrical guy may not have faced the noise problem or dealt with filters as often as electroncs comms people. So I have a model idea that that could put it all into more familiar territory for him. That idea is to try a three phase system analysis instead. to look at it like an induction motor with a variable load.

Perhaps like.
Reference phase = long term average solar cycle period.
phase 2 = actual TSI
phase three = actual magnetic.

What does have a big effect is the lowpass filter.

Yes looking foward to trying three. Air,sea and planet.

• #
Skeptikal

Releasing all the data and code, is that really a good idea at this stage?

They’ve made a promise to release it. It would be hard for them to justify not releasing it now, unless they’ve found a fundamental problem with it.

Yes, some people are going to download it with the sole intention of breaking it… but others will get a better understanding of the theory behind it when they’ve had a chance to pull it apart and see how it really works.

If the theory is right, then it will withstand whatever is thrown at it.

• #
Rereke Whakaaro

… some people are going to download it with the sole intention of breaking it …

Of course. That is just human nature. But then, if they are to retain their own credibility, they will have to publish their own data and code to demonstrate the flaw, and then David (or others) can address the cause.

This is not new. Computer software designers have always worked this way. It is only the ivory tower academics, who are too protective of their egos, and too precious, to play nicely.

• #
Philip Shehan

Strangely, this is not the attitude “skeptics” take when it comes to climate models.

• #
sophocles

I’ve downloaded the TSI reconstruction data and implemented what I think is your notch filter (2 poles, 2 zeroes).
The notch filtering made very little difference to the data,
suggesting that the original data actually has much less 11-year signal energy than you think it does,

… because it’s been removed by the solar notch filter? I would expect data which didn’t have any 11 year cycle information embedded in it to pass through an 11-year notch filter with no change. It’s an obvious prediction. I’m sure David is comforted your results replicate his.

Absence of 11-year signal does NOT prove that a notch exists.

It’s a very good indicator there may be one. When there is an indisputable, strong, measurable 11 year cycle modulating a specific property of the sun, yet there is no detectable signal nor residues left by that signal, down here, it means there is a high-Q filter somewhere in the system between input and output.

For example, the effective filter could just have a NULL (e.g. from thermal inertia averaging),
or there could be another influence (e.g. magnetic) that cancels by being 180 degrees out of phase,
so there you have 2 possible explanations that did NOT involve a notch.

If either of your suggestions do what you say they could do then they are notch filters. It’s the result which counts here, not the mechanism. It it wipes all trace of a known frequency from the data, it doesn’t matter how it does it, it matters that it is done.

If it looks like a duck, walks like a duck, and quacks like a duck, I wouldn’t call it a seagull.

• #

sophocles
“suggesting that the original data actually has much less 11-year signal energy than you think it does,”

…because it’s been removed by the solar notch filter?

Can you see this lack of 11 year signal in the TSI places your duck on the sun? More accurately it places the duck at 1 au from the suns Barry centre. Niether us or the sun are often exactly at either end of this data averaging artifact distance.

• #
sophocles

Can you see this lack of 11 year signal in the TSI places your duck on the sun?

I can’t see it being anywhere else. It’s a very hot, yellow, non-rubber ducky. 🙂
I did write solar notch filter as a hint…

• #
londo

Yes, there are clearly some strange arguments being thrown around. Clearly, the clever observation is that the periodicity of solar activity is not present in the temperature data although it clearly should. Question is, is the notch unique? Is there any other transfer function that would suppress the humming of the TSI but give a different step response that could lead to different conclusions. What is the space of possible transfer functions that could equally well explain the lack of TSI “print through” onto the temperature record. I have read the material but perhaps not sufficiently well to realize that this has been answered.
Though I love the fact that here’s a theory that makes (somewhat) bold predictions and is falsifiable within half a decade. If its 11-year lag prediction turns out to be correct then this theory is Nobel prize material, and not for its complexity but for its simplicity.

• #

> the periodicity of solar activity is not present in the temperature data although it clearly should

That’s an assertion (“it clearly should”) not an observation. Why “should” is be present? Of itself, its a small forcing.

You can see the solar periodicity in some atmospheric variables – as various people have pointed out – but only weakly.

> is falsifiable within half a decade

Not according to DE. He says a decade: “Here’s the criterion: A fall of at least 0.1°C (on a 1-year smoothed basis) in global average surface air temperature over the next decade.”

• #
the Griss

No, there has been an 11 year signal reported in many papers at a regional level.

That shows that the sun is a major driver of the climate, yet there is not a solar signal at the global level.

* Something is smoothing and delaying the effect of the solar signal at the global scale.

CO2 CANNOT produce such an 11 year signal at any level. It is a diminishingly small forcing at best, and there is NO CO2 signal in the GAT what-so-ever.

——

*Something is smoothing and delaying the effect of the solar signal at the global scale.

This model predicts that this is happening.

It is therefore much better aligned with reality than the current climate models.

The CO2 based agenda driven models have failed badly over the last 15-20 years, and if the temperature drops as predicted by this model, and by many other real scientists, the divergence will make even more of a mockery of them.

• #
Hamish McDougal

You have not yet replied to my 2.4. The problem with alarmism is ‘prediction‘.

• #
Rereke Whakaaro

I doubt he ever will.

If presented with a possibility that would cause him to “loose face”, he metaphorically puts his hands over his ears, and sings, “la-la-la-la, I can’t hear you”.

• #
bobl

Because William, TSI when modulated via geometry ( orbital eccentricity, axial tilt, and incidence angle) always modulates temperature. The question is why are sunspots DIFFERENT. How come the 0.1C that sunspots should theoretically cause doesn’t show up, even when we use noise reduction of about 10x? When we know that, we will understand more about the climate.

• #

If you’re talking Milankovitch type timescales then the reply is obvious: those are very different timescales. One short (11-y) timescales, the TSI gets integrated out by the ocean. That argument doesn’t apply to millenial timescales.

• #
bobl

Gee, William, last time I looked night was colder than day, and winter was colder than Summer so clearly TSI has an effect at 24 hours, and 1 year too, but the special case of 11 year peridicity doesn’t , Why William, Why? When you can answer that question, you can challenge for the world championship belt.

• #

Why? Why not pause for thought before writing?

The insolation variations between night and day, summer and winter, are large. If the day/night variation was 0.1 W/m2, it wouldn’t have much effect.

• #
bobl

How do you know William, prove it…

Given that these large effects cause large temperature effects it is reasonable to assume that a small effect will have a small temperature impact. In spite of the fact that Davids’s method will reduce noise by a factor of about 10 (or alternately you can see this as an amplification of the the signal) the expected frequency doesn’t appear. Doesn’t that strike you as odd?

So, on that basis it is reasonable to hypothesise that TSI from sunspots, SHOULD have an effect on temperature and build a model on that basis. What David is doing is on that basis, not unreasonable, despite your protests

• #
bobl

PPS remember the assumtions, for small excursions of TSI the climate is linear ( an effect in one gives rise to a proportional effect In the other) and invariant, ( if I make the same excursion twice I get the same effect twice.)

• #
the Griss

Its a cumulative thing 😉

One day you may, perhaps, understand…

nah… permanently brainwashed.!

• #
the Griss

“Why not pause for thought before writing”

Why, you never do ?

Parroting from the given meme. !!!

• #
Rereke Whakaaro

He can’t answer that question. And he can’t admit that he cannot, so he just goes away. That fact alone, speaks volumes about his psyche.

• #
bobl

Rereke,
Will has been arguing a strawman, not that he actually knows that, he tries to discredit the model by saying what if the climate isn’t related to TSI, of course that’s not within the hypothesis. That discussion belongs to a different model.

Having said that, there is plenty of evidence TSI in fact does modulate temperature over all timescales. If a cloud floats past, does it get cooler? Well you bet. Is 3PM cooler than 2PM, yes usually, even though the sun traverses only 15 degrees and cos 15 is 0.965 and despite the fact that thermal capacity exists in the system. Is March, a different temperature from February – well yep, despite the fact that the angle of incidence changes by just 8 degrees and cos 8 is 0.99. So it’s patently obvious that small changes in TSI can and do cause changes in temperature!

• #
Greg Cavanagh

bobl; You’ve finally explained with accuracy what we felt we knew intuitively.
Thanks for that.

• #
londo

It is a assertion but it’s also a quantifiable one. Even if it is a small forcing, the number of periods that we need to observe can be computed, no matter how small the perturbation is. Having said that, it is a little of a simplification. The spectrum of TSI must be sufficiently “clean”, or periodic for this to be true or it can be hidden in the noise anyway. Still, this is a mathematical exercise of finite duration with a definite answer. I don’t know if Dr. Evans has done it but I know it would be the first thing I would do before attempting to formulate a theory on this “assumption” and I presume that he has done it as well.

I’m not sure what you mean by “DE” but the prediction of his theory is very distinct from everything else we have seen. As always, we have to do the math but if we see that distinct temperature drop, especially if it happens within one year from 2015, then I believe the math of signal to noise and whatever we need to do to distinguish this observation from pure chance is reasonably straight forward.

• #
Peter Whale

Hi Jo just an old retired layman but your 11 year notch has a simplicity and good feel about it. My only observation is this if the sunspot cycle was regular within itself you could get a pretty good hold on the pattern but within the sunspot cycle it is multivariate.If each individual sunspot has an eleven year notch it would mean that there would be a lot of compensatory elements forcing and nullifying over the full cycle.
The fact that you have put in a null hypothesis to falsify your model is real science.
Thanks for making me think.

• #
Joe

Jo, I think I had made a comment in the early (either Part I or II) questioning whether you would still ‘find’ a notch even if the temp data were just random. That was genuinely just an enquiry (and not a criticism)after thinking about the basic maths of it. If I input a steady state signal plus a single frequency into any low pass filter I was bound to get a notch in the transfer function, that would not strike me as a ‘mystery’ but just as an impulse or narrowish pulse convolved with a LP function. To me, the term ‘mystery notch’ in the headlines just sounds like a little bit of an emotive way to say that the sole frequency ripple at the input does not appear at the output, which seemed pretty obvious in just the time domain data anyway. I think the transfer function is perfectly valid but I was not sure what it was indicating. I know the model details are more complete now and I will look more carefully at it as I know David models the LP part(s) as a separate element and so the notch or force X really just explains the ‘extra’ cancellation necessary for that 11 year component to be cut out as that 11 year cycle is a ‘little’ under the LP roll-off at 5 years or so and would not be eliminated by the LP alone. Force X could very well be multi-variate and be some function of say radiation frequency or magnetic field as David speculates – our TSI figures don’t show spectral distribution. I think the real understanding must come with understanding the still unknown force X element in the xfer function.

• #
jim2

The crux of the issue is this: Is is reasonable to expect that the 0.1% variation of TSI will appear in the temperature record? 0.1% is very small and could easily be swampped by noise in the temperature record. This to me is the critical issue. Because you method will work only if that signal has a chance of being detected. If that signal is simply getting lost in noise, then the notch is meaningless. If the signal is being cancelled by “force x,” then the notch is meaningful.

So, to sum up, the potential problem here is the very small magnitude of variation of TSI at TOA.

• #
bobl

Yes it is, the autocorrelation over 150 years of data will average out uncorrelated signals, and should reveal any periodicity in the temperature data that is less than about 10 dB below the noise floor

• #
jim2

So has the 11-year signal been proven to exist in the global temp records?

• #
bobl

It has been seen according to various papers, however it is not very clear, but it aught to be. So David invents (hypothesises) the existence of a counterphase forcing that offsets warming. This is perfectly valid science, for science to advance a proposition has to be made, a “maybe it works this way” argument needs to be proposed and tested. The filter is David’s leap. If David’s model works then there is some evidence that he’s right about that and climatologists can crawl over it for 50 years arguing what force X might be, instead of lobbying governments to tax us to death. Arguably a more useful use of their time.

• #
Joe

At this point tho, force X can still embody a CO2 effect or any effect really. I think someone had made the comment that there was no 11 year signal or mechanism in the CO2 component but that is not really the case in the context of the transfer function. If there is some CO2 mechanism (part of force X)then if that is excited with an 11 year component it is likely we will see an 11 year component in the transfer function. There does not need to be a notch or resonance phenomenon within the CO2 mechanism itself, a simple low pass gives the necessary 180 degree phase shift once you are beyond the pole. The associated (CO2 related)LP pole would only need to be lower than the 11 year figure. Force X even embodies any human related forcing (not saying CO2) but figure tweaking of the data – it embodies all things unknown.

• #
bobl

While technically true, CO2 effects are unlikely to be lagged 11 years and even less likely to be synchronised to the solar cycle ( except perhaps an outgassing component ). CO2 is not likely to play a role in this, more likely it’s a confounding factor ( part of the missmatch between the model and reality)

• #
Richard C (NZ)

jim2 #7.1.1

>”So has the 11-year signal been proven to exist in the global temp records?”

Yes, including GISTEMP global average.

David has yet to address the evidence (12 papers at least). See #20 downthread here:

http://joannenova.com.au/2014/06/are-transfer-functions-meaningless-the-white-noise-point-beware-your-assumptions/#comment-1498508

• #
Richard C (NZ)

>”including GISTEMP global average”

See Souza Echera et al (2012), Table 2, #73.4 here:

http://joannenova.com.au/2014/06/big-news-part-vii-hindcasting-with-the-solar-model/#comment-1496534

• #
Richard C (NZ)

Ree #7.1.1.2.1

>”See Souza Echera et al (2012), Table 2″

Global 2–2.8;3.7–6.6;7.7;8.3;9.1;10.4;11.5;20.6;26.3;29.6 and 65

The “11 year” cycle is just the nominal average periodicity but the period varies with cycles shorter and cycles longer than 11 years.

Souza Echera et al identify 10.4 and 11.5 years i.e. the “11 yr” cycle has dual periodicity either side of the average.

And for the naysayers, they also identify 65 year periodicity in GISTEMP.

• #
Philip Shehan

I don’t like the word “proven” applied to empirical science but here is one example where it helps explain the temperature record:

http://www.sciencemag.org/content/326/5960/1646/F8.expansion.html

• #
Philip Shehan

Which brings me to ask why you need a filter to subtract the frequency associated with 11 solar cycle when you can simply subtract the temperature contribution from the temperature record.

• #
the Griss

Let me guess..

You have never done any electrical engineering analysis.., correct !

• #
Philip Shehan

Griss,

Let me think a minute.

Notch filters. Electronics. Frequencies. Capacitors. Inductors. 4 dB attenuators. Transmitter/receiver coil. Tuning range 12.1 -13.6 MHz, Fourier transformations…

Hmmm.

Kind of rings a bell.

Oh yeah. I remember now.

Chapter 10

Construction and Testing of a High Sensitivity Transverse coil 95Mo probe.

PhD thesis, B.P. Shehan, 1985.

Also in

Applications of 95Mo NMR Spectroscopy. XIV. Construction of a Transverse Probe for the Detection of Broad 95Mo Resonances in Solution Studies On Binuclear Mo(II) Compounds

B.P. Shehan, M. Kony, R.T.C. Brownlee, M.J. O’Connor and A.G. Wedd

J. Magn. Reson. 63, 343, (1985)

http://www.sciencedirect.com/science/article/pii/0022236485903257

• #
Mark D.

Thumbs up earned Philip.

Griss you have to hand it to him that is a classic burn. 😉

• #
Philip Shehan

Thanks Mark. I said to William that ignoring Griss is good advice, but occasionally he bends over with a kick me sign on his rear end and I just could not resist.

• #
the Griss

Thanks for the confirmation that you DO actually know something about electronics.

I can only assume that you are PLAY-ACTING, PRETENDING you are ignorant, with your moronic posts showing zero understanding.

I have often suspected that, I mean no-one with the qualifications you SAY you have could be so dumb !!

• #
Eddie

Ever since Mann’s Hockey Stick function was shown to produce, well a Hockey Stick, from almost anything that was fed in including noise, the possibility has been topical.

Absolutely a tool is to be used for it’s intended purpose in mind and Mann’s was indeed a tool for a purpose.

• #
Philip Shehan

More from the report to US congress on hockey sticks:

http://www.nap.edu/openbook.php?record_id=11676&page=R1

With regard to the criticisms of MIntyre and McKitrick the report states:

“As part of their statistical methods, Mann et al. used a type of principal component analysis that tends to bias the shape of the reconstructions. A description of this effect is given in Chapter 9. In practice, this method, though not recommended, does not appear to unduly influence reconstructions of hemispheric mean temperature; reconstructions performed without using principal component analysis are qualitatively similar to the original curves presented by Mann et al.”

• #
A C Osborn

Jo, I am sorry that this is off topic, but I have discussed this with you before. The Temperature data sets are inaccurate. There is a very important Thread on WUWT where Anthony Watts has admitted that Stephen Goddard has been correct all along. The USHCN data has been manipulated by the keepers of the data by Cooling the Past and Warming the Current period.
Not enough Climate Forums are picking up this story, which is huge as it has turned a currently cooling globe in to a warming one by Infilling data on very large numbers of Stations, some of which are no longer even in use.
http://wattsupwiththat.com/2014/06/28/the-scientific-method-is-at-work-on-the-ushcn-temperature-data-set/

The comments of the defenders of the “faith” are also very enlightening.

• #

Similar issues can be found in Dr Marohasy’s new paper on Australian data: http://jennifermarohasy.com/wp-content/uploads/2014/06/Marohasy_Abbot_Stewart_Jensen_2014_06_25_Final.pdf and NIWA abandoned their NZ data when sprung by court action. It’s probably a world wide phenomenon.

• #
Philip Shehan

A C Osborn.

Hansen et al’s 1999 paper explains the adjustment process.

http://pubs.giss.nasa.gov/docs/1999/1999_Hansen_etal_1.pdf

Some of the adjustments decrease later temperatures relative to earlier ones.

For instance adjustments made to take into account increasing urban heat island effects. Fig 3 shows these adjustments for Tokyo and Phoenix. Also A2 for the US.

And:

“We also modified the records of two stations that had obvious discontinuities. These stations, St. Helena in the tropical Atlantic Ocean and Lihue, Kauai, in Hawaii are both located on islands with few if any neighbors, so they have a noticeable influence on analyzed regional temperature change. The St. Helena station, based on metadata provided with MCDW records, was moved from 604 m to 436 m elevation between August 1976 and September 1976.

Therefore assuming a lapse rate of about 6°C/km, we added 1°C to the St. Helena temperatures before September 1976.

Lihue had an apparent discontinuity in its temperature record around 1950. On the basis of minimization of the discrepancy with its few neighboring stations, we added 0.8°C to Lihue temperatures prior to 1950.”

Others are neutral adjustments, such as the exclusion of outlier temperature data, those that are greater than 5 sigma warmer or cooler.

Then there is the adjustment of discontinuities in overlapping data, where the adjustment is to the average of the difference

“Figure 2. Illustration of how two temperature records are combined. The bias T between the two records is the difference between their averages over the common period of data. The second record is shifted vertically by T and T1 and T2 are then averaged.”

There is no evidence that the adjustment process is biased toward showing greater warming in later years.

• #
A C Osborn

I suggest that you actually go and read the 3 WUWT Threads and Steve Goddard, Paul Homewood, SunshineHours and many others who have actually looked at the data before trying to waft away what is one of the scandals of the Century. NCDC are corrupting the data on a massive scale.
I have seen it with my own lying eyes, so don’t try and bullshit me your “official line” of crap.

The current Final Temperature Data bears no relationship with reality, any analysis done using it any other data set based on it are also corrupted.

• #
Rereke Whakaaro

This is obviously another example of the pragmatic differences between practical engineering and theoretical science.

Engineers will know the nominal accuracy of the collection mechanism, and will have an idea about the tolerance ranges of the measurements. They will also know that what you have, is the best you can possibly get. So you work with what you have, and calculate the error bars.

Climate Scientists seem to think that you can improve the accuracy of measurement by fiddling with the numbers after the event. In fact they go as far as suggesting that you can express temperatures in hundredths of a degree, when most physical equipment can only be read with an accuracy of a tenth of a degree, or worse. Even electronic equipment will require periodic calibration.

So, you can fiddle with numbers if you like, but you cannot improve their accuracy. You could, however, adjust them to suit what you might expect them to say, in the belief that your judgement will be much more accurate than than any empirical measurement.

The point that, “Some of the adjustments decrease later temperatures relative to earlier ones.”, is totally immaterial. The numbers have been fiddled with. The books have been cooked. And any suggestion that an independent audit should be conducted, is met with violent outbursts, and vitriolic abuse.

That last fact, in itself, tells you all you need to know about the veracity of the Climate Change Industry.

• #
Bernie Hutchins

You said:

“It is trivially obvious that the transfer function will find a relationship between entirely unrelated time series, as any mathematical tool will when it’s misapplied.”

If this tool (transfer function) is applied to “unrelated time series” it is prima facie already misapplied. It (“transfer function”) doesn’t simply mean (as David uses the term) a ratio of Fourier transforms. That ratio can always be computed, and may or may not have meaning. The term “transfer function” implies that a connecting mechanism between input/output is KNOWN to EXIST, and is being characterized.

If you claim this ratio to be the “Frequency Response” of some (Laplace domain) transfer function, we should not have to assume this as even provisional. Instead, the mechanism of the transfer function connection should be juxtaposed to the first mention. Else it misleads. A few sentences of explanation should suffice initially.

• #
bobl

This point has been made over and over, The hypothesis is

That TSI is related to temperature (either directly or as a proxy for something else)
That for small excursion of TSI the climate ( the response) can be considered roughly linear and invariant.

If either of these conditions fail then the model won’t work, that is – it won’t forecast

This model is only valid if these two conditions are true, David has clearly stated this. We are NOT discussing models outside this presumed reality. If you don’t believe one or the other, then off you go and argue about someone elses model for which your presumed reality is valid.

• #
Bernie Hutchins

I guess I probably agree with you. You are using the term “proxy” in a sense that means an indirect linkage – right?

• #
bobl

Almost, in the broader sense TSI might just be related to the real mechanism, for example cosmic rays modulated by the suns magnetic field might be the real mechanism for which TSI is a proxy measure, through a common cause for both (Suns magnetic field variation).

• #
bit chilly

jo,well done on maintaining your sense of humour .there has been some big chunks of humble pie consumed recently due to people who really do know better skim reading,it would seem human beings really do struggle to learn from prior mistakes.i do hope that despite some friction friends will remain friends regardless of differing opinions.

i have no idea how this is going to pan out,but i really am glad there are people willing to commit their personal time,effort and money in the pursuit of a noble cause. the world really would be a very different place if people like your good selves and others took the option of an easy life.

• #
steverichards1984

People should not worry about the notch. Any system with a frequency sensitive delay with exhibit a notch effect. Where could such sensitive Come from? Mass of the sun and earth, the distance between the two and how quickly the earth reacts to any changes from the sun.

• #
Ross

I admire your patience Jo and I particularly like your last two paragraphs above. Some of the responses you are getting clearly indicate to me, you and David are ruffling a few over inflated egos and deep seated worries. Keep up the good work.

• #
CC Squid

NOAA’s Temperature Control Knob Makes 1936 The Hottest Month…

http://wattsupwiththat.com/2014/06/29/noaas-temperature-control-knob-for-the-past-the-present-and-maybe-the-future-july-1936-now-hottest-month-again/
Two years ago during the scorching summer of 2012, July 1936 lost its place on the leaderboard and July 2012 became the hottest month on record in the United States. Now, as if by magic, and according to NOAA’s own data, July 1936 is now the hottest month on record again. The past, present, and future all seems to be “adjustable” in NOAA’s world. See the examples below.

• #
TedM

Sorry this is off topic but Sebastion Luning presentation. Sees possible 0.2C cooling by 2020.

Previous presentation with audio to WUWT here: http://www.youtube.com/watch?feature=player_detailpage&v=1kdxxbSmUwQ

• #
• #
the Griss

A drop of 0.2ºC will just be nuisance for crops, weather events, etc..

CO2 unfortunately doesn’t appear to help plants cope with the cold..

….and strong weather events are driven by atmospheric differences, which are exacerbated by a cooler climate….
(as is evident in the drop in extreme weather events over the slight warming period we may have had)

I’m a tad worried it may drop quickly by 0.5ºC, which would cause major issues for world wide food supplies.

A slight warming would be a far better scenario. But seems we may be out of luck 🙁

• #
gnomish

so the sun modulates a tide of cosmic dust and the dust nucleates clouds?

• #
sophocles

What sort of cosmic dust? There’s a lot of it out there!

• #
Steven Mosher

“And for those who are impatiently waiting the full working model, we’re working on it. There are a few last-minute things to sort out. The spreadsheet used data to August 2013 in the investigation, and was frozen months ago with that data. That’s the copy that is available to people who got advance notice. Now that we are releasing it, it would be nice to update the data, while preserving the original calculations. David is copying the Aug 2013 data and updating all the data. We are also figuring out the creative commons conditions that would be workable, and deciding how to manage suggestions, adaptations, and modifications. We suspect the normal open source software sites don’t deal with 20Mb Excel files which people can modify, but which are very difficult to track changes on (does anyone know of a similar project?). ”

huh? When you publish something Open source you are giving people a right to do what ever they damn well please to do with your oode and data provided they give you attribution. Once you publish it open source interested people will download it and make their own changes, modifications, improvements and derivative works.

I dont know what possessed you to use excel I imagine when time permits the first job will be to put it on a better more open platform.

• #
the Griss

I don’t know what possessed you to make such a meaningless comment.!

• #
Steven Mosher

How to do choose a CC licence

https://creativecommons.org/choose/

not rocket science. not even a transfer function.

• #
the Griss

So.. nothing that David and Jo would not be aware of… OK.

As I said… a meaningless comment.

• #
steven mosher

its not that hard. answer two questions.
so again, why write a sentence in a post about a choice that one is puzzling over when the choice is quick and simple.

In short, one can choose a licence in less time than it takes to write that one is puzzling over a licence.

On the other hand, if one is hunting for excuses for delay then any one will do.

Here is a thought. They have already released the material. Release it to everyone else under the same terms and conditions.

or here is a radical thought. Do what “data hiding” Mann did. Post it free and clear, public domain. I mean if mann can figure it out, surely jo can.

• #
Scott

Funny Steve,

it didn’t matter what software you used to create the so called BEST data set (an oxy moron if ever there was one) it is still garbage in meddle with the garbage and guess what double garbage out.

• #
the Griss

That’s why they had to hire a salesman. 🙂

• #
crakar24

I used to think i knew a lot about transfer functions until i saw my sons 2nd year mechanical engineering maths exam (past exams used for practice). I doubt many people here have this level of competency.

Also why use Excel? Well they could have used MATLAB i suppose, MATLAB is a far more powerful tool and can handle the larger file sizes much better and you can create your own scripts to do what ever you want. Unfortunately in order to use MATLAB proficiently you need to do a week or so of training and you need to buy the product whereas Excel is simple to use and comes with a copy of windows.

I suspect you are a dancing penguin fan (Linux) but i wont hold that against you.

Cheers

• #
Chuck Nolan

Steve, they used Excel to keep Phil Jones away.
cn

• #
sophocles

Steven Mosher says:

I dont know what possessed you to use excel I imagine when time permits the first job will be to put it on a better more open platform.

I sort of agree with you on that point. I neither own, possess nor use any proprietary software, especially that emanating from Redmond.

crakar24 says:

Also why use Excel? Well they could have used MATLAB i suppose

Also proprietary. Both Excel and Matlab impose a price/licensing barrier with attached conditions I disagree with.
So I don’t use them.

crakar24 also says:

I suspect you are a dancing penguin fan (Linux) but i wont hold that against you.

I’ve been a Microsoft(tm) Free Zone since 1994. Unix (and all its derivatives, like Linux, but I prefer OpenBSD) are far more productive than anything from MS. But that’s way OT and should stay there.

• #
Steven Mosher

Some people are claiming that the transfer function is meaningless because you could use white noise instead of temperature data and get the same notch.

Manners makes no difference to the scientific method, but ultimately the human practice of the Scientific Method is only ever advanced by … humans, and manners do matter. Science is never advanced by namecalling, misquoting, strawmen and personal attacks. Please quote us exactly, eh?

#############

its rather funny to contrast the first sentence which attributes an idea to a “someone” and the final plea
for exact quotes.

Funny. that is all.

• #
the Griss

Science is never advanced by namecalling

Which is why climate science has not advanced one iota since the “denier” tag was first used. !!

Until now.

You and your ilk, with the “settled science” mantra, have stalled knowledge of the climate for 2 or 3 decades.

Do you feel proud ????

• #
the Griss

ps… yes I know you started out actually thinking… but your employment as mouthpiece for BEST has muddied and stalled your mind.

You are stuck in a pit of primordial ooze. !!

• #
steven mosher

err

“ilk” is name calling in my book
science is never settled.
I think we ought to frack with wild abandon so poor folks can have cheap energy.
The EPA needs to go
end subsidies for wind and solar.

what ‘ilk’ are you talking about.

But Ya, Like david evans I think that C02 ( and other GHGs ) explain the warming.
he thinks more of the warming is explained by solar, i think less of the warming
is explained by solar

Please define the bright line between his ilk and my ilk

1. If I say that c02 explains less than 50% ( like david ) do I belong to his “ilk”?
2. if I say 45% am I his his “ilk”

What bright line separates those who argue that C02 explains 25% of the warming from those who argue its really more like 50%?

Now of course, david is no sky dragon arguing that its impossible for c02 to have any effect, so he builds a model that attributes say 25%.say 2%. Maybe 5 years from now based on new data he has to change his mind to 35%. Is he now a warmista? maybe in 10 years his
model is just slightly out of bounds.. and to spare it from falsification he tweaks it…
and argues that 45% of the warming is c02.. the data could drive you to that number, the science isnt settled
hell look archibald is saying he may have to give up his solar model for davids solar model.. Does archibald argue that its not the sun merely because his model is off? And scafetta, he says something like 40% of the warming is c02 based. in 10 years if his data drives him to change this to 60% is he suddenly my ilk?

There are those of us, luke warmers, who accept a broad range of roles for solar and ghg. A broad range. And as data comes in and models get tweaked or expanded that range may contract ( its possible) or expand (its possible) It may do this because the science isnt settled. And because the science isnt settled you will have a hard time using our scientific position to define our ilk.

So I am Davids Ilk. I think the science isnt settled. I think solar plays a role and ghgs play a role. Of course we are the same ilk. Of course we are on the same team.
oh, i disagree with his methods, and his specific results, but from where I stand we are on the same team.

• #
the Griss

Nice to see you twisting and turning like a used car salesman. 🙂

• #
the Griss

Truly in your element ! 🙂

• #

maybe if you made your views clear in an unambiguous way more often, you would avoid less careful people using the word “ilk”?
All I know about you is that you helped me learn R… so Im certainly not a hater.

• #
Rod Stuart

Is the Draconic cycle a candidate for Force X, or at least a bit player?

• #

Rod. Thanks for that link. On the same page scroll down to …

two ways that the Jovian and Terrestrial planets can influence bulk motions in the convective layers of the Sun.

and ask yourself while you read it if the notch may be stronger every second 11 year cycle?

• #
Richard C (NZ)

>”So is the discovery of a notch filter useful”

No, but the N-D is (see below).

It was a discovery based on inadequate analysis. David didn’t find 11 year cyclicity in GISTEMP globally averaged surface temperature but Souza Echera et al (2012) not only found 11 year cyclicity in the GISYEMP global average, they also found it in 9 of 11 latitudinal GISTEMP analyses.

I’ve presented 12 papers showing 11 year cyclicity in temperature including the global average of the GISTEMP series. David has been sidestepping this, latest at #59.2.1.1.1 in Part VIII here:

http://joannenova.com.au/2014/06/big-news-viii-new-solar-model-predicts-imminent-global-cooling/#comment-1498087

To which I’ve replied in 3 comments with the evidence (again) as above.

>”So is the discovery of a notch filter useful”

Yes, the N-D is useful because a non-causal (acausal) 11 year lag comes close to calculations of planetary thermal inertia and therefore (causal) thermal lag.

David has effectively mimiced, to a degree, planetary thermal inertia with an 11 year lag in the N-D model.

The thermal inertia of the ocean would indicate a lag longer than 11 years has a damping effect (N-D too sensitive to TSI) but that’s for time and climate to adjudicate.

The N-D model is incomplete however, without the introduction of 65 year (+/-) periodicity (PDO/AMO).

• #
bobl

There is barely enough data to extract 65 year periodicity, nyquist has a say in this. The S/N on such a signal would be poor, however if it’s there the method would extract it, given enough data.

• #
Richard C (NZ)

bobl #20.1

>”There is barely enough data to extract 65 year periodicity,”

You don’t think H.Luedecke, A. Hempelmann, C.O. Weiss; Clim. Past. 9 (2013) p 447, achieved that?

• #
bobl

Richard,
I don’t know if he did, I just know that there is barely 2 Full cycle of 65 year periodicity, and so the method that David employs wont discriminate this from the noise very well, especially if there is wavelength variability in the 65 year signal.

• #
Greg Goodman

” Pope that the bible is a totally fictional document it would have a better chance.”

He probably knows better than most that it is. He’s head of the organisation that wrote most of it. 😉

I’ve already explained to you that paper does nothing but model the step between the beginning and of the data, wrapped around to the end.

64 is simply a DFT sub-multiple of the data length, not a natural feature of the data.

There probably is energy in climate at around 60 but most of the amplitude they fit is to reproduce the step.

That was a particularly inept attempt at Fourier analysis, unfortunately.

• #
bobl

OOPS, NOT A nyquist issue actually.

• #
Greg Goodman

David has effectively mimiced, to a degree, planetary thermal inertia with an 11 year lag in the N-D model.

I generally agree with that.

• #
handjive

A US team has modelled the effects of a limited conflict in the light of new concerns over weapons proliferation.

For an exchange of 100 Hiroshima-sized bombs, the modelling suggested there would be millions of deaths, as well as climate cooling and ozone damage.

It was Professor Turco who first coined the phrase “nuclear winter” back in the 1980s to describe the climate shift that would result from a war using atomic weapons.

December 2006

• #
MikeO

I know WC of old the one who took up with a religious fervor the mission to change or delete anything he did not like on Wikipedia. He even edited entries by authors about themselves to change the meaning discrediting Wikipedia in the process. See Pointman’s “the scorning of william connolley“. Why aren’t we doing that here and not wasting time on him. Let us instead try to convince the Pope that the bible is a totally fictional document it would have a better chance.

The likes of WC to this project are a real threat. When the spreadsheet is distributed how can the integrity of the data and source be protected. An army of opposition is out there waiting to destroy any attack on their position. What if they take your work and flood the internet with corrupted versions and then use that to discredit the whole hypothesis. Please do not distribute in haste this must be carefully considered, how do you publish it in a very hostile environment?

• #
David Reeve

For goodness sakes, what’s with the culture war? Science will advance because it is a construct that will eventually shrug-off this sort of BS. David, Jo and Connolley are mere players in the process of science. Where all eyes should be focussed is the ball.

• #
MikeO

The WC I talk of should not be allowed to be part of this. He censured every thing he could and used every possible tactic to shut down debate. Using up resources and trying to obfuscate are the tactics used. Until things are much further along the enemy has no part here nor does your comment. It is politics not science or culture and realizing it is not BS. Most of the activist side wishes to hide behind the science but has little intention of apply a scientific method.

• #

> He censured [sic] every thing he could and used every possible tactic to shut down debate.

You’re making that up.

By contrast, we have a fine example up above – Pointman – who is doing exactly what you pretend to censure. Watts does the same. BishopHill does the same.

> Until things are much further along the enemy has no part here

You’re suggesting that the poor delicate darlings here can’t cope with a bit of constructive criticism? Fortunately our hostess is not the shrinking violet that you appear to be. And “the enemy” is showing your partisan side just a touch.

> hide behind the science

I’m absolutely happy to be aligned with the science. I’ve no idea what “hide behind the science” could possibly mean, though.

[Gawd, you’d think WC could learn how to use the b-quote tab after all these weeks?] ED

• #
the Griss

“By contrast, we have a fine example up above – Pointman – who is doing exactly what you pretend to censure. Watts does the same. BishopHill does the same.”

It is something you will have to live with the rest of your life 🙂

• #
Rereke Whakaaro

That is true … for the rest of his life … let us hope it is a long one.

• #
the Griss

I don’t care if its long or short……

Karma can be a right b****….. I hope. 🙂

• #
PhilJourdan

[Gawd, you’d think WC could learn how to use the b-quote tab after all these weeks?] ED

That presupposes he can learn.

• #

> use the b-quote tab

[You are a funny man WC! – Mod]

[

Ha! Dishonest in what way? We edit to make reading easier. The b-quote function works very well to prevent misunderstandings. ] ED

• #

william:
your “work” speaks for itself…thats why you are reviled by many. you got it the old fashioned way…you earned it.

• #

There are various strategies to protect the integrity of a mass of data. There are software version control systems that could be used. However, they tend to be software oriented and not well suited for a tasks of this kind. I recommend the following partly automated and mostly manual approach.

1. The data version should be identified by at least a date stamp and a multilevel serial number that unequally identifies the specific data mass. (ie. ..)

2. Make sure any change in the data has been carefully checked and validated before it is committed to be the next version of data.

3. Have at least three backups on three different media stored in three widely dispersed locations. Also similarly keep a backup of at least two previous generations of data.

4. Provide along with the data, a data check sum that can be used to verify the data has not been modified or otherwise corrupted. This will be unique per data version.

5. Provide a central distribution point to which one can go to get the authorized version of the data.

6. Provide an audit trail of everyone who acquired the data such that you know who took the data and when.

7. Provide a time stamped log of all changes in the data describing the change in detail. Treat the log exactly as you would the data it tracks.

8. Periodically check your public access data to assure it has not been hacked. Re-install the correct data and note that fact in your data change log.

This process is not absolutely fool proof but it offers recovery possibilities in depth. Short of Armageddon the desired version of data should be able to be found or reconstructed.

I use a variation of this process to assure I don’t loose the content of my software development projects. It works but it takes a bit of self discipline to make it happen. It adds less than 1% overhead to my efforts.

• #
MikeO

These are all valid things to do but until we see what the how the model is constructed it is unknown what is possible. One possibility is a binary, load executable that looks to web for authentication so that only authenticated uses can run it. Typically there is large level of insecurity associated with Excel spreadsheets.

• #

I think the goal here is open source science. There is no commercial or proprietary goal here. Hence no secrecy is wanted or necessary.

The insecurity of the Excel environment is irrelevant. The idea is to protect the original data module from modification by all except the author and to assure the original data can always be recovered and identified. Nearly every vendor of software does that. At least the ones who want to stay in business do. At the bit level, there is no difference between a software package and an Excel spreadsheet. Hence there is no need for a different method of preserving and assuring the data.

However, isn’t the purpose of making the data available is that it can be tested and modified to try out different ideas? Otherwise why provide it in computer executable form? The data need not be and should not be made unmodifiable.

Once a person downloads and installs the data, he can use, modify, add to, or destroy the data at will. The check sum still stands and it can be used to detect the fact of a change. That way, one can distinguish data that came from its author from data that has been modified.

Someone who is trying to the author made a mistake by changing the data can be discovered. Either by doing a check sum check on the data or by his refusal to provide the data so that it can be checked for fidelity to the original.

If there is more than one file that constitutes the data, it can be packaged in various ways. For example the software packaging tool I use for customer delivery is Inno Setup. I find it extremely effective. It is widely used freeware. See: http://www.jrsoftware.org/isinfo.php

Yes, there are issues left unspecified in my proposal. However all those issues have been dealt with by countless developers for decades. One simply needs to select the appropriate tools and use them consistently.

In other words, the relevant problems have been solved a long time ago. It has been in use for decades by countless developers. All that remains is to fill in the details, select the appropriate tools, and use the solution.

• #
the Griss

We use a thing called Tortoise SVN.

Its free, might take a bit of learning though, and I’m not sure exactly what can be done with it.

I know the boss was able to keep track of my changes to programs though.. and fix them 🙂

• #

That package might be OK for a large development team but I think it is vast overkill for a small one. It attempts to enforce uniform policy on one and all so as to eliminate the effects of sloppy, incompetent, and undisciplined developers. To attempt to do science using that kind of developer is a contradiction in terms. The same is true for software development but that is a different matter.

The cost of using such a system is brain cracking complexity, constant “administration”, and road blocks to changes by the lead author. That cost is far in excess of simply having self discipline, honesty, and honor to do the right things in the right way all the time. All of which are vitally necessary for the doing of good science.

In this case, we have a team of two doing the authoring: Jo and David. The rest of us are the quality control team. We test their output and report the flaws we find. We simply need access to the things to be tested so we can test them. We DON’T need the ability to change the original source. That is up to Jo and David. Hence a rather simple computer aided manual system will do the task both effectively and efficiently.

• #
the Griss

There’s only 3 or 4 of us in different part of Oz using it to create stuff.

but yeah.. it can be a bit of a pain, and I don’t even have to look after it 🙂

Was just a suggestion 🙂

• #

Having a dispersed team working on the same modules makes using something like the SVN tool almost mandatory. Using it for a team of two who lives together with one doing the communicating and the other the development is where its use is serious overkill.

As always, the nature of the problem and its context should determine the tools to be used. Selecting the tools without considering the problem and context almost always costs more and usually doesn’t work that well.

• #
Spitfire Hornet

Yes, the distributed version control systems such as Subversion, Git, Mercurial and so on are way overkill at present.

I would suggest using RCS (Revision Control System), which has a Windows version available.
…and make sure you have a decent backup policy as well.

• #
Dry Liberal

TortoiseSVN is a Windows GUI client for the Subversion version control system. It’s OK but has a rather centralised mode of operation – people have to be given permission to write to the repo. Newer and better “distributed version control systems” are available – Git, Mercurial, Bazaar, etc. People using these systems get a copy of the entire repository and can work on it themselves. Then if they wish they can merge their work back into the main repo (with the original owner’s permission of course). It works well for software development, not sure how it would work with data.

• #

On my projects, I am the team. My challenge is keeping enough history securely so that I can recover in case I do something stupid or simply change my mind about something. So simple and disciplined is much better for me. I have maintained, for decades, up to a gigabyte and more of data and software source by a method similar to the one I proposed here.

I haven’t looked into the packages you suggest. It is possible that one of them might be useful. That is if their complexity of use is manageable. Thanks for identifying the alternatives. I will look into them.

• #

I have downloaded and looked at Get, Mercurial, and Bazaar. They appear to me to be gross overkill for this project.

They could be setup to do the job but you have to be a super geek developer to configure them correctly. It would also be good to be at least an ordinary geek developer to use them.

Mere mortals are likely to be challenged. Especially if they have a life they want to live outside of the project. There is simply a too steep of a learning curve and too much effort to use for what little advantage they offer this project.

I suggest staying with a manual system similar to the one I proposed. The major burden of its complexity is on the author. The user need only understand how to use Excel. As such, it is a much better match to the task.

• #
Truthseeker

Griss,

We use Tortoise SVN at our work as well. Once you have it set up, it is quite seamless especially with the Visual Studio add-in that we use. I also recommend it.

• #
Jaymez

Thanks for that MikeO, it really was an hilarious read. Pointman has a way with words. I do recommend others have a look at this link http://thepointman.wordpress.com/2014/05/29/the-scorning-of-william-connolley/

I totally understand what Pointman was saying – William Connolley’s past behaviour makes him an undesirable who you wouldn’t invite to your house. He has lost his privileges and deserves a lifetime ban.

Given his past practices at Wiki and how readily ‘warmist/alarmist’ sites censor and ban skeptics, it shouldn’t come as a surprise to him. But Jo Nova has always allowed dissenting views at this blog as long as they are prepared to follow the rules of the site.

Though I wish Connolley would learn to use quotation marks or the ‘block-quote’ feature when he is quoting content from someone else’s comment. And I note his most common tactic seems to be to simply ignore the questions or points he doesn’t want to or can’t answer, yet demands every point he raises be addressed.

Personally I think I’d take Pointman’s position and ban Connolley because he lost his rights through his activities at Wiki and has shown absolutely no REMORSE. He continues to claim that anyone who thinks he acted inappropriately at Wiki simply does not understand how Wiki works!

• #
Yonniestone

That Pointman post still makes me laugh however I do think MikeO has a good reason to warn people not to be complacent about what they’re dealing with.

Extreme Narcissism is a pathological state that personally I find a major trait in many types of psychopaths, if anyone has not had the pleasure of encountering such a person then consider yourself lucky, these people will never take no for an answer if they believe their not getting things their way.

• #
the Griss

“has shown absolutely no REMORSE”

He actually PRIDES himself on it..

It is his ONE achievement in life. lol !!!

Sad, really, to think that is all he has ever had to offer…

His ego must be so bruised to know how much he is mocked for what he has done. (suck it up, WC, you did it, you wear it).

You can see just how much it STINGS every time it is mentioned. 🙂

Perhaps eventually he may learn to make a positive contribution to the world.

nah… not going to happen. !

• #
Rereke Whakaaro

Jo Nova has always allowed dissenting views at this blog as long as they are prepared to follow the rules of the site.

Which is actually one of its strengths, when you think about it.

The problem is that she seems to attract serial pests who try to win by dominating threads, with comments that are no more than lists of references, and/or taking the thread totally off topic.

And here are we, having a meta-conversation, I will shut up.

• #
Greg Goodman

There’s not need to ban him, just ignore him.

If he got banned he’d just go running of to tell him mummy how unfairly he’s been treated. Ignoring him will hit his ego where it hurts.

He chose to abuse his position to remove other people’s contributions, deny them a voice and ban them, now he want to come round for a chat.

The time to engage in discussion was then. His choice was made. Let it remain that way.

• #
Rereke Whakaaro

And Pointman is still getting regular hits on that particular piece, and I guess he has got a real spike in hits today, with little willy spouting on here.

Delightful.

• #
crosspatch

Yes, if you feed “white” noise through a notch filter it looks like white noise that went through a notch filter. I am failing to understand why that is either surprising or significant.

• #
Bernie Hutchins

Because no one IS notching white noise.

Because it is a supposed input that has a PEAK (not flat) and the supposed output that is flat, a notch is being INFERRED (cancelling the peaking rather exactly).

Yes – it takes some study to understand what is proposed here. Note the “instead of TEMPERATURE”(my emphasis) in the quote just below.

Jo and/or David apparently concur with this white noise being an issue as they say: “Some people are claiming that the transfer function is meaningless because you could use white noise instead of temperature data and get the same notch. It’s true, you could. …..”

• #
Rohan

Some people are willing to declare they know that TSI cannot be associated with changes in Earth’s temperature. Some of us have an open mind. The solar dynamo is not completely worked out. Fair?

Jo, how can TSI not be associated with earths temperature?

I personally think its better to generate a list of every possible contributing factor to each of the four categories of the universal balance equation. Establish that list and then put each one under the microscope and then list the basis for their omission in assumptions. It will then lend far more weight to the assumptions made when developing the model. The universal balance equation for the uninitiated is simply:

Accumulation = In – Out + Generation – Disappearance

Each of the above variables are rates. And in this case we’re measuring energy, so it’s the “rate of accumulation” and so on.

Correct me if I’m wrong, but from reading some of Richard Lindzen’s papers and watching his lectures on Youtube, I would have thought that TSI is the only significant component for the rate of energy In.

Generation and Disappearance are so small as to be insignificant, so it’s going to be a small list. Atmospheric CO2 doesn’t generate or disappear energy in it’s own right, so it’s not going to factor into this energy balance.

• #
Richard C (NZ)

Rohan #24

>Accumulation = In – Out + Generation – Disappearance

>Each of the above variables are rates. And in this case we’re measuring energy, so it’s the “rate of accumulation”

Exactly. And when energy-in falls from a high level where accumulation occurs to a medium level, the accumulation stops. When energy-in falls enough, accumulation reverses.

The ocean is the largest heat sink where accumulation (storage) of energy has been occurring. Trenberth and successive co-authors estimate the accumulation has been 0.4 PW (1 Petawatt = 10^15 watts) as shown in Figure 2 of (ignore the silly stuff):

‘Changes in the Flow of Energy through the Earth’s Climate System’

Fig. 2 shows the flows for the atmosphere in the ocean and land domains. Here the areas are accounted for and the units are Petawatts. Plus and minus twice the standard deviation of the interannual variability is given in the figure as an error bar. The net imbalance in the top of the atmosphere (TOA) radiation is 0.5±0.3 PW (0.9 W -2) out of a net flow through the climate system of about 122 PW of energy (as given by the ASR and OLR). The fossil fuel consumption term is too small to enter into this figure. Hence the imbalance is about 0.4%. Most of this goes into the oceans [0.4 PW], and about 0.01 PW goes into land and melting of ice. However, there is an annual mean transport of energy by the atmosphere from ocean to land regions of 2.2±0.1 PW, primarily in the northern winter when the transport exceeds 5 PW.

So the “rate of accumulation” has been 0.4 Peta Joules per second (1 Watt = 1 Joule/sec) according to Trenberth.

Upper ocean heat content (OHC) has either reached (UKMO EN3) or is reaching (NODC) peak levels. The Pacific basin (the largest) is not showing accumulation. Therefore the accumulation metric to watch for an energy-in deficit is upper ocean heat content.

Because the air “sees” ocean heat, there will be a tropospheric response to OHC which is lagged considerably from solar energy-in.

• #
Richard C (NZ)

>”Because the air “sees” ocean heat, there will be a tropospheric response to OHC which is lagged considerably from solar energy-in.”

For this a solar-temperature transfer function is required at a thermal causal level. I don’t consider the N-D valid but the resulting transfer function does the lagging task to a certain extent. But considering only the upper 100m or so of ocean as David has done (5 yr time constant) results in a transfer function that is too sensitive to a fall in energy-in IMO.

Consideration of lag due to at least the upper 700m (20 yrs say) pushes thermal lag out longer than 11 yrs and “damps” the atm response to solar decrease.

• #

“Correct me if I’m wrong, but from reading some of Richard Lindzen’s papers and watching his lectures on Youtube, I would have thought that TSI is the only significant component for the rate of energy In.”

Solar irradiance seems to power all weather and climate on this planet, with an active control of radiant exitance to space. The alarmists insist on using some nonsense temperature average, that is not climate, is not wheather, at best such an average can only be an indication of the internal energy of the Earth.

Consider David’s graphs an his words (Force X seems to be 17 times that of Solar irradiance). The Sun is massive, the Earth wee, but appears to be in orbit about the Earth-Sun barycenter, inside the Sun’s photosphere. The Sun and that barycenter move in epicycles about the Solar system barycenter, that SSb itself, is likely in orbit about somthing larger but still cyclical. All these spins and cycles must have angular momentum. Is there some Lorentz transformation for the conservation of angular momentum? How is the temperature of the wee Earth affected by that? Force X could also be electromotive, or magnetomotive. Will the alarmists ever admit they “know nothing”?

• #
Philip Peake

I hope Jo will forgive me for using this forum to express a personal opinion, and to deliver a personal message, but I am finding it difficult to restrain myself.

I am, in general, a great fan of yours. You approach subjects in a way that I like, you look for the simple answers, and you express your theories clearly and with humor. You are also willing to admit when you are wrong (mostly).

Mostly is ok, because you are human.

I have to admit to being somewhat non-plussed by your virulent and dare I say rude rejection of David’s work. His presentation style is definitely more engineering style than academic, which to me is a good thing. I was a little put off initially by the slow-drip release, but have come to appreciate that it’s not a bad thing. It gives time for the ideas to percolate in my mind. Now perhaps (likely) your mind works faster than mine, but speaking personally I find I appreciate the neuances of new ideas better having slept on them.

Speaking strictly for myself, I am prepared to wait until the end, then let the dust settle before coming to any hard and fast conclusions.

I am somewhat dismayed that you don’t do likewise. I think you are normally much better than that, and open to new ideas, and then taking your own spin on the ideas and verifying, or otherwise, in your own way using your own approach.

I never thought I would need to say this to you, but keep an open mind, and believe me when I say that I am one of your biggest fans, and think that you might have something to contribute to this, one way or another, in your usual carefully analytic and humorous style.

Philip

• #
Ross

Philip

I understand your frustration with Willis E on this issue but I think your message should be on WUWT not here.

• #
the Griss

I assumed .. BOTH. !

• #
the Griss

OT.

I’m in Sydney… is it time to move north ??????

• #
Yonniestone

It’s been a high of 6°c the last 2 days and feels like snow coming 🙂

On our Sunday walk on Mt Buninyong it was 4°c and some sleet, in summer it can be 40°c amazing how we survive such climate change.

• #
Annie

We went to Marysville yesterday and it was sleeting for a while. There were streams of cars heading up to Lake Mountaint for the snow. Very different from summer temps but somehow humans keep going.

• #

> defending the missing hot spot? Really?

Yes, really. We started discussing this elsewhere http://joannenova.com.au/2014/06/green-climate-pornography-cheer-for-the-deaths-of-the-heretics/#comment-1486289 You said ” The CCSP with many of your favourite authors called it a fingerprint 74 times in 2006, yet you blame skeptics for elevating it?” I said “I presume you mean http://data.globalchange.gov/assets/51/56/6d7da49e1f93bef673d56ff6aa6a/sap1-1-final-all.pdf This is a perfect example of your lack of skepticism. All you’ve done is search for the word “fingerprint” in that document, or a similar one. But you haven’t read it, or you’d realise that most/all of those instances of “fingerprint” are talking about something else.” But the conversation seemed to end there.

Anyway, since you bring it up, its worth pointing out that the actual strong fingerprint of GW-from-GHGs in the atmos is stratospheric cooling (and tropospheric warming); and this is observed. And as we all know, the signal you expect to see from TSI variations is warming (or cooling) throughout the entire atmosphere. The Svensmark types tend to ignore that little matter; and so have you.

• #
bobl

Er, no. Any source of warming which affects partitioning of energy between lower and upper atmosphere, will cause differential warming/cooling between atmospheric layers. For example a low level temperature inversion would do that nicely.

• #

bobl,

We have seen latitudinal climate zone shifting correlating to variations in solar activity.

To achieve that, force x must first alter the gradient of tropopause height between equator and poles by changing stratosphere temperatures differently at different latitudes or heights.

There is some recent data to that effect for the period 2004 to 2007 and I await an update.

• #
Terry

And as we all know, the signal you expect to see from TSI variations is warming (or cooling) throughout the entire atmosphere.

It is not necessarily the case if TSI is a proxy for a specific band in the spectrum. Differential heating will occur if TSI is a proxy for UV, and it will be different if it is a proxy for the IR bands. I havent looked at that stuff recently but I doubt that TSI variation (full spectrum equally varied by energy) will produce a uniform warming through the entire column. Someone will correct me if Im wrong.

• #
the Griss

but I doubt that TSI variation…will produce a uniform warming through the entire column

Precisely. And that is why the approach that David has taken is so interesting.

He has effectively taken regional and atmospheric height differences out of the equation, by looking at the system at a whole-istic scale.

What happens inside the box can be discovered later if the box turns out to be accurate.

The use of electrical/audio/resonance methodologies is a very interesting way of approaching the problems.

That is why the alarmista troops are soooooooo ‘anti’ and so trying to find fault. (and the fact that most of them obviously just do not understand it 🙂 )

• #
Richard C (NZ)

Terry #27.2

>”I doubt that TSI variation (full spectrum equally varied by energy) will produce a uniform warming through the entire column”

Don’t know for above tropopause but for troposphere see:

‘Eleven-year solar cycle signal throughout the lower atmosphere’

K. Coughlin and K. K. Tung (2004)

http://onlinelibrary.wiley.com/doi/10.1029/2004JD004873/full

‘Observed Tropospheric Temperature Response to 11-yr Solar Cycle and What It Reveals about Mechanisms’

JIANSONG ZHOU AND KA-KIT TUNG (2012)

http://depts.washington.edu/amath/old_website/research/articles/Tung/journals/Zhou_and_Tung_2013_solar.pdf

• #
steven mosher

WC

‘Anyway, since you bring it up, its worth pointing out that the actual strong fingerprint of GW-from-GHGs in the atmos is stratospheric cooling ”

It kinda stunned me when I first saw this and wonder why this fingerprint has been so hard to communicate rather than the “hot” spot which is not that good of a metric.

As I read it the strat cooling is a rather unique metric in that it rules out solar as a cause of the warming, whereas the trop hot spot doesnt have this diagnostic power.

• #

You’re right. The trop hot spot is just a consequence of atmospheric dynamics as currently understood; there’s nothing GHG-specific in there; its a consequence of heating the sfc (together with a certain set of plausible changes or not-changes in the troposphere); solar-forced warming would have the same effect (see, e.g., the nice discussion at http://www.gfdl.noaa.gov/blog/isaac-held/2011/12/07/20-the-moist-adiabat-and-tropical-warming/). By contrast solar-forced warming would warm the entire atmosphere (actually with some slight caveats I should have put in earlier, but Eli reminds me: if anything, since TSI variations are more in the UV, more solar will get dumped in the upper atmos (and therefore warm it) than the sfc). The exact mechanism for GHG cooling the strat is complex though; every time I try to explain it I get it wrong. So I’d advise reading Science of Doom not me: http://scienceofdoom.com/2010/04/18/stratospheric-cooling/

• #
Greg Goodman

The stratospheric cooling is an after effect of the two major eruptions.

http://climategrog.wordpress.com/?attachment_id=902

This is mostly a result of an apparent change in opacity and a reduction in reflected SW.
http://climategrog.wordpress.com/?attachment_id=955

Since the stratosphere is cooling, that extra SW must be being absorbed in the troposphere.

After the initial cooling effect, the long term effect of El Chichon and Mt Pinatubo was a warming which is being erroneously attributed to GHG.

I estimated 1.8 W/m2 TOA change. That would go someway to explaining the paradox of the hiatus.

• #

You certantly are indeed reading that wrong as usual. It shows nothing of what the Sun may do. The decrease in stratospheric temperature with increasing CO2 shows what effective “coolant” are all IR radiantly active atmospheric gasses. In the tropopsphere the variable amount of water vapor controls all temperatures, quite independently of what stupid earthling governments may try to do!

• #

In my view this probably is; at least for the last 30 million years, the likely location of the Earth’s ‘global air/sea conditioner’, ‘delay transformer’, the likely locus of ‘Force X’, the reason why both regional signals with 11 year periodicity are more common deep in the NH continents and why AGW effects in so far as they exist are also predominant in the NH. It is also the reason why the Milankovich Cycle started and finished the Tertiary and Quaternary ice ages from the NH.

‘Spin this up and down’ if you dare (the other side daren’t)!

http://www.nature.com/ngeo/journal/v7/n2/full/ngeo2037.html

• #
Mikky

Another possible non-notch reason for the absence of an 11-year signal in global temperatures: ICE & SNOW.

Some of the response to long term trends in TSI will be albedo changes from SNOW/ICE melting/growing.
Such positive feedback probably takes years to establish, so the 11-year oscillations would not get such feedback.

This additional suppression of 11-year oscillations is effectively a steeper roll-off of the low-pass filter.

• #
handjive

Figure 3:
Notch-Delay climate model and the CET record with a projection out to 2045.
“The hindcast match is good.
The interesting thing is that the projected temperature decline of 3.0°C is within the historic range of the CET record.
The low is reached about 2045, lining up with the projection from the Finnish tree ring study. ”
~ ~ ~
>Here is another test for the N-Dcm with one the best historical record of climate change in the world (with a few twists):

September 2000
The Office of Legislative and Public Affairs (USA)
Tale of the Ice, Revealed
“All but one of the 39 ice records, which come from sites ranging from Canada, Europe, Russia and Japan, indicate a consistent warming pattern.
The average rate of change over the 150-year period was 8.7 days later for freeze dates, and 9.8 days earlier for breakup dates.

(note)- A smaller collection of records going well past 150 years also shows a warming trend, but at a slower rate.
*
One of best the historical records of climate change in the world
For example, Lake Suwa in Japan has a record dating back to 1443 that was kept by holy people of the Shinto religion.
The religion had shrines on either side of the lake.
Ice cover was recorded because of the belief that ice allowed deities on either side of the lake — one male, one female — to get together.”
~ ~ ~
>Better researchers might know where to look but, found this graph:
Figure4.
Variations of winter (December to January) Temperature of Suwa region, Nagano Prefecture, Japan, over the last 500 years estimated from ice condition of the Lake Suwa
(Mikami and Ishiguro, 1998)

Some Quotes:
“The documenting of Lake Suwa in Nagano Prefecture represents one of the best historical records of climate change in the world.
The occurrence of a unique phenomenon known as Omiwatari, “the divinity’s pathway” was observed at Lake Suwa in winter.
The icy surface swells to make a long ridge if the temperature remains very low after the lakes surface is completely frozen.
This natural process has been incorporated into a religious rite of the Suwa shrine. and records have been kept since 1443.
*
Omiwataris
2012– To the delight of local residents this winter, an elevated line of cracked ice appeared on the frozen surface of Lake Suwa in Nagano Prefecture for the first time in four years.

Suwa City Museum- The Icecracks of Lake Suwa
Historical photos & records: 2013 and back
. . .
Here is a conundrum for the doomsday global warmers.
You will notice that all claims above of Man Made Global Warming are from surface air temperature data.

14 June 2014:
“Now, Stephen Briggs from the European Space Agency’s Directorate of Earth Observation says that surface air temperature data is the worst indicator of global climate that can be used, describing it as “lousy”.
Climate scientists have been arguing for some time that the lack of rising temperatures is due to most of the extra heat being taken up by the deep ocean. A better measure, he said, was to look at the average rise in sea levels.

The science is settled for taxing? Tell ’em he’s dreaming.

• #
TdeF

“a leading indicator of some other effect coming from the Sun after a delay of 11 years or so”. What is the basis for this?

If the simple conclusion is that there is a delay between solar activity and global temperature, this is expected but what is this idea of a second effect?

You would expect that solar activity was the prime driver of temperature anyway on this hot ball of molten rock with a skin the thickness of a balloon. A delay between irradiance and temperature would be quite expected. If it matched the solar cycle, that may be a rough coincidence or an oscillation excited in a multitude of complex oscillating feedback systems driven by the first cycle.

Consider that most of the world is covered in water 4km deep which captures 2/3 of the incident sunlight and this huge mass has to heat up, a huge buffer. Air is 1/400th of the mass (1 atmosphere per 10 metres), so the air temperature in the long term is determined by the water temperature and the sunlight goes straight into the water, regardless of CO2. Further, as the water warms slowly, it releases CO2 and 98% of all CO2 is in the water. This is controlled by Henry’s law and the CO2 elevator bring up new dissolved and compressed CO2 rapidly from the depths, regardless of water currents, so it all ties together. The sun heats the water and this takes time.

As Dr. Murry Selby demonstrated so clearly, while there is no correlation between temperature and CO2, there is an almost perfect match between CO2 and the integral of temperature. For the first statement, he was fired and no one has considered the second result. However the integral of immediate temperature is total irradiance which to first order would give you total energy in and thus the increase in water temperature which in turn gives you the CO2 amplified x 50. (98% is in the ocean)

So what may have been demonstrated is nothing more than the expected delay between water heating and cooling and irradiance. To propose there is some other, unknown effect coming from the sun seems unfounded and quite unnecessary. Fundamentally though, such Fourier analysis will pick up basic oscillations or harmonics in any complex stable system. That these would be driven by the sun is no surprise. Incidentally if the sun harmonic actually matched a physical process in balance, it could be disastrous with runaway oscillations like any mechanical system. Clearly it does not get excited and we get a faint echo of the driving cycle, nothing more. However that alone can change our weather noticeably when tiny parts of one degree are so significant to the IPCC.

• #

TdeF

I’ve put much the same point about the oceanic delay to David but he remains unconvinced in still suggesting that something in the sun causes the delayed thermal response on Earth to TSI variations.

One can only await further details of the deliberations that led him to that conclusion.

• #

• #
Richard C (NZ)

TdeF #31

>”So what may have been demonstrated is nothing more than the expected delay between water heating and cooling and irradiance.”

Agreed. And irradiance comes in successive pulses (SCs), the most recent of which have been at historically high levels. The water heated.

In other words, the 11 yr cycle is an insignificant event in terms of water heating. Look to 2×210 years (2 deVries cycles) for the significant event.

• #
Roy Hogue

…has it been misapplied? What matters is whether the base assumption is valid, and whether the results will be a useful answer to the question you’ve asked.

I wondered myself about this question. What the OFT does (as David explained it) is not exactly the obviously correct thing to do. It’s kind of akin, at least in terms of things you should question, to interpolation beyond the end of your data, which I’ve been asked to do several times and always questioned it. I was overruled but was also assured that the result would be close enough for the purpose. So I’m not unfamiliar with doing something that may look inadvisable.

I think it will take the skeptical on this point, including myself, time to see what the model actually predicts vs. what actually happens. In the meantime I remember that the other camp has asked us demanded that we believe things far, far more questionable. So for myself, I’m still waiting for the spreadsheet and the math details and above all else, the future predictive ability.

I wonder if I can live long enough to see the evidence myself and if my math background will be up to the job of going through the details of the model. But no one ever said life would be easy. 😉

• #
Roy Hogue

Jo and David,

I’m glad you’re the ones with money riding on what happens in the next decade or so and not me.

I’m still as anxious as ever to see the rest of this grand adventure into the unknown.

• #
Greg Goodman

“… to interpolation beyond the end of your data”

you can not, by definition, interpolation beyond the end of your data.

inter + pole 😉

it’s called extrapolation. the uncertainly increases very rapidly the further out you get. This is fundamentally different from interpolation.

• #
Roy Hogue

Valid point. But it seems like a distinction without a difference. The same thing happens either way. 😉

• #
Bernie Hutchins

Oh Goodness Roy – Very Different. Extrapolation involves the future , not just the past. It’s Prediction!

We used to write on the board the quote: “Prediction is very difficult – especially about the future” and ask students to guess who said it: (1) the Ancient Chinese, (2) Niels Bohr, or (3) Yogi Berra. The answer is, of course, all three. [Is Yogi known in Oz? One can hope so.]

We know of perhaps a half dozen really GOOD ways of interpolating. Prediction involves questionable “modeling” and we know how well that can work out. As Greg suggests, even half a sampling unit beyond the present and you may be in disaster-territory (like polynomials).

• #
Roy Hogue

Oh goodness indeed. I think we’re getting overboard. 😉

I was given a set of data representing a curve that characterized a piece of hardware (curve unspecified) and asked to incorporate linear interpolation along that curve into my software based on those known points. Unfortunately the data didn’t cover the entire range over which I would need to interpolate. So I questioned that and was told that going outside the known data at either end to the extent required would yield an acceptable result.

I’d provide some code to show that the same C++ function could handle the job whether its argument was inside the legitimate range or outside of it without even needing to ask whether it was inside or outside. However, I think it would be a little too far off topic. But what would you call it under those circumstances? Maybe exterpolation or intrapolation?

In any case, it’s a practice that, as Jo said, you need to be sure applies to the situation BEFORE you do it, not after you find out you’ve bet the store on a mistake.

• #
Bernie Hutchins

Roy –

You can certainly use the same “model” and the same code inside or outside the range of the given points and still CALL it interpolation or extrapolation accordingly. Your use of “linear interpolation” is a polynomial model – straight lines being a first-order polynomials. It should work fairly well as long as you don’t stray more than, say, 10% to 20% outside the range given. But try, for example, fitting a 10th-order polynomial to 11 points. Quite an impressive interpolator in the range of say the 3rd point to the 9th point. But if you try the 12th point (!), the polynomial is already rapidly heading for + infinity or – infinity. Signals are basically horizontal in time while polynomials are vertical. For extrapolation first try sinc interpolation math (bandlimited) or something like an all-pole modeling, as they don’t feel obliged to rocket away when released, but are happy to toy back to zero. Not that you can trust anything all the time ;).

Bernie

• #
Mikky

Here is why the “notch” is not science:

It is based on a signal that is not observed.
You claim to have deduced something very specific from a signal that is not observed.
The ONLY thing you can deduce from a non-observed signal is that its amplitude falls below the noise.
The amplitude response you see is (noise/input), an inverted peak, nothing at all to do with the transfer function.
You say (rightly) that you can’t deduce the phase response: of course not, there is no signal present to measure the phase of.

You have deduced something from nothing, which is pure speculation, not science.

• #

I can see people giving Copernicus similar sort of grief. All you have is where the planets are in the night sky. Where’s the evidence that they’re tethered to the Sun? (He was worried about that but he didn’t actually cop criticism until after he died)

Speculation is where science starts. This is a blog rather than a text book so speculation is kosher. There are a few more steps before it makes it into a text book, if it does, and one of those is people coming up with proper rebuttals.

• #
Mikky

Speculation is fine, when simpler explanations are ruled out.
The problem is that David has claimed that the “notch” is a discovery.
Which is why “mainstream” science will not take it seriously.

• #
bobl

Oh the naysayers,
Mikky

Consider E=mC^2

When Einstein first postulated this it was pie in the sky, it fitted the observations like Davids filter does, but he had no idea whether it was right or not. We still don’t know if it is right, but it seems to describe things well enough to be useful, so until someone comes along with a different equation that works better E= mC^2 is it. Would you seriously say that this equation is not a discovery? Was it a discovery at the time he first proposed it, back when we didn’t have a clue if it was right?

• #
Philip Shehan

bobl, the acceptance of E = mc2 is based on things that were observed. The trinity test at Almagordo being hard to miss.

• #

So much for your nonsense “conservation of energy”!

• #
Philip Shehan

Will, If that is directed to me I do not understand what you mean.

The point is Vic, the experiment would not have worked at all if special relativity (E = mc2) was wrong. Nor would fusion, either in H bombs, the sun or fusion reactors .(Admittedly the phenomenon in the latter is very short lived due to technical problems.)

• #

For some strange reason, I don’t think that this experiment actually quantitatively confirmed that E=mc2. The calorimetric bomb wasn’t up to it and you should see what happened to the balance!

• #
Philip Shehan

Actually it was “confirmed” quantitatively (well kind of) in that a pool was run among the physicists to see whose prediction of the yield would be closest to the measured yield in equivalent kilotons of TNT.

Forget who won, (Oppenheimer was I think very much on the low side) but you could legitmately say that given the wide range of predictions and the prize went to the closest estimate, someone had to win.

But I think calculations have improved since then.

On the other hand one of the first H bomb tests “Castle Bravo” I think “ran away” producing about five times the yield expected due to the failure to take into account some factor (again can’t remembere the details) leaving those observing it from what they had considered a safe distance with a few anxious seconds and possibly soiled underpants.

• #
Philip Shehan

Yep, Castle Bravo.

http://en.wikipedia.org/wiki/Castle_Bravo

• #
bobl

Misses the point, was it a discovery at the point it was postulated, I say yes it was.

• #

Philip Shehan July 1, 2014 at 12:10 pm · Reply

“bobl, the acceptance of E = mc2 is based on things that were observed. The trinity test at Almagordo being hard to miss.”

Will Janoschka July 1, 2014 at 12:22 pm · Reply

So much for your nonsense “conservation of energy”!

Philip Shehan July 1, 2014 at 5:59 pm

Will, If that is directed to me I do not understand what you mean.

The actual equation for Yield is E = (alpha)mc^2. With no alpha above 0.5 ever being observed. The very fact that “energy” was “created” falsifies the first law of thermodynamics.
I know, there are nice Lorentz transformations, that show the relationship between energy and mass. The first law, speaking only energy, was useful for a while, but still “has been falsified” in NM, and crudely replaced by stastical mechanics nonsense, that has no use whatsoever. Also the new zeroth law of thermodynamics has been falsified by radiometric (brightness) temperature which has no relationship to thermometric temperature or thermodynamic temperature = energy/entropy.

• #
Philip Shehan

Will, I am unfamiliar with the “alpha” parameter.

As far as the first law of thermodymnamics, that was formulated before special relativity was formulated, and mass and energy were shown to be interconvertable.

So the law of conservation of energy becomes the law of conservation of mass and energy.

Of course in everyday situations the conversion of mass to energy and vice versa is undectable and the law of conservation of energy is a perfectly reasonable approximation.

Similarly classical Newtonian physics with absolute time and space is acceptable for everyday uses even though strictly superceded by general relativity.

I am unfamiliar with the zeroeth law of thermodynamics.

• #
Roy Hogue

That first flash of light at Alamogordo (the correct spelling) was hardly a proof that e=mc^2. They were not sure it would work. But it was sure convincing that Einstein was onto something. There has been much work done since then with much better measurements and it appears to hold up. But what will the future bring?

What will David Evans have shown the world of 50 or 100 years from now? He’s stuck his neck out a long way and deserves our respect for being confident enough in his judgment to do that. And he deserves the benefit of the doubt until the theory is tested the only way it can be tested, by the passage of time.

• #
Roy Hogue

The actual bomb test was at White Sands, a considerable distance from the city of Alamogordo. There is no visible evidence of the bomb’s ever having been there but the site is marked with a small monument and you can drive to it quite safely. It’s no more radioactive than the cities of Hiroshima and Nagasaki are today.

• #

“You have deduced something from nothing, which is pure speculation, not science.”
Do you but wish, that you had such skill?

• #
Rereke Whakaaro

It is based on a signal that is not observed.

That depends on your definition of “signal”.

A regular fluctuation, of anything, is a signal.

A deviation from a historically regular fluctuation is also a signal, although of a different type.

These signals exist, whether observed, or deduced through Fourier Analysis.

The ONLY thing you can deduce from a non-observed signal is that its amplitude falls below the noise.

Noise, is simply the combination of all of the signals, plus their harmonics and products. Therefore all signals fall “below the noise” until they are extracted, and identified with a real-world phenomina.

As I understand it, David has observed a real-world cyclic phenomina, and identified the singular signal that it correlates with, and could be the cause of, that cyclic phenomina.

The rest of your comment is grandstanding.

• #
Greg Goodman

“The transfer function between TSI and Earth’s (surface) temperature will be meaningless if there is no causal link between TSI and Earth’s temperature. (Some people may need to read that twice).”

For most once would be enough.

“Two assumptions were made before computing the transfer function. ”

Well I don’t find in part II that you linked. So perhaps adding it now without the sarcastic parenthesis would have been more appropriate.

At least it is now stated clearly.

• #

You’re modelling the estimated global mean anomalies of temperatures. As Jennifer Marohasy pointed out, raw data shows that towns in Australia were cooling when the world was warming at the beginning of the century. I know that the HadCRUT data for NH and SH are almost identical but you need to take that with a pinch of salt (you might be modelling weather data and not the climate).

Both maximum and minimum temperatures are affected by humidity but in different ways. It would be interesting to see if the humidity data shows something. Maybe just the wet bulb readings?

• #
Greg Goodman

I think the aim is to model the temperatures record accepted the orthodoxy, without relying on CO2.

The aim is not necessarily to solve the riddles of climate, so whether that record is accurate is not germane to the exercise.

• #

I do realise what it is that they are trying to do but if the data is just garbage it is a pointless exercise.

• #
Greg Goodman

No, it’s not pointless.

IPCC and Hadley Centre , amongst other, make the deceptive claim that models don’t work without CO2. This on the basis of tuning the model’s parameters to work with CO2, then chopping it out, rather than attempting to make a non AGW model work.

What DE is trying to do is just that. His aim is to counter that claim.

• #

Fair enough. I was about write that asking what is Factor X is not important as you don’t have good enough data to answer that. I guess another 10 years of satellite data might be good enough.

• #
Rereke Whakaaro

There is also a difference in approach.

Climate Science seems to be a bottom up approach based on a whole raft of “first principles,” and derivations thereof, that are put together in the models, in an attempt to simulate climate variances, in a virtual environment.

The Engineering approach is to start with the observed outputs, and then use Fourier Analysis (and other tools?), to identify the constituent frequencies. Each of those frequencies will resonate with one physical phenomenon, with the relative amplitude giving an indication of its importance. The challenge then, is to identify what each physical phenomenon might be.

A lot of the “animated discussions” we are observing on these threads, are from people who are used to only using only one approach or the other.

This blog, in itself, is a fantastic source for psychological study. Always assuming, of course, that we can ignore those who are overly evangelist or jihadist, and are thus past rational thought on every topic.

• #
the Griss

“and are thus past rational thought on every topic”

Hey.. leave me alone.. you bully !! 🙂

• #

I dislike this idea that physical models are better than curve fitting. If there weren’t huge uncertainties in many parameters there might be a point but as it is, just looking for interesting trends might give better insights.

I have absolutely no time due to work and the WC, but I’m putting together something interesting that was missed because the trends in the data are ignored. I suspect that the Global Mean Surface Temperature Anomalies is just the weather.

• #
Greg Goodman

“Assuming that:
1. Recent global warming was associated almost entirely with TSI.”

No, you are going much further than that to conclude the dividing output by input gives you the transfer function of the system.

You are assuming that ALL change in the output is due to changes in the input, not just “recent warming”, all of it : 11y, 13y, 3y, 2.7 …. everything.

This may be a reasonable assumption in an electronic blackbox test but its application to climate is taking the analogy too far without proper consideration.

You are assuming that there is no significant noise, measurement error or variability in the output that is not related to the input. That alone seems rather improbable.

There are four possible reasons for the 11y peak not being present in the output.

1. A. 11y “notch” response
2. A low-pass response cutting =11y
4. There is not discernible relation between input and output.

Of those 2 and 4 seem to be the most readily explained physically.

You give no reason for rejecting the last three, indeed you give no indication of having considered them.

In that context the notch is not a discovery it is an assumption.

The lag of just over 10y can be found from cross-correlation with SST:
http://climategrog.wordpress.com/?attachment_id=958

• #

The low pass is rejected because of the response at 5yrs.

So you have 1 &3 (which is called “4”).

• #
Greg Goodman

That would be correct if there is no noise, sampling errors or ANY variability not due to TSI in the surface data.

That is a pretty ridiculous starting point.

“3 (which is called “4″).” 4 was called 4, see correction below.

Of the list 1…4, 2 and 4 seem to be the most readily explained physically.

• #
Greg Goodman

OH crap, I used > and < signs and I’ve just noticed it messed the post and option 3 disappeared altogether.

I’ll try again.

1. A. 11y “notch” response
2. A low-pass response cutting <=11y
3. A high-pass response cutting >=11y
4. There is not discernible relation between input and output.

• #
Philip

Greg (#36): There are other possibilities also:

5. A high-gain negative feedback loop that takes counteracts any TSI changes.
6. The earth having a different response to different wavelengths of radiation from the sun, and the spectrum changing significantly with TSI over its 11 year cycle.

There is no reason to reject any, as you say. I don’t think that David has done that.
He looked at TSI and saw a cyclic variation, then looked at temperatures and didn’t see any corresponding variation and asked himself why.

There have been enough “climate scientists” writing off the sun as the cause of warming principally because that 11 year cycle doesn’t show.

David asked himself how he could model this — not that he could give you the physics of why, but how could you build a model. Leaving the mapping of the model to physical causes until the model appears to be valid.

He chose one solution. Looking at a cyclic input and a flat output some form of filtering is the first thing that might spring to mind for an electronics engineer. The chose that, for whatever reason, and constructed his model on that basis.

Now you, or anyone else, is perfectly free to choose a different model and build your own.
For example, I think Willis might have fun exploring a high-gain negative feedback model. He already has a pretty good physical explanation for that.

The one thing that David’s model has going for it is the ~11 year delay. That is a phenomenon already observed by others, and as it turns out, is a necessary component of his model. The bits begin to fit together. Exactly what causes this, the mysterious “Force X” isn’t an issue at this point. it doesn’t become one until the model can be demonstrated to hind and forecast with acceptable accuracy.

No-one says that it has to map component to component to physical processes, just that if it reasonably accurately models observed phenomenon, and is a better model than the CGM/CO2 models, then a priori, it begins to look like the sun may be the major driver of temperature, and not CO2.

At THAT point, you start to look for physical processes.

Some people are being a bit to pedantic about the scientific process. Theories are not born fully formed overnight. They evolve, and in many cases evolve from simplistic models, in the mind, on paper or in a computer. At this stage, David in in the process of developing his theory. He wants to see if it holds water, constructing the simplest model that allows validation is a useful step. It is not in itself a full-blown theory, but it has aspects of the scientific process. It allows a model to hind cast and forecast. The forecast is a short enough period for validation (unlike the current models).

yes, the notch is an assumption, and as you say it can be replaced with other components, and very similar results achieved. Those are just alternate models. There is no “one true model” of any process, just variants which prove to be somewhat better than others.

• #
Mark D.

Some people are being a bit to pedantic about the scientific process. Theories are not born fully formed overnight. They evolve, and in many cases evolve from simplistic models, in the mind, on paper or in a computer.

Well said Philip!

• #
Philip Shehan

Very true Mark and the other Philip.

• #
Greg Goodman

“5. A high-gain negative feedback loop that takes counteracts any TSI changes.”

Same as my #4.

“6. The earth having a different response to different wavelengths of radiation from the sun, and the spectrum changing significantly with TSI over its 11 year cycle.”

Yes, if you want to have several different responses to TSI at the same time, one could pick a mix of the 1 to 4 options.

I have suggested elsewhere that Willis’ tropical feedback primarily a surface one and would highly attenuate changes the have shallow penetration.

shorter wavelengths that pentrate deeper will bypass the -ve feedback.

• #
Richard C (NZ)

Greg #36

>”The lag of just over 10y can be found from cross-correlation with SST”

What lag considering 100m depth? David says 5 yr time constant, 5 yrs less than SST. Doesn’t make sense.

And lag considering 700m? 2000m? Must be well in excess of 10 yrs out to around 20, 40. Trenberth says 10 – 100.

• #
Greg Goodman

deeper reservoirs have larger capacities and longer time constants leading to greater lag if you take a relaxation model.

I did not see David’s 5y for 100m so I can’t comment out of context but it seems odd.

• #
Richard C (NZ)

>”I did not see David’s 5y for 100m so I can’t comment out of context but it seems odd”

David Evans
June 25, 2014 at 1:35 am

There is a low pass filter, but its time constant is about 5 years.

Many others have found it, e.g. as Stephen Schwartz at Brookhaven said in “Determination of Earth’s transient and equilibrium climate sensitivities from observations over the twentieth century: Strong dependence on assumed forcing” in 2012, “The time constant characterizing the response of the upper ocean compartment of the climate system to perturbations is estimated as about 5 years, in broad agreement with other recent estimates, and much shorter than the time constant for thermal equilibration of the deep ocean, about 500 years.”

There’s a little more to his comment but that’s the important bit.

My first of several responses was at #79.2.1 (abbreviated too):

Richard C (NZ)
June 25, 2014 at 8:55 am

Schwartz quote #79.2

>“The time constant characterizing the response of the upper ocean compartment of the climate system to perturbations is estimated as about 5 years”

Huh? 0 – 700m OHC (indicative):

http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/heat_content55-07.png

To which David replied at #79.2.1.3:

David Evans
June 25, 2014 at 2:27 pm

Schwartz’s upper compartment is more like 100m deep (the only indication I can find in his paper). He doesn’t make it clear; I understand it is a functional division between upper and lower.

# # #

Trenberth concurs with 5 yrs (he says 6), but then says 10 – 100 yrs. SST at 10 yrs. I don’t think 5 yrs is the prime characteristic of oceanic lag. I think 20 yrs +/- (Abdussamatov) at least.

• #
• #
Richard C (NZ)

Re #38.1.1

>”I don’t think 5 yrs is the prime characteristic of oceanic lag. I think 20 yrs +/- (Abdussamatov) at least”

Between 20 – 30 years.

Global 2–2.8;3.7–6.6;7.7;8.3;9.1;10.4;11.5;20.6;26.3;29.6 and 65

From Table 2:

‘On the relationship between global,hemispheric and latitudinal averaged air surface temperature (GISS time series) and solar activity’

Souza Echer et al (2011)
[Search title for PDF on web]

• #
Richard C (NZ)

Excerpts, Souza Echer et al (2011)

Page 5

“…by spectral analysis we found clearly the 11 years signal in the temperature series (also detected by the decomposition).

And,

“Although we have found the 22 and 11 years oscillations in the
surface air temperature series and the same oscillations have
been observed in several climatic records, the spectral properties are different for different geographical regions.This might reflect the complexity of the climatic system,which depends on both
external forcing and local conditions.

• #
Greg

I’ve shown that ex-tropics are more sensitive to rad. changes than tropics. If they are doing regional analysis, this may make the signal clearer.

• #
the Griss

I’m beginning to think of these lags of different time periods as a sort of “smearing” of the extra TSI and other solar energies across a reasonably wide time span.

The effect of these combined lags is likely to peak around the same period as the solar TSI cycle due to resonance transfer from the main driver.

Probably not normally distributed, but more like a chi-squared distribution. ie a peak around 5-15 years and a long tail.

The effect being that when there is a series of strong solar peaks (like the second half of last century), things warm up gradually…

…but when we get a couple of very weak solar, things level off, then start to cool, gradually more quickly.

• #
Richard C (NZ)

Griss #38.1.1.2.2

>”I’m beginning to think of these lags of different time periods as a sort of “smearing” of the extra TSI…”

“Smearing” definitely (seems to be a legit climate science term – John Christy uses it in regard to ENSO)

>”when there is a series of strong solar peaks”

How does that then translate to:

“peak around the same period as the solar TSI cycle”

And,

“a peak around 5-15 years and a long tail” ?

Don’t you then need to add SC to SC?

The most recent strong peaks were SCs 21, 22, 23. The period of “smearing” then being 20 years (Dec 1979 – Jul 1989 – Mar 2000 by SSN), not 10 (5-15).

Prior to that was the highest SSN at SC 19 peak Mar 1958.

List of solar cycles
http://en.wikipedia.org/wiki/List_of_solar_cycles

The “smearing” period from 1958 then 42 years (1958 – 2000).

The lag of those 2 smeared periods to be detected in dodgy OHC somehow (see Steve #40.1.1 downthread) or a Souza Echer et al type GAT analysis (#38.2.1.1 upthread).

Or something.

• #
the Griss

Don’t you then need to add SC to SC?

The long tail will do that, but not all of it.

What fraction…….. depends on the “k” of the distribution

• #
the Griss

“k” as in the shape in this chart

(yes, I know its wiki, but this is just the easiest graph to find, and not liable to WC tampering)

• #
Richard C (NZ)

>“k” as in the shape in this chart

OK, but isn’t one SC smearing the k = 1 profile?

If you were to add SC + SC + SC, wouldn’t that be the k = 9 profile (which is how I see the smeared and lagged effect of multiple SCs)?

• #
the Griss

Multiply the numbers along the bottom by say 3, to get years, then use a ‘k’ of around 4

Maybe that might be somewhere in the ball park for how all the delays might add together.

Each SC adds less as time progresses. So something like SC(n) + 0.7*SC(n-1) + 0.5*SC(n-2) … etc. as time progresses.

Each one having a peak response then a decay. (treat the italics as subscripts)

Pure supposition, of course… just been looking at all the different possible delays in relation to David’s hypothetical model.

• #
Richard C (NZ)

>”Each SC adds less as time progresses. So something like SC(n) + 0.7*SC(n-1) + 0.5*SC(n-2) … etc. as time progresses”

Nice explanation (thnx). But if each SC pulse of energy is a similar amount over 3 SCs say, why would the delay be less for SC 3 than for SC 1?

Wouldn’t the effect be more like 3 k = 4 curves in succession?

Once an SC pulse is a lessor amount (e.g. SC 24), there’s less energy to smear so the profile is not as tall or as long (continuing the supposition on my part).

• #
the Griss

I’m trying to add curves in my head. Not working well after a dinner with drinks 😉

Try taking the blue line and copy paste but translate each successive curve to the left by 4 or so units.. then add the curves.

Maybe a k = 3 curve.. . get my drift.? (If not I’ll try again when not a tad tiddly.) 🙂

• #
Greg

Thanks for the link, I thought you meant it was is part of his presentation, I have not trawled the now thousands of comments across parts I…….VII

David Evans
June 25, 2014 at 1:35 am

“There is a low pass filter, but its time constant is about 5 years.”

There seems to some confusion between time-constant ( which is a parameter of a relaxtaion response ) and a cut-off frequency (or period) of a low-pass filter.

A relaxtion has some low=pass characteristcs but the two parameters are unrelated values from two different things.

It’s interesting that there is quite some literature supporting the 5 relaxtion time-constant that I just threw together as an illustration, with just very cursory hand ‘optimisation’.

Look at the relaxation response to Mt.Pinatubo. The best fit time-constant was 8mo but results in a peak response about 13mo after the peak disturbance.

http://climategrog.wordpress.com/?attachment_id=884

That 13mo lag is also shown in Spencer and Braswell’s lag correlation reproduced in that discussion.

In conclustion is seem like a lag of around 10-11 years may be in agreement with time-constant of about 5y. There is not the contradiction you ( and apparently David ) thought.

• #
Richard C (NZ)

>”seem like a lag of around 10-11 years may be in agreement with time-constant of about 5y”

Thought it was something like that but wasn’t sure (still not).

Thing is, it’s the depth consideration that is determining this figure. They’re only looking at the top 100m.

The longer the time period of consideration, the deeper the depth consideration i.e. I agree with your “deeper reservoirs have larger capacities and longer time constants leading to greater lag”.

This is what Glassman is getting at with “pattern of delays”:

“The ocean’s complex patterns of circulation across the surface, and between the surface and the deeper ocean, produce a pattern of delays, with some cycle times exceeding a millennium.”

And,

“…temperature might be best modeled as a set of relatively narrowband accumulators of solar energy”

100m depth is is just one of those accumulators and it’s associated with lessor lag.

• #
Greg

”seem like a lag of around 10-11 years may be in agreement with time-constant of about 5y”

Thought it was something like that but wasn’t sure (still not).

====

A 5y exp response to a step fn with be 95% of the way there in 15y. That is in one way a lagged response, though it is not a time shift like Davids hypothesised delay.

when the input is not a step but variable ups and downs there will be similar but damped changes in the output. They will be lagged and of a rather different form due to the differing attenuation of different freqs.

The actual lag will depend upon the content of the input, it is not a simply fixed time lag.

A fixed linear rise will output a fixed rise with a lag of one time constant. So it’s between 1 and 3 , probably 2*tau will proivde a ball-park estimation.

That’s why I said tau=5y seems to agree with correlation lag of 10y.

My work I linked about Mt.Pinatubo reaction is a fast, shallow reaction in the tropics, with a tau=8 months and a lag of 13mo.

• #
Greg Goodman

Philip

The one thing that David’s model has going for it is the ~11 year delay. That is a phenomenon already observed by others, and as it turns out, is a necessary component of his model.

I find this even more confused.

It is not a component of his filter it is an addition fudge he has to apply after his filter along with the highly implausible ‘nuclear winter’ fudge.

A simple relaxation response seem much more parsimonious.

http://climategrog.wordpress.com/?attachment_id=981

• #
Rereke Whakaaro

Greg,

If you have two signals, one at 11 years (approximately), and one at 10 years (approximately) how do you differentiate them? An engineer will see the difference as a phase shift, but I am not sure a climate scientist would, because they are not used to working with frequencies high enough to make it obvious.

The term ‘Nuclear Winter’ is a little propaganda name.

During the time of the atmospheric and surface tests, we know that considerable amounts of dust and water (unfortunately unquantified) were thrown up into the stratosphere, and possibly higher. We do know for certain that the ionosphere was impacted, because long distance radio transmissions were severely disrupted, on a global basis. Several countries put their military on high alert, as a result. I understand that mean radio signal strengths are still below the level that was normal, prior to the tests.

That dust, and water vapour, presumably had the effect of blocking some to the suns rays, thus causing cooling, even though the effect was not quantified at the time.

• #
Greg

dust and water: yeah there _could_ be _some_ effect. It merits being looked into.

I did and found nothing. Not even slight evenence of a change coinciding with any of the major test. Maybe I missed it. It certainly was not confimation bias, because I really expected there to be something. There wasn’t.

But to jump from a situation where no one has published anything showing a link to 0.5 deg C , which close to 100y of “climate change”, without any more justification than it fills a hole in the model is, frankly, unscientific.

• #

I knew we’d get around to this! Dr. Jeff Glassman of ‘once best’ Fast Fourier Transform fame has 6 – 10 years. His approach, which used the smoother, low background Wang et al. 2005 TSI record (noting David used the more radical Steinhilber et al record which Svalgaard hates and also never bothered to check the whole ensemble of such records) gets:

Solar energy as modeled over the last three centuries contains patterns that match the full 160 year instrument record of Earth’s surface temperature. Earth’s surface temperature throughout the modern record is given by
EQ01

where Sn is the increase in Total Solar Irradiance (TSI) measured as the running percentage rise in the trend at every instance in time, t, for the previous n years. The parameters are best fits with the values m134=18.33ºC/%, m46=-3.68ºC/%, b=13.57(-0.43)ºC, and τ=6 years. The value of b in parenthesis gives T(t) as a temperature anomaly. One standard deviation of the error between the equation and the HadCRUT3 data is 0.11ºC (about one ordinate interval). Values for a good approximation (σ=0.13ºC) with a single solar running trend are m134=17.50ºC/%, m46=0, b=13.55(-0.45)ºC, and τ=10 years.

http://www.rocketscientistsjournal.com/2010/03/sgw.html

Frankly, I find this fairly parsimonious too. And it was done in March 2010 with absolutley no fanfare – just the quite humble musings of a very smart old scientist to whom every cell phone user owes a big debt….

• #
Richard C (NZ)

Steve #40

>”Glassman…..has 6 – 10″

Wouldn’t that be better stated 6 and 10?

I’ll use the PDF to reference by page:

http://library.crossfit.com/free/pdf/CFJ_JGlassman_SolarGlobalWarming.pdf

Page 1 Abstract:

“The parameters are best fits with the values…..τ=6 years”

“Values for a good approximation (σ=0.13ºC) with a single solar running trend are…..τ=10 years”

Those for a 160 year near-sfc atm temperature series.

Now page 16 (my emphasis):

“Oceans, because of their mass, their heat capacity, and their color, are the dominant mechanism of Earth’s energy balance between the Sun and space. The atmosphere as a reservoir plays a minute role, and is well-represented as a byproduct of the ocean. And the ocean is the distributor of the carbon cycle, the hydrological cycle, and the energy cycle. The ocean’s complex patterns of circulation across the surface, and between the surface and the deeper ocean, produce a pattern of delays, with some cycle times exceeding a millennium. These are evident in the concentration of CO2 cross-correlated with temperature. Consequently, temperature might be best modeled as a set of relatively narrowband accumulators of solar energy. An analog to this process in electronics and signal processing is the tapped delay line.”

The “pattern of delays” is not limited at 10 yrs and a 10 year delay does not explain upper ocean heat accumulation over the last 60 years or so.

• #

You are wrong Richard. Check out Figure 1 in:

http://onlinelibrary.wiley.com/doi/10.1029/2004GL020258/full

From observations of ocean interior temperature changes, Levitus et al. [2000, hereafter referred to as the “Levitus” results] have shown that ocean heat content has increased over the last fifty years, with significant decadal variability superimposed on an underlying trend. In comparisons of AOGCMs with observations, Levitus et al. [2001], Barnett et al. [2001, Figure 1], Reichert et al. [2002, Figure 2], and Sun and Hansen [2003, Figure 4a] demonstrated that their models were able to reproduce the trend but not the decadal variability.

If that (the Levitus results) doesn’t look suspiciously like a delay of the order of 10 – 11 years then I’ll eat my hat!

• #
Richard C (NZ)

Steve, you missed:

“superimposed on an underlying trend”

I said:

“a 10 year delay does not explain upper ocean heat accumulation over the last 60 years or so”

Figure 1 is the period beginning 1950 and the “underlying trend” is an increase over the entire period i.e. “60 years or so”

• #

I’m beginning to think you must be a lawyer!

I don’t deny (gainsay) the underlying trend at all (TSI was most recently maximal around 2000 and had been rising since at least 1940) but like David it is the creator of the notch and the delay filter required for it I am interested-in. Perhaps you can now explain to me why there is a 10 – 11 years periodicity IMPOSED on the trend? Is that not suggestive of a delay filter almost in sync withe 11-year Schwabe Cycle?

I find it bemusing that you have been bombarding us with numerous example papers ‘showing 11 year cyclicity in temperature’ (excluding Levitus) but when I point out one in the ocean (clearly heading towards a cyclical minimum at 2000) you get upset. As someone who frequently does, amongst other things, high level catchment hydrology modeling wherein lie all sorts of interesting delays I think you just fell into a straw man false inference.

With respect to your fond 65 year periodicity in the oceans (PDO etc.) I also think Bob Tisdale does it so very much better. That is my inference.

• #
Richard C (NZ)

>”I don’t deny (gainsay) the underlying trend at all (TSI was most recently maximal around 2000 and had been rising since at least 1940)”

Steve, the “underlying trend” is from your quote of Gregory et al (2004) in respect to Fig 1 OHC:

Figure 1. Five-year running means of world ocean heat content from Levitus….

Not TSI.

>”I point out one in the ocean (clearly heading towards a cyclical minimum at 2000) you get upset”

Not at all. Shaviv has analyzed this in respect to 11 yr solar cyclicity:

‘Using the Oceans as a Calorimeter to Quantify the Solar Radiative Forcing’

http://www.sciencebits.com/files/articles/CalorimeterFinal.pdf

Refer Figure 4, page 4. Net Oceanic Heat Flux,

Note the trend from 1950, The “underlying trend” is about ocean heat accumulation, 11 yr cyclicity – isn’t.

Trenberth (and co-authors) estimate ocean heat accumulation at 0.4 PW but with SC 24 change he’ll have to do an update.

Lawyer? No. Just some law papers and years of adhering to Acts and Regulations. And some engineering science including heat.

• #
Richard C (NZ)

>”With respect to your fond 65 year periodicity in the oceans (PDO etc.)”

Did you notice Glassman replicated that with only TSI?

In other words, he introduced 65 year periodicity without an extra signal being inserted.

• #

The dominant surface temperature trend period Glassman identified is 134 years with a secondary at 46 years. There is no periodicity (PDO or otherwise) at 65 years and Jeff has not; ‘introduced 65 year periodicity without an extra signal being inserted.’ You are playing out your fantasies.

• #
Richard C (NZ)

>”There is no periodicity (PDO or otherwise) at 65 years”

Yes there is Steve.

From #38.1.1.2 (“and 65” emphasized):

Global 2–2.8;3.7–6.6;7.7;8.3;9.1;10.4;11.5;20.6;26.3;29.6 and 65

From Table 2:

‘On the relationship between global, hemispheric and latitudinal averaged air surface temperature (GISS time series) and solar activity’

Souza Echer et al (2011)
[Search title for PDF on web]

• #
Richard C (NZ)

Souza Echer et al – GISTEMP

• #
• #

Glassman did NOT “introduce(d) 65 year periodicity without an extra signal being inserted”. You ‘miswrote’ (= lied).

• #
Greg

“The “pattern of delays” is not limited at 10 yrs and a 10 year delay does not explain upper ocean heat accumulation over the last 60 years or so.”

A fixed delay does not but a relaxation response comes closer.

There is not reason for a fixed delay in climate, that is why then had to hypothesise something other than TSI, which looks like TSI withe same variability but offset from it by 11y.

I’m sorry but at every turn this model is being fudged and shoehorned by hypothesised , as yet undocumented effects. Square pag meets round hole.

I think the idea of a non-GHG model is excellent but the notch interpretation is a mistake.

Now I’ve seen that there’s exitsting evidence for a 5y lag, it seems that a simple, physically obvious, relaxation response is the point to start from.

• #
Richard C (NZ)

Greg #40.1.2

>”Now I’ve seen that there’s exitsting evidence for a 5y lag, it seems that a simple, physically obvious, relaxation response is the point to start from.”

That’s the Schwartz approach:

‘Heat capacity, time constant, and sensitivity of Earth’s climate system’

Stephen E. Schwartz (2007)

http://www.ecd.bnl.gov/steve/pubs/HeatCapacity.pdf

Problem being it’s CO2-centric (GHG forcing in GCMs) so there’s a lot to leave behind but possibly some takeaways.

# On page 9 no mention of solar change or energy accumulation resulting from it (a trend in the fkuctuations):

“The upward fluctuations in ocean heat content represent an increase in planetary heat content that is greater than the average over the time period, and correspondingly the
downward fluctuations represent a loss in planetary heat content. What can give rise to such fluctuations that are evidently unforced by the surface temperature? Clearly the planetary heat balance must be fluctuating on account of changes in planetary coalbedo and/or effective planetary emissivity on these time scales, as these are the only means by which the heat content of the planet can change. As these changes are not forced by the surface temperature they must therefore be manifestations of internal variability of the climate system”

# On page 10 he establishes the “effective” heat capacity of the ocean:

“This coupled heat capacity is much less than the actual heat capacity of the ocean to the indicated depths, 23%, 16%, and 5%, for depths 300 m, 700 m, and 3000 m, respectively”

And,

“The effective heat capacity determined in this way is equivalent to the heat capacity of 106 m of ocean water or, for ocean fractional area 0.71, the top 150 m of the world ocean. This effective heat capacity is thus comparable to the heat capacity of the ocean mixed layer.”

# Page 13, time constant and lag:

“The increase in τ with increasing Δt would seem to be indicative of increased coupling to elements of the climate system having greater time constant; the leveling off of τ to a constant value of about 5 years at lag times as great as 15-18 years.”

# Page 14 on the time constant:

“Has the detrending, by imposing a high-pass filter on the data, resulted in a value of τ that is artificially short? To examine this I carried out the same analysis on the non-detrended data as on the detrended data. As expected, this analysis resulted in estimates of the relaxation time constant that were substantially greater than the estimate obtained with the detrended data. However these estimates differed substantially for different subsets of the
data: 15-17 yr for each of the data sets as a whole, 6 to 7 yr for the first half of the time series (1880-1942), and 8-10 yr for the second half of the data set (1943-2004). This dependence of τ on the time period examined suggests that the time constant obtained with the non-detrended data is not an intrinsic property of the climate system but does indeed reflect the long term autocorrelation in the data that results from the increase in GMST over the time period for which the data exist. For this reason I proceed in the further analysis using only the time constant obtained with the detrended data set.”

# Page 16 in respect to GHG forcing (not solar change):

“It would thus appear that there is very little unrealized warming as a consequence of “thermal inertia” of the climate system, so for all practical purposes the climate system can be considered in steady-state (or “equilibrium”) with the applied forcing”

# Q&A starting page 17:

“Is the effective heat capacity that is coupled to the climate system, as determined from trends in ocean heat content and GMST, too low, or too high?”

And,

“Is the relaxation time constant of the climate system determined by autocorrelation analysis the pertinent time constant of the climate system?”

And,

“Finally, as the present analysis rests on a simple single- compartment energy balance model, the question must inevitably arise whether the rather obdurate climate system might be amenable to determination of its key properties through empirical analysis based on such a simple model. In response to that question it might have to be said that it remains to be seen. In this context it is hoped that the present study might stimulate further work along these lines with more complex models.”

# # #

I don’t think Schwartz (2007) can be regarded as definitive.

• #
Richard C (NZ)

Greg #40.1.2

>”Now I’ve seen that there’s exitsting evidence for a 5y lag, it seems that a simple, physically obvious, relaxation response is the point to start from.”

Along with Schwartz (2007), also Scafetta and West’s approach:

‘Phenomenological reconstructions of the solar signature in the Northern Hemisphere surface temperature records since 1600’

Scafetta and West (2007)

3. Phenomenological Thermodynamic Model

[18] PTM assumes that the climate system, to the lowest-order approximation, responds to an external radiative forcing as a simple thermodynamical system, which is characterized by a given relaxation time response τ. This should be a valid approximation for small variation of the input forcing. The model depends on only two parameters: the relaxation time τ and a factor α that has the purpose of phenomenologically transforming the irradiance units, W/m2, into temperature units, K. The physical meaning is that a small anomaly (with respect to the TSI average value) of the solar input, measured by ΔI, forces the climate to reach a new thermodynamic equilibrium at the asymptotic temperature value αΔI (with respect to a given temperature average value).

http://onlinelibrary.wiley.com/doi/10.1029/2007JD008437/full#jgrd13793-fig-0005

>”the relaxation time τ

Time constant τ (tau),

Physically, the constant represents the time it takes the system’s step response to reach 1-1/e approx 63.2% of its final (asymptotic) value

http://en.wikipedia.org/wiki/Time_constant

S&W07 6. Conclusion

[41] Also, a reduced solar activity on climate would imply, according to our PTM, a reduction of the relaxation time constant τ. This time constant τ cannot be too smaller than what we estimated because the ocean requires several decades for reaching a thermodynamic equilibrium with a change in the forcing. We also observe that a recent independent study [Schwartz, 2006, 2007] confirms the existence of a climate time constant of 4 < τ < 17 a that is consistent with our estimate using MOBERG05 and WANG2005 (6 < τ < 12).

[42] In conclusion, if we assume that the latest temperature and TSI secular reconstructions, WANG2005 and MOBERG05, are accurate, we are forced to conclude that solar changes significantly alter climate, and that the climate system responds relatively slowly to such changes with a time constant between 6 and 12 a. This would suggest that the large-scale computer models of climate could be significantly improved by adding additional Sun-climate coupling mechanisms.

# # #

First characterization:

Time Constant

4 < τ < 17 Schwartz
6 < τ < 12 Scafetta and West

10.5 Schwartz
9 Scafetta and West

Second characterization:

Lag

τ is approx 63.2% of lag (final asymptotic value). Therefore the respective lags are:

6.3 < lag < 26.9 Schwartz
9.5 < lag < 19 Scafetta and West

16.6 yrs Schwartz
14.3 yrs Scafetta and West
14 yrs Abdussamatov

Third characterization:

Equilibrium

The ocean requires several decades for reaching a thermodynamic equilibrium with a change in the forcing (20 yrs +/- oceanic lag, Abdussamatov).

• #

Steve Short July 1, 2014 at 11:31 am

” knew we’d get around to this! Dr. Jeff Glassman of ‘once best’ Fast Fourier Transform fame has 6 – 10 years. His approach, which used the smoother, low background Wang et al. 2005 TSI record”
http://www.rocketscientistsjournal.com/2010/03/sgw.html

“Frankly, I find this fairly parsimonious too. And it was done in March 2010 with absolutley no fanfare – just the quite humble musings of a very smart old scientist to whom every cell phone user owes a big debt….”

Your Dr. Glassman did an excellent job of explaining that atmospheric CO2 is a function of oceanic cycles. These same oceanic cycles influence the surface temperature of all places on this Earth.
David’s conjecture is about the Solar influence on the oceanic cycles if these cycles are “not” driven by the poor Earthling concept of Total Solar Irradiance? Earthlings have no conceptual meaning for “total” anything.
Your oceanic cycles and Earth internal energy, seem to apear as some Solar force with an 11 year transport delay. What might that be?

Back to this thread of the Fourier Transform of an impulse, (white noise), divided by the Fourier Transform of something periodic, giving a notch at that period. Where the hell are these folk get their “inpulse”? Was that the recent minor “Big Bang”, or some other fantasy?

• #
James in Perth

Joanne and David,

Well done with your exploration into an alternative theory of climate change. Even though your patience may be wearing thin, keep up the good work. It’s been a fascinating read and adventure.

James

• #
gai

The Earthshine Project at Bigbear Observatory shows a marked change in albedo around 1997 graph

What is interesting is the following paper suggests a lessening in solar strength back in 1997.

DISCUSSION

…In this view the absence of pronounced 11-year temperature fluctuations (related to the unshaded area under the aa curve in Figure 3), is attributed to the damping effect of the thermal inertia of the oceans. Wigley and Raper [1990] have shown that such damping can reduce the impact of even a relatively strong solar cycle with ~0.1% peal-to-peak irrafiancevariation [Willson and hudson, 1991] to a barely detectable temperature signal (~0.02C). Thus it is the slow variation of the underlying solar signal, as revealed by the aa min time history,rather than the 11-year cycle in either aa or sunspots that shows up most strongly in the temperature record.

The fact that the aa index at solar minimum retains a value proportional to its flanking sunspot maxima, rather than falling to near zero values like the sunspot number, is thought to be a reflection of the interchange of poloidal and toroidal (sunspot) magnetic fields via the solar dynamo… The point we wish to make here is that the aa index provides evidence for a long-term (low-frequency) component of solar variability that persists through sunspot minimum and may therefore affect Earth’s climate.

While we hypothesize that the changing aa baseline is somehow related to a long-term irradiance variationon the Sun, there is another possibility and that is that the solar wind itself influences climate…

Our study suggest that solar variability has contributed significantly to the long-term change of earth’s climate during the past 350 years…

….we are reminded that there is evidence, albeit mixed…, for temperatures comparable to present day values during the interval 900-1250 A.D., well before the industrial age. The later part (1100-1250 A.D.) of this so-called Medieval Warm Period had inferred solar activity comparable to present levels….

As of this writing it appears that the average aa value of 1997 will be even lower (~16 nT) than that of 1996. Such leveling off or decline of the long-term solar component of climate change will help to disentangle its effects from that of anthropogenic greenhouse warming.

Perhaps your Factor X was also changing in 1997 during cycle 23. Then the 10 to 20 year window would start ~ 2009.

Stories like the following have been appearing rather regularly.

posted on February 10, 2010 Mongolia: The Disaster You Haven’t Heard Of

…Then there are years like this one, in which the Mongols and their animals endure what they call a dzud. I very roughly translate this as “a winter that’s atrocious even by Mongolian standards.” Dzud is actually a double-whammy. It refers to an unusually dry summer that stunts the growth of pasture grass, keeping animals thinner and limiting what can be gathered for hay. Then a brutal winter hammers down, exceptionally cold (often down to -55F/-48C) with blizzards that bury pasture grass under an impenetrable blanket of snow. Meager stores of fodder are soon exhausted, and the freezing sheep, goats, cattle, yaks, camels, and horses upon which the Mongolian herders depend start to die. Quickly….

As of this writing, Mongolian and international aid agencies estimate that more than 2 million domestic animals have perished so far in this dzud. Ten to twelve million died in the last disastrous episode ten years ago, and this dzud is regarded as far worse. Some fear that up to 20 million animals — half of Mongolia’s total herd — may succumb before tolerable weather arrives in late May….

Thermometers have been stuck below minus 20 degrees Celsius (minus 4 Fahrenheit) in Moscow – and below minus 50 degrees (minus 58 F) in some parts of Siberia – for a week.

Russian weather forecasters said temperature in the Khabarovsk region in eastern Russia had dropped to minus 43 Celsius, while Krasnoyarsk in Siberia reported minus 47.
…Other European countries hit hard by the extreme temperatures were counting the toll as temperatures gradually started to return to normal.

Authorities in Ukraine, which has been battling heavy snowfall for weeks, said 83 people had died of cold, with 57 of the victims found on the street.

• #
Greg

“I very roughly translate this as “a winter that’s atrocious even by Mongolian standards.” Dzud is actually a double-whammy. ”

double-whammy?? I think I prefer the term dzud.

Interesting though.

• #
Philip Shehan

Interesting gai.

• #
Bob_FJ

Gai,

That’s interesting

• #
Schrodinger's Cat

Leaving aside some of the earlier negative comments, there have been some very interesting contributions and suggestions. The dominance of the CO2 obsession over decades has left the “elephant in the room”, the sun, relatively poorly understood. The proposed model is now focussing the minds on solar effects and is showing much potential.

One of the difficulties, or perhaps an opportunity, is that a number of solar effects are clearly linked to the TSI cycle directly, or they are external to the sun but modulated by it.

Changes in the solar magnetic field and polarity, GCRs, solar wind, the very high frequency end of the solar spectrum, UV, all of these could have influence on atmospheric chemistry.

• #
Bob_FJ

S.C,
yep

• #

http://data.giss.nasa.gov/modelforce/strataer/

• #

I would suggest that David Evans refer in future to the so called ‘X Factor’ as the ‘W’ (Weather Factor) – the term coined by eminent Geologist H J (Larry) Harrington in his discussions with Alex S. Gaddes before the publication of the work ‘Tomorrow’s Weather.'(Alex S. Gaddes. 1990)
The following corroborative extract is from the original publication.
“The W Factor”

“What particular phenomenon is known to exist in this zone which could have any possible relevance? Is it possible that it is the vital zone from which our Earth’s climate is controlled?

“Dr Harrington and I were agreed that (unknown) entity,(he named it the ‘W’ (weather) factor, appeared to be emanating from the sun.

“He also agreed with me that, whatever is behind the emanation, is migrating in a retrograde direction relative to the sun’s rotation.

Tidal Activity

“The most likely candidate that I can think of at the moment is tidal activity, brought about by the gravitational influence of the planet; this influence would be quite capable of producing any number of different intensities of ‘pull ‘on the Sun and the effect would always produce retrograde tidal waves therein; furthermore, it would accommodate all the required time scales.

“The above tidal effects would be reflected in the constant train of sunspot waves (the so-called 11 year cycles,) the frequency of which I’ve used as No. 4 constant.
“It seems to me that the amount of consequent solar activity thus produced, would be proportional to the degree of tidal disturbance wrought by the gravitational ‘pull’ of the planets, from their given positions in their respective orbits.

“The reason that the overall sunspot wave frequency remains constant is that the solar rotation and the motion of the planets remain constant.

“It is noted, that it is the amplitude of the sunspot waves that varies, but the fact that over the 297.76 year solar cycle, the sunspot wave periods average out, makes the sunspot wave frequency constant at 11.028148 years,(the new value, see Fig. 3.) All of which seems to argue that whatever it is that is acting on the sun to produce the enigmatic sunspots, has a cyclic period of 297.76 years.

“NOTE: If one multiplies the metonic cycle (18.61 years) by a 26.75 ratio, the result is 497.8175 years; a discrepancy of about 4.65 years, which throws my calculations out of kilter; ie. One cannot fit three 167.49 year tree-ring sub-cycles into 497.8175 years, which must happen to be compatible with the formulae,
hence my preference for the 27 day ratio.

“Attached also is a copy of pages 318-19 of Climate Change and Variability (Ref No. 18,) carrying tables setting out the results of much research into climatic cultural change……..”

At least this name change would preclude any chance of ‘mistaken identity’ with the current ‘X Factor’ television singing contest!

The correlation with temperature change and Solar induced ‘Dry’ Cycles, may be found in the relevant sections of ‘Tomorrow’s Weather’ Alex S. Gaddes 1990.)
An updated version of this work (with ‘Dry’ Cycle forecasts to 2055) is available as a free pdf from [email protected]