JoNova

A science presenter, writer, speaker & former TV host; author of The Skeptic's Handbook (over 200,000 copies distributed & available in 15 languages).


The Skeptics Handbook

Think it has been debunked? See here.

The Skeptics Handbook II

Climate Money Paper


Advertising

Australian Environment Conference Oct 20 2012


micropace


GoldNerds

The nerds have the numbers on precious metals investments on the ASX



WARNING: Using a different computer could change the climate catastrophe

How bad are these global forecast models?

When the same model code with the same data is run in a different computing environment (hardware, operating system, compiler, libraries, optimizer), the results can differ significantly. So even if reviewers or critics obtained a climate model, they could not replicate the results without knowing exactly what computing environment the model was originally run in.

This raises that telling question: What kind of planet do we live on? Do we have a Intel Earth or an IBM one? It matters. They get different weather, apparently.

There is a chaotic element (or two) involved, and the famous random butterfly effect on the planet’s surface is also mirrored in the way the code is handled. There is a binary butterfly effect. But don’t for a moment think that this “mirroring” is useful: these are different butterflies, and two random events don’t produce order, they produce chaos squared.

How important are these numerical discrepancies? Obviously it undermines our confidence in climate models even further. We can never be sure how much of the rising temperature in a model forecasts might change if we moved to a different computer. (Though, since we already know the models are using the wrong assumptions about relative humidity, and are proven wrong, missing hot-spot an’ all, this is like adding sour cream to a moldy cake. We were never going to eat it anyway).

This is what 90% certain looks like.

The cheapest way to lower global climate sensitivity might be to switch operating systems in climate lab computers.

The monster complexity of climate models means they never had a chance of actually solving the climate in the foreseeable future, but that makes them ideal to issue unverifiable pronouncements from The Mount. At the same time it cultivates the endless grant-getting-cash-cow, always in need of bigger computer arrays: more code, more conferences, more computers!

 Abstract

This study presents the dependency of the simulation results from a global atmospheric numerical model on machines with different hardware and software systems. The global model program (GMP) of the Global/Regional Integrated Model system (GRIMs) is tested on 10 different computer systems having different central processing unit (CPU) architectures or compilers. There exist differences in the results for different compilers, parallel libraries, and optimization levels, primarily due to the treatment of rounding errors by the different software systems. The system dependency, which is the standard deviation of the 500-hPa geopotential height averaged over the globe, increases with time. However, its fractional tendency, which is the change of the standard deviation relative to the value itself, remains nearly zero with time. In a seasonal prediction framework, the ensemble spread due to the differences in software system is comparable to the ensemble spread due to the differences in initial conditions that is used for the traditional ensemble forecasting.

Some details

Computers set up in large parallel clusters can produce a different result through many means:

Massively parallel computing cluster computers, which contain many networked processors, are commonly used to achieve  superior computational performance for numerical modeling. The message-passing interface (MPI) is the most commonly used package for massively-parallel processing (MPP). The MPI parallel codes of an atmospheric model may not produce the same output if they are ported to a new software system which is defined as the computational platform that includes the parallel communication library, compiler, and its optimization level in this study.

At this frontier edge of numerical calculations, the finer details of how each system handles things like rounding and Fourier transform shortcuts can affect the end result:

Rounding error is another important consideration in atmospheric modeling that can arise due to various characteristics of the computer software implementing the model, including: 1) the rounding method; 2) the order of arithmetic calculations; 3) intrinsic functions; and 4) mathematical libraries, such as the fast Fourier transform (FFT).

REFERENCE

Hong, S.,  Koo, M., Jang, J., Kim, J.E., Park, H., Joh, M.,  Kang, J., and Oh, T. (2013)   An Evaluation of the Software System Dependency of a Global Atmospheric ModelMonthly Weather Review 2013 ; doi: http://dx.doi.org/10.1175/MWR-D-12-00352.1

h/t to The Hockey Schtick


VN:F [1.9.22_1171]
Rating: 8.8/10 (95 votes cast)
WARNING: Using a different computer could change the climate catastrophe, 8.8 out of 10 based on 95 ratings

Tiny Url for this post: http://tinyurl.com/k22ms9l

217 comments to WARNING: Using a different computer could change the climate catastrophe

  • #

    In response to a question on another blog: Could the much publicized climate catastrophe be due to a rounding error?

    My answer was:

    In any numerical (finite math) simulation run long enough, the rounding errors become the result. This even if the equations translated into computer code are technically correct. This was in the texts on all the books on the subject over 50 years ago. It should have been obvious. Yet the kids who write such crappy code don’t know enough of history or math to balance their checkbooks.

    To answer your question: Yes! The much publicized climate catastrophe IS due to a rounding error.

    Unfortunately, there still is the belief that if a computer says so, what it says is true. The combined wealth that the world could have generate in a decade has been squandered on the monumental fraud of climate catastrophe because of that belief.


    Report this

    480

    • #
      Rick Bradford

      That’s the technical part.

      The social part comes from Cipolla’s Five Laws of Stupidity, which state, in part:

      3. A stupid person is a person who causes losses to another person or to a group of persons while himself deriving no gain and even possibly incurring losses.

      ….

      5. A stupid person is the most dangerous type of person.

      …feed that into…

      1. Always and inevitably everyone underestimates the number of stupid individuals in circulation.

      … and add in the comment that modern society actively promotes a higher ratio of stupid people in positions of influence, and it is obvious why we are in the present situation with regard to AGW (and a lot else).


      Report this

      240

      • #

        There is even a more fundamental law: In any large administrative (aka bureaucratic) system, the managers are promoted inversely proportional to their competence to do the job. The more competent you are, the faster you leave the system. I call it The Inverse Peter Principle.

        I saw this law at work in several public education systems as a Science Teacher in the early 1960s. I have yet to find an exception. However, I have discovered that it really doesn’t make much difference. The reason is that all large top down power and control systems will fail to reach their own goals let alone any rational goal.

        The mean time between failures exponentially decreases as the system grows to have more than four people necessary to execute a task. Such systems always fail to take into account the necessity of requisite variety by substituting something called “decisions” for discovery and rational action based upon that discovery. The information used by the so called decision makers is indistinguishable from noise. Success is by accident and ultimate failure is assured.


        Report this

        190

        • #
          Rick Bradford

          There is even a more fundamental law: In any large administrative (aka bureaucratic) system, the managers are promoted inversely proportional to their competence to do the job. The more competent you are, the faster you leave the system. I call it The Inverse Peter Principle.

          Because the competent people dislike office politics; incompetent people rely on it.

          And in organisations which don’t need to worry about results (i.e. the public sector), the top jobs have been filled by the incompetent for many years.

          And so, the office politicians rise to the top, where their ilk reside, and the competent people, whose hearts are simply not into climbing the greasy pole, remain in relatively lowly positions. Or, as you say, they move to a more congenial atmosphere, one where their competence is appreciated.

          Brown-nosing and schmoozing seem to come naturally to many people who lack talent — a compensation mechanism, perhaps — and incompetent leaders feel much happier with incompetent lieutenants.

          Birds of a feather flock together.


          Report this

          200

        • #
          bananabender

          It’s a bit like the military. The least intelligent officers frequently end up as generals because all the smart blokes retire early due to frustration. Dwight Eisenhower, Bernard Montgomery and (Australia’s own) Thomas Blamey all reached 5-star rank despite being widely regarded as incompetent hacks.


          Report this

          41

    • #
      turnedoutnice

      Predicted by Edward Lorenz in he 1960s…


      Report this

      90

    • #
      Roy Hogue

      In response to a question on another blog: Could the much publicized climate catastrophe be due to a rounding error?

      My answer was:

      In any numerical (finite math) simulation run long enough, the rounding errors become the result. This even if the equations translated into computer code are technically correct. This was in the texts on all the books on the subject over 50 years ago. It should have been obvious. Yet the kids who write such crappy code don’t know enough of history or math to balance their checkbooks.

      Indeed, one of the very first lessons I had to learn early in my career when constrained to do integer arithmetic was to arrange to do the division last. Otherwise the whole result, no matter how accurate the input, became or could become worthless. That’s an extreme case of rounding error (actually truncation) but it’s the same problem.

      The idea that you can put a computer to work grinding away on numbers forever is a fantasy. Even with good data in you can turn it into garbage out with ill considered design or just plain too many steps building on each prior result.

      Ain’t computers fun? ;-)

      Anyone for error analysis? It’s not impossible to come up with numbers for how large the errors introduced by rounding to a fixed precision can get by the time you print the result. And if the result is different on two different computer systems it’s time to go back to the drawing board and find out why.


      Report this

      80

    • #
      Doug Proctor

      How MUCH of a difference? Significant or not, in particular for the CAGW crowd.


      Report this

      10

  • #
    Grant (NZ)

    Just a geeky technicality… the distinction is not between a Linux world and an Intel world. Linux runs on the Intel architecture.


    Report this

    40

    • #

      Ahhh. True. So this is more of an Intel vs IBM thing? Thanks… will fix.

      Table 1 shows the 20 computing environments including Fortran compilers, parallel communication libraries, and optimization levels of the compilers. The Yonsei University (YSU) Linux cluster is equipped with 12 Intel Xeon CPUs (model name: X5650) per node and supports the PGI and Intel Fortran compilers. The Korea Institute of Science and 112 Technology Information (KISTI; http://www.kisti.re.kr) provides a computing environment with high-performance IBM and SUN platforms. Each platform is equipped with different CPU: Intel Xeon X5570 for KISTI-SUN2 platform, Power5+ processor of Power 595 server 6 for KISTI-IBM1 platform, and Power6 dual-core processor of p5 595 server for KISTI-IBM2 platform. Each machine has a different architecture and approximately five hundred to twenty thousand CPUs.


      Report this

      60

      • #
        Rereke Whakaaro

        Reaching my memory back to a time when I intimately engaged in systems engineering, I would surmise that the problems are less with the actual hardware, more to do with variations between Fortran compiler macro libraries.

        A Fourier Transform on two different compilers may have identical source code, but at the macro level, the calculation method could be totally different.


        Report this

        90

      • #
        Roy Hogue

        I would not put it past some compiler to make an optimizing decision that doesn’t do the best job. It also depends heavily on the floating point word length. The more significant bits the better.


        Report this

        30

        • #
          crakar24

          The more significant bits the better.

          I dont follow, if the answer is 14.657483 and i treat it as an integer i will get 14 as a FP number i will get 14.657483 depending on whether it is a 8, 16 or 32 bit word. The bigger the word (bit number) the greater the accuracy (is this what you meant).

          Lest say its 16 bit then bit 0 is the least significant bit and bit 15 is the most significant bit.


          Report this

          32

          • #
            Roy Hogue

            Crakar,

            The point is that as a general rule the more bits you have the longer it takes for small errors (rounding) to become significant to your end result. But this is true only for floating point calculations and isn’t automatically going to help you either. For instance, the IEEE double precision floating point word gives you 52 significant bits. But once your numbers reach the point where 52 bits do not preserve enough precision your following calculations are introducing possibly big errors and there’s nothing you can do about it.

            If you use the single precision floating point format instead of the double then your problem of running out of significant bits happens a lot sooner in a long string of number crunching.

            These days, as someone has already mentioned, the trend is to make use of the massive parallelism of graphics cards to do number crunching. But this has its pitfalls possibly worse than using the CPU. For instance, the graphic card I’ve been working with has only the single precision floating point format available and worse, I have a choice between fast or exact arithmetic (what shortcut does “fast” take with my numbers?). Fortunately I get the exact math by default and also very fortunately, the number crunching I do is simple so there’s no problem.

            What I’ve been doing is real time surface plotting of incoming data and all the calculation I need is just the positioning of each point on that graph, a numeric model of the graph that can be passed off to OpenGL for rendering. If I was really interested in just doing fast number crunching I would need a much more capable graphic card offering double precision.


            Report this

            20

            • #

              The primary saving grace of graphics calculations is the finite resolution of the display device. The calculation only needs to be accurate enough for the error to be less than one pixel.

              The secondary saving grace of graphics calculations is the short path through the math model to compute the display pixel and its color value. Rounding errors don’t have time to accumulate to swamp the pixel resolution.

              The third saving grace is that the calculations are repeated from scratch at 20 to 200 Hz depending upon the circumstances.

              Then there is the fact that the eye blends the jittering of the small errors in a rapidly changing image so the image looks good to the eye even though each image is total crap.

              Climate models are different, there must be a fine 3D Mesh mapped over the planet. The climate calculation must be run on each node with a short interval so the results approximately match dF(c)/dt where C is the climate model. Then at each delta t, the results from the previous node calculation are the boundary conditions for the next calculation cycle. Finally, the sum of delta t must accumulate to being a significant fraction of a century.

              If you make delta t too large, the results will be wrong. If you make the 3D Mesh too large, the results will be wrong. If you make either of them adequately small, the rounding errors will soon swamp any accuracy existing in the starting conditions. Finally, it is next to impossible to know the correct starting conditions to a sufficient accuracy to start the simulation run in the first place.

              The bottom line, a climate simulation no matter how well designed nor how large of a precision, will rapidly fail to predict the future. Three days locally perhaps. A week maybe. A century? No way, no how!


              Report this

              20

            • #
              crakar24

              Roy,

              Thanks for the clarification, what you say makes more sense to me now, i just have one question, exactly how many decimal point places do you need to make the declaration that the temps in 2100 will be somewhere between 2.1C and 6.7C? (note the sarcasm in that question)

              Cheers


              Report this

              12

              • #
                Roy Hogue

                Crakar,

                Sorry I didn’t get back to see your response until now. But the answer to you question is the same as the answer to, “How many significant digits does the value of pi have?” ;-) Sarcasm is good! That’s the only way to deal with some of life’s problems.

                By the way, I remember reading somewhere that someone has now computed pi out to several hundred thousand digits or some wild number like that (forgot the exact number but it boggled my mind) and guess what? Pi still doesn’t repeat or terminate. I have no idea how you would design a program that would keep expanding the series for pi out to so many places, much less how you would definitively state that it doesn’t repeat. But I suppose that in every field there are some gluttons for punishment. I’ve no idea how much computer time must have been spent on this either.


                Report this

                10

              • #
                crakar24

                Roy,

                Very good point what is the real value of Pi V aq usable value of Pi….i think 3.14 is sufficient.

                cheers


                Report this

                11

              • #
                Roy Hogue

                Hi Crakar,

                Indeed, what is a usable value of pi? And it depends a lot on what you’re doing and how accurate the result has to be. For some work 3.14 will be fine but when you need something better here’s what you do. The Microsoft Calculator will give you pi as 3.1415926535897932384626433832795. That’s 31 decimal places. You copy that to the clipboard and paste it into your program as the value of something you name Pi and let the compiler handle converting it to floating point. The conversion will be as accurate as the available 52 bits will permit. Then you don’t have any worry that what you use for pi is doing anything serious to your calculations. Or at least you’re confident that any problem introduced by Pi is unavoidable.

                It’s simple and avoids the question of, “What is a usable value of pi?”

                The real problem in some of the work I’ve done is what happens when you need 10^x (base 10 exponential). x doesn’t have to get very large to overflow the double precision word and then you’re dead after that.

                Cheers back at you!


                Report this

                10

              • #
                Geoff Sherrington

                Roy,
                I’m doing some follow up on matters like the accuracy of the physics of assigning the heating effect of IR on CO2, going right back to pre 1900 and the assumptions behind the Planck and Wien equations & ab initio derivation of Schrodinger for simple gases. There are physics errors but there are also likely to be computing errors that modern computers might not have rechecked. Do you have access for a few minutes of use on a large machine?
                sherro1 at optusnet.com.au


                Report this

                10

      • #
        Wally

        Not so much Intel vs IBM.

        More its related to small differences in how floating point numbers work on different processor types AND maths libraries.

        COMPETENT computer systems course have been teaching the limits of floating point numbers for decades. You never rely on floating point numbers for long running accumulation type calculations because of the known behaviour of floating point errors.

        Essentially it is IMPOSSIBLE on a computer to have a perfectly precise floating point number – its all an illusion. Floating point numbers are an internal representation that approximates the actual number.


        Report this

        30

    • #
      Tel

      Linux runs on the Intel architecture.

      And runs on everything else too, including your kitchen cheese grater… but mostly on Intel.

      Hmmm, actually these days mostly on AMD64 which is now also supported by Intel after Intel admitted that AMD was right and Itanium was a flop.

      So this is more of an Intel vs IBM thing?

      There’s a number of factors in play, floating point units are notoriously difficult to get 100% proven results out of when it comes to round off errors, then there are compiler issues, and even computer language issues. For example, the C language defines two types of floating point values, 32 bit and 64 bit, but neither of these are supported natively inside an Intel floating point unit which traditionally uses 80 bit floating point registers. Compilers making careful use of the stack-based FPU architecture are able to get 80 bit precision for quite a lot of intermediate calculations and still be a correct implementation of the language when it comes to storing that result back into memory as a 64 bit value (imposing round-off at some stage).

      Then there are SSE extensions (which some in various flavours) and were designed for 3D applications favouring speed over accuracy, and they support various sizes of register 16, 32, and 64 bits, but mostly 32 bits which is what the game developers tend to prefer.

      If I haven’t lost anyone yet, the latest thing is Advanced Vector Extensions (AVX) which is an Intel invention supporting 512 bit wide registers, but these contain packed data (i.e. multiple floating point numbers, not 512 bit precision numbers).


      Report this

      81

    • #

      Not true, even the version of the operating system, even the software you use on the operating system, can make a difference.

      The reason is, software and operating systems have a choice of what math conventions to use – the processor supports more than one level of precision.

      So for example, there was a huge problem in banking (which I used to work in) reconciling Microsoft Excel spreadsheets to software written in C++ (then later in C#), because Excel uses a different floating point precision to just about everything else.

      Add to that other possibilities, such as high speed software which uses the computer’s graphics card (which is actually a very powerful parallel computation engine), lots of minor variations in configuration which could influence math conventions, the actual math libraries (which manage issues such as rounding errors – Macs for example use a different convention to PCs), and you have a large number of factors which could contribute to different outcomes.

      This whole issue is so funny – I’m kicking myself for not seeing it. Because I used to have to deal with this issue all the time.


      Report this

      171

      • #
        Tel

        I thought the banking industry had made a pact to always work in whole cents and keep right away from floating points. That’s been the case wherever I have worked on accounting software, and online web shopper stuff too.

        I guess there’s economic modelling, which is quite another animal entirely…


        Report this

        30

        • #

          Merchant banking – models to reconcile interest rates with foreign currency rates, models to detect instabilities in portfolio positions (volatility, Greeks – delta, gamma, vega, etc.).

          You rarely get the same answer from the Excel model and the software – its a real pain to figure out whether the deviation is caused by Excel’s internal oddities, or a mistake in the software (or sometimes a mistake in the Excel model).


          Report this

          30

        • #
          Backslider

          and online web shopper stuff too

          No, in ecommerce we use four decimal places for more accurate currency conversions (rounded to two after conversion, naturally). I’m sure that banking also would use more than two decimal places for the same.


          Report this

          00

  • #
    Tel

    Ohh Memory Lane

    Brian G Valentine, June 20, 2009 at 3:02 am

    It is known that the CGM must be periodically stopped and then restarted at the conditions of blow-up of solutions from mesh elements to mesh elements. This is done by means of flux correlations – wherein new initial conditions are generated by (some other) models for the calculations to begin again. Thus the model simulation at any given time is no better than the errors introduced by resetting the initial conditions, which, at any given time, are unknown (if future) or interpolated (from incomplete data of the past)

    Tel, July 19, 2009 at 11:11 am:

    In other words, the whole AGW thingy is at the very limit of the error margin and might quite reasonably be impossible to verify if the errors are only slightly larger than we currently believe. Hansen and friends know this very well. Edward Lorenz spent years studying and documenting the difficulty of making both measurements and predictions when dealing with chaotic time series. Lorenz is pretty close to retired now and quite sensibly doesn’t want to get too closely involved with the whole debate, but his old papers are easily available to download.

    John Blake, December 16, 2009 at 2:26 pm

    In 1960 Edward Lorenz of Chaos Theory fame asked, “Does the Earth have a climate? The answer, at first glance obvious, improves on acquaintance.”

    Due to “sensitive dependence on initial conditions” (the so-called “butterfly effect”), Lorenz asserted that any “complex system” –one with three or more interacting variables– will be non-random but indeterminate: Non-linear, neither amenable to detailed analysis nor capable of extrapolation in any form. In other words, long-term “climatic” changes in Earth’s ultra-complex atmospheric system are mathematically impossible either to model or predict.

    Andy G, January 24, 2010 at 4:23 am

    Another thing that I find odd is that Chaos theory came partly from observations (by Lorenz) that his very simple weather models gave some very unexpected and random results, yet now climatology has been fixed and is non chaotic and accurate…

    Neil Fisher, May 14, 2010 at 2:29 pm

    All things are not and will not be equal in any case. That is to say, it is not just temperature that will change, but also many other attributes including, but not limited to: albedo, precipitation, humidity, atmospheric circulation patterns, ocean circulation patterns. Please remember that climate (and even weather!) are highly complex coupled non-linear systems that appear to present as chaotic. Such systems are notoriously difficult to simulate accurately over long time frames, being extremely sensitive to initial conditions.

    BobC, October 18, 2010 at 3:20 pm

    Climate is predicted by Global Circulation Models (GCMs), which predict weather. While the serial prediction of weather does indeed predict a climate, it is fabulously unlikely that it will predict our climate (and you would not be able to tell if it was). This is because the weather is a chaotic system — it’s future track is sensitively dependent on very small differences in initial conditions.

    There is only one way to prove that a chaotic system can be predicted, and that is by demonstration. No amount of back-calibrating or theoretical analysis is relevant. So far, GCMs have demonstrated weather prediction capabilities out to about 10 days. They have not shown any skill at multiple-year climate prediction beyond simple linear extrapolation (curve-fitting).

    All cut-and-paste from this site. We don’t need to say any more.


    Report this

    310

    • #

      Terrific find Tel, well done mate and thanx for the memory lane.


      Report this

      20

    • #
      edwina

      Yes, the Chaos theory has been around long enough for any educated person to realise the futility of forecasting by computer something that involves a myriad of ever changing inputs. The Butterfly effect can be looked at two ways. That is, does a butterfly wing flapping in the Amazon cause a tornado in Kansas or does a tornado in Kansas cause an Amazon butterfly to feel it should beat its wings? Impossible to tell.

      Then, as a USA Secretary of State once pointed out; there are the known facts and the unknowns…(or something like that.) The point is there are very likely many more factors influencing weather and climate than we know about yet and to be discovered. Thus, the “science is not in.”

      There is work on creating quantum computers where atoms or small particles are used as qubits which can be 1 and 0 or both at the same time. Such computers may be able to do more than all computers combined to this day and much more and in a non linear way. Even so, I doubt these could cope with climate forecasting because the old adage GIGO still will apply.


      Report this

      20

    • #
      Roy Hogue

      Ohh Memory Lane

      That nails it up on the barn door for all to see — at least for those who have eyes with which to see it. The rest will go on as usual I fear.


      Report this

      20

    • #
      Mark D.

      Holy crap, that is a scary memory……


      Report this

      10

  • #
    MadJak

    Range and Precision my friends, range and precision.

    They do follow good SCM practices don’t they? After all, replicability should be a primary concern of any scientists work!

    Oh, but what am i saying, that’s right, we’re talking about undisciplined rabble who don’t even want to share their data – let alone even know or want to produce a configuration status accounting report!


    Report this

    90

  • #
    Louis Hissink

    This tends to happen when soft science types wander into the physical sciences and assume machines are perfect.

    Don’t wait for a mea culpa though.


    Report this

    101

  • #
    Antisthenes

    Very good post, i.e. it supports my prejudices and thus I like it. I’m a long term user of computers – my first one was almost the size of my current girlfriend’s car. I use them and don’t try to love them or understand them. Even now, when something goes wrong and I can’t fix it within a reasonable time, say five minutes, I wait for a visit by one of my sons. They, of course, “fix” the problem by changing everything. But I noticed, throughout those years, that my carefully worked out ‘things’ suddenly make no sense when imported from Windows to Linux or vice versa. A mind boggles what happens when the high science of climate predictions gets this treatment. Shouldn’t someone work it out on an abacus? Soon?


    Report this

    30

  • #
    Yonniestone

    Amazing it works!
    I just ran the GMP program through the combination of a Commodore 64 MOS system and the Atari 2600, the result is the planet will experience an atmospheric transition identical to the gases found in Uranus.
    Upon contacting the IPCC they were excited by the results but insisted I add “Intergovernmental” in the description, so now it’s the Global Intergovernmental Model Program or GIMP for short.

    So when evil deniers dare argue the relevance of the IPCC and gases in Uranus all they have to do is bring out the GIMP.


    Report this

    190

  • #
    Rereke Whakaaro

    I was just sitting down to validate some of these models by hand – the good old fashioned way.

    But then I got to thinking, would it matter if I used a wooden pencil? It might be a bit slow, I thought, so I decided to use a modern mechanical pencil to speed things up.

    Well having done my calculations with a Pentel 0.5 pencil, I thought I should verify it, before rushing to publication.

    So I redid the calculations using a Staedtler 0.7 pencil, and got totally different results.

    I have checked my workings thoroughly, and they are correct. All I can think of, is that it must be a hardware error. :-(

    Another bottle of Pinot, I think …


    Report this

    160

  • #
    AndyG55

    It is very strange that they are getting such huge differences between systems.

    We run simulation stuff on a Linux system and on PC’s and while we do occasionally get very slight differences, nothing anywhere near the scale that seems to be indicated here.

    I’m guessing because they haven’t considered enough divergence issues in their coding.

    That would be because they are so full of themselves. Non-scientists are often like that.


    Report this

    50

  • #
    AndyG55

    Chuckle..

    Was just thinking how a building would go designed with one of this type of program but parts done on different systems.

    Guggenheim eat your heart out !!


    Report this

    20

    • #
      MemoryVault

      .
      Didn’t something like this happen with the Airbus A380, where the Germans were using one software program for the wiring, and the British/French were using another?

      One was for aluminium wiring, and one for copper, and the compatibility problems caused months of delays.


      Report this

      10

      • #
        meltemian

        My nephew was sent to Toulouse to sort out the wiring for the A380 wings. When he got there he found they had just inverted the diagram from one wing to use on the other! He had to explain the error of their ways.


        Report this

        10

      • #
        Rereke Whakaaro

        Some of the early NASA engineering had similar problems – some of the structural mountings had a tendency to come loose.

        Eventually, it came down to the holes being drilled with an imperial drill, and the bolts being metric sizing (or the other way around).


        Report this

        00

        • #
          crakar24

          Hahaaa,

          I should not laugh as i have bore witness to numerous aeronautical cockups in my time, the best one would have to be watching the exploding bolts go off on the JAXA plane and therefore separating from the rocket whilst still on the launch pad (man that was funny).

          I am sure we can all remember that space probe thingy drilling into the martian surface due to incompatability between metric and imperial altitude measurments.


          Report this

          02

  • #
    John Brookes

    Wow these computer models are good. Just like the real thing they have uncertainty built in!


    Report this

    234

  • #
    AndyG55

    How does the saying go..

    “To err is human, to really stuff things up requires a computer.”

    or was it.

    “Computers allow you multiply your errors very rapidly.”


    Report this

    110

  • #
    Andrew McRae

    Someone once told me that mathematics was an organised way of going wrong with confidence. It’s one of those jokes that is said only half-jokingly because it has a nugget of truth inside it.

    Even if the physical theory was a perfect model of reality, and even if the formulation of the simulation in math was sound, and even if the math could be implemented with infinite precision on a machine, the result after simulating 30 model years will be too vague to be useful if the initial inputs were not a perfect replica of the state of the entire Sun-Earth system last Tuesday at 9am.

    Kinda makes me wonder if they could dispense with all the expensive computers and just go back to using an abacus. The result would be no less precise but at least there’d be some sport in the whole activity:

    The modern Chinese abacus uses beads on rods. Its use requires some training, but once it is mastered it allows extremely fast and accurate calculations. This was demonstrated in 1946 in a competition between “the most expert operator of the electric calculator” of the United States army and the “champion operator of the abacus” of the Japanese postal service. The contest ended 4:1 for the abacus.

    Actually the result would be even more accurate than with computers. The simulation could proceed no faster than realtime, so after 30 years of flicking beads they could just look outside the window and tell you exactly what the weather was at the end of the simulation. Perfect accuracy! :D

    I jest of course. This is where one must distinguish between precision and accuracy. Precision is internal to a process and is a function of repeatability and a low variance about a mean. Accuracy is judged relative to an external standard. One way to remember it is with a shooting range analogy. If you aim carefully at the target and shoot many bullets in a tight grouping that is several centimetres off-centre, you have got high precision but low accuracy.

    The precision of the abacus and the electro-mechanical calculator relied mainly on operator proficiency. These days the precision of silicon CPUs is perfect aside from occasional cosmic ray impacts with memory registers (it does happen but it’s rare). The accumulated error in consecutive calculations is not really the CPU’s fault because the CPU will do it the same way every time, it’s just in the way floating point arithmetic is defined that the CPU is designed to intentionally throw away precision after each multiplication. It’s a popular design choice, and with good reason. Otherwise the space needed to store a single number would quickly grow to be too large for available memory. The rate of calculation would slow exponentially and grind to a halt.

    The Z3, built by Konrad Zuse in Germany in 1941, had an overflow indicator to tell the user that the calculation exceeded the tolerated inaccuracy of floating point.
    What can today’s climate models learn from Konrad Zuse? :)

    “Uh, Mr Pachauri, Sir, I know it’s only the second day of the first simulation run, but the computer has already stopped and told us it can’t calculate the climate any further ahead than seven years.”


    Report this

    60

  • #

    How the hardware (and software) environment effects precision is something I had drummed into me during my Comp Sci degree. Usually this is not something to get concerned about, as most programs do not ‘feedback’ their results of their previous run into their next run.

    Trouble is, most models, do just exactly that – so in effect a model running a few cycles in mathematical terms is a bit like taking a load of complex formulas and feeding them back into themselves – complete with errors due to measurement and errors due to representation. Then doing this a few hundred or thousand times.

    In this situation, unless you are very very careful. The errors from representation will keep accumulating (little bits of rounding adding up, complete with the odd ‘bug’ in your FPU or some ‘casting’ bug in a library or interface).

    The net result is that very quickly you end up with the output of the model more determined by the sum of all the error factors due to its environment of execution than anything to do with the original starting data set.

    I think one of the real problems here is that most environmental types haven’t the background to understand the significance of errors and rounding when running models. It just wasn’t seen as something ‘significant’ to cover in enough depth to understand the dangers…


    Report this

    120

  • #
    AndyG55

    The thing you also need to remember is that if the models hindcast to Hadcrud or Giss, they are ALWAYS going to overestimate any real future warming because of the false trend built into those 2 records.

    The models have ZERO chance of a correct prediction even if they weren’t written on toilet paper.


    Report this

    60

  • #
    Joe V.

    Never trust a computer.
    Get several of them, all of different designs, to come up with the same answer.
    Even then, they might all be wrong.
    Always remember to validate your tools against the real world, before relying on their output.
    Even then, they may still be wrong.
    A computer and its programs are just tools. Best used by skilled operators who understand their limitations. Not just by other tools.


    Report this

    60

  • #
    Bob Massey

    The truth is Jo, computers can’t calculate mathematical functions without being programmed to do so. The errors come in when these functions such as Maclaurin’s theorem, S plane transforms and Fourier series just to name a few are used to do these calculations.

    This is reasonable considering the functions are quite complex and even a simple series can lead you to consume a small forest if done by hand. A computer is the obvious choice, and by far, a much better choice than a HP programmable calculator which is what I had to use during my Electronic Engineering studies.

    These errors occur because most of these calculations are in the form of a series which means a large amount of recursive operations with different factors for each. The programmer has to then limit his program for efficacy to a number in the series that he would be happy with the result but this is not the true value of the function, it is only a very close approximation. No computer can calculate pi for example it will still be ticking away hundreds of years from now with a result of a decimal number with millions of decimal places but still no true result.

    So these huge calculators the IPCC call GCM’s may be accurate to 20 decimal places but I suspect not and they are at the mercy of the programmer not the climate scientist.

    I do recall 1 calculator I had which when a very large number was divided by 9 and that result was then multiplied by 9 gave a different result to the starting figure. I hope these Climate scientists take that into account when they do there calcs. So rounding errors can be very pronounced.

    I can even recall a huge fuss some years ago, about a certain CPU that had errors in the MPU (Maths Processing Unit) which was integrated into the CPU. All sorts of legal ramifications but it still had a problem.

    This is why the code and data for these GCM’s must be specified when the the various studies are released so experienced programmers and scientists can verify the data and replicate the experiment. True peer review instead of the Pal review we currently have.

    Thanks for the article Jo, I haven’t been back over this stuff for nearly twenty years and it was a challenge to recall as much as I could. keep up the good work :)


    Report this

    40

  • #
    ROM

    After going through the rapidly increasing number of posts from the computer software engineering / coding world denizens on WUWT plus here on Jo’s site, i am starting to ask whether this is the break point for the whole global warming / climate change / climate extremes / etc belief system.

    Over the last few years I have read Steve McIntyre’s. Judith Curry’s, Roy Spencer’s, Richard Lindzen’s, Roger Pielke Sr’s. and many other’s take and comments on climate modeling which were all spread out in various blogs over the last few years.
    Back then even a couple of years ago , there were repeated claims from climate professionals of a skeptical bent that the climate models were far from what they were supposedly cracked up to be in terms of any predictive capabilities.

    But climate catastrophism then reigned supreme and the more skeptical scientists including a few luke warmers as in that list above, were ignored by the arrogant climate warming science modellers who knew they knew it all.
    And if they didn’t know it, they made it up, safe in the knowledge that nobody would have the know how or cojones to challenge them in their political and science power and influence

    But this time around the piling on to the abject standards of the climate models by the software engineering proffessionals is astonishing. Almost as though they have been building up a big head of steam about the abject quality of climate models and the incompetency of climate modellers and have finally reached a breaking point as proffessional coders and software engineers and needed to get it off their chests.

    The Korean paper’s origins itself .An Evaluation of the Software System Dependency of a Global Atmospheric Model is interesting as it has come from outside of the centres of the CAGW faith in western Europe, America and Australia.

    So it has looked at global warming faith and it’s creation and origins and most importantly,it’s basic underpinning theoretical support structure, the backing of and under writing of the entire global catastrophism belief by climate models through a fresh set of eyes from a different culture.

    The Korean paper has provided the software engineers and coders the excuse to tear down the entire climate modelling profession on a strictly proffessional basis as modellers and coders. They are now in the process of showing just how abjectly bad those climate modellers and their models and the coding and software engineering of those climate models really are.

    The consequences of this could well be very severe in the climate modelling game as those now known to be virtually useless, incompetently coded climate models and the predictions emanating from those models are the only and entire basis on which the entire global warming faith is founded and the entire and only basis and rationale on which all governmental so called de-carbonisation policies have been based..

    Those climate models and their predictions are the entire basis on which the expenditure of perhaps close to a trillion dollars of global wealth, the implementation of the so called and totally ineffectual de-carbonisation programs have made to the increasingly desperate financial situation of many western nations due in considerable part to the immense subsidisation of the so called renewable energies, the immense human suffering created by high cost energy and the need for low income earners in some countries including Australia, to make a decision whether to “heat or eat” and the immensely inflated and totally unnecessary costs of energy across the western world have all been implemented as governmental policies.
    All of these nefarious effects have been a consequence of the supposed predictions and outputs of what we are now being told by professional coders and software engineers, are completely and incompetently coded climate models and therefore effectively worthless in their predictions.

    Regardless of what happens from now on, past history plus so many past over the top and ultimately stupid claims and events of supposed climate induced catastrophes all based entirely on the output of climate models, and the increasing analysis of the major and basic coding failures has hung climate modellers and their models for evermore with reputations that they are basically incompetent at coding, that their models have never and possibly will never be able to predict the future climate both globally and regionally. [ which Pielke Sr says, regional climate modelling is even worse in it's predictive capabilities than global climate modelling. CSIRO anybody?! ]

    Due to past rigidity and past and increasingly well documented historical climate science corruption plus a very, very large dose of arrogance and hubris on the part of the climate warming scientists and their now known to be incompetently modeled catastrophic predictions of a global warming armageddon in the not very distant future unless we “did something” then climate warming science is facing it’s own moment of truth, something that in all their arrogance and hubris, the cultists of the global warming faith and the water melon believers never thought they themselves would ever be the ones who had to do the explaining as to what and why it all went wrong.
    And why the world and so many of it’s citizens had to pay such a high price just because a bunch of incompetent climate warming believing the very highly paid software coders didn’t really have a clue about the correct way to code the critical climate computer programs that were going to predict the climate of the world of the future.


    Report this

    101

    • #
      Streetcred

      Great discourse, Rom. “professional”


      Report this

      10

    • #
      John Brookes

      Those climate models and their predictions are the entire basis on which the expenditure of perhaps close to a trillion dollars of global wealth

      I know its rude to pick on an individual statement while ignoring the “broad brush” feel of what you are saying, but I’ll be rude.

      Of course the trillion dollars of global wealth is not based entirely on the climate models. Really simple models predict warming. Really ordinary thermometers are measuring the warming. And clever people measure the concentration of CO2 in the atmosphere. Satellites measure the declining summer sea ice minimum, and measure the sea level rising.

      If abandoning fossil fuel burning was not so difficult, we’d happily say that there was enough evidence to stop burning them. But it is difficult, and there are vested interests who don’t want us to stop burning fossil fuels. So we have to do more work to prove that there really is a problem – hence the more complex models.


      Report this

      111

      • #
        Rereke Whakaaro

        John,

        As usual, you miss the main point.

        The climate is varying — it always has, as far as we can tell, and it probably always will; and the level of atmospheric CO2 is varying, and has done so in the past, and probably alway will.

        Now this presents three possibilities: a) Changes to CO2 concentration drives temperature; b) Variations in global temperatures force changes in the level of atmospheric CO2; c) There is no causal relationship between the level of atmospheric CO2 and temperature, or temperature and the level of atmospheric CO2.

        Climate Science, and Political Science, have strongly supported option “a”, not because there is any empirical proof — there isn’t, but because of the Precautionary Principle. It is the only one of the three options that mankind can take some form of action on. And once you assume that there is a problem, and once the voting public think there is a problem, then politically you have to be seen to have a solution to that problem.

        Going back to the other two options: There is some evidence for option “b”, in the ice core records, which show that the level of atmospheric CO2 tends to follows the temperature trends by around 700 years. But since mankind has, to date, found no way to control temperature (or any other significant part of the climate), there is no political milage to be gained from following that line.

        Similarly, there is no political up-side with option “c”, only the endless need to respond to the vagaries of weather as and when they occur, which can never be done quickly enough to meet public expectations, and answer the question, “Why isn’t the Government doing more … ?”.

        So, once Hansen, et al, had created the possibility of a catastrophe — firstly with fears of a new ice age in the 1970′s, and then with a fear of the climate overheating — the politicians had no alternative, but to get on board and put more funding into climate research. In fact, they not only climbed on board, they positively jumped at the opportunity, because they had something to blame for their own lack of action. In return, the climate scientists got funding, upon which a few careers have been built.

        The reality is almost certainly option “b” because that is where the empirical evidence points. But in the ’20th and ’21st centuries, public opinion, and the political alignment with public opinion, finds the idea of a catastrophe to be much more titillating, than the knowledge that our 70 year lives are not really significant when compared to 700 year natural cycles.


        Report this

        11

  • #
    AndyG55

    This is why VALIDATION is so important..

    And NOT ONE of the climate models has EVER, EVER been validated.


    Report this

    141

    • #
      Manfred

      I was just about to mention this AndyG55. In fact, there have been a number of articles that highlight the point – showing us that the models never acquired face validity – ever. Didn’t hindcasting confirm this? Now we know (again), that the natural chaotic melange that is climate defies long term prediction of its nature on the one hand, and through the tools on the other, except of course Rereke’s pinot soaked chicken entrails.

      It has always been, is and will always be a sociopolitical, eco-theocratic belief driven, poverty creating totalitarianism. It should be spurned much as one would spurn a rabid dog.


      Report this

      60

      • #
        Grant (NZ)

        It makes the comments from some of the believers on this forum look even more desperate.

        “The models predicted… And they were right”.
        “According to the models… and this has proven so.”

        1. You need a hole in the head to believe any prediction these models have made (ever). and
        2. You need a 2nd hole at right angles to the first, to believe that reality has turned out to be anything like what the models predicted.


        Report this

        20

        • #

          One of the fundamental problems is that the real climate system was supposed to be controlled by the climate simulations. Unfortunately, the real climate system was so unimpressed with the quality of the climate simulations it refused to cooperate with them and did its own thing has it has done for nearly four billion years. Ignoring man and his computational toys all the while.

          The modelers assumed the problem was an inadequate computer so they asked for many billions of dollars for better computers. They got their better computers but the climate system still refused to cooperate. So the modelers did what modelers do and asked for still better computers costing a lot more money. The modelers repeat endlessly and get the same lack of cooperation from the climate system. This so far past insane, there are no numbers large enough to measure the distance – even a number as large as the US national debt.


          Report this

          20

  • #

    Having written (notionally much simpler) structural analysis programs, from first-principles and without resorting to fancy libraries, I can vouch for the magnitudes more care required when doing any numerical analysis that relies on incremental changes.

    You find that (A + B – C) doesn’t exactly equal (A – C + B) if the magnitude of the values is substantially different … in my case for example, fractions of a millimetre and tens of metres.

    Compilers that turn the programmer’s expression into a sequence of machine instructions will optimise differently for a different processor. The number of floating point registers varies substantially from one system to the next. Fro mas few as ZERO to many thousands (in array processors of e.g. graphic cards); even on desktop system. Compilers will take into account what variables they dealt with previously and try to keep them loaded into the processor’s registers as long as possible if the variables are used frequently; and re-arrange the order of actual calculations, obeying the traditional rules of (e.g. commutative) arithmetic.

    My earlier remarks on the subject are over at WUWT.


    Report this

    40

  • #
    blackadderthe4th

    Just how good are NASA climate/weather models?

    ttp://www.youtube.com/watch?v=SfN9F9dvjaQ


    Report this

    019

  • #

    Just goes to show one that this hyped AGW is BS. See my take on the Pres’s climate speach: http://bit.ly/17MBPYk


    Report this

    20

  • #
    Escovado

    Being a software engineer for the past 30 years, I have to shake my head and chuckle. The previous posts in this thread by others in my profession have already covered anything I would have to add. The incompetence and dishonesty of these so-called climate “scientists” never ceases to amaze me.


    Report this

    80

  • #

    It was all bloody obvious from the get go that it was a baseless fraud. However the drive to “do something” overcame reason and the nose dive into the ground began decades ago. I wonder what inscription the ET’s will use on the historical marker beside our memorial crater.

    How about: “Here lies a terminally stupid and remarkably clever race of creatures who met their end because they had to DO SOMETHING in spite of not knowing what to do. This even though they knew that do nothing, wait and see, then adapt to circumstances was generally the better choice.”


    Report this

    50

  • #
    alan neil ditchfield

    I once calculated the Cartesian coordinates of 540 points of a geodesic dome with a FORTRAN program. I then proceeded to confirm them with an EXCEL spreadsheet. The computation involves thousands of complex trigonometric equations of spherical geometry, supposed to have exact results. I got unexpected random numbers. Further scrutiny show there was different treatment of quadrant definition: in FORTRAN an arc of 0.00000001degrees is interpreted as zero, and defined as start of the first quadrant; in EXCEL first quadrant is counted from 0.00000001to 90 degrees; a world of difference in rounding errors. Taken into account, EXCEL confirmed the FORTRAN calculation.


    Report this

    40

  • #
    Ed Caryl

    So, the GCMs only model the climate inside the computers, not in the outside world. Suspicions confirmed!


    Report this

    130

  • #
    graphicconception

    I have always thought that the greatest achievment of climate science was to locate the position of every butterfly on the planet.


    Report this

    50

  • #
    Ace

    Theyve got a long way to go before the Matrix comes true!


    Report this

    30

    • #
      Bulldust

      Skynet might be a tad closer … maybe one of the models will go rogue and take over a drone army! Hey, about as realistic as any of their predictions…


      Report this

      20

      • #
        Ace

        I think Metropolis was uncannily prescient.
        Those worker guys slaving to keep up with the machines. Aint that modrn life as we know it?
        Actually far more true to life than the over-boasted Bladerunner.
        We just need a glamour-bot to save us.


        Report this

        30

  • #
    Bruce

    Rounding errors are a well-known problem in complex computer calculations.

    As expected, this is not well-known among climate scientists.

    But we knew we were dealing with a monkey farm.


    Report this

    30

  • #
    janama

    OT – just to keep you up to date: In my local paper here in Dubai:

    Seven residents have received tips from the former United States vice president, Al Gore, in how to convince people of the need to combat climate change.

    Timothy Paul, Allyson Sosa, Hema Shetty, Clay Gervais, Clara and Lisa Harter and Subind K joined more than 400 delegates from 77 countries at a three-day training session in Istanbul organised by the Climate Reality Project, which is chaired by the American environmental campaigner.
    All delegates spent a day in training with Mr Gore and received detailed advice from him on how to most effectively present the latest in climate science

    Read more: http://www.thenational.ae/news/uae-news/environment/uae-ambassadors-for-the-environment#ixzz2aNN3s5JA
    Follow us: @TheNationalUAE on Twitter | thenational.ae on Facebook


    Report this

    30

  • #
    crakar24

    Well you dont need a computer model to understand this

    http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/seaice.recent.antarctic.png

    and we still seem to be circling the bowl at the other end with a slight comeback (maybe the wind blew the other way the last couple of days)

    http://www.ijis.iarc.uaf.edu/en/home/seaice_extent.htm


    Report this

    33

  • #
    WheresWallace

    As a programmer, “like der”!


    Report this

    16

    • #
      crakar24

      As a programmer, “like der”!

      WW i seem to recall a number of computer models predicting the exact opposite hence my comment would you care to engage me in meaningful debate on the subject or are you merely interested in childish tamtrums?


      Report this

      34

      • #
        WheresWallace

        My comment was chronologically after yours, not a reply to it. Stop being so insecure.

        But if it’s engagement you’re after.

        Artic Sea ice decline for 30 years and accelerating beyond all model projections.
        Antarctic Sea ice, a slight incline over the same period, thought to be more related to changes in wind patterns according to the science.
        Antarctic Total Mass Balance also in decline.

        So what’s your point? Or by debate do you mean cherry picking a small bit of evidence?


        Report this

        12

        • #
          crakar24

          Oh come on Wally you and your ilk sit in the balcony and snipe all the time, your latest comment was another stupid cheap shot “as a programmer like der”.

          This statement actually implies you are a programmer however after serveral conversations with you i have come to the conclusion that you could not possibly have the intellect to write software programs even at the elementary level required for climatology.

          Re your latest purile purge

          Artic Sea ice decline for 30 years and accelerating beyond all model projections.

          So the models are wrong, no dont give me that bullshit “its worse than we thought” the models are wrong….useless…..waste of time and money. If the models are wrong then how do we know the programmers understand the concept? It is possible they just got it wrong because they have no idea what they are modelling rather than “oh they just erred on the conservative side”……..since when does a climatologist err on the conservative?

          Antarctic Sea ice, a slight incline over the same period, thought to be more related to changes in wind patterns according to the science.

          I just posted a pic from NOAA proclaiming Antarctic sea ice is at an all time high for this time of year obviously you mentally blanked it out so here it is again.

          http://arctic.atmos.uiuc.edu/cryosphere/IMAGES/seaice.recent.antarctic.png

          Note the extent here Wally and then reread the bullshit excuse you make for this whilst simultaneously telling

          Antarctic Total Mass Balance also in decline.

          Its time you came up with a new bag of tricks Wally, maybe a few new excuses? Instead of just believing the shit you are fed, you have been given a brain, my god man use the f*&^%ing thing.


          Report this

          22

          • #
            WheresWallace

            So the models are wrong

            Always will be – like der! That’s not good news by the way. Accuracy ni modelling would help know the impacts of our actions better.

            I just posted a pic from NOAA etc

            Yes, I acknowledged that when I a;so repeated that “Antarctic Sea ice, a slight incline over the same period”.

            But I also pointed out that you are cherry picking one small piece of evidence and disgarding all others.

            When faced with the fact that Antarctic Mass, which include land based ice, is in decline; rather than “engage me in meaningful debate” you produce insults like “childish tamtrums”.


            Report this

            21

        • #
          ghl

          “…accelerating beyond all model projections.”
          Rotten models all the way down.


          Report this

          00

  • #
    john robertson

    Surely this is the design feature of these vague, unverified exercises in programming.
    Accuracy is to be avoided when you are making stuff up.
    This has always been the joke, the absolute certainty, with thousanths of a degree accuracy from made up and assumed linear depictions of natural cycle, that we do not understand.
    So much more “on message” to just make the results up, average of the models???
    More BS from the , maths is hard crew.


    Report this

    30

  • #
    michael hart

    I guess all this might explain why Phil Jones doesn’t spend much time alone with Excel (not that I can brag about it).


    Report this

    40

  • #
    Scott Scarborough

    Yes, Willis Pointed out on WUWT that the computer climate models can be replicated with a single line equation.

    http://wattsupwiththat.com/2013/06/03/climate-sensitivity-deconstructed/

    That equation probably, in some way, just represents the rounding error of the hardware and software that the computer model was run on. Somewhat simpler than the actual climate.


    Report this

    10

  • #
    pat

    Tel -

    Tech sites should have tackled Climategate from Harry’s view. instead, we got tech blogger at the UK Tele, Ian Douglas, doing a bit of gate-keeping:

    December 2009: UK Telegraph: Ian Douglas: Harry_read_me.txt: the climategate gun that does not smoke
    I wonder how many of the harry_read_me cheerleaders have really gone through it, rather than just reading a few cherry-picked quotes from sceptical blogs. Plough through the source and one thing becomes clear: this is not someone trying to massage data or cheat the scientific community, this is Harry heroically manhandling a huge, diversely-formatted and randomly error-strewn set of numbers into a coherent and understandable from. That’s not a disgrace to science or an anti-social conspiracy, it’s a definition of model-making…
    Comment by starfiregps:
    Lord Monckton says it so much better –
    “Ian Douglas joined the Telegraph in 1999 ….. He is a bedwetter”.
    Actually, I think he’s looking for a job at the BBC…
    http://blogs.telegraph.co.uk/technology/iandouglas/100004334/harry_read_me-txt-the-climategate-gun-that-does-not-smoke/

    the above is not surprising, given this was Douglas just weeks before:

    September 2009: UK Telegraph: Ian Douglas: A grown-up response to climate change
    The decision to take the climate science for granted in this report might attract some criticism from followers of Ian Plimer, author of the milion-selling but ludicrous Heaven and Earth, bible of the denial camp…
    http://blogs.telegraph.co.uk/technology/iandouglas/100003521/a-grown-up-response-to-climate-change/

    Ian Douglas is the Telegraph’s head of digital production. He’s the male cuckoo in the Telegraph Wonder Women nest, writes on science and technology then sneaks off to his beehive in north London and writes about that too.
    http://www.telegraph.co.uk/journalists/ian-douglas/


    Report this

    10

  • #
    Ian H

    Um guys. You are in danger of making too much of this. Really!

    Even a godlike computer with insane accuracy and a perfect description of the climate taking all factors into account would behave in this way. Tiny roundoff errors and small changes in initial conditions would lead to dramatically different predictions. It is like this because that is how the real climate behaves. It is a chaotic system. That is what a chaotic system does.

    The different roundoffs, although they cause very different looking predictions, are unimportant. You could get the same behavior via a tiny variation in the initial conditions (indeed there is a theorem which tells us this). And we don’t know (cannot know) the initial conditions accurately enough for the differences here to matter.

    This result isn’t at all surprising. It is expected. It doesn’t invalidate the models – the real climate also behaves this way – it is exquisitely sensitive to tiny perturbations.

    The problem from my point of view is that the models have been oversold as accurate predictors of future climate. They have been used as black boxes to generate scary looking graphs. And the public has not been informed of their limitations.

    Actually this is funny. To those who understand what the models are actually supposed to do then this is simply expected behavior. But the public has been allowed to think that the models are godlike black boxes for predicting the future. Now those who write models have the unenviable task of properly explaining to everyone just how limited their models actually are. That will be fun to watch.

    The proper test of these models is to look at whether they predict the correct dynamic behavior. Do the models display El Ninos and La Ninas. Does the southern ocean oscillation appear spontaneously. What about the mini ice age and warming periods like the 1930s. Do they predict episodes like these. Models that don’t display the correct dynamic behavior are invalid and should be discarded. Most models fail this test. Yet people are still using them to scare people by pretending the graphs they generate say something real.


    Report this

    50

    • #
      John Brookes

      At least you make sense!

      With weather forecasting, the bureau runs many different simulations and gets all sorts of different results. They take all the results into account when deciding what to actually forecast. I imagine they discard models that consistently perform badly.


      Report this

      07

      • #
        Streetcred

        By way of observation, jb, the “bureau” clearly does not “discard models that consistently perform badly.” Their weather predictions, even 24 hours out, usually suck. From what I read they use the same models for climate prediction … they should “discard models that consistently perform badly” like all of them maybe ?

        Dr Tisdale shreds the UKMO on many things including climate models:

        CLIMATE MODELS

        Let’s discuss another of the boldfaced quotes from the Executive Summary of the UKMO report. They wrote:

        The fundamental physics of the Earth system provides the basis for the development of numerical models, which encapsulate our understanding of the full climate system (i.e. atmosphere, ocean, land and cryosphere), and enable us to make projections of its evolution.

        We’ve illustrated in a number of posts over the past few months that the climate models prepared for the IPCC’s upcoming 5th Assessment show no skill at being able to simulate:

        * Global Land Precipitation & Global Ocean Precipitation
        * Global Surface Temperatures (Land+Ocean) Since 1880
        * Greenland and Iceland Land Surface Air Temperature Anomalies
        * Scandinavian Land Surface Air Temperature Anomalies
        * Alaska Land Surface Air Temperatures
        * Daily Maximum and Minimum Temperatures and the Diurnal Temperature Range
        * Hemispheric Sea Ice Area
        * Global Precipitation
        * Satellite-Era Sea Surface Temperatures

        And we recently illustrated and discussed in the post Meehl et al (2013) Are Also Looking for Trenberth’s Missing Heat that the climate models used in that study show no evidence that they are capable of simulating how warm water is transported from the tropics to the mid-latitudes at the surface of the Pacific Ocean, so why should we believe they can simulate warm water being transported to depths below 700 meters without warming the waters above 700 meters?

        The climate science community may believe they understand “the fundamental physics of the Earth system”, but the performance of their models indicate their understandings are very limited and that they have a long way to go before they can “make projections of its evolution”. If they can’t simulate the past, we have no reason to believe their projections of what the future might hold in store.


        Report this

        00

      • #
        AndyG55

        “I imagine they discard models that consistently perform badly.”

        Can’t be many left , then !


        Report this

        00

  • #
    pat

    heard a lengthy radio report last week (bbc i think) on the uncertainties inherent in Hawk-Eye technology, with calls for a disclaimer each time it is used. it’s amusing how seriously MSM takes sport reporting, yet i can’t recall such IN-DEPTH reporting over the uncertainties in the CAGW modelling:

    8 July: Guardian: Hawk-Eye at Wimbledon: it’s not as infallible as you think
    But how accurate is Hawk-Eye? A paper published in 2008 in a journal called Public Understanding of Science suggests that the way Hawk-Eye analyses are presented in sport may lead people to incorrectly assume that its output is definitely what happened. Hawk-Eye presents a great opportunity to discuss uncertainty, confidence intervals, and the joy of stats, so here’s a Monday morning maths class…
    The Hawk-Eye paper suggests that although stats are tricky to understand (and it should be pointed out that scientists can fall foul of misunderstanding or misinterpreting them too), it’s easier to understand uncertainty when there’s a burning interest in being able to do so…
    Cameras cannot record every moment of the ball’s flight, due to frame rate limitations, so between frames the trajectory of the ball must be estimated. With regards to cricket, where LBW calls are questioned, Hawk-Eye extrapolates beyond where the ball hits the pad, and predicts whether it would have hit the stumps or not.
    A model’s ability to predict the future path of a ball depends on a number of factors. The further a ball travels before it stops, the easier it is to predict where it would have carried on to. Therefore Hawk-Eye is likely to be less accurate the further towards the batsman the ball bounces, and the further away the batsman is from his stumps. Though Hawk-Eye technology takes some uncertainties into account, its purpose is to give a binary outcome: “out” or “not-out”…
    ***The article suggests that more information about that uncertainty should be reported to the television audience, to more honestly show the variation in the possible true paths of a ball. For example they suggest showing a ball’s predicted location, and the confidence intervals that surround it (if 95% this would mean there’s only a 5% chance the ball actually fell outside this larger area)…
    If Hawk-Eye could provide a measure of uncertainty around its prediction, it wouldn’t make its decision any more controversial, argues the article. The results could aid the umpire, even if the margin of error for the technology is reported and explained…
    http://www.guardian.co.uk/science/sifting-the-evidence/2013/jul/08/hawk-eye-wimbledon

    30 May: ABC: AFP: Scientific eyes may drop the ball
    In a commentary appearing in the journal Nature, science journalist Nic Fleming says it is misleading to promise fans that the technology would eliminate errors like the denial of an apparently legitimate goal by England midfielder Frank Lampard in the 2010 World Cup…
    Many people would believe they were seeing an accurate, real-time snapshot of what happened, when in fact the images are a computer reconstruction of an array of two-dimensional pictures, he says…
    ***He argues for a disclaimer on the graphics so that people will understand there was a scientific margin of error to consider…
    “This would help people gain a clearer understanding that science is usually based on probabilities, not definitive answers,” he says.
    “Based on this we could have much better discussions about the role of science in public debates around climate change, nuclear power and genetic modification, for example.”…
    http://www.abc.net.au/science/articles/2013/05/30/3771096.htm


    Report this

    00

    • #
      Ace

      “Many people would believe they were seeing an accurate, real-time snapshot of what happened, when in fact the images are a computer reconstruction of an array of two-dimensional pictures, he says…”

      You can apply that to brain imaging and amplify it by ten…or a \hundred. Th phrase “blindd by scinc” (ie, bedazzld by pseudo-scinc) coms to nind. It should b called “brain scamming”.
      No brain image thatwas vr mad actually depicts th thing it represnts. Its purely an artificial way of represnting numerical data gatherd by th scanning procss. But here we ar bing shown thse things as supposdly photographs of the brain…neigh (down boy) …of the “mind” ven.

      (No “e’s” were forcd to participate that didnt want to in the above commnt).


      Report this

      00

  • #
    Roy Hogue

    On the other hand, don’t be too hard on those poor computers and their software. If all you had to work with were the numerals 0 and 1, you might have as much trouble getting the right result as the computers do.

    Remember, if god had intended man to use binary we’d all have only two fingers, one on each hand. No! Binary isn’t natural at all. So just keep that in mind and be grateful that you have computers to do all that hard work. ;-)


    Report this

    20

  • #
    crakar24

    I played Battle Field 3 the other night and all the roads had suddenly become a greenish color, a quick google search told me the latest windows 7 update to be the cause.

    Imagine if a simple windows update can cause a change in color of a game then no wonder a climate model gives a different result depending on what PC you are using.

    When you consider our Dandy Handy calculator had to resolve the temps down to 3 decimal places so as to get a seperation of results between IPCC scenarios and now this……one has to question just how good these models are.


    Report this

    23

    • #
      Dave

      Crakar 24,

      I think BOM may be using Battle Field 3 for the climate models and predictions?


      Report this

      11

    • #
      Roy Hogue

      Crakar,

      There ain’t no such thing as a simple Windows update. Be glad that most of the time the updates don’t make your computer crash.

      I have learned to be very careful about letting Microsoft update hardware drivers in particular. They don’t always have it right. Case in point: I have an HP scanner for which Microsoft offered me an updated driver. I installed it and it promptly broke the scanner. Since it didn’t offer an uninstall option I had to go back and re install all the scanner stuff from scratch.

      Graphic drivers are another source of headaches. If you’re having no problem with what you have right now, then don’t go installing new drivers. The theory is, if it ain’t broke, don’t fix it.


      Report this

      00

  • #
    Safetyguy66

    I was at a project planning meeting on Friday, the meeting was about the construction of an irrigation scheme in NE Tas. One of the groups at the meeting was the planning authority responsible for environment considerations (State Gov.).

    We were discussing a proposal for a pump station on a river bank. They started talking about positions for the pump station with consideration being given to ensuring the station was positioned above the “100 year flood mark”. I couldn’t resist and asked the question “so is that the 100 year flood mark including or excluding climate change”?

    The answer kind of surprised and delighted me all at once. It was basically, “yes we have calculated the 100 year flood mark and also the 100 year flood mark including a factor for climate change, the difference was 600mm”. So even in the highest echelons of the Tas. State Gov. environment planning dept. (a dept. not known for its… um… even handed views on these matters) they have a maximum factor of 600mm over the next 100 years as their provision for rising water levels and whatever else they figure would raise the 100 year flood level.

    Its a long way from Tim’s predictions of the “under water opera house”. Just interesting that its quite a realistic figure really, its kind of “within the margin of error” really and nothing more…

    I smugly sipped my Govt. tea and enjoyed the rest of the meeting in between drifting off on how boring it was lol.


    Report this

    71

  • #
    Peter Q

    The traditional programming languages on IBM mainframe systems had very clear definitions of how calculations were sequenced (parentheses and implicit parentheses) and how rounding was handled. You also had other ways to make sure you were minimising the extent of truncation and rounding.
    Regrettably the modern programmer seems to just assume that the plethora of similar but different programming language implementations running on a multitude of slightly different operating systems, all running on a disparate range of hardware will magically produce the same results…


    Report this

    10

  • #
    janama

    OMG – OT – the latest lefty dreamers wank – they are out in force with lines like – “That’s the way to go!”

    http://www.solarreviews.com/blog/Ivanpah-Steams-Ahead-5-27-13

    “The giant Ivanpah solar electric generating system (SEGS) in the California desert is more than 92 percent complete as BrightSource continues construction on the behemoth concentrating solar power (CSP) system, which generates electricity via a steam turbine. Once complete, its three towers will provide a combined 377 megawatts of solar generation for Southern California Edison and Pacific Gas & Electric under long-term Power Purchase Agreements (PPAs).

    When Ivanpah is finished, it will be among the largest solar plants in the world. This solar system will likely be the first major CSP system to come online in the U.S., although SolarReserve’s 110 megawatt (MW) Crescent Dunes Solar Energy Plant located near Tonopah, Nev., also is slated for completion toward the end of 2013. The SolarReserve system has an added component, thermal storage, which will allow it to provide power on par with baseload power from a coal or natural gas-fired power plant.

    sure, I’ll believe it when I see it.


    Report this

    30

    • #
      MemoryVault

      .
      SITUATION VACANT – MIRROR POLISHER

      Must be able to polish 180,000 mirrors per week.

      Polishing must be carried out without making physical contact with the mirrors, as this may affect their alignment.

      .
      Please Note: 149,546 of these mirrors are not accessible by vehicle, so cleaning will be limited to the use of hand-held tools.


      Report this

      50

      • #
        Heywood

        Aahhhh. One of the many ‘Green’ jobs that have been promised.

        At least AAD’s kids will have meaningful work.

        (AAD is Michael – Arrogant Annoying Dickhead)


        Report this

        31

        • #
          janama

          How is it that they always make two claims – 1. that it will power so many houses – which is total BS unless people can live with power only from 9am – 3pm and 2. It will create so many jobs. – temporary labour jobs at most.
          For this waste of time and money the government has given them a $1.6 billion loan!


          Report this

          40

          • #
            Carbon500

            We get the same nonsense in the UK about how many houses a wind turbine will supposedly generate electricity for.
            At the moment, there are around 4,000 operational turbines which generate about 4% of all the electricity generated in a year over here.
            Not the sort of figures we’re likely to be given by the windmill operators.


            Report this

            20

      • #
        Streetcred

        Battery powered handheld tools ;)


        Report this

        20

    • #

      I am utterly sick and tired of slack journalism that doesn’t bother to check facts, but then again, they wouldn’t have the faintest idea of what to look for anyway, so they just spout gems like this, thanks janama:

      The SolarReserve system has an added component, thermal storage, which will allow it to provide power on par with baseload power from a coal or natural gas-fired power plant.

      What doesn’t help is that the info put out by the people who propose plants like this is obscure to those who read it, because they cover facts with meaningless jargon that says one thing and means another thing altogether.

      So then, let’s do a work up and actually show what some simple checking can actually find, shall we.

      You know how I say that everything works backwards from the generator, to the turbine which drives it, to the steam that drives the turbine, to the heat to make that high temperature high pressure steam.

      So, a monster nuke with a 1200MW generator needs a monster turbine to actually turn over the huge weight of the rotor in that generator at the required speed. Humungous amounts of steam drive that turbine, this steam from the immense heat generated by the nuclear reaction.

      The Chinese are now driving the same sized generators from coal fired sources, the new technology USC plants, and burning considerably less coal in the process of doing that.

      Old style 70′s technology (like the biggies here in Oz) drive 660MW generators, because that’s the biggest one (by power rating) they can attach to the turbine, because that’s linked to the steam that they can make to drive that turbine.

      See the point here?

      It’s all dependent upon the most steam they can make to drive the largest turbine for that steam, hence hooking up a matched generator to that turbine.

      So then, now we come to this Ivanpah Concentrating Solar plant that the journo says will compete with 24/7/365 coal fired power because it has heat storage.

      They add some blah blah blah to impress the gullible punters readers, who nod their heads and say well done you guys.

      Steam – Turbine – Generator.

      Okay then, here’s the turbine, the middle step of all that, based solely around the steam this plant can make.

      It’s the Siemens SST-900, info at this link. (pdf document 4 pages)

      Now, note at the top it says ….. up to 250MW, and in the intro under that, says this turbine can be used for all those types of plant, so that up to 250MW means this turbine can be used to drive a 250MW generator, provided the steam can be made for that purpose, eg fossil fuel plants. What you see at this link is just the turbine itself.

      So now let’s go and look at the blurb for the Ivanpah Concentrating Solar Plant, shown at this link.

      This indicates that the plant will be using that SST-900 turbine, and in this case it will be driving a 130MW generator, because that’s all the steam it can make.

      So, there will be three of them, hence 390MW.

      But hey, this plant can compete with coal fired power on a 24/7/365 basis.

      Yeah, right!

      So 390MW X 24 X 365.25 gives a theoretical (100% of Capacity) yearly output of 3,418,740MWH.

      Note at that blurb the Electricity Generation, where it gives the yearly total of 1,079,232MWH.

      That gives this plant a Capacity Factor of 31.56%. Now, as that is the total across the whole year, then that equates to a daily average of 7 hours and 34 minutes.

      Now, you tell me. How close is that to 24/7/365?

      Not within a bull’s roar.

      Keep in mind that’s the year round average so higher in Summer, and lower in Winter, probably down to around 5 hours on a bright and sunny Winter’s day.

      Competitive with coal fired power. Well, you be the judge.

      Oh, and one last thing, where the journo claimed competitive because it has thermal storage, well look at tht last line on that blurb fact sheet where it says this.

      Thermal Storage
      Storage Type: None

      Oh dear! Fail.

      Also, there’s 173,500 heliostats, mirrors mounted on tables that track the path of the Sun across the sky, umm, every mirror table driven by an electric motor, so, umm sort of subtracting from the power the plant delivers. Spread over three separate plants covering 3,500 acres. (5.5 Square Miles)

      And all this at a bargain basement price of only, umm, $2.2 Billion.

      Need I add ….. only in California.

      Tony.


      Report this

      80

      • #
      • #
        Heywood

        Well written Tony.

        Thermal Storage
        Storage Type: None

        It is also worthy of mention that even if it did have thermal storage, then a fair chunk of the solar energy harnessed by the plant would need to be diverted into heating the molten salts required to sustain production afetr the sun goes down. This has the effect of reducing the output of the plant during the day.

        The stored heat would also slowly degrade during the night and thus, the output from the plant would slowly decline until it can no longer produce enough steam to drive the turbine at all. In the mean time, the coal fired plant will keep chugging along at the same rate, regardless of the time of day.

        How they can say that a plant such as this is capable of reliable base load is beyond me.

        If we hypothetically replaced 100% of Australia’s generation with these plants (unrealistic, I know, but indicative of how ludicrous it would be), a quick “back of the envelope” calculation reveals the following;

        In order to meet the minimum Australia wide MINUMUM base load of 18,000MW, and assuming that we are in full sunlight at that moment in time across the country, we would need to build a minumum of 47 of these plants, at a cost of US $103 Billion, taking up an area of 164500 Acres (665 Sq km).

        Remember, this is based on the THEORETICAL maximum output of this plant. It doesn’t take into account maintenance, dusty mirrors, broken tracking motors etc., and it only refers to daylight hours at minimum base load.


        Report this

        20

        • #

          Thanks Heywood.

          Here’s a link to an image of a graph explaining how solar energy generates electrical power for a Concentrating Solar Plant with heat diversion.

          The actual total power generation is shown on the tight hand side Y Axis (vertical). Note how that with heat diversion the maximum power is marked there at 50MW, shown by the very bottom solid orange line. This is theoretical, because it hasn’t been achieved yet. Also note the time frame for power generation ….. not the claimed 24 hours but for barely 17 hours, again, theoretical.

          So far, the best achieved was that infamous Plant in Spain which managed a full 17MW for one full 24 hour period. This same plant has a Capacity Factor of 62%, which equates (on a year round average) to just under 15 hours.

          So then, let’s replace Bayswater, shall we.

          For the same Capacity, then, at 17MW, we would need 156 of these plants, and that’s, umm, $101.4 Billion, and you still only get power for 15 hours a day.

          For the same power output, we would now need 190 of them, hence $124 Billion, and you STILL only have power for 15 hours a day.

          If they have trouble finding finance for ONE of these types of plant, (Chinchilla, now folded, and with no heat diversion) even after the Federal and State GAVE them half the up front money, ($750 Millionof taxpayer’s money just given to them) then you tell me how soon ALL these plants will be up and running to replace just the ONE Bayswater, and there’s six plants around the same size as Bayswater here in Oz, so you do the math on that.

          So, just ballparking here, then by the required date of 2025, that’s having one of these plants up and operational at the rate of two of them a week from now until 2025 to replace coal fired power, and you still only get power for 15 hours a day. As you laugh loudly at that, consider that from proposal to the turning of the first sod is between five and seven years.

          So worry not dear friends, your money is safe, because this will NEVER happen.

          I just point out how ridiculous it all is.

          Look again at that THEORETICAL graph produced by a computer model.

          Tony.


          Report this

          00

          • #
            Heywood

            “So far, the best achieved was that infamous Plant in Spain which managed a full 17MW for one full 24 hour period”

            The Gemasolar plant has a whopping 19.9MW (capacity) generator which would be relatively easy to turn over using captured heat compared to a 130MW generator in the Ivanpah plant.

            I wonder if the thermal storage capacity vs heat output is a linear relationship?? ie. if the generator is 6 times the capacity, does the thermal storage have to be 6 times bigger?

            Any thoughts??


            Report this

            00

      • #
        Heywood

        Also worthy of note is the other line in that link.

        Fossil Backup Type: Natural gas

        So when the sun goes down (or behind a cloud), the fossil fuels kick in….


        Report this

        10

        • #

          I specifically didn’t mention it because I just KNEW someone would pick that up.

          So then, this type of plant has no chance to get up here in Australia, as it will be subject to the CO2 Tax, and each year will have a reduced power generation as the government lowers that cap.

          Imagine a much feted renewable plant having to be subject to the ETS. Oh, the bitter irony!

          Tony.


          Report this

          30

          • #

            Incidentally, that now failed Chinchilla Solar Dawn Project was for a 250MW plant without heat diversion. It would have run 5 X 50MW generators, and one of them would have had two drives, the solar power drive and a secondary Natural Gas (NG) fired turbine driving the one 50MW generator.

            However, information provided to me from Solar Dawn was that the NG fired turbine was restricted to just 15% operation (of this one generator). That was to keep CO2 emissions under the proposed Cap of Julia’s top 500 derdy polluders.

            However, under the ETS, every CO2 emitting entity is subject to pay the current price on CO2, then this plant would be virtually inoperable, as the added impost of the CO2 cost would severely impact their profit margin.

            Academic now, as the proposal failed for want of private financing.

            Tony.


            Report this

            30

  • #
  • #
    Ace

    …who here, back in the day, could use them:
    At a guess, Tony from Oz and Rareke Whackero.
    and Eddie thingamajig as well.


    Report this

    30

  • #
    Brian G Valentine

    The whole argument is predicated upon the idea that “climate sensitivity” is a “parameter” with more physical significance than a welfare cheque for a Government-supported “researcher.”

    The most effective way to lower the “value” of “climate sensitivity” is to pay people to prove that it is not within measurement possibility of natural variation if it has significance at all.

    We would witness that value plummet like a stone


    Report this

    50

  • #
    crakar24

    Well heres a turn for the worse (pun intended) for the warmbots, it looks like we may have a reprive and the death spiral is officially on hold until this time next year of course when we are bombarded with the next round of dooms day cultist proclimations about an ice free arctic.

    http://www.ijis.iarc.uaf.edu/en/home/seaice_extent.htm


    Report this

    21

  • #
    MemoryVault

    Totally O/T but I hope Jo and the mods forgive me.

    The USS George Washington, one of the largest warships in the world, plus as far as I can tell, the entire “Forward Deployed Fleet” of some 20 plus ships, is currently in Brisbane, from yesterday until Friday.

    I mention this only to highlight what passes for “reporting” at our taxpayer-funded ABC. The only mention of this huge feet being in Brisbane was buried in a story about four pelicans that got smeared in a slick caused by an oil leak in the harbour.

    The entire story was about the leak and the poor birds. Right near the bottom was a throw-away line by the QLD Minister for Environment, who remarked that he didn’t think the spill would affect the visiting American warships. That was it, and I can’t even quote it for you, because now the ABC have even removed that one, single reference from the story.

    .
    I appreciate that ABC reporters are not big fans of the US military, but surely one of the biggest, most advanced flotillas of warships in modern history parked on our doorstep for a week is at least worthy of a story and maybe a photo?

    I mean, it’s not as if people aren’t going to notice. Even if they miss all the bloody huge, grey ships in the harbour, bristling with weapons, aircraft and helicopters, I’m pretty certain the 10,000 plus sailors on R & R over the week will catch their attention around the place.


    Report this

    50

  • #
    pat

    U.N. rejects call to slash CDM fees, opens regional centres
    LONDON, July 29 (Reuters Point Carbon) – The U.N. last week rejected pleas from project developers to slash fees for emission reduction projects registered under the Clean Development Mechanism (CDM) and instead vowed to plough cash into regional centres to boost the number of schemes in some of the world’s poorest countries…
    http://www.pointcarbon.com/news/1.2487344

    30 July: Bloomberg: Ari Natter: Moniz Touts CO2 Capture, Says Coal in “our Energy Future for Decades
    PHOTO CAPTION: Secretary of Energy nominee Ernest Moniz arrives for testimony before the Senate Energy and Natural Resources Committee on April 9, 2013 in Washington, DC. Moniz, a nuclear physicist, testified before a full committee hearing on his pending nomination…
    Coal and other fossil fuels will be “a major part of our energy future for decades,” Energy Secretary Ernest Moniz said July 29 during a speech in which he said research on the development of clean coal technologies is needed to combat climate change…
    As part of a broad plan to combat climate change laid out by President Obama in June, the administration announced July 1 it is offering up to $8 billion in Energy Department loan guarantees for carbon capture and other environmentally friendly fossil energy projects…
    ***Coal and other fossil fuels provide 80 percent of the country’s energy and 70 percent of its electricity, according to Moniz…
    http://www.bloomberg.com/news/2013-07-29/moniz-touts-co2-capture-says-coal-in-our-energy-future-for-decades-.html


    Report this

    20

    • #
      Ace

      Wouldnt it be funny if after seventy years of “capturing” carbon…some terrorist blew a hole in the sin bin where its kept and the whole lot escaped!

      No I ralise that wouldnt be an actual possibility. BUT, whats more likely is that sevnty years on, fifty years after all this moneys been paid out and forgotten about, its decided theres something else they want to do and deliberately release it all.

      Much like the burning of those forests the other week.


      Report this

      20

      • #
        Allen Ford

        Carbon dioxide panic is old hat! The latest scare is methane, which is zillions of times more effective than CO2 as a greenhouse gas (how I wonder when a zillion times zero is still zero!).

        I recall seeing an article in recent days, but can’t recall where, where a massive eruption of methane occurred with predictable predictions of climate catastrophe. One problem, as the eruption appeared to be natural, what relevance does that have to human antics.

        These guys are incurable worry warts!


        Report this

        30

        • #
          Ian H

          The problem with constructing stories of methane catastrophe is that methane doesn’t last long in the atmosphere. To engineer a catastrophe using methane you must propose that a truly humungous amount of it could be burped into the atmosphere over a very short period of time. It is the speed of release that is the problem for those attempting to spin fantasies of armageddon. While there is plenty of methane stored away, it is hard to see how enough of it could be released quickly enough to cause a problem.

          Warming of the arctic tundra (which is the most often mentioned scenario) is likely to cause a slow and steady release which just won’t do the trick. A direct hit with a large enough meteorite in exactly the right place might do the job. But that wouldn’t really be our fault, and those who are looking to propose catastrophe are desperate that it all be our fault. What is the good of having a catastrophe if you can’t blame it on our wicked ways.


          Report this

          10

  • #
    Ace

    Can someone rcommend a snsibly pricd keyboard that actually works. Im on the third on this year. This latest dsoesnt like the letter “e” or shift or “x” or “c”. and if I dont correct this commnt you’ll get an idea how it looks if I dont painstakingly go over everything inserting missing letters and caps.

    Also, dos anyon hav any ida why…with any keyboard, my PV does not ever register a GBP “pound” sign?


    Report this

    10

  • #
    pat

    29 July: Washington Post: Laura Vozzella: Ad targets Cuccinelli fight with climate scientist
    Titled “Witch Hunt,” the commercial recalls a two-year effort by Cuccinelli, the GOP nominee for governor and a climate change skeptic, to obtain records from Michael Mann, then a U-Va. researcher…
    “It’s been called ‘Cuccinelli’s witch hunt.’ ‘Designed to intimidate and suppress,’ ” a narrator for the 30-second spot says, quoting newspaper editorials from the time. “Ken Cuccinelli used taxpayer funds to investigate a U-Va. professor whose research on climate change Cuccinelli opposed. Cuccinelli, a climate change denier, forced the university to spend over half a million dollars defending itself against its own attorney general. Ken Cuccinelli — he’s focused on his own agenda, not us.”…
    http://www.washingtonpost.com/local/virginia-politics/ad-targets-cuccinelli-fight-with-climate-scientist/2013/07/29/5d2b0be0-f87c-11e2-8e84-c56731a202fb_story.html

    29 July: ClimateCrocks: Climate Denial Crock of the Week
    VIDEO: New Politcal Ad – Cuccinelli’s Witch Hunt
    Further underlining Climate Change’s maturation as a potent political issue – a new ad in the Virginia Governmental race features GOP candidate Ken Cuccinelli’s crazed fixation on persecution of climate scientist Mike Mann…
    The story became national news as part of a broad campaign by right wing science denial groups to intimidate, spy on, and persecute scientists whose research ran counter to fossil fuel and other corporate interests…
    http://climatecrocks.com/2013/07/29/new-politcal-ad-cuccinellis-witch-hunt/


    Report this

    10

  • #
    Brian G Valentine

    I have seen this climate modeling stuff go on for 30 years.

    I have seen nothing of the past, present, or future projected or retroactively interpreted with any reliability. It is total bunk because the foundations of it are.


    Report this

    30

  • #
    pat

    Enron all over again:

    29 July: Bloomberg: Brian Wingfield & Dawn Kopecki: JPMorgan Accused of Energy-Market Manipulation by U.S. Agency
    JPMorgan Chase & Co. (JPM) manipulated power markets in California and the Midwest from September 2010 to June 2011, obtaining tens of millions of dollars in overpayments from grid operators, the U.S. Federal Energy Regulatory Commission alleged today…
    The New York-based bank has agreed to sanctions including a fine of about $400 million in a settlement that may be announced as early as tomorrow, according to a person familiar with the case who asked not to be identified because the terms aren’t yet public. Other sanctions may include forfeiting profits, this person said…
    JPMorgan said July 26 it may sell or spin off its physical commodities business including energy trading, three days after a congressional hearing examined whether banks are using their ownership of raw materials to manipulate markets.
    Commodities chief Blythe Masters oversees the unit, J.P. Morgan Ventures Energy Corp. The wholly owned subsidiary trades and holds physical commodities, including agricultural products, metals and energy, as well as derivatives…
    http://www.bloomberg.com/news/2013-07-29/jpmorgan-accused-of-energy-market-manipulation-by-u-s-agency.html

    24 July: Bloomberg: Joe Richter & Lynn Doan: Warren Says Banks Using Enron Commodity Model Face Risk
    JPMorgan, Morgan Stanley (MS) and Goldman Sachs Group Inc. are among lenders whose commodity trading is in jeopardy as the Federal Reserve reconsiders letting banks ship crude oil and run warehouses for industrial metals. Warren, a Massachusetts Democrat, spoke today at a Senate subcommittee hearing on bank ownership of metal and energy assets. Enron, once the world’s biggest trader of power and natural gas, collapsed in 2001…
    http://www.bloomberg.com/news/2013-07-23/warren-says-banks-using-enron-commodity-model-face-risk.html


    Report this

    10

  • #
    Leonard Lane

    There is also the possibility of numerical methods (i.e. finite differences, etc. used to solve differential equations)which are, in and of themselves difficult,part mathematics and part art or skill from experience. One of the earliest homework problem I was given in a numerical analysis class in the 60′s was to program the “unstable scheme” a finite difference method known to produce larger and larger errors as the delta x was decreased. Most students came back with nothing to show but programs that “blew up” and caused an overflow. A couple students came back with a solution. The prof. took stern measures with those two for cheating and lectured us on searching the literature to find proven stable schemes and then to thoroughly test and validate them before proceeding. Even most stable schemes are stable subject to certain criteria such as the Courant condition. If the modelers know little of error propagation (as discussed in previous comments), then they might know little about stability and calculation errors in numerical methods.


    Report this

    10

  • #
    Kim

    I just wanted to make a few points on computational accuracy :-

    1) binary ( non floating point ) operations only range as high as the number of bits available . 64 bits provides a bigger number than 32 bits . However scientific calculations are often done using a floating point processor \ unit – FPP \ FPU . These use a separate binary mantissa and exponent with associated logic .

    These have a theoretical minimum number , however if you try to load this number into them you may well find that the number actually produced is not the same . The designers have , for purposes of speed , produced a circuit that ‘cheats’ . It is accurate for larger numbers but not for very small numbers .

    2) ( a * b ) / c can produce a different result than ( a / c ) * b depending on the sizes of a , b and c and the registers being used for the calculation .

    3) in any binary calculation fractional amounts are discarded . 5 / 2 -> 2 . These can only be handled by a FPU and then only within the limits of that FPU .

    Calculate , just using binary operations ( non FPU ) – ( ( ( ( ( 23 / 2 ) / 2 ) / 2 ) * 2 ) * 2 ) * 2 ) – if you thought the result would be 23 think again – it’s 16 . My my that’s a 30% error !

    Now suppose you have a small number ( well within max. and min. range ) and you are using the FPU to perform a massive number of computations together on that number . What will happen is that the fractional parts on the mantissa will fall off repeatedly and this error will accumulate . If multiplication is being performed it will be multiplied ( a large subtraction will occur ) .

    If you are calculating , for example , a sin ( taylor series ) or a power \ root you will find that you will need a minimum number of terms before you can produce an accurate forward and inverse operation – eg. sin-1( sin( a ) ) . The more resolution ( number of digits ) you have the more terms you require . If you are doing this using a FPU you can easily find an inaccuracy . This is why many hand held calculators use the long method – they don’t need the speed but they do need the accuracy .

    4) a compiler is just a program that converts the English language textual version of the program into the machine code used by the computer . There are many different compilers available for many different languages including multiple compilers for the same language .

    The language being used should be chosen very specifically for the task . For high speed computation C should be used . C++ can be used under specific usage . Script based languages should not be used – far too slow . Not Python , Perl etc. .

    5) RTL’s – run time libraries – may well be used for mathematical calculations . Their accuracy will be called into question when being used to calculate series – trig. fn.’s for example .

    6) as with anything a computer program must be tested extensively and pass those tests before it can be accepted as proven .


    Report this

    20

  • #

    [...] the results without knowing exactly what computing environment the model was originally run in. Keep reading » Rate this:Share this:TwitterFacebookEmailLike this:Like [...]


    Report this

    10

  • #

    [...] Funny Geen reacties  Leven we op planeet IBM of op planeet Intel? Dat is de vraag die Joanne Nova zich stelt in haar wervelend geschreven blog over de nieuwe paper van Hong, S., Koo, M., Jang, J., Kim, J.E., Park, H., Joh, M., Kang, J., and [...]


    Report this

    10

  • #
    Snyder Lacher

    The first rule of models: All models are wrong. Some are useful.


    Report this

    10

  • #

    [...] WARNING: Using a different computer could change the climate catastrophe Related Posts:Could Earth become as barren as Venus? Climate change…Sen. Sheldon Whitehouse: God Can’t Stop Climate ChangeMet Office Explains That Climate Change Might Possibly Be…Obama to pivot to ‘climate change’Davey attacks ‘dangerous’ climate change… [...]


    Report this

    10

  • #
    Pooh, Dixie

    Many years ago, I stood inside the computer (yes, inside) at the BRL. It was intended to compute ballistic tables. http://en.wikipedia.org/wiki/Ballistic_Research_Laboratory

    It was not working properly; it dropped data.

    As it turned out, the problem was that a signal from one unit could not reach the destination unit in time. (Something about the speed of light.) At least that was the explanation I was told.


    Report this

    10

  • #
    Pooh, Dixie

    Human nature does not change much. I suggest these still apply:

    Parkinson, C. Northcote. Parkinson’s Law, and Other Studies in Administration. Cutchogue, N.Y.: Buccaneer Books, 1957

    Peter, Laurence J., and Raymond Hull. “Peter Principle.” Wikipedia, the Free Encyclopedia, July 3, 2013. http://en.wikipedia.org/w/index.php?title=Peter_Principle&oldid=560815577


    Report this

    10

  • #
    Mike

    I’m surprised that this kind of thing hasn’t been accounted for by the researchers involved. Then again, it’s a tad embarrassing to mention to the lab director that you forgot to use double-precision FP and now the climate model shows a completely different result.

    Years ago, I wrote code in FORTRAN (C was not an available option) on a machine that had no built-in FPU; your math routine choices were fixed-point (fast), software floating point (slow), and BCD (glacial). For critical stuff, BCD was used because one could set the precision to whatever was necessary and rounding behavior was predictable, but of course there was an enormous performance hit. However, when you’re doing calculations on stuff that can get lost in space if there’s a rounding error in the wrong spot, you tend to be a bit more careful. The difficulty now is that few programmers understand the mechanics of math in FPU hardware and the errors that can result. Many programmers now automatically assume that double or quad FP precision is good enough, and that the library routines don’t have bugs, or that rounding behavior is consistent across libraries.

    Inconsistencies between FPUs on different chips are also readily apparent in single precision when you are performing a large number of successive computations against a stored value (long-period double exponential moving averages in financial modeling systems for example); this sort of effect is experienced more often than many suspect. There’s a reason large financial institutions use hardware-supported BCD.

    Graphics accelerators used as FP units aren’t always consistent about how they handle rounding, and there’s a world of difference in cost and speed between single and double-precision units. Budget-constrained researchers might not pay attention to the accuracy differences so long as the results seem “good enough”.

    I find it rather horrifying that politicians are using results from models that may be grossly inaccurate because no one bothered to check the internal consistency of the math functions. That smacks of gross negligence, especially considering the enormous effects of political decisions on people who have no choice in the matter. What do you say to someone who has lost their business because of a policy change based on what turned out to be an error in a climate model?


    Report this

    10

  • #

    [...] When the same model code with the same data is run in a different computing environment (hardware, operating system, compiler, libraries, optimizer), the results can differ significantly. So even if reviewers or critics obtained a climate model, they could not replicate the results without knowing exactly what computing environment the model was originally run in. Keep reading » [...]


    Report this

    10

  • #
    Bob Cormack

    The problem of rounding error taking over the answer has been known for far longer than (automatic) computers have existed. 150 years ago, “computers” was the term for the (mostly) women who did the number crunching at astronomical observatories. The problem of roundoff error creep drove people to generate log and trig tables of ever greater lengths of significant digits. (In fact, this was on of the first uses ‘automatic computers’ (machines) were put to.)

    It can be mind-numbing to calculate when this problem will become critical. There is, however, a dead-simple way to deal with it: Use a language that allows a variable word length, and keep increasing the number of significant digits carried until the answer stops changing (at least to the accuracy desired). Of course, as you make the problem more complex, the number of digits needed increases.

    This means that the climate community has given themselves an impossible task: As they go to ever bigger and faster computers so they can run their simulations at ever finer scales, the number of digits needed to avoid ‘anything in, garbage out’ grows also, thus slowing down the computers (assuming they increase the word length to avoid junk output). It’s possible that they long ago reached the practical limit of the models (I would guess about 5 years forcasting range) no matter what computers they run them on.

    Since the whole ‘science’ of climate depends on a particular result (else funding would be cut), it would make more sense for ‘modelers’ to just program in the desired political result and quit pretending to deal with ‘data’ at all. Michael Mann seems to be ahead of the curve on this.


    Report this

    20

  • #
    Adir

    And the article and the discussion here do not even touch the huge errors due of discretization.
    Who on earth thinks that a derivative step or an integration step of 30 km (or whatever size the grid has, anyway, it’s of that order) will not introduce a huge error?

    The numerical algorithms have as a typical error some power of the grid and temporal step size.

    There are no path for kings for avoiding the numerical algorithms issues.


    Report this

    10

Leave a Reply

  

  

  

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>