Two earnest young chaps drive out to meet the slimy-ogre-from-the-hobbit who posts skeptical comments all over the web, and discover, shock, Hoyt is actually a nice guy. He’s not only normal looking, he’s well spoken, and he has good manners. He’s likeable.
They young fans of the climate scare get excited when Hoyt admits he doesn’t believe a Democrat. (Hmm, I suspect they are thinking “ideologue”.)
They also got excited when he said he was a prostate cancer survivor, not because he survived (though I say, Thank Goodness) but because it gave them an excuse to talk about “doctors”. There’s an attempt at a James Delingpole-BBC moment coming where they play up a logical fallacy as if it tells us something about the climate. You trust medical science, but not climate science? I do hope Hoyt responded that if his medical doctors had been hiding data and losing records he would have been skeptical of them too.
Naturally, there’s no real science discussed. That apparently comes in “Part III”. Though…wait and see. They’re pairing him up against Physics Girl (who we meet in Part II) who has a PhD, works with the Hadron Collider and has a black belt in Judo.
We all know how selective editing can bias the result, but so far at least it’s been a fun thing to watch. Let’s hope they keep the good natured theme, and let both sides speak. If they do, that would make it a first.
The odds are not good (to put it mildly). This comes from Climate Desk, where they enjoy their troll battles so much they don’t allow any comments at all. (Just tweets).
But lucky for them, they have a whole planet to find a bad-weather-day somewhere for a photo-op.
Hoyt, feel free to tell us your side of the story over here.
UPDATE: Lubos points out that Part III is up in another spot. And I can say Congrats to the guys at Climate Desk, against my expectations, they did keep it good natured. They didn’t actually discuss any science at all. But they didn’t do a character assassination.
PhysicsGirl is assuming that all scientists are the same, that confirmation bias doesn’t exist, and that when someone discusses a climate model, an answer about another model entirely, has some meaning. She talks in general abstract terms so much I wonder if she’s been reading much on the climate debate. James doesn’t answer the most pointed question that survived the editing (would he keep his job?). And I’d like to see what Hoyt said that was edited out, but he comes across well in a video that goes nowhere. But if they’d caught him getting angry or making a mistake, it would have been a headline.
The mouth of the Amazon is the worst source of “pollution”.
Bad news for fans of The Amazon River. A new study shows that while the Amazon rain forest is the Lungs of The Planet, pulling down gigatonnes of CO2, the river undoes all the good the trees do, and pours all the CO2 back into the sky. Damn that river eh? Lucky it only discharges one fifth of the worlds freshwater.
Apparently most researchers thought bacteria couldn’t digest the tough woody lignin of tree debris fast enough to prevent it getting to the ocean*. Underestimating microbial life seems a common affliction, and we hear was a big surprise that only 5% of the lignin actually ends up reaching the ocean where it might sink to the floor and be sequestered. The rest is broken down by bacteria and released into the air. The clues were there for years that the Amazon was giving off lots more CO2 than people expected, but the consensus was that it “didn’t add up”. So much for that consensus.
Yet another victory for observations over opinions.
Until recently, people believed much of the rain forest’s carbon floated down the Amazon River and ended up deep in the ocean. University of Washington research showed a decade ago that rivers exhale huge amounts of carbon dioxide — though left open the question of how that was possible, since bark and stems were thought to be too tough for river bacteria to digest.
A study published this week in Nature Geoscience resolves the conundrum, proving that woody plant matter is almost completely digested by bacteria living in the Amazon River, and that this tough stuff plays a major part in fueling the river’s breath.
The finding has implications for global carbon models, and for the ecology of the Amazon and the world’s other rivers.
“People thought this was one of the components that just got dumped into the ocean,” said first author Nick Ward, a UW doctoral student in oceanography. “We’ve found that terrestrial carbon is respired and basically turned into carbon dioxide as it travels down the river.”
Tough lignin, which helps form the main part of woody tissue, is the second most common component of terrestrial plants. Scientists believed that much of it got buried on the seafloor to stay there for centuries or millennia. The new paper shows river bacteria break it down within two weeks, and that just 5 percent of the Amazon rainforest’s carbon ever reaches the ocean.
“Rivers were once thought of as passive pipes,” said co-author Jeffrey Richey, a UW professor of oceanography. “This shows they’re more like metabolic hotspots.”
How do we stop that river, can we divert it underground, or cover it in white reflective plastic — do you think? Would it be better to dose it with megatonnes of antibiotics, and can we have a grant to study that? Does this mean flushing antibiotics down the toilet helps change the weather? Don’t send those old pills to landfill…
Remember there are global carbon markets running (limping) churning over $170 odd billion a year pretending that they can account for carbon flows in a meaningful way. Shame there are just a few odd gigatonnes still unaccounted for. Does this mean the parts of the Amazon that WWF “own” which were projected to net them as much as $60 billion may actually have a net worth of zero dollars when the “downstream” effect of their pollution is taken into account? We wouldn’t let the big mining corporations ignore their downstream pollution, so why let that naughty river get away with it?
Can someone calculate how many coal fired powerstations this river is equivalent too?
Finally the modelers are catching up with the slow-down or pause in global warming over the last decade. Where the best estimate for the IPCC was 3.3 degrees C with a range of 2.0 to 4.5C, Otto et al have revised the best estimate to 2C, with a range of 1.2C – 3.9C.
Effectively the power of CO2 to warm just got 30% less, according to a team of researchers, many of whom have in the past have published more alarming papers. Remember “there is no debate” and “global warming is a fact” a lot “like gravity”. (And that gravitational constant g could be revised by a third soon, right?)
Otto et al looked at the global energy budget which means adding up atmospheric, ice, continent, and ocean energy. They compare it decade by decade since 1970, and argue that the long term equilibruim temperature response need to be revised down. The oceans are taking up more heat than anyone thought (or says Jo, given how little we know about the oceans, it could be that sneaky energy is tripping off to outer space?)
Observations of the global mean temperature change plotted against change in forcing minus heat uptake (ΔF–ΔQ) for the equilibrium climate sensitivity (ECS) (a) and against ΔF for the transient climate response (TCR) (b), for each of the four decades 1970s, 1980s, 1990s and 2000s and for the 40-year period 1970–2009. Ellipses represent likelihood contours enclosing 66% two-dimensional confidence regions; best-fit points of maximum likelihood are indicated by the circles; and the curved thick and thin lines represent the 17–83% and 5–95% confidence intervals of the resulting one-dimensional likelihood profile in ECS (or TCR), respectively. All time periods are referenced to 1860–1879, including a small correction in ΔQ to account for disequilibrium in this reference period14. Straight contours show iso-lines of ECS (a) and TCR (b), calculated using a best-fit value of F2x of 3.44 W m−2 (also adjusted for fast feedbacks)10. Uncertainty in F2x is assumed to be correlated with forcing uncertainty in long-lived greenhouse gases10. To avoid dependence on previous assumptions16, we report results as likelihood-based confidence intervals.
For the number-junkies, one number that has been used ad lib as gospel for years appears to have changed. The hallowed forcing due to a doubling of CO2 was 3.7Wm^-2 is being lowered to 3.44Wm-2.
F2x is the forcing due to doubling atmospheric CO2 concentrations.
We use a value of F2x of 3.44 W m–2 (with a 5–95% confidence interval of ±10%) from ref. 10. Using a higher estimate of 3.7 W m–2 would shift up our estimated ranges for ECS and TCR, but only by about 0.1 K
The wide error bars on climate science knowledge (when it suits) mean that this 30% shift can be described as “in agreement”:
The oceans as measured by ARGO are warming, but that warming is not only far less than the models predicted, it is far less even than the instrument error.
The background of a crucial point
Everyone agrees: 90% of the energy in the Earth’s climate system is stored in the oceans. Rocks and sand don’t transmit the heat down, except at incredibly slow rates. The wil-o’-the-wisp-atmosphere hardly holds any energy. But water covers 70% of the surface, to an average depth of 3,700m, and it can store septillions of joules.
Climate models say the Earth’s energy balance is out of whack, and therefore 90% of the extra energy trapped by increasing greenhouse gases is stored in the ocean. The oceans are warming (probably), but the extra energy found in the top 700m of the world’s oceans is not enough. The modelers argued the heat was hidden below, that from 700m-2,000m. Skeptics argue the missing energy was flung out to space. This is the big enchilada, and as far as measuring oceans goes, everything changed in 2003 when we finally got the ARGO system, and that’s why it’s worth a closer look now.
David points out that the errors might be seriously miscalculated. A single ARGO buoy (which measures ocean temperatures down to 2000m) has an uncertainty of about 0.1C. But using 3,000 buoys doesn’t make that uncertainty dramatically smaller when all that data is combined together. It would, if the 3,000 buoys were all measuring the same swimming pool. But each buoy measures a different piece of ocean, and the ocean does not have one global temperature. Or it would if all the world’s ocean localities warmed by the same increment due to global warming, in each time period. But that would be a very brave assumption, because different parts of the world’s oceans probably warm at different rates due to global warming. So the measurement uncertainty is closer to the instrument error of 0.1C than the 0.004C as claimed by fans of man-made global crisis, and since the oceans have only warmed by about 0.02C (if that) since we’ve been measuring it with ARGO, that tiny amount of warming might just be noise. Going back further, the pre-ARGO data is so bad that longer datasets have much larger uncertainties.
Overview of Measuring Ocean Temperatures
According to many neat diagonal graphs, the oceans are warming, it’s alarming, and we have to spend billions to stop it. But is that warming enough, and how accurately can we measure it anyway?
For some inexplicable reason NOAA publish graphs of ocean heat content (OHC) but not ocean temperatures — the later are what the equipment measures, and what we relate to. So we need to convert OHC back to degrees C to find out the change in temperature.
Inspired by Willis Eschenbach, David Evans has done it with the ARGO data. (David doesn’t use non-ARGO data because he considers the XBT and buckets-off-boats stuff to be whimsical fantasy with error bars up where the unicorns fly, see below.) ARGO data only properly started in 2003, and any data at-depth before then was sparse. Compare it to the detail in our atmospheric measurements. We’ve been releasing weather-balloons twice a day from 800 sites around the Earth for five decades. To measure the global oceans we used 50 ships “of opportunity” (which happened to be in a shipping-lane) and XBT’s were fired down, not twice a day but once every few weeks. And vast tracts of the ocean hardly got measured, ever, especially the deepest parts. (XBT’s only get down to about 800m.)
ARGO data is good, but short. It’s fair to ask if such a short span is meaningful? Since thou shalt not create nor destroy energy — it is fair. The imbalance caused by extra CO2 has run day in and day out for nearly 3000 days since ARGO began. That energy has to be somewhere.
Where are the error bars?
This is the standard diagonal scare-graph for ocean temperatures, but with added notes about which parts are reliable.
It looks like a convincingly big rise, but it’s not enough, and the error bars are huge. Even on the accurate ARGO buoys error margins are probably not a lot less than 0.1C.
David Evans separated the layers of ocean so we can compare the rate of warming in the deeper layer, 700m – 2,000m, with that from the top layer, 0 – 700m. (Bear in mind, the “top” 700 m is not exactly shallow — it is the hull-crush depth of a military submarine.) David found that the rate of warming in the deeper layer is the same as the rate in the top 700m.
Over this short period, there is no acceleration evident in either layer.
If the instrument error is 0.1C, the error bars would be off the scale. [Jo says, "look" the error bars are marked in white...]
You might almost think the slight appearance of warming matters, but let’s put it in perspective — it’s not remotely close to what the models predicted.
Though the funny thing about having wider error bars is that even though the results look so different from the models, the predictions could still fall within the error bars! Why haven’t the modelers announced it? [Prediction from Jo: Trenberth channels Santer 2008, "reconciles model and ARGO data by revisiting error estimations!" Coming soon, with 17 authors and a major press release... Researchers investigate errors and find the missing energy!].
Dr David Evans discusses the ocean data
These notes about the graphs above and the following explanation of how he got the graphs and why he chooses to use ARGO over the other data sets are thanks to David.
What does a study of 20 years of abstracts tell us about the global climate? Nothing. But it says quite a lot about the way government funding influences the scientific process.
John Cook, a blogger who runs the site with the ambush title “SkepticalScience” (which unskeptically defends the mainstream position), has tried to revive the put-down and smear strategy against the thousands of scientists who disagree. The new paper confounds climate research with financial forces, is based on the wrong assumptions, uses fallacious reasoning, wasn’t independent, and confuses a consensus of climate scientists for a scientific consensus, not that a consensus proves anything anyway, if it existed.
Given the monopolistic funding of climate science in the last 20 years, the results he finds are entirely predictable.
The twelve clues that good science journalists ought to notice:
1. Thousands of papers support man-made climate change, but not one found the evidence that matters
Cook may have found 3,896 papers endorsing the theory that man-made emissions control the climate, but he cannot name one paper with observations that shows that the assumptions of the IPCC climate models about water vapor and cloud feedbacks are correct. These assumptions produce half to two-thirds of the future projected warming in models. If the assumptions are wrong (and dozens of papers suggest they are) then the predicted warming is greatly exaggerated. Many of the papers in his list are from these flawed models.
In other words, he’s found 3,896 inconclusive, subsequently-overturned, or correct but irrelevant papers. What is most important about his study is that after thousands of scientists have pored over the best data they could find for twenty years, they still haven’t got any conclusive support.
2. Cook’s study shows 66% of papersdidn’t endorse man-made global warming
Cook calls this “an overwhelming consensus”.
They examined “11 944 climate abstracts from 1991–2011 matching the topics ‘global climate change’ or ‘global warming’. We find that 66.4% of abstracts expressed no position on AGW, 32.6% endorsed AGW, 0.7% rejected AGW and 0.3% were uncertain about the cause of global warming.
Perhaps the large number that are uncertain merely reflects the situation: climate science is complicated and most scientists are not sure what drives it. The relative lack of skeptical papers here is a function of points 4, 5, and 7 below. Though its irrelevant in any case. It only takes one paper to show a theory is wrong. Who’s counting?
3. Cook’s method is a logical fallacy: Argument from Authority. This is not science, it’s PR.
The thing that makes science different to religion is that only empirical evidence matters, not opinions. There are no Gods of Science. Data, not men, is the authority that gets the last say (there is no Pope-of-The-Papers). Cook turns that on its head. It’s anti-science. When scientists explain why they’re sure gravity keeps the Earth in its orbit, they don’t argue that “97% of geophysicists voted for it”.
Cook knows this (I do keep reminding him), but he pretends to get around it. Spot the delusion: “Scientists must back up their opinions with evidence-based analysis that survives the scrutiny of experts in the field. This means the peer-reviewed literature is a robust indicator of the state of the scientific consensus.” Cook assumes that scientists opinions are based instantly and accurately, and only on the evidence, as if humans were Intel chips. He assumes that “peer review” is uncorruptible (unlike every other human institution), that two unpaid anonymous reviewers is “scrutiny”, that climate-activist-scientists don’t work to keep skeptics out of the peer review literature, and that ClimateGate never spilled out what really happened in climate science.
Don’t people who do psychological research need to understand the basics of human nature? Scientists can cling to the wrong notion for years — just look at those who thought humans would never fly (even two years after the Wright brothers’ first flight) or that x-rays in shoe stores were safe, or that ulcers weren’t infectious, or that proteins could not be contagious (then came BSE).
4. The number of papers is a proxy for funding
As government funding grew, scientists redirected their work to study areas that attracted grants. It’s no conspiracy, just Adam Smith at work. There was no funding for skeptical scientists to question the IPCC or the theory that man-made climate science exaggerates the warming. More than $79 billion was poured into climate science research and technology from 1989 to 2009. No wonder scientists issued repetitive, irrelevant, and weak results. How hard could it be? Taxpayers even paid for research on climate resistant oysters. Let no barnacle be unturned.
Sheer quantity of abstracts endorsing man-made climate change has increased, but so has the funding.
Over the same era, $79 billion was poured into climate science and climate technology related research.
The problem with monopsonistic funding models is that there is little competition. Few researchers are paid to research angles that are likely to disagree with the theory. Volunteers who want to do their own research don’t have free access to journals, may have trouble getting the data (sometimes it takes years or FOIs to get it, and sometimes it never comes). Volunteers don’t necessarily have the equipment to do the analysis, and don’t have PhD or Honours students to help. They also don’t get paid trips to conferences and suffer the impediment of having to devote time to earn an income outside of their research. When they do find something there are no PR teams to promote their papers or send out the press releases.
In the financial world we have audits, in courts we have a defense, in Parliament we have an opposition, but in science we have… whatever the government feels like funding.
In the end, there is no government funding, be it through a grant or institute that actively encourages people to search for reasons the IPCC favoured theory might be wrong.
5. Most of these consensus papers assume the theory is correct but never checked. They are irrelevant.
The papers listed as endorsing man-made global warming includes “implicit endorsement”, which makes this study more an analysis of funding rather than evidence. Cook gives the following as an example of a paper with implicit endorsement: “‘. . . carbon sequestration in soil is important for mitigating global climate change’. Any researcher studying carbon sequestion has almost certainly not analyzed outgoing radiation from the upper troposphere or considered the assumptions about relative humidity in climate simulations. Similarly, researchers looking at the effects of climate change on lemurs, butterflies, or polar bears probably know little about ocean heat content calculations. These researchers are “me too” researchers.
If a conservative government had spent billions analyzing the costs of the failed climate models and the impact of disastrous green schemes, skeptics would be able to quote just as many me-too papers as Cook quotes here. (But we wouldn’t, because analyzing the climate by doing keyword studies — it ain’t science).
6. Money paid to believers is 3500 times larger than that paid to skeptics (from all sources).
Cook seems to believe there are organized efforts running to confuse the public. Is that a projection of Nefarious Intent(NI) coupled with conspiratorial suggestions of mysterious campaigns?
Contributing to this ‘consensus gap’ are campaigns designed to confuse the public about the level of agreement among climate scientists.
Given that he is confused about what science is, he probably would think people are trying to confuse him when they give it to him straight.
His own personal bias means he is the wrong person to do this study (if it were worth doing in the first place, which it isn’t).
It has all the hallmarks of activist propaganda, not research. Cook tries to paint skeptics as doing it for the money, but blindly ignores the real money on the table. Governments have not only paid more than $79 billion in research, they also spend $70 billion every year subsidizing renewables (an industry which depends on researchers finding a link between carbon dioxide and catastrophic climate change). Carbon markets turn over something in the order of $170bn a year, and renewables investment amounts to a quarter of a trillion dollars. These vested interests depend entirely on a catastrophic connection — what’s the point of cutting “carbon” if carbon doesn’t cause a crisis? Against these billions, Cook thinks it’s worth mentioning a 20 year old payment of $510,000 from Western Fuels? And exactly what was Western Fuels big crime? Their primary goal was allegedly the sin of trying to ‘reposition global warming as theory (not fact)’ which as it happens, is quite true, except that technically, “global warming” is not even a theory, it’s a hypothesis, something with much less scientific weight.
Do you think if you had $79 billion you could get 3896 papers published
7. Keywords searches may miss the most important skeptical papers.
Keyword searches are more likely to turn up “consensus” papers. Many skeptical papers don’t use the terms “global warming” or “global climate change”: eg Svensmark (1998), Douglass (2007), Christy (2010), Loehle (2009), and Spencer (2011). Were they included? Perhaps they were, but they don’t appear to match the search terms in the methods. These are just a few seminal skeptical papers that might have been missed.
UPDATE:Lucia and JunkPschology in comments confirm that these six papers listed would not have made Cooks list. So in only half an hour of random analysis I can easily turn up major papers by skeptics that fall outside Cooks primitive keyword hunt. How many others miss too?
8. Some of these abstracts are 20 years old — does two decades of new evidence change anything?
Twenty years ago the IPCC was predicting we’d get warming of 0.3 degrees C per decade. The warming trend came in significantly below their lowest possible estimate, no matter which major dataset you consult. Back then scientists didn’t know there was an 800 year lag in the ice cores (where temperatures rise centuries before carbon dioxide does). In 1992 scientists didn’t realize that warming would soon flatten out for 15 years. They didn’t know that 28 million radiosondes would show their models were based on flawed assumptions about water vapor. They didn’t know that 3000 ARGO bouys would finally measure the oceans adequately for the first time (starting in 2003) — and find the oceans were not storing the missing energy their models predicted they would be, or heating nearly as quickly as the models predicted. In other words, even if there was a consensus in 1992, it’s irrelevant.
Skeptics don’t issue press releases decreeing that this means anything scientific. It does mean that the media and IPCC are blindly ignoring masses of evidence, and that the term “denier” is well… marketing, not science. The people who deny these 1,100 papers exist are the ones calling other scientists names. When will journalists notice?
Given how much money has been paid to find evidence, the question real investigators ought to ask is “Is that all they found?”
Skeptics don’t issue press releases saying we outnumber and outrank the believers. Perhaps we should, but skeptics prefer to argue the evidence. Cook ignores the authorities that don’t suit him. Skeptics get Nobel Physics prizes, but believers only seem to get prizes for Peace. Phil Jones is one of the most expert of expert climate scientists, but he couldn’t create a linear trendline in Excel. Some skeptics, on the other hand, got man to the moon.
No wonder the public don’t think there is a consensus. There is no consensus among scientists.
Cook makes out that the public have been fooled by a deliberate campaign, but unless devious skeptics can cover continents in floods and snow, it could be that the public can see the failure of the models with their own eyes.
Is there a better argument to show why big government is a big-fail than just saying, E.U.? It takes real skill to start with 20 successful economies and combine them into one large bankrupt entity — and all this in only a decade-and-a-half.
For Australians, seven billion dollars is $320 per man, woman and child. That was nearly $1,300 per household of four.
If someone had knocked on your door with a registered E.U. tin, would you have felt compelled to give $1,300 (of non-tax deductible income) to rescue the poor bankers and burdensome bureaucrats of Europe? Perhaps you might have chosen instead to get a new fridge, take the family on a holiday, or pay for private tutoring for your teenager. But thanks to Swan and the force of big government, you didn’t get that chance. As it happens, Australia doesn’t have the cash to pay it anyway, so the government borrowed $7 billion and we’ll pay the interest, as well as the debt. Good-o.
It’s OK though, Swan tells us “but it won’t involve donating cash into a relief fund.” I’m so relieved.
Instead, we are borrowing money we don’t have, to rescue the regulators, bail-out the bureaucrats, and keep the EU politicians in their cushy but overpaid, net-drain employment who made mistaken decisions we had no role in.
Meanwhile people in Papua New Guinea, Indonesia, Afghanistan, and Bangladesh will just have to make do with a bit less (and, if the Eurocrats get their way, no prospect of life-changing cheap power anytime soon). I’m sure they feel a warm glow knowing their new water-wells, schools, or vaccinations are delayed for a “good cause”.
The opposition estimate Gillard will leave us with a gross debt of more than $270 billion. The cost of five years of mismanaged big-government was that even above the thousands paid in five years of tax, the average person owes another $12,000 dollars — or nearly $50k for a family of four.
What would your household have done with an extra $50k?
When will voters realize the true cost of the parasitic-political class, which voted to rescue fellow parasites rather than the poorest of the poor? What do we call this group who use the goodwill and productivity of others to fund their own mistakes through regulations and coercion: polarasites, politisites?
Budget night in Australia tonight. More fun.
*Minor edits to make it clear the $7b decision was made in 2012.
Who knew? Sometimes you should feed goats to prevent floods and droughts. Other times you should shoot them to get the same outcome. Confused? In times gone by you would need to ask the tribal witchdoctor. Now, in the post-modern period, talk to a climate scientist.
Steve Goreham highlights some Global Stupidity in the Washington Times. In case you didn’t know, for writing things like this his booksget burned. It could only happen in a centre for higher education.
Shepherd or Shoot Goats in the Name of Climate Change
O’Hare airport will finally get its goats. The Department of Aviation of the City of Chicago has awarded a contract to a private firm to provide 25 goats to munch vegetation at the city’s airport. These “green lawn mowers” will help reduce carbon dioxide emissions to sustain the planet.
Last fall, when the project was bid, Amy Malick, head of sustainability at the Department of Aviation, commented on the planned use of goats in hard-to-mow areas, “They may have steep slopes, very hard to get to with heavy machinery, and those machines also emit pollution. They’re burning fossil fuel. So as a sustainability initiative we’re looking to bring in animals that do not have emissions associated with them, at least to the same extent that heavy machinery would.”
A shepherd will herd the goats across 120 acres at four different sites on airport property. The 25 fuzzy critters are expected to clear vegetation each day from a square at least sixteen feet on a side.
Senator Ron Boswell (National Party) makes the case in The Australian today that Australia is losing manufacturing due to rising energy costs, while the US is gaining manufacturing back from China because it has a cheap energy advantage with shale gas.
Such is the “National Conversation” on these policies, it is impossible to say exactly how much The Carbon Farce has cost Australian business, due to the complexity of supply chain accounting and the rule that the government will put you in jail if you publish an estimate that is wrong. There are other factors at work, but there is no situation where higher energy costs make business easier, unless your business is one of the ones making expensive energy.
Energy taxes are the nail in the coffin of local manufacturing
Ron Boswell The Australian May 08, 2013 THE collapse of the European carbon price has laid bare the absurdity of the Gillard government’s energy-based taxes on production.
The Australian carbon tax, at $23 a tonne and rising, and the almost-as-expensive renewable energy tax are killing the competitiveness of our businesses.
Boswell lists the businesses cutting production in Australia: Holden, Bluescope Steel, Boral, Penrice Soda, Pentair, Amcor, Goodman Fielder, Caltex…
So Holden will cut 500 jobs in South Australia and Victoria to reduce costs while Labor adds an estimated $550 to the manufacturing cost of every new Holden: $350 with a carbon tax and $200 with renewable energy targets. European car manufacturers gain an immediate advantage because of their far lower carbon price, not to mention the high Australian dollar. Why subsidise the Australian car industry on one hand and penalise it with energy taxes on the other?
Australia’s strong manufacturing sector was based on two great advantages: abundant food and mineral resources and cheap power from massive coal deposits.
By increasing the cost of power, Labor has thrown away one advantage and major Australian manufacturing operations are going out of business or overseas.
Businesses here are cutting jobs to save costs:
BlueScope Steel in Victoria, 170 jobs gone; Boral, 790 jobs gone; Penrice Soda in SA, 60 jobs gone; Pentair, a company that made steel pipes in western Sydney for 60 years, 160 jobs gone; and Amcor, 300 jobs gone. Goodman Fielder is shutting 15 factories, cutting 600 jobs, Caltex its Kurnell refinery, 330 jobs, and the Norsk Hydro aluminium smelter near Newcastle, 350 jobs.
Others are moving production overseas: Kerry Foods, 100 jobs gone; Kresta Blinds, 72 jobs; Cussons soaps, 75 jobs; Aerogard, 190 jobs; Harley-Davidson, 212 jobs; and Bosch, 380 jobs. Golden Circle has moved processing lines and jobs to New Zealand too.
In the US, manufacturing is growing:
General Electric is spending $US800 million ($784m) bringing back production of electrical appliances from China to Kentucky. Apple is spending $US100m on a US manufacturing line for Mac computers currently made in China. Other companies lured back to the US include Caterpillar, Whirlpool, Otis, Electrolux, Google, NCR and Zentech. Foreign companies plugging into the US’s cheap power include German chemicals giant BASF, which has already committed $US5.7 billion to North America since 2009.
US commentators are talking about an extra $US1.5 trillion increase in manufacturing production and 3.7 million manufacturing jobs by 2025. One of the main reasons for this resurgence is cheap electricity, generated by the shale gas now being tapped into by US power companies.
The carbon tax is placing an $8 billion a year burden on Australia…
One industrial user in Queensland told me it recently received a $244,000 electricity bill. Of that, $32,500 was attributed to the RET and $45,000 to the carbon price. This is an increase of almost 50 per cent on this user’s costs last year, due to the RET and the carbon price. A cost projection prepared for another Queensland industrial user shows that next year it will pay a $5.78 million electricity bill; $495,000 of that will be from the RET and about $1m from the carbon price.
Even companies that don’t use much electricity suffer from higher energy prices
All businesses in Australia hurt when Australian consumers have less to spend. The invisible opportunity cost robs them of customers:
Modelling done for NSW’s Independent Pricing and Regulatory Tribunal, the Climate Change Authority and other bodies all show that households would save about $270 a year with the RET and carbon tax gone.
That’s $270 from every household each year that could be used to create jobs in restaurants, holiday towns, local shops, gyms, sporting facilities… instead of Australians enjoying memorable moments, and getting fitter, it funds some bureaucrats, it provides jobs in China, Japan and Germany and other places where “renewable-energy” machinery is made, and it helps to reduce world temperatures by zero degrees.
But it’s true that part of that money props up a small local industry. It’s a group of people who install equipment they hope will change the weather.
We might as well be casting spells and stirring cauldrons.
Welcome to the land of witchcraft.
*Update: Added “Senator” and Nationals to the first line. h/t for proofreading to Themm.
How is the hallowed Green tech industry working out for China? Not so well.
Shi Zhengrong was called a “hero of the environment” by Time Magazine. He was a billionaire who ran the worlds largest seller of solar PV cells. But the glory days of 2008 – 2011 are gone. Another bubble bursts. Wiped out in two years. How fast was this fall?
In 2008, CNN named Shi “China’s Sunshine Boy.” In 2009, Fortune anointed him “China’s new king of solar.” That year, New York Times columnist Thomas Friedman also cited Shi and Suntech as models of China’s green leap forward — which he called “the Sputnik of our day” and a spur for U.S. clean energy policy.
Now, however, the Chinese Sputnik has crashed to Earth, and the Sun King has been toppled.
… a Chinese court declared the company bankrupt after a petition from eight Chinese banks. On Wednesday, the company announced that its 2012 revenue had plunged 48 percent from the previous year.
Suntech — which in 2011 was the world’s biggest seller of silicon-based photovoltaic modules — was once valued at $13 billion on the New York Stock Exchange; it is worth less than 1 percent of that today. A news report that Warren Buffett might be eyeing all or part of Suntech lifted the battered share price more than 80 percent in one week, but acquiring Suntech could be a risky bet. [Buffett denies this according to Bloomberg]
Quartz reports that just last week Suntech cancelled a major order: “This doesn’t look good. Suntech, the embattled Chinese solar panel maker, has canceled $1.3 billion in polysilicon orders…”
Shi was trained in Australia, under our best.
The rise of Suntech was a rags-to-riches story for its founder. Given up for adoption by destitute parents in a farming area, Shi excelled at school and obtained a master’s degree before moving to Australia in 1989 on an exchange program. He accepted a scholarship to do solar cell research at the University of New South Wales, quickly earned a PhD … [The Washington Post]
An old fashioned glut with new fangled technology. There were just too many solar panels in the world that nobody wanted:
China has about two-thirds of the world’s solar panel manufacturing capacity. China alone, theoretically, could supply all of the world’s solar demand and there would be no room for U.S., German, Japanese or Taiwanese companies. [The Washington Post]
And people said that solar was becoming more competitive with coal?
“Analysts said Suntech’s business model, deliberately pushing down prices to capture larger market share despite narrower profit margins, contained the seeds of its own destruction.” [The West]
The last profit was in 2010:
Suntech recorded a net loss of $1.0 billion in 2011, from a profit of $237 million in 2010, according to company filings. The firm has yet to report financial results for 2012. [The West]
What the government giveth, a bureucrat can take away, and when combined with a rapid expansion at the wrong time, it became the perfect (solar) storm:
… just as Suntech became the world’s largest solar panel manufacturer, the European debt crisis hit. Profit margins collapsed, and customers’ unpaid bills piled up. In June 2012, a puzzled Citigroup analyst noted in a report that as major countries such as Germany and Italy sharply reduced subsidies, Chinese manufacturers led by Suntech boosted capacity 30 percent, “only exacerbating the excess supply conditions.” The report said that “module prices are likely to remain below Suntech’s production costs for the foreseeable future.” [The Washington Post]
If Shi had been doing his research properly and reading skeptical blogs he might have seen the writing on the wall.
Remember how we are supposed to be following China and how China is “going Green”?
Suntech is not unique. Most Chinese solar panel manufacturers have been losing money…
Thanks for the invite to assist with the crowd sourced online survey.
Unfortunately I just can’t see this working.
1. The survey is profoundly anti-science, it’s exactly the kind of thing I debunk on my blog. Consensus is the stuff of politics, not science. Science is not a democracy, and we don’t vote for the laws of physics, which are either right or wrong and not “97% popular”. Hence, any answer you get in this survey (and it appears you already have the answers) has got nothing to do with understanding the climate of Earth. It may possibly be helpful in psychosocial analysis of groupthink in modern science, or the effect of monopsonistic funding on scientific progress, but that brings me to problem 2, even if it were useful for that, you are not the researcher to study that. See point 2.
2. You still refer to us as “deniers” in much of your work. You admitted there was no such thing as a “climate denier” a few months ago (albeit after five years of using the term), but you have not adopted a more useful name, or apologized for abusing the English language. Clearly you think skeptics are run by their lizard brains, unable to think, prone to burying their heads in the sand. (Didn’t you put that on the cover of your book?) So in the end, if skeptics are deniers, and they are so incapable of thinking, it’s not rational for you to want deniers to “help” with this fallacy of a survey, because irrational dullards would be bound to do a bad job. It appears more likely that that what you are trying to achieve is a new way to survey people who disagree with your scientific opinion. That would make this a form of confirmation bias for the purpose of PR and not science. (See also point 1).
If this is what John Cook thinks of skeptics, why would he ask them to “help” his research?
3. Your recent work with Stephan Lewandowsky has rather vaporized any small remnant of trust or goodwill most skeptics feel for you. Hence, as commenters are already saying on threads related to this, what skeptic would bother giving you 10 minutes of their time? One term I heard was “barge-pole”.
4. Lastly there is the question of honesty. You’ve given each skeptic blogger a different survey link, so presumably you can track responses to certain blogs. Yet in the email you tell me that the responses are anonymous and you don’t disclose that each site was given a trackable link. That’s dishonest. (It reminds me of the time you insisted you hosted a professional research survey on your blog and when you couldn’t find it, said improbably, that you’d deleted it, except it was never on your blog in the first place was it? There is a peer reviewed science paper that justifies itself because of that imaginary post of yours, why don’t you correct it? Are you interested in the truth, or just “The Cause”?)
I think that’s got it covered: It’s anti-science, illogical, run by the wrong man, and you haven’t been honest.
But if you can fix all that, then send me an email.
Australia could be leading the world in research that will save lives and earn money. Instead we lead the world in gullible obedience to an absurd scheme to change the weather.
Making human body parts from off-the-shelf or customized cells is a vast improvement on waiting for healthy people to die and donate, and a lifetime of immunosuppressants. Any delay in developing these techniques will surely cost lives and prolong disability. Those who do develop and patent these techniques are bound to profit and be able to afford even more research, and will improve national productivity as well as the quality of life. For those on waiting lists, spending $3billion on solar panels to stop storms and hold back the sea is the cruelest waste.
This type of medical work could make waiting lists for organ-donation a thing of the past. The political vanity of supporting useless causes for the sake of symbolism has a ghastly price.
Read about what this poor little girl and her family have had to go through. She is two, and had never been home or tasted food until last month. She has not been able to speak, but might possibly one day. This is marvelous news, which we fervently hope continues to deliver. It is very early days…
Toddler is youngest to ever get lab-made windpipe
Tuesday Apr 30, 2013 | Lindsey Tanner for The Associated Press
South Korean 2-year-old has new windpipe made from her own stem cells; youngest patient ever
CHICAGO (AP) — A 2-year-old girl born without a windpipe now has a new one grown from her own stem cells, the youngest patient in the world to benefit from the experimental treatment.
Hannah Warren has been unable to breathe, eat, drink or swallow on her own since she was born in South Korea in 2010. Until the operation at a central Illinois hospital, she had spent her entire life in a hospital in Seoul. Doctors there told her parents there was no hope and they expected her to die.
The stem cells came from Hannah’s bone marrow, extracted with a special needle inserted into her hip bone. They were seeded in a lab onto a plastic scaffold, where it took less than a week for them to multiply and create a new windpipe.
About the size of a 3-inch tube of penne pasta, it was implanted April 9 in a nine-hour procedure.
Princeton scientists have made a bionic ear that extends human hearing into ranges humans can’t normally hear.
A bionic ear created with 3D printing. It contains an antenna that can pick up radio waves. Credit: Photo by Frank Wojciechowski
May 1, 2013 — Scientists at Princeton University used off-the-shelf printing tools to create a functional ear that can “hear” radio frequencies far beyond the range of normal human capability.
The researchers’ primary purpose was to explore an efficient and versatile means to merge electronics with tissue. The scientists used 3D printing of cells and nanoparticles followed by cell culture to combine a small coil antenna with cartilage, creating what they term a bionic ear.
“In general, there are mechanical and thermal challenges with interfacing electronic materials with biological materials,” said Michael McAlpine, an assistant professor of mechanical and aerospace engineering at Princeton and the lead researcher. “Previously, researchers have suggested some strategies to tailor the electronics so that this merger is less awkward. That typically happens between a 2D sheet of electronics and a surface of the tissue. However, our work suggests a new approach — to build and grow the biology up with the electronics synergistically and in a 3D interwoven format.”
Lüdecke, Hempelmann, and Weiss found that the temperature variation can be explained with six superimposed natural cycles. With only six cycles they can closely recreate the 240 year central European thermometer record. There is little “non-cyclical” signal left, suggesting that CO2 has a minor or insignificant effect.
The three German scientists used Fourier analysis to pick out the dominant cycles of one of the longest temperature records we have. The Central European temperature is an average of records from Prague, Vienna, Hohenpeissenberg, Kremsmünster, Paris, and Munich.
The dominant cycle appears to be about 250 years. There is also a cycle of about 60 years, corresponding to the Atlantic/Pacific decadal oscillations.
Data is of course, always the biggest problem. If we had 10,000 years of high quality global records, we could solve “the climate” within months. Instead, we have short records, and Lüdecke et al, make the most of what we have. The European records are only 240 years long, or (darn) one dominant cycle, and only one region, so to check that the results are valid over longer periods they also analyze a the 2000 year Spannagel Cave stalagmites proxy, where the dominant cycle of roughly 250 years is confirmed. To show that the results apply to other parts of the world, they look at the German Alfred Wegener Institut (AWI), Antarctica series.
Ominously, the temperatures of the dominant cycle (in Europe at least) peaked circa 2000 and if the six-driving-cycles do represent the climate then things are going to get cooler, quickly. Wait and see…
Fourier analysis can’t tell us what causes the cycles, but it can tell us the likely frequency, amplitude and phase of those cycles. If these are accurate, it can be used to rule out significant effects from man-made forces and ultimately to predict what will happen next.
Horst-Joachim Lüdecke, EIKE, Jena, Germany
Alexander Hempelmann, Hamburg Observatory, Hamburg, Germany
Carl Otto Weiss, Physikalisch-Technische Bundesanstalt Braunschweig, Germany
In a recent paper  we Fourier-analyzed Central-European temperature records dating back to 1780. Contrary to expectations the Fourier spectra consist of spectral lines only, indicating that the climate is dominated by periodic processes (Fig. 1 left). Nonperiodic processes appear absent or at least weak. In order to test for nonperiodic processes, the 6 strongest Fourier components were used to reconstruct a temperature history.
Fig. 1: Left panel: DFT of the average from 6 central European instrumental time series. Right panel: same for an interpolated time series of a stalagmite from the Austrian Alps.
Fig 2: 15 year running average from 6 central European instrumental time series (black). Reconstruction with the 6 strongest Fourier components (red).
Figure 2 shows the reconstruction together with the Central-European temperature record smoothed over 15 years (boxcar). The remarkable agreement suggests the absence of any warming due to CO2 (which would be nonperiodic) or other nonperiodic phenomena related to human population growth or industrial activity.
For clarity we note that the reconstruction is not to be confused with a parameter fit. All Fourier components are fixed by the Fourier transform in amplitude and phase, so that the reconstruction involves no free (fitted) parameters.
However one has to caution for artefacts. An obvious one is the limited length of the records. The dominant 250 year period peak in the spectrum results from only one period in the data. This is clearly insufficient to prove periodic dynamics. Therefore, longer temperature records have to be analyzed. We chose the temperature history derived from a stalagmite in the Austrian Spannagel cave, which extends back by 2000 years. The spectrum (Fig. 1 right) shows the 250 year peak in question. The wavelet analysis (Fig. 3) indicates that this periodicity is THE dominant one in the climate history. We ascertained also that a minimum of this 250 year cycle coincides with the 1880 minimum of the central European temperature record.
Fig 3: Wavelet analysis of the stalagmite time series.
Thus the overall temperature development since 1780 is part of periodic temperature dynamics prevailing already for 2000 years. This applies in particular to the temperature rise since 1880, which is officially claimed as proof of global warming due to CO2, but clearly results from the 250 year cycle. The 250 year cycle was driving the temperature drop from 1800 to 1880 (see Fig. 4), which in all official statements is tacitly swept under the carpet. This same general fall and rise shows in the high quality Antarctic ice core record in comparison with the central-european temperature records (Fig. 4, blue curve).
ALMOST 150 suspected rorts of the Gillard government’s Renewable Energy Target scheme were reported to the regulator last year, with NSW and federal authorities assisting with the execution of two search warrants as a part of the probe.
The Clean Energy Regulator yesterday released its annual report to government on the administration of the RET — a scheme that provides certificates for both large and small-scale renewable energy generation as part of the bipartisan target of ensuring 20 per cent of Australia’s electricity comes from renewables by 2020.
The regulator’s audit report revealed that during 2012 it received 147 allegations of rorts, the majority of which related to the creation of dodgy certificates for rooftop solar panels.
So far three “monitoring warrants” have been executed by NSW and Australian Federal police. One matter is before the Federal Court as a civil prosecution. One criminal matter was heard last year.
…businessman John Testoni of Sydney Solar Eco Solutions pleading guilty to improperly creating $170,000 in RET certificates for 24 non-existent solar system installations in the Sydney area.
Fake markets just ask to be scammed. Who can forget the Spanish winter of late 2009 when 4,500Mw hours of “solar” electricity was generated at night. That was 2.5 million euro of diesel powered photovoltaic electricity. As Lubos points out, the regulators said they ended the scam, but slightly smarter crooks would just turn off the diesel at dinner time wouldn’t they? Who was checking?
Higher feed in tariffs for solar power are a falsely high price. No one can tell a solar-powered electron from a coal-fired one, yet one artificially “sells” for more (and these are the same people who say we need a “free market”). The arbitrage position begs to be tested.
Alarmists concerned coal investors will lose money
The same Australian story notes at the end that The Climate Institute are worried about coal investors losing money. Shucks. Coal investments in the US are about $674bn a year and about $5.7bn in Australia. Coal is in an unsustainable bubble they say, “of climate denial, indifference or dreaming.”
Being the great investors they are The Climate Institute warns “It is increasingly clear there will be a lot of wasted investments or stranded assets,…”
But I suspect coal investors probably know something about their competition from The Famed And Glorious Renewables Revolution.
Speaking of which, Scott-the-trader wrote to me yesterday to point out that in the wind-favoured-electricity-market of South Australia the wind was stilled and electricity spot prices were up: “Pool prices at $200/Mwh compared to the rest of the NEM at $60/Mwh.” (NEM means National Electricity Market.) He went on, “5 minute pool price price in SA has been as high as $12000/Mwh this morning.” I wondered, how much power the turbines were making? He replied, “…the next 5 minute snapshot after the one attached was for a net draw by wind from the system of 1Mw.”
“Installed capacity of wind in SA >1220Mw. ”
Running at minus 0.1% capacity then*.
Electricity spot prices across Eastern Australia Monday 29-April-2013
A few hours later things had improved (slightly) “Combined wind output totalled across Vic and SA is right at this moment 52Mw from the installed capacity of 2107Mw”
But hey. It’s only one day, right?
UPDATE: Commenters were asking about the screen, and if that pricing information is available on the web publicly. It’s not. But Scott points to the goovernment AEMO site (Australian Energy Operator), and the Wind-Farm Performance site where data is available for yesterday, and on each wind-farm individually. We can see that yesterday national wind-generated-electricity was running a bit less than 10% of capacity.
* UPDATE #2: John Hultquist points out that a “draw” on the system means it’s a negative. Scott confirms. The wind towers are using electricity trying to “hunt” for the right wind direction. The word “minus” added to that sentence.
Over Easter, psychologist Stephan Lewandowsky moved from Perth to Bristol (lucky UK). He’s the psychologist who is expert in an imaginary group of humans called “Climate deniers”. Neither he, nor anyone else has ever met one but he discovered their imaginary motivations by surveying the confused groups who hate them. As you would, right?
None of the so-called researchers can explain what scientific observations a climate denier, denies. It’s an abuse of English, profoundly unscientific, but has some success in shutting down public debate, if that’s what you want.
Can humans change the weather and stop the storms? If you know we can, Lewandowsky calls that “science”. If you wonder “how much”, you are a denier.
The Royal Society, possibly reaching a tipping point in its rush to abject scientific decay, has immediately awarded him the Royal Society Wolfson Research Merit Award. It’s effectively a top-up on his salary for the next five years, just in case the UK might lose him. While Australia is grateful, scientists everywhere, cry. Hat tip to Geoff Chambers
The scheme provides up to 5 years’ funding after which the award holder continues with the permanent post at the host university.
The focus of the award is a salary enhancement, usually in the range of £10,000 to £30,000 per annum.
The Royal Society now “owns” Stephan Lewandowsky’s achievements, for they have decided he is talent from overseas “of outstanding achievement and potential.” We presume they have the internet, and bothered to google the obvious? Apparently, either The Royal Society has not much idea of what they now need to defend, or they have changed their definition of “outstanding achievement”.
I should think that now was an excellent time for concerned fellows to write to The Royal Society and politely ask what that definition is.
What does it take to be “outstanding”?
Does an outstanding Royal Society Scientist base their work around namecalling? Hello “deniers”. Do they hail the chosen ones — annointed climate science experts (aka the Gods of Science) and declare authority to be more important than observations? (How many fallacies-per-page does it take to qualify for RS recognition these days?)
Call me a cynic: Is “outstanding” success now assessed in a pragmatic spirit — say managing the unthinkable — like achieving journal publications and media headlines with only ten results from an online internet survey? It might not be outstanding maths, but it is outstanding PR-for-a-cause, especially given the data doesn’t remotely support the title of the paper or the headline.
Perhaps his real “achievement” is to get ethical approval for work done by researchers who hold their subjects in contempt and don’t bother trying to hide that? That would breach normal National Health and Medical Research Council (NHMRC) ethical standards. It would mean psychology could be used against personal enemies, and taxpayer funded grants be used against taxpayers. None of which is what “outstanding” science should be, or used to be.
The Royal Society was established near the start of the Enlightenment, to promote empirical science over political authority. It’s motto: Nullius in verba, Latin for “Take nobody’s word for it”. Now it aids and abets in the old Soviet tactic of medicalization of dissent, to silence and discredit those who promote opinions or facts that the political establishment finds inconvenient. The Lewandowsky connection risks damning the Royal Society forever as just another corrupt political authority, and profoundly anti-science.
But sometimes it’s not some abstruse subtle bias. Sometimes it’s not a good-natured joke. Sometimes people might just be actively working to corrupt your data.
The paper’s thesis was that climate change skeptics are motivated by conspiracy ideation…
Unfortunately, it’s…possible Stephan Lewandowsky wasn’t the best person to investigate this? Aside from being a professor of cognitive science, he also runs Shaping Tomorrow’s World, a group that promotes “re-examining some of the assumptions we make about our technological, social and economic systems” and which seems to be largely about promoting global warming activism. While I think it’s admirable that he is involved in that, it raises conflict of interest questions. And the way his paper is written – starting with the over-the-top title – doesn’t do him any favors.
(if the conflict of interest angle doesn’t make immediate and obvious sense to you, imagine how sketchy it would be if a professional global warming denier was involved in researching the motivations of global warming supporters)
…This then devolved into literally the worst flame war I have ever seen on the Internet…
Voting closes on Tuesday. Look for “Jo Nova“. Click “next” until you get the chance to click Finish, or Submit. Thanks! Overseas votes welcome.
This week in stem cell news one research group announced they’d accidentally figured out a way to easily convert human bone-marrow stem cells into brain cells which could in future repair spinal or brain damage. Another group showed that if you happen to be a particular type of old mouse with memory problems, researchers can give you a transplant of stem cells that restore your learning and memoryand help you swim through water mazes faster. But seriously, these discoveries could help a lot of very needy people.
Meanwhile Australia, celebrated it’s one millionth roofing panel that provide expensive, irregular electricity.
Ladies and Gentlemen — there is a revolution going on, and it’s not the Green one. How much could $2 billion wasted dollars have achieved if it were spent wisely?
These two studies fit together quite well — the first shows it’s possible to use stem cells to restore brain function, the second suggests it might be easier to get the right stem cells than anyone thought.
Repairing damaged mouse brains
How’s this for odd, wierd, exciting and worrying at the same time, human embryonic stem cells were implanted into a strain of mouse that does not reject transplants from other species. The cells were cultured with chemicals that helps them develop into nerve cells, and they apparently went on to become functional and useful.
[University of Wisconsin-Madison] “After the transplant, the mice scored significantly better on common tests of learning and memory in mice. For example, they were more adept in the water maze test, which challenged them to remember the location of a hidden platform in a pool.
The study began with deliberate damage to a part of the brain that is involved in learning and memory.
Three measures were critical to success, says Zhang: location, timing and purity. “Developing brain cells get their signals from the tissue that they reside in, and the location in the brain we chose directed these cells to form both GABA and cholinergic neurons.”
These cells were placed at the hippocampus, and grew out to connect with the damaged part of the mouse brains (the medial septum). They specialized and connected the right cells together.
This is Big Medicine, and there are big risks. Often, stem cells grow into tumors. This time, the research team say they got it right by coaching the stem cells to differentiate before they were injected.
[Science Daily] “Ensuring that nearly all of the transplanted cells became neural cells was critical, Zhang says. “That means you are able to predict what the progeny will be, and for any future use in therapy, you reduce the chance of injecting stem cells that could form tumors. In many other transplant experiments, injecting early progenitor cells resulted in masses of cells — tumors. This didn’t happen in our case because the transplanted cells are pure and committed to a particular fate so that they do not generate anything else. We need to be sure we do not inject the seeds of cancer.”
Brain repair through cell replacement is a Holy Grail of stem cell transplant, and the two cell types are both critical to brain function, Zhang says. “Cholinergic neurons are involved in Alzheimer’s and Down syndrome, but GABA neurons are involved in many additional disorders, including schizophrenia, epilepsy, depression and addiction.”
Though tantalizing, stem-cell therapy is unlikely to be the immediate benefit. Zhang notes that “for many psychiatric disorders, you don’t know which part of the brain has gone wrong.” The new study, he says, is more likely to see immediate application in creating models for drug screening and discovery.
Converting human bone-marrow into brain cells on demand
Ultimately, who wouldn’t prefer to stay outside the ethical quagmire, avoid embryonic stem cells, and generate your own perfect “transplants”, as needed? Our bodies won’t reject our own cells, but it is expensive and difficult to generate a personal cell-line for every patient — this discovery may change that. A group, directed by Richard Lerner at Scripps, discovered that an antibody can be injected into bone marrow cells and transform the cells into brain cells. They thought the antibody they were injecting would stimulate the growth of the stem cells, but did not expect it would set off sweeping changes and induce the formation of neural cells. A serendipitous discovery.
[Scripps News] Current techniques for turning patients’ marrow cells into cells of some other desired type are relatively cumbersome, risky and effectively confined to the lab dish. The new finding points to the possibility of simpler and safer techniques. Cell therapies derived from patients’ own cells are widely expected to be useful in treating spinal cord injuries, strokes and other conditions throughout the body, with little or no risk of immune rejection.
“These results highlight the potential of antibodies as versatile manipulators of cellular functions,” said Richard A. Lerner, the Lita Annenberg Hazen Professor of Immunochemistry and institute professor in the Department of Cell and Molecular Biology at TSRI, and principal investigator for the new study. “This is a far cry from the way antibodies used to be thought of—as molecules that were selected simply for binding and not function.”
Lerner says. “With this method, you can go to a person’s own stem cells and turn them into brain cells that can repair nerve injuries.
This group plans to work with another team who are trying to regenerate nerves in the eye.
What was especially ground-breaking about this was that a single type of molecule tranformed the cells instead of a long, risky, series of steps:
Only five years ago Australia had a mere 20,000 solar systems installed on homes across the country. Now thanks to a Gonzo-Big-Daddy-Government we have over one million solar systems, almost all of them producing electricity that could have been made for something like a fifth of the price with coal.
The Clean Energy Regulator spins it as though wasting money on inefficient equipment in the hope of reducing world temperatures is a good thing for Australia.
“…the Clean Energy Regulator, which estimates that those solar energy systems provide power for around 2.5 million Australians. With a population of around 23 million, that means over ten percent of the country benefits from solar power.”
So 10% of Australians benefit from solar, and 90% pay for it?
“The regulator also says the installations have saved Australians about half a billion dollars on electricity bills!”
The regulator doesn’t say how much Australians had to pay to “save” a half a billion dollars.
At the prevailing REC prices, this effectively provided an up-front capital subsidy of $777 million to solar PV systems in 2010.
In October 2012, The Weekend Australian estimated a cost of $3 billion:
SUBSIDIES for the Gillard government’s rooftop solar scheme are threatening to blow out to $3 billion as households rush to install panels to beat price hikes related to the start of the carbon tax.
In NSW that was nearly $300 per household or business:
NSW Energy Minister Chris Hartcher said the combined impact of the carbon tax and the RET added $270 to NSW household and small business annual electricity bills and the state government wanted the RET closed.
The Victoria Auditor General showed large scale solar costs about 5.5 times as much as coal, and small scale rooftop solar would cost even more. The part time nature of solar power means large scale baseload providers (like coal fired power) run less efficiently. There are rarely any actual CO2 savings, and the savings there are, are not cheap. (The productivity commission estimated the RET scheme “abated” CO2 emissions at a cost of $177–$497/ton. Tarrifs have been reduced since then, but then, on the EU market, CO2 credits sells for 3 Euro a ton.) Peak electricity use at home is before 9am and after 3pm, (not the sunniest part of the day). Our baseload power consumption even on the quietest nights is still 60% of the peak. Solar just can’t do that.
As usual, fans of solar talk about the “capacity”:
The national installed capacity is now at 2.452 GW from 1,011,478 solar PV systems.
But when it’s dark their capacity is zero GW. The average works out to be around 5 – 20% of the total installed capacity. (See also wikipedia)
If we have spent $3 billion on solar power we could have bought the same electricity from coal instead and had $2 billion to spare for medical research. Buying inefficient solar panels from China is not going to make us a world leader, and it isn’t going to cool the planet. Are we doing it in the hope that the profits from the fake solar market will help someone somewhere “invent” solar power that works? If so, why not just spend the money on research ourselves? We might even discover something worth using and selling.
The irony: the answer to “clean” energy might not be the glossamer sun or the lilting breeze, but an infectious germ.
[Science Daily] …a team from the University of Exeter, with support from Shell, has developed a method to make bacteria produce diesel on demand. While the technology still faces many significant commercialisation challenges, the diesel, produced by special strains of E. coli bacteria, is almost identical to conventional diesel fuel…
They’re not there yet, yields are … tragic.
[BBC] Professor Love said it would take about 100 litres of bacteria to produce a single teaspoon of the fuel.
“Our challenge is to increase the yield before we can go into any form of industrial production,” he said.
But speaking as someone who did microbiology, sooner or later, the bug solution is coming. I presume everyone knows the old exponential growth story where one bacteria weighing 10-12 of a gram, doubles every 20 minutes, and if Earth were a cheesecake, 2 days later you’ve converted it into E.Coli (and 4000 times over)? (There’s more on this theme here).
There is power in them efficient little biology machines. Our chemical factories are mere shadows of the curmudgeonly ‘Coli. Though in the end — even bacteria need to be fed, and these ones will be eating some kind of sugar. It has to come from somewhere.
To create the fuel, the researchers, who were funded by the oil company Shell and the Biotechnology and Biological Sciences Research Council, used a strain of E. coli that usually takes in sugar and then turns it into fat.
Using synthetic biology, the team altered the bacteria’s cell mechanisms so that the sugar was converted to synthetic fuel molecules instead.
Other biofuels are pathetic
Apparently biofuel made from vegetable oil is so bad it’s worse than fossil fuels. See the recent report by Chatham House.
For some inexplicable reason, the EU has decided the UK must use 5% biofuels in its fuel mix starting from last week. (And just who runs the country eh?) But the ethanol distilled from corn and rapeseed oil are not so environmentally friendly. Tropical forests get razed to grow the palm oil, and hungry people can’t eat the corn that’s fed to cars, it’s expensive (UK motorists need to pay out an extra £460m a year), and it isn’t very good at reducing CO2 (if that mattered). Basically it kills humans and trees, but protects underground rocks.
So some bright spark thought we ought insist on people using “used cooking oil” — which would’ve been thrown away. But apparently there are too many hungry cars on the road and the price of “used cooking oil” rose so high that it was a sensible financial decision to buy new palm oil, fry a single dim-sim in the vat, and sell the lot at a premium because it was now “used”.
Tony Abbott has a plan to try to convince China and the US to sign up for the “global climate change deal.” As if the world’s number one and two economies, with a population of 1.6 billion combined, will be waiting for instructions. And as if the global climate needed “a deal”. Hey but we do have 22 million people. squeak. squeak.
To make matters worse, Greg Hunt — the opposition spokesman for the environment — said a Coalition Government might not wipe out the emissions reductions target but… wait, they might lift the target instead. Thus taking something useless, expensive and ineffective against a problem-that-doesn’t-exist and making it moreso.
It’s a mistake every which way. The Liberal Party could play them at their own green game and beat them, just by applying common sense. Instead its appeasing the politically correct namecallers (who wouldn’t vote for them anyway), and the price they pay is to look weak, irrational and lacking in conviction.
A true environmentalist would stop wasting money on schemes that don’t help the environment. (Why spend a cent cooling Australia by no degrees? There goes the carbon tax…)
If the Liberal Party were serious about protecting the environment, they would promise to drop funding for pointless fantasies and token do-gooder projects and get the science right first. A government that was serious about the environment would use some saved funds to set up an entirely new climate science research unit — one that aimed at predicting the climate (inasmuch as it is possible). Better climate models would help farmers, town planners, tourism operators, emergency services, dams and water catchments. It’s not just green, its a productivity thing too. Better than a wind-farm…
The new unit could compete with the BOM and CSIRO and may the best scientists win.
At the end of the day, if the Greens cared about the air, the temperature and the trees, they would care about the data used to track these things. They would care about the outcomes. Anything less is just “seeming”.