Two prominent U.S. Government scientists made two separate admissions questioning the reliability of climate models used to predict warming decades and hundreds of years into the future.
Gary Strand, a software engineer at the federally funded National Center for Atmospheric Research (NCAR), admitted climate model software “doesn’t meet the best standards available” in a comment he posted on the website Climate Audit.
“As a software engineer, I know that climate model software doesn’t meet the best standards available. We’ve made quite a lot of progress, but we’ve still quite a ways to go,” Strand wrote on July 5, 2009, according to the website WattsUpWithThat.com.
Strand’s candid admission promoted WattsUpWithThat’s skeptical Meteorologist Anthony Watts to ask the following question:
“Do we really want Congress to make trillion dollar tax decisions today based on ‘software [that] doesn’t meet the best standards available?’”
Meteorologist Watts also critiqued the current climate models, noting, “NASA GISS model E written on some of the worst FORTRAN coding ever seen is a challenge to even get running. NASA GISTEMP is even worse. Yet our government has legislation under consideration significantly based on model output that Jim Hansen started. His 1988 speech to Congress was entirely based on model scenarios.”
Another Government Scientist Admits Climate Model Shortcomings
Another government scientist — NASA climate modeler Gavin Schmidt — admitted last week that the “chaotic component of climate system…is not predictable beyond two weeks, even theoretically.”
Schmidt made his admission during a June 29, 2009 interview about the shortcomings of climate models. Schmidt noted that some climate models “suggest very strongly” that the American Southwest will dry in a warming world. But Schmidt also noted that “other models suggest the exact opposite.”
“With these two models, you have two estimates — one says it’s going to get wetter and one says it’s going to get drier. What do you do? Is there anything that you can say at all? That is a really difficult question,” Schmidt conceded.
“The problem with climate prediction and projections going out to 2030 and 2050 is that we don’t anticipate that they can be tested in the way you can test a weather forecast. It takes about 20 years to evaluate because there is so much unforced variability in the system which we can’t predict — the chaotic component of the climate system — which is not predictable beyond two weeks, even theoretically. That is something that we can’t really get a handle on,” Schmidt lamented. [Note: Schmidt has been under fire for his website’s recent scientific woes at RealClimate.org. See: Climate Depot Report: Real Climate Exposed! A Comprehensive Report on the ‘Real’ RealClimate.org – June 30, 2009 ]
Climate models made by unlicensed ‘software engineers’
The credibility of these computer model predictions — used by governments to determine global warming policy based on future climate risks — have been under increasingly intense scrutiny for years.
In June 2007, Dr. Jim Renwick, a top UN IPCC scientist, admitted that climate models do not account for half the variability in nature and thus are not reliable. “Half of the variability in the climate system is not predictable, so we don’t expect to do terrifically well,” Renwick conceded. (LINK)
Another high-profile UN IPCC lead author, Dr. Kevin Trenberth, echoed Renwick’s sentiments in 2007 about climate models by referring to them as “story lines.”
“In fact there are no predictions by IPCC at all. And there never have been. The IPCC instead proffers ‘what if’ projections of future climate that correspond to certain emissions scenarios,” Trenberth wrote in journal Nature’s blog on June 4, 2007.
Trenberth also admitted that the climate models have major shortcomings because “they do not consider many things like the recovery of the ozone layer, for instance, or observed trends in forcing agents. There is no estimate, even probabilistically, as to the likelihood of any emissions scenario and no best guess.” (LINK)
IPCC reviewer and climate researcher Dr Vincent Gray, of New Zealand, an expert reviewer on every single draft of the IPCC reports going back to 1990, author of more than 100 scientific publications and author of The Greenhouse Delusion: A Critique of “Climate Change 2001,” declared “The claims of the IPCC are dangerous unscientific nonsense” in an April 10, 2007 article. (LINK) & (LINK)
“All [UN IPCC does] is make ‘projections’ and ‘estimates’. No climate model has ever been properly tested, which is what ‘validation’ means, and their ‘projections’ are nothing more than the opinions of ‘experts’ with a conflict of interest, because they are paid to produce the models. There is no actual scientific evidence for all these ‘projections’ and ‘estimates,’” Gray noted.
In addition, atmospheric scientist Dr. Hendrik Tennekes, a scientific pioneer in the development of numerical weather prediction and former director of research at The Netherlands’ Royal National Meteorological Institute, recently compared scientists who promote computer models predicting future climate doom to unlicensed “software engineers.”
“I am of the opinion that most scientists engaged in the design, development, and tuning of climate models are in fact software engineers. They are unlicensed, hence unqualified to sell their products to society,” Tennekes wrote on February 28, 2007. (LINK)
Climate Models Likened to Sony ‘PlayStation’ Video Games & ‘Tinker Toys’
On a New Zealand radio interview in 2007, the late Atmospheric Scientist Augie Auer ridiculed climate model predictions: “Most of these climate predictions or models, they are about a half a step ahead of PlayStation 3 . They’re really not justified in what they are saying. Many of the assumptions going into [the models] are simply not right.” (LINK)
Atmospheric physicist James Peden ridiculed climate models in October 2008, calling them “computerised tinker toys with which one can construct any outcome he chooses.” (LINK)
In addition, top forecasting experts now say the models violate the basic principles of forecasting.
In addition, Prominent Physicist Freeman Dyson has referred to climate models as “rubbish.”
Dyson is a Professor Emeritus of Physics at the Institute for Advanced Study at Princeton University, a fellow of the American Physical Society, a member of the US National Academy of Sciences, and a fellow of the Royal Society of London.
“The fuss about global warming is grossly exaggerated,” writes Dyson in his 2007 book “Many Colored Glass: Reflections on the Place of Life in the Universe.” (See: Dyson: Climate models are rubbish – August 14, 2007)
Dyson is blunt in his criticism of climate models, mocking “the holy brotherhood of climate model experts and the crowd of deluded citizens who believe the numbers predicted by the computer models.”
“I have studied the climate models and I know what they can do. The models solve the equations of fluid dynamics, and they do a very good job of describing the fluid motions of the atmosphere and the oceans. They do a very poor job of describing the clouds, the dust, the chemistry, and the biology of fields and farms and forests,” Dyson wrote.
Small Sampling of Reports and Links Challenging Climate Models:
Scientists Claim Computer Model Predictions are ‘Useless Arithmetic’- February 20, 2007 – Orrin H. Pilkey, a coastal geologist and emeritus professor at Duke and his daughter Linda Pilkey-Jarvis, a geologist in the Washington State Department of Geology, wrote a book in 2007 entitled “Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future.” The new book presents “an overall attack on the use of computer programs to model nature,” according to a February 20, 2007 New York Times book review. The Times book review explained how these models “may include coefficients (the authors call them ‘fudge factors’) to ensure that they come out right. And the modelers may not check to see whether projects performed as predicted.” “Nature is too complex, they (the authors) say, and depends on too many processes that are poorly understood or little monitored — whether the process is the feedback effects of cloud cover on global warming or the movement of grains of sand on a beach,” the Times article explained. “And instead of demanding to know exactly how high seas will rise or how many fish will be left in them or what the average global temperature will be in 20 years, they argue, we should seek to discern simply whether seas are rising, fish stocks are falling and average temperatures are increasing. And we should couple these models with observations from the field. Models should be regarded as producing ‘ballpark figures,’ they write, not accurate impact forecasts,” the Times article continued. The coastal models are so flawed that Pilkey recommends dredging up a lot of sand and dumping it on the beach “willy-nilly” and he predicts you would end up with the same result, minus the “false mathematical certitude.”
Excerpt: Nearly two dozen prominent scientists from around the world have denounced a recent Associated Press article promoting sea level fears in the year 2100 and beyond based on unproven computer models predictions. […] Prominent scientist Professor Nils-Axel Morner, declared “the rapid rise in sea levels predicted by computer models simply cannot happen.” […] Internationally known forecasting pioneer Scott Armstrong of the Wharton School at the Ivy League University of Pennsylvania and his colleague Kesten Green Monash University in Australia: “As shown in our analysis experts’ forecasts have no validity in situations characterized by high complexity, high uncertainty, and poor feedback. To date we are unaware of any forecasts of sea levels that adhere to proper (scientific) forecasting methodology and our quick search on Google Scholar came up short,” Armstrong and Green explained. “Media outlets should be clear when they are reporting on scientific work and when they are reporting on the opinions held by some scientists. Without scientific support for their forecasting methods, the concerns of scientists should not be used as a basis for public policy,” they concluded. […] Ivy League geologist Dr. Robert Giegengack of the University of Pennsylvania, explains that sea level is only rising up 1.8 millimeters per year (0.07 inches) — less than the thickness of one nickel. “Sea level is rising,” Giegengack said, but it’s been rising ever since warming set in 18,000 years ago, he explained. “So if for some reason this warming process that melts ice is cutting loose and accelerating, sea level doesn’t know it. And sea level, we think, is the best indicator of global warming,” he said.
Excerpt: President Obama’s Energy Secretary Steven Chu is at it again. Fresh off his declarations in May claiming computer model predictions as evidence of a certain climate catastrophe, (see: Climate Depot Exclusive: Sec. Chu’s assertions ‘quite simply being proven wrong by the latest climate data’ – April 19, 2009), he has now gotten more bold, confidently predicting a certain climate catastrophe by the year 2109. (When he and everyone who hears his warning today will be unable to verify his predictions because they will be conveniently DEAD!) Chu told a conference in California his latest prognostication. “At no other time in the history of science have we been able to say what the future will be 100 years from now,” Chu, the soothsayer, declared according to a June 28, 2009 article in Palo Alto Online News. The question looms: Shouldn’t Energy Sec. Chu be touting these scary predictions of the year 2100 on a boardwalk somewhere with a full deck of Tarot Cards? Chu continued: “For the first time in human history, science has shown that we are altering the destiny of our planet…It’s quite alarming. Every year looks more alarming. … An irony of climate change is that the ones who will be hurt the most are the innocent — those yet to be born.”
Scientists Write Open Letter to Congress: ‘Earth has been cooling for ten years’ – ‘Present cooling was NOT predicted by the alarmists’ computer models, and has come as an embarrassment to them’ – July 1, 2009