Nature article: $250 million should be spent on climate models able to skillfully simulate clouds & convection
http://hockeyschtick.blogspot.com/2014/11/nature-article-250-million-should-be.html
An article published today in Nature notes multiple and substantial uncertainties and deficiencies of climate models which are “crucial for predicting global warming,” due primarily to the low-resolution of today’s models which is insufficient to skillfully simulate essential climate aspects such as clouds, ocean eddies, convection, water cycle, thunderstorms, “crucial components of the oceans” such as “the Gulf Stream, and the Antarctic Circumpolar Current” [and ocean oscillations] etc As the article mentions, typical climate models use a low resolution of 100 km, but much higher resolutions of 1 km or higher are required to skillfully model convection and clouds, far beyond the capability of current supercomputers. The author recommends a quarter billion dollars be spent to create international supercomputing centers for climate models, before the world spends trillions on mitigation based on the Precautionary Principle that may or may not be necessary. As climate scientist Dr. Roger Pielke Sr. has pointed out, and contrary to popular belief, climate models are not based on “basic physics,” rather are almost entirely comprised of parameterizations/fudge factors for most critical aspects of climate including convection and clouds. As the article below notes, “simulations of climate change are very sensitive to some of the parameters [fudge factors] associated with these approximate representations of convective cloud systems” However, even if supercomputers are developed over the next decade capable of handling such high resolution, substantial doubt remains of the benefits for climate prediction due to the inherent limitations of chaos theory, multiple flawed assumptions in the model code, and inadequate observations to initialize such numeric models. These are some of the reasons why two recent papers instead call for a new stochastic approach to climate modeling. Climate forecasting: Build high-resolution global climate models Tim Palmer 19 November 2014 International supercomputing centres dedicated to climate prediction are needed to reduce uncertainties in global warming, says Tim Palmer. Local effects such as thunderstorms, crucial for predicting global warming, could be simulated by fine-scale global climate models. Excerpts: The drive to decarbonize the global economy is usually justified by appealing to the precautionary principle: reducing emissions is warranted because the risk of doing nothing is unacceptably high. By emphasizing the idea of risk, this framing recognizes uncertainty in the magnitude and timing of global warming. This uncertainty is substantial. If warming occurs at the upper end of the range projected in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report1, then unmitigated climate change will probably prove disastrous worldwide, and rapid global decarbonization is paramount. If warming occurs at the lower end of this range, then decarbonization could proceed more slowly and some societies’ resources may be better focused on local adaptation measures. Reducing these uncertainties substantially will take a new generation of global climate simulators capable of resolving finer details, including cloud systems and ocean eddies. The technical challenges will be great, requiring dedicated supercomputers faster than the best today. Greater international collaboration will be needed to pool skills and funds. Against the cost of mitigating climate change — conceivably trillions of dollars — investing, say, one quarter of the cost of the Large Hadron Collider (whose annual budget is just under US$1 billion) to reduce uncertainty in climate-change projections is surely warranted. Such an investment will also improve regional estimates of climate change — needed for adaptation strategies — and our ability to forecast extreme weather. Grand challenges The greatest uncertainty in climate projections is the role of the water cycle — cloud formation in particular — in amplifying or damping the warming effect of CO2 in the atmosphere2. Clouds are influenced strongly by two types of circulation in the atmosphere: mid-latitude, low-pressure weather systems that transport heat from the tropics to the poles; and convection, which conveys heat and moisture vertically. Global climate simulators calculate the evolution of variables such as temperature, humidity, wind and ocean currents over a grid of cells. The horizontal size of cells in current global climate models is roughly 100 kilometres. This resolution is fine enough to simulate mid-latitude weather systems, which stretch for thousands of kilometres. But it is insufficiently fine to describe convective cloud systems that rarely extend beyond a few tens of kilometres. Simplified formulae known as ‘parameterizations’ [i.e. fudge factors] are used to approximate the average effects of convective clouds or other small-scale processes within a cell. These approximations are the main source of errors and uncertainties in climate simulations3. As such, many of the parameters used in these formulae are impossible to determine precisely from observations of the real world. This matters, because simulations of climate change are very sensitive to some of the parameters [fudge factors] associated with these approximate representations of convective cloud systems4. Decreasing the size of grid cells to 1 kilometre or less would allow major convective cloud systems to be resolved. It would also allow crucial components of the oceans to be modelled more directly. For example, ocean eddies, which are important for maintaining the strength of larger-scale currents such as the Gulf Stream and the Antarctic Circumpolar Current, would be resolved. Simulation of convective cloud systems in a limited-area high-resolution climate model. The goal of creating a global simulator with kilometre resolution was mooted at a climate-modelling summit in 20095. But no institute has had the resources to pursue it. And, in any case, current computers are not up to the task. Modelling efforts have instead focused on developing better representations of ice sheets and biological and chemical processes (needed, for example, to represent the carbon cycle) as well as quantifying climate uncertainties by running simulators multiple times with a range of parameter values. Running a climate simulator with 1-kilometre cells over a timescale of a century will require ‘exascale’ computers capable of handling more than 1018 calculations per second. Such computers should become available within the present decade, but may not become affordable for individual institutes for another decade or more. Climate facilities The number of low-resolution climate simulators has grown: 22 global models contributed to the IPCC Fourth Assessment Report in 2007; 59 to the Fifth Assessment Report in 2014. European climate institutes alone contributed 19 different climate model integrations to the Fifth Assessment database (go.nature.com/3gu8co). Meanwhile, systematic biases and errors in climate models have been only modestly reduced in the past ten years6… Even with 1-kilometre cells, unresolved cloud processes such as turbulence and the effects of droplets and ice crystals will have to be parameterized [fudge-factored] (using stochastic modelling to represent uncertainty in these parameterizations9). How, therefore, can one be certain that global-warming uncertainty can be reduced? The answer lies in the use of ‘data assimilation’ software — computationally demanding optimization algorithms that use meteorological observations to create accurate initial conditions for weather forecasts. Such software will allow detailed comparisons between cloud-scale variables in the high-resolution climate models and corresponding observations of real clouds, thus reducing uncertainty and error in the climate models10. High-resolution climate simulations will have many benefits beyond guiding mitigation policy. They will help regional adaptation, improve forecasts of extreme weather, minimize the unforeseen consequences of climate geoengineering, and be key to attributing current weather events to climate change. High-energy physicists and astronomers have long appreciated that international cooperation is crucial for realizing the infrastructure they need to do cutting-edge science. It is time to recognize that climate prediction is ‘big science’ of a similar league.
— gReader Pro