Search
Close this search box.

Sacrificing Scientific Skepticism: (Re)Discovering Disproof – Science Writer Talks About Lessons Learned About Climate Change

by Phil Berardelli

Read the previous installment of “Sacrificing Scientific Skepticism” here, and the first installment here.

Summary: A veteran science journalist looks back 15-plus years of reporting on science and climate issues to offer key lessons—and warning—about climatology. Journalists and scientists should remain the most skeptical of professionals, particularly in areas like climatology that we don’t understand well.

The Possibility of a “Little Ice Age”

 For over a decade, solar scientists have been observing an unusual change in the sun’s activity. They know it’s unusual because the data record goes back over 400 years, beginning with Galileo’s early telescopic observations in the early 1600s. The record shows that between 1645 and 1715, very few sunspots appeared; in fact, for long stretches the sun’s face was blank. Solar scientists call it the Maunder Minimum. During approximately that same period, Europe experienced an era known as the Little Ice Age. Rivers such as the Thames in England froze solid for months at a time, though they rarely were seen to freeze before or since. And Europe suffered short summers coupled with long, very cold winters.

Why is this potentially important? Two reasons.

First, many climate scientists have asserted that the buildup of CO2 and other greenhouse gases in the atmosphere exerts a much stronger effect on global temperatures than any changes in solar activity.

Second, the aforementioned solar behavior is mirroring what happened before the Maunder Minimum. After nearly three decades of studying the solar magnetic field (sunspots are actually explosive outcroppings of the sun’s magnetism), scientists believe that perhaps within two decades Europe will see another Little Ice Age.

In some ways, we could consider the onset of a “little” ice age, if it develops and lasts maybe a half-century, to be fortuitous. Yes, such a temperature shift would no doubt cause difficulty for many people and nations. But the change would also produce two significant benefits. One, it would prove or disprove assertions about the degree of the sun’s direct influence on global temperatures—if sunspots disappear and Earth cools, then there’s no question solar activity is the preeminent climate driver. But if a Maunder Minimum recurs without a corresponding and widespread cooling, then the extra atmospheric CO2 has been sufficient to overcome a diminished sun.

Two, if a new Little Ice Age arrives, we should immediately abandon any thoughts of trying to cool the planet, because it would become obvious that doing so would be a terrible mistake. Instead, scientists and politicians could set their sights on either mitigating or preventing the next ice age, and, in the process, head off what surely would become a global catastrophe for humanity.

That’s yet another item on the list of large unknowns, climate-wise. Setting aside the apparent disagreement between the computer models and the temperature data, assume that global warming is happening and that human CO2 emissions are the primary cause. Could global warming finally disrupt the ice-age cycle, even though at least three supervolcanic eruptions apparently could not? And, if so, should we then regard the presumed driver of global warming—the burning of fossil fuels—as the savior of civilization?

Could global warming disrupt the ice-age cycle, even though at least three supervolcanic eruptions apparently could not?

It would be foolish, and potentially dangerous, to draw any such conclusions at this point. Some of the proxy data suggest that during the last interglacial period, about 120,000 years ago, the average temperature on Earth was 2 degrees Celsius higher than it is now. If true, somehow the planet achieved that amount of warmth without the aid of an industrial civilization.

Share: