Close this search box.

Search Results for: lovejoy

Former Obama Scientist Dr. Steve Koonin rebuts attack from 12 climate activist academics: Cites Einstein: ‘If I were wrong, it wouldn’t take a dozen scientists to disprove me – one would be sufficient’

It took 12 climate activist academics to attempt to discredit former Obama scientist Dr. Steve Koonin That ‘Obama Scientist’ Climate Skeptic You’ve Been Hearing About: By Naomi Oreskes, Michael E. Mann, Gernot Wagner, Don Wuebbles, Andrew Dessler, Andrea Dutton, Geoffrey Supran, Matthew Huber, Thomas Lovejoy, Ilissa Ocko, Peter C. Frumhoff, Joel Clement Koonin’s full response to the 12 academics in SciAm Koonin responds to a Scientific American article by Oreskes et al. Scientific American has published a criticism of me and my recent book, Unsettled.  Most of that article’s 1,000 words are scurrilous ad hominem and guilt-by-association aspersions from the twelve co-authors.    Only three scientific criticisms are buried within their spluttering; here is my response to each them. The first criticism concerns rising temperatures: A recent Washington Post column by conservative contributor Marc Thiessen repeats several points Koonin makes. The first is citing the 2017 National Climate Assessment to downplay rising temperatures—but the report’s very first key finding on the topic says temperatures have risen, rapidly since 1979, and are the warmest in 1,500 years.  In fact, Unsettled explicitly acknowledges a warming globe, but also the problems in comparing instrumental and proxy temperatures that weaken confidence in the “warmest in 1,500 years”.    The book’s Chapter 5 criticizes in detail the 2017 report’s misleading and inaccurate representation of a different temperature metric, US extreme temperatures.  To the surprise of many, the country’s warmest temperatures have not increased since 1960 and are no higher in recent years than they were in 1900. The authors go on to offer: The second is Thiessen quoting Koonin’s use of an outdated 2014 assessment on hurricanes to downplay climate concerns. But the newer 2017 report finds that human activity has “contributed to the observed upward trend in North Atlantic hurricane activity since the 1970s.”  In fact, Unsettled’s Chapter 6 discusses the description of hurricanes in the 2014 report, in the 2017 report, and in more recent research papers through 2020, including an authoritative 2019 assessment by eleven hurricane experts.  None of those studies claim any detectable human influences on hurricanes. Finally, we’re given: A third point downplays sea level rise by portraying it as steady over time, cherry-picking reports from the Intergovernmental Panel on Climate Change. In fact, the rate of sea-level rise has quadrupled since the industrial revolution, as climate scientists pointed out years ago when Koonin made this same argument.  In no sense does Unsettled  portray sea level rise as “steady over time”.  Rather, the book’s Chapter 8 does quite the opposite, describing the full decadal variability as portrayed in the IPCC reports and subsequent research literature, but somehow omitted in the 2017 National Climate Assessment. The IPCC statement that rates of rise between 1920 and 1950 were likely similar to those of recent decades complicates attribution of recent trends. It is telling that these three criticisms cite Thiessen’s column rather than what I’ve written in Unsettled.  That they are readily countered suggests the authors haven’t read the book or, if they have, they aren’t acting in good faith.  That’s precisely the same unprofessional behavior found in the easily rebutted “fact check” of, again, a review of Unsettled, not the book itself. To paraphrase a statement attributed to Einstein, “If I were wrong, it wouldn’t take a dozen scientists to disprove me – one would be sufficient.”  As I write in Unsettled, I welcome serious, informed discussion of any of the points I raise in the book.  Unfortunately, the article by Oreskes et al. falls well short of that standard. Steven E. Koonin is the author of the bestselling book Unsettled: What climate science tells us, what it doesn’t, and why it matters.   Click to access Koonins-response-to-SciAm.pdf   Related Links: Former UN scientist Ben Santer outraged, cuts ties to Lawrence Livermore lab because Steve Koonin allowed to speak at seminar Book Review: Unsettled: What Climate Science Tells Us, What It Doesn’t, And Why It Matters, by Steven E. Koonin Watch: Obama Energy Dept. scientist Dr. Koonin dissents: Human have not ‘broken’ the climate system – Climate alarmists ‘misinformed’ Watch: Bill O’Reilly interviews NYU Physicist Koonin on Global Warming and he had some interesting things to say Obama’s Chief Energy Scientist Declares his Climate Dissent: Physicist Dr. Steven Koonin: Mr. Koonin is a Brooklyn-born math whiz and theoretical physicist, a product of New York’s selective Stuyvesant High School….He would teach at Caltech for nearly three decades, serving as provost in charge of setting the scientific agenda for one of the country’s premier scientific institutions…Served as chief scientist of the Obama Energy Department. # Watch: Former Obama-Biden federal scientist Dr. Steve Koonin declares his climate dissent – Served as former Energy Dept undersecretary Watch: Former Obama-Biden federal scientist Dr. Steve Koonin declares his climate dissent – Served as former Energy Dept undersecretary Fox Business Channel – Broadcast March 19, 2021 – ‘Kudlow’ w/ Larry Kudlow Physicist Dr. Steve Koonin was undersecretary of energy for science during President Obama’s first term and is director of the Center for Urban Science and Progress at New York University. Physicist Dr. Steve Koonin: “First of all, I think everybody agrees that the globe has warmed about a degree from 1900 until the present. And that warming is due to some combination of human influences and natural influences. But beyond that, almost no severe weather event shows any detectable trend. There are no long-term trends in droughts or floods around the globe — in severe weather events like thunderstorms. Sea level is rising at the spectacular rate of one foot per century and was doing it at about the same rate 80 years ago. In the US, record high temperatures are no more frequent than they were in the 1900s. I can go on and on. No detectable human influences on hurricanes. This is not Steve talking, this is what’s in those reports often explicitly, but sometimes a little bit obscured and you got to read closely to find it.” Larry Kudlow: “Actually I want you to go on and on, I’ll give you 45 or 50 minutes on the show because this is so important. All these people, you know they got the memo ‘existential threat’, you’re saying factually it’s not true.”  

EU Heads Downgrade Climate Change To ‘A Footnote’ GWPF Newsletter 09/05/19 EU Heads Downgrade Climate Change To ‘A Footnote’ The Truth About The Latest Mass Extinction Scare With scientists warning that the window to avoid extreme climate chaos is closing fast and European efforts to cut greenhouse gas emissions still insufficient, protesters called on European leaders to make climate action a top priority and to increase the EU’s climate targets in line with the Paris climate agreement. –-Greenpeace, 9 May 2019 Heads of state and government from the EU-27 signed off on broad-brush ‘ten commitments’ for Europe’s next five years on Thursday (9 May), as they adopted a vague Sibiu Declaration. Greenpeace criticised the Sibiu Declaration for putting climate change as “a footnote to an afterthought in their statement on the future of Europe”. Executive Director Jennifer Morgan said she was “appalled” by the text. —EurActiv, 9 May 2019  1) EU Heads Downgrade Climate Change To A ‘Footnote’ In Their Statement On Future Of Europe  EurActiv, 9 May 2019  2) The UN’s Extinction Warning Doesn’t Add Up Toby Young, The Spectator, 9 May 2019 3) The Truth About The Latest Mass Extinction Scare Ronald Bailey, Reason, December 2017  4) Matt Ridley: Biodiversity Alarmism Doesn’t Work Reaction, 7 May 2019  5) Green Killers: Dams and Reservoirs Used for Renewable Energy Threaten World’s Rivers Bloomberg, 9 May 2019  6) Merkel’s Partner CSU Calls For Lowering Taxes Instead Of New Carbon Tax Clean Energy Wire, 9 May 2019  7) Nick Timothy: When Will Green Zealots Figure Out That Britain Cannot Fight Climate Change Alone? The Daily Telegraph, 9 May 2019  1) EU Heads Downgrade Climate Change To A ‘Footnote’ In Their Statement On Future Of Europe  EurActiv, 9 May 2019  Heads of state and government from the EU-27 signed off on broad-brush ‘ten commitments’ for Europe’s next five years on Thursday (9 May), as they adopted a vague Sibiu Declaration during the opening stages of an informal summit dedicated to the bloc’s future. The leaders took only a few minutes to agree on the draft text, which was widely circulated last week and covers everything from defence and solidarity to the rule of the law and the EU’s role on the global stage. “We will continue to protect our way of life, democracy and the rule of law,” the declaration reads, concluding that the EU will “jointly tackle global issues such as preserving our environment and fighting climate change”. Less substantial and specific than traditional summit conclusions, the declaration outlines broad strokes on what the EU should focus on in the coming years, rather than suggesting actual courses of action. […] Environmental activists descended on Sibiu to urge the leaders to keep climate change at the top of their agenda, though the Green Party revealed that the mayor of the city had not granted a permit for the march. Demonstrators still made their voices heard though, amid heavy security that locals have said is a remnant of the communist period. Greenpeace also criticised the Sibiu Declaration for putting climate change as “a footnote to an afterthought in their statement on the future of Europe”. Executive Director Jennifer Morgan said she was “appalled” by the text. Full story 2) The UN’s Extinction Warning Doesn’t Add Up Toby Young, The Spectator, 9 May 2019 Anyone watching the BBC’s News at Ten on Monday would have been surprised to learn that economic growth poses a dire threat to the future of life on this planet. We’re used to hearing this from climate change campaigners, but I’ve always taken such claims with a pinch of salt, suspecting that the anti-capitalist left is distorting the evidence. Apparently not. ‘One million species at risk of imminent extinction according to a major UN report,’ intoned the BBC. ‘It says the Earth’s ecosystems are being destroyed by the relentless pursuit of economic growth.’ So does this mean the Extinction Rebellion protestors are right? I decided to do some digging to see if one million species really do ‘face extinction in the next few decades’, as the BBC put it. That claim is based on a report by the UN’s Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), but it hasn’t been published yet. All I could find online was a press release put out by the IPBES and a ‘summary’ of the report ‘for policymakers’. The press release states: ‘The report finds that around one million animal and plant species are now threatened with extinction, many within decades.’ It gives no source for this beyond the as-yet-unpublished report, but the summary makes it clear that it’s partly based on data from the International Union for Conservation of Nature (IUCN) Red List of Threatened Species. The IUCN’s Red List website says that ‘more than 27,000’ species ‘are threatened with extinction’. So how did the IPBES arrive at the one million figure? The key passage in the summary for policymakers reads as follows: ‘An average of around 25 per cent of species in assessed animal and plant groups are threatened, suggesting that around one million species already face extinction, many within decades, unless action is taken.’ The word ‘suggesting’ is doing a lot of work there. On the Red List website, it says 98,500 species have been ‘assessed’ — and the IPBES worked out what percentage 98,500 was of the total number of species and multiplied the 27,000 figure accordingly. That’s a difficult calculation to make, given that the number of species in the world is unknown. The most reliable estimate is 8.7 million (with a margin of error of plus or minus 1.3 million), but even the compilers of that stat acknowledge that 86 per cent of all species on land and 91 per cent of those in the seas have yet to be discovered, described and catalogued. So how exactly did the IPBES arrive at the magic one million number? It seems we’re just supposed to take it on faith, which the BBC duly did. What about the IPBES’s claim that ‘around 25 per cent of species… are threatened’? That seems a little pessimistic, given that the number of mammals to have become extinct in the past 500 years or so is around 1.4 per cent and only one bird has met the same fate in Europe since 1852. Not bad when you consider how much economic growth there’s been in the past 167 years. So what does ‘threatened’ mean? The IPBES is using the IUCN’s definition, which is ‘at high risk of extinction in the wild’. Rather implausibly, the IUCN includes species in this category that it designates as ‘vulnerable’, which it defines as facing a ‘probability of extinction in the wild’ of ‘at least 10 per cent within 100 years’. About half the species the IPBES includes in its 25 per cent figure are in this ‘vulnerable’ category. Full post 3) The Truth About The Latest Eco-Scare Ronald Bailey, Reason, December 2017  New predictions of animal population doom are likely exaggerated. Stanford biologist Paul Ehrlich has made a gaudy career of prophesying imminent ecological doom. “In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now,” he declared in his 1968 manifesto The Population Bomb. In the subsequent 50 years, as world population more than doubled, the proportion of chronically undernourished people in the world dropped from 33 percent in 1968 to 11 percent now. Ehrlich is now predicting population doom for the world’s animals. The cause? Human overpopulation, naturally. Ehrlich and his colleagues Gerardo Ceballos and Rodolfo Dirzo describe the allegedly impending “biological annihilation” of about a third of all vertebrate land species in a paper for The Proceedings of the National Academy of Sciences. “The ultimate drivers of those immediate causes of biotic destruction [are] human overpopulation and continued population growth, and overconsumption, especially by the rich,” they argue. “All signs point to ever more powerful assaults on biodiversity in the next two decades, painting a dismal picture of the future of life, including human life.” The crisis supposedly results from “the fiction that perpetual growth can occur on a finite planet”; meanwhile, “the window for effective action is very short, probably two or three decades at most.” Ehrlich and his colleagues reached those conclusions by taking the International Union for the Conservation of Nature’s data on populations of 27,600 species of mammals, birds, reptiles, and amphibians and overlaying those figures on a grid of 22,000 plots measuring 10,000 square kilometers across all of the continents. The goal is to identify areas where local populations of each species has been extirpated. They report that since 1900 “nearly half of known vertebrate species, 32% (8,851/27,600) are decreasing; that is, they have decreased in population size and range.” This not the first time the alarms of mass extinction has been raised. In 1970, Dr. S. Dillon Ripley, secretary of the Smithsonian Institution, predicted that between 75 and 80 percent of all species of animals would be extinct by 1995. In 1979, the Oxford biologist Norman Myers suggested that the world could “lose one-quarter of all species by the year 2000.” Also in 1979, the Heinz Center biologist Thomas Lovejoy chimed in, estimating that between a seventh and a fifth of global diversity would become extinct by 2000. None of those dire predictions came true. Ehrlich and his dour colleagues are probably wrong too, thanks to human ingenuity and the very trends in “perpetual growth” that they think are threats to biodiversity. First, human population will peak this century at perhaps as few as 8.2 billion people. The United Nations projects that 80 percent of those will be living in cities by 2100, meaning that fewer than 1.6 billion people will be living on the landscape, down from 3.2 billion now. Humanity may already be at peak farmland. If biofuel subsidies are stopped, some researchers project that as much as 400 million hectares of land would be returned to nature by 2060; that is an area double the size of the United States east of the Mississippi River. Many countries have now gone through the forest transition and their forests are expanding. More broadly, the global rate of deforestation has been declining. Furthermore, there is evidence that “dematerialization“: Thanks to technological progress, humanity is using relatively less stuff to obtain more services. Current trends suggest that humanity is likely to withdraw increasingly from nature over the course of this century, thus relinquishing a great deal of territory in which our fellow creatures will be able to thrive. In fact, a very different and much more positive story can be told about how biodiversity is faring around the world. In a forthcoming book, Inheritors of the Earth: How Nature Is Thriving in an Age of Extinction, University of York conservation biologist Chris Thomas points out that at reasonable scales—say, regions the size of Vermont—humanity has actually been enriching local biodiversity. How? By moving around and introducing species to areas they were previously absent. New Zealand’s 2,000 native plant species have been joined by 2,000 from elsewhere, doubling the plant biodiversity of its islands. Meanwhile, only three species of native plants have gone extinct. In many cases, as I reported in my book The End of Doom, the newcomers may actually benefit the natives. In a 2010 review article in the Annual Review of Ecology, Evolution, and Systematics, the Rutgers ecologist Joan Ehrenfeld reports that rapidly accumulating evidence from many introduced species of plants and animals shows that they improve ecosystem functioning by increasing local biomass and speeding up the recycling of nutrients and energy. Similarly, as a 2012 review article in Trends in Ecology and Evolution notes, “ecological theory does not automatically imply that a global decline in species richness will result in impaired functioning of the world’s ecosystems.” In other words, the foundations of civilization are likely not imperiled even in the dubious event that Ehrlich’s mass extinction occurs. Full post 4) Matt Ridley: Biodiversity Alarmism Doesn’t Work Reaction, 7 May 2019  The threat to biodiversity is not new, not necessarily accelerating, mostly not caused by economic growth or prosperity, nor by climate change, and won’t be reversed by retreating into organic self-sufficiency. Driven perhaps by envy at the attention that climate change is getting, and ambition to set up a great new intergovernmental body that can fly scientists to mega-conferences, biologists have gone into overdrive on the subject of biodiversity this week. They are right that there is a lot wrong with the world’s wildlife, that we can do much more to conserve, enhance and recover it, but much of the coverage in the media, and many of the pronouncements of Sir Bob Watson, chair of the Intergovernmental Panel on Biodiversity and Ecosystem Services (IPBES), are frankly weird. The threat to biodiversity is not new, not necessarily accelerating, mostly not caused by economic growth or prosperity, nor by climate change, and won’t be reversed by retreating into organic self-sufficiency. Here’s a few gentle correctives. Much of the human destruction of biodiversity happened a long time ago Species extinction rates of mammals and birds peaked in the 19th century (mostly because of ships taking rats to islands). The last extinction of a breeding bird species in Europe was the Great Auk, in 1844. Thousands of years ago, stone-age hunter-gatherers caused megafaunal mass extinctions on North and South America, Australia, New Zealand and Madagascar with no help from modern technology or capitalism. That’s not to say extinctions don’t still happen but by far the biggest cause is still invasive alien species, especially on islands: it’s chytrid fungi that have killed off many frogs and toads, avian malaria that has killed off many of Hawaii’s honeycreepers, and so on. This is a specific problem that can be tackled and reversed, but it will take technology and science and money, not retreating into self-sufficiency and eating beans. The eradication of rats on South Georgia island was a fine example of doing this right, with helicopters, GPS and a lot of science. We’ve been here before. In 1981, the ecologist Paul Ehrlich predicted that 50% of all species would be extinct by 2005. In fact, about 1.4% of bird and mammal species, which are both easier to document than smaller creatures and more vulnerable to extinction, have gone extinct so far in several centuries. The idea that “western values”, or “capitalism”, are the problem is wrong On the whole what really diminishes biodiversity is a large but poor population trying to live off the land. As countries get richer and join the market economy they generally reverse deforestation, slow species loss and reverse some species declines. Countries like Bangladesh are now rich enough to be reforesting, not deforesting, and this is happening all over the world. Most of this is natural forest, not plantations. As for wildlife, think of all the species that have returned to abundance in Britain: otters, ospreys, sea eagles, kites, cranes, beavers, deer and more. Why are wolves increasing all around the world, lions decreasing and tigers now holding steady? Basically, because wolves are in rich countries, lions in poor countries and tigers in middle income countries. Prosperity is the solution not the problem. Nothing would kill off nature faster than trying to live off it. When an African villager gets rich enough to buy food in a shop rather than seek bushmeat in the forest, that’s a win for wildlife. Ditto if he or she can afford gas for cooking rather than cutting wood. The more we can urbanise and the more we can increase our use of intensive farming and fossil fuels, the less we will need to clear forests for either food or fuel. Full post 5) Dams and Reservoirs Used for Renewable Energy Threaten World’s Rivers Bloomberg, 9 May 2019  Large dams and reservoirs built to provide renewable energy around the world are one of the biggest threats to global river health, a new study has found. Just one-third of the world’s longest rivers are free flowing, with 60,000 large dams used to provide hydropower to populations from Brazil to China blocking most main waterways, according to the research from McGill University and the World Wildlife Fund Inc. Long seen as an effective way to move on from burning fossil fuels, building the infrastructure for hydropower prevents rivers from flowing naturally, which harms agriculture, biodiversity, and access to water supplies. The report claims to be the first global assessment of river health and examined 12 million kilometers (7.5 million miles) of rivers. The world’s only remaining free-flowing rivers can now be found in underpopulated areas of the planet including the Arctic and the Amazon Basin, the researchers said. “Free-flowing rivers are important for humans and the environment alike, yet economic development around the world is making them increasingly rare,” said lead author Gunther Grill at Montreal’s McGill. Climate change is also threatening rivers as rising global temperatures affect water flow and quality. However, efforts to combat climate change by transforming energy systems to be less carbon intensive mean that the environmental impact of building dams needs to be factored into planning decisions. Some 3,700 hydropower dams are currently planned or under construction, the report said. Full story 6) Merkel’s Partner CSU Calls For Lowering Taxes Instead Of New Carbon Tax Clean Energy Wire, 9 May 2019  The Bavarian Christian Social Union (CSU) – part of Chancellor Angela Merkel’s alliance of centre-right parties – is sceptical regarding the introduction of a new CO₂ tax. The party is calling instead for a reduction in existing energy taxes, such as the electricity tax, to steer Germany towards a low-carbon future.  “We as CSU are not prepared to introduce a form of CO₂ pricing which puts an additional burden on citizens,” said Georg Nüßlein, deputy head of the CDU/CSUparliamentary group, at a press briefing in Berlin. Nüßlein said the CSU would prefer a European solution, such as expanding the EU Emissions Trading System (ETS), as its “instrument of choice.” However, should a national solution be necessary, “I propose to check all environment and energy taxes for their relevance to CO₂, and reduce taxes where needed,” said Nüßlein. “For the Union [CDU/CSU alliance], ecologic tax reform means tax reduction reform.” Nüßlein said a resulting economic stimulus and innovation drive would finance part of such a reform. But climate action also “has to play a different role in federal budget planning in the coming years,” Nüßlein said. “Apart from the energy and climate fund, the current budget positions do not do justice to the issue.” Full post 7) Nick Timothy: When Will Green Zealots Figure Out That Britain Cannot Fight Climate Change Alone? The Daily Telegraph, 9 May 2019  Britain is about to take a dangerous leap into the unknown. It will cause a massive economic hit, damaging industry and jeopardising jobs. The Government has no policies to implement its plan. Parliament has barely considered it. MPs are under pressure from zealous activists, who don’t realise that Britain doesn’t lead the world anymore. Yes, the Climate Change Committee is back, and more crazily unilateralist than ever before. Their latest brainwave: to make Britain the first country to cut its carbon emissions to zero. Industry must be decarbonised, conventional cars banned, and a fifth of farmland given up for biomass production and peatland restoration. “Emissions from international aviation,” it warns, “cannot be ignored.” Enjoy your family holiday while you can. All this can be achieved, the committee says, at no greater economic cost than the original target, set by the Climate Change Act, to reduce carbon emissions from 1990 levels by 80 per cent by 2050. Ignore the fact that the committee produced no credible impact assessment to justify its assertion. The Climate Change Act has already proved more expensive than was ever predicted, or admitted by ministers even today. And all for little reason. The truth is, alone, Britain cannot do much to slow or stop climate change. Last year, we emitted 352.9 million tonnes of carbon dioxide. In the past eight years, developing countries increased their emissions by 4,362 million tonnes. Yet Britain is wrecking its economic competitiveness. Industrial electricity prices have gone up by more than 160 per cent since 2004. And they are getting increasingly uncompetitive: in 2010 they were about average for a western economy; now they are 28 per cent more expensive. We are also pushing up domestic bills. More than 2.5 million households in England – 11.1 per cent of the total – live in fuel poverty. A quarter of households on low and middle incomes struggle to pay their energy bills. So much for helping families who are “just about managing”. And what are we achieving? Yes, Britain has reduced its emissions by 43.5 per cent compared to 1990 levels. But emissions relating to imports are 28 per cent higher than in 1997, when statistics were first collected. Emissions associated with imports from China are 276 per cent higher. However, it is not only to China and the rest of Asia that we have outsourced our industry and emissions. We’re outsourcing them – along with jobs and prosperity – to other European countries too. This is because the Climate Change Act ignores the European Emissions Trading Scheme. Under the ETS, total carbon emissions are capped and companies are given emissions allowances. Companies can then trade their allowances, selling their spares to companies that need them. For every tonne of carbon not emitted in Britain as a result of the Climate Change Act, therefore, an extra tonne can be emitted elsewhere in Europe. Britain is going through the hardship and sacrifice of reducing its emissions, only to hand over its allowances, in effect, to European competitors. Meanwhile, 39 per cent of Germany’s electricity production comes from coal, the dirtiest of all fossil fuels. Seven of Europe’s top 10 carbon emitters are German power stations. And while even America has reduced its carbon emissions by 16 per cent since 2000, Germany’s have fallen by only 10 per cent in that period. Critics of the Climate Change Act warned ministers against reckless unilateralism. And the Act’s own impact assessment warned, “the economic case for the UK continuing to act alone where global action cannot be achieved would be weak”. Ministers said it was “a contribution to a worldwide effort”, but naively conceded that, “as yet we do not know what the worldwide effort is”. Eleven years on, we know that the worldwide effort has been less intense than Britain’s. As Prof Dieter Helm, who ministers asked to review the cost of energy, says: “The cost … is significantly higher than it needs to be.” He adds that the Committee on Climate Change has been too gung-ho in setting carbon budgets, arguing, “as technology moves on … it will be cheaper to reduce carbon tomorrow than today”. Sadly, ministers are keen to listen to a 16 year-old schoolgirl, but not to the esteemed economist and energy expert they themselves commissioned. And now, instead of learning from the mistakes of the Climate Change Act they look set to repeat them. Full post

Delingpole: Six Reasons Why You Should Ignore the UN’s Species Extinction Report – ‘It’s politics, not science’ Delingpole: Six Reasons Why You Should Ignore the UN’s Species Extinction Report Getty Images JAMES DELINGPOLE 7 May 20191,091 8:35 The United Nations has produced a report warning that a million species are threatened with extinction. Here is why you shouldn’t take it seriously. It’s politics, not science The Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), which produced the report, is a political organisation not a scientific one. Just like its sister organisation the Intergovernmental Panel on Climate Change — IPCC — in fact. As Donna Laframboise notes here, both exist purely to give a fig leaf of scientific credibility to the UN’s ‘sustainability’ agenda. When the IPBES was established in 2010, we were informed point blank that its purpose was “to spearhead the battle against the destruction of the natural world.” In other words, there’s all sorts of deception here. This is no sober scientific body, which examines multiple perspectives, and considers alternative hypotheses. The job of the IPBES is to muster only one kind of evidence, the kind that promotes UN environmental treaties. That’s how the United Nations works, folks. Machinations in the shadows. Camouflaging its political aspirations by dressing them up in 1,800 pages of scientific clothing. This is the usual suspects crying wolf. Again No one would dispute that habitat loss is a problem for plants and animals. But it’s a big stretch from there to suggest that a million species are ‘threatened’ with actual extinction. The ‘E’ word has long been overplayed by environmentalists because it’s so dramatic and final and because everyone has heard of the dodo. There is no evidence whatsoever, though, that the world is heading for its so-called Sixth Great Extinction. As Willis Eschenbach once pertinently asked at Watts Up With That? – Where Are The Corpses? Harvard ecologist EO Wilson once estimated that up to 50,000 species go extinct every year. How did he calculate this? Using the same method the IPCC uses for its junk-science prognostications on catastrophic climate change: computer models. Greenpeace co-founder Patrick Moore exploded this myth long ago: Moore said in 2000: “There’s no scientific basis for saying that 50,000 species are going extinct. The only place you can find them is in Edward O. Wilson’s computer at Harvard University. They’re actually electrons on a hard drive. I want a list of Latin names of actual species.” Moore was interviewed by reporter Marc Morano (now with Climate Depot) in the 2000 Amazon rainforest documentary: Environmental activist Tim Keating of Rainforest Relief was asked in the 2000 documentary if he could name any of the alleged 50,000 species that have gone extinct and he was unable. “No, we can’t [name them], because we don’t know what those species are. But most of the species that we’re talking about in those estimates are things like insects and even microorganisms, like bacteria,” Keating explained. R-i-g-h-t. So there are all these species going extinct. But we don’t know what they are because we haven’t yet discovered them. Hmm. Sounds terrible. Let’s cancel Western Industrial Civilisation right now, just in case. Seriously, these people are like a stuck record Here – h/t Dennis Ambler at Homewood’s place – is the Independent from 2006: Life on earth is facing a major crisis with thousands of species threatened with imminent extinction – a global emergency demanding urgent action. This is the view of 19 of the world’s most eminent biodiversity specialists, who have called on governments to establish a political framework to save the planet. Scientists estimate that the current rate at which species are becoming extinct is between 100 and 1,000 times greater than the normal “background” extinction rate – and say this is all due to human activity. Anne Larigauderie, executive director of Diversitas, a Paris-based conservation group, said that the situation was now so grave that an international body with direct links with global leaders was essential. The scientists believe that a body similar to the Intergovernmental Panel on Climate Change could help governments to tackle the continuing loss of species. They get away with presenting it as “news” every time because the mainstream media is so thoroughly compliant and dutifully bigs up each scare every time it appears. “Nature is in its worst shape in human history’ This is exactly the kind of scaremongering claim the report was designed to generate. It gives environmental correspondents from on-message outfits like the BBC and CBC the excuse to put in a call to their favourite eco-alarmists, who helpfully respond with hysterical drivel like this: “Humanity unwittingly is attempting to throttle the living planet and humanity’s own future,” said George Mason University biologist Thomas Lovejoy, who has been called the godfather of biodiversity for his research. Actually, as Patrick Moore notes, there have been many worse times for species extinction. Moore, in an interview with Climate Depot, refuted the claims of the species study. “The biggest extinction events in the human era occurred 60,000 years ago when humans arrived in Australia, 10-15,000 years ago when humans arrived in the New World, 800 years ago when humans found New Zealand, and 250 years ago when Europeans brought exotic species to the Pacific Islands such as Hawaii,” Moore explained. “Since species extinction became a broad social concern, coinciding with the extinction of the passenger pigeon, we have done a pretty good job of preventing species extinctions,” Moore explained. “I quit my life-long subscription to National Geographic when they published a similar ‘sixth mass extinction’ article in February 1999. This [latest journal] Nature article just re-hashes this theme,” he added. Moore left Greenpeace in 1986 because he felt the organization had become too radical. Polar Bears and Tigers By curious coincidence perhaps the two most overhyped of all doomed species are now enjoying a remarkable recovery, not least because – contrary to the claims of environmentalists – humans actually do care about flora, fauna and diversity and have made great strides in preserving them. It has been a century since the last species of any significance – the passenger pigeon – died out. Almost all the species extinctions that have occurred in the last two centuries have been on islands, the result of predation by invasive species such as rats or cats accidentally introduced by sailors. Polar bear populations have exploded from about 5,000 60 years ago to around 26,000 now – making a mockery of their status as an emblem of man-made environmental catastrophe. Meanwhile, the number of tigers in India has risen dramatically in the last decade,according to the Irish Times: The estimated population of the endangered big cat has increased from 1,411 in 2006 to 2,226 in 2014, according to the report published by the Indian government’s National Tiger Conservation Authority. Read the small print When you get to the bottom of the scaremongering report, the authors show their true colours. Here is the BBC’s summary: The study doesn’t tell governments what to do, but gives them some pretty strong hints. One big idea is to steer the world away from the “limited paradigm of economic growth”. They suggest moving away from GDP as a key measure of economic wealth and instead adopting more holistic approaches that would capture quality of life and long-term effects. They argue that our traditional notion of a “good quality of life” has involved increasing consumption on every level. This has to change. Yes, we’re back to our old friends – Agenda 21 and sustainability – the UN’s code phrases for a new world order in which technocrats of the international elite impose their globalist agenda of wealth redistribution, regulation, enforced renewables, higher taxes and enforced rationing on sovereign nations in the name of ‘saving the planet.’ If the UN really cared about species extinction, of course, it would be doing the exact opposite. As Jo Nova points out: 1. The worst pollution is in countries with a low income per capita — when people are hungry they raze forests. The most polluted cities are in places like Ghana, Ukraine, Bangladesh, Zambia, Argentina, and Nigeria.  The most deforestation occurs in Brazil, Indonesia, Russia, and Mexico. The worst air is in India and China. 2. Only rich nations have the resources to save the environment. 3. Countries that produce more CO2 are richer. Ignore everything the UN tells you about the environment. It’s drivel – and dangerous drivel at that.

Reality Check: Mass extinction fears ‘have little support from science’ – Claimed losses ‘are absurdly large’ About the mass extinctions supposedly occurring now By Larry Kummer Summary: More fears fed by activists. (See: ‘Terror being waged on wildlife’, leaders warn) As the papers cited below show, they have little support from science – but are seldom rebuked by scientists. Each round of such exaggerated claims erodes away the public’s trust. The leaders of science institutions show little interest in fixing this problem. We cannot afford to have science’s credibility squandered for short-term political gains. “Biodiversity is decreasing at an alarming rate with more than 10,000 species disappearing each year.” — Opening speech at the opening session of the 48th Plenary of the IPCC by Jian Liu (UN Environment’s Chief Scientist), 1 October 2018. This is a bad beginning for the rollout of the IPCC’s SR15, the special report “Global Warming of 1.5 ºC.” It is a claim from leftist advocates based on zombie science (much like Zombie Economics on the Right): with mysterious origins and impossible to kill. Since the numbers are made-up, they vary widely. “Every hour, three species disappear. Every day, up to 150 species are lost. Every year, between 18,000 and 55,000 species become extinct.  The cause: human activities. …Climate change is one of the major driving forces behind the unprecedented loss of biodiversity.“ — Speech on 21 May 2007 by Ahmed Djoghlaf, then Executive Secretary of the Convention on Biological Diversity under the United Nations Environment Programme (UNEP). “If there are 100,000,000 different species on Earth, and the Extinction rate just is 0.01% per year, then 10,000 species go extinct every year. — Website of the World Wildlife Federation. The WWF’s claims are uncritically repeated (a lot) by journalists (e.g., in USA Today). Those numbers are absurdly large. Also, there are many causes of species extinction. Climate change is today a far smaller factor than habitat loss and pollution (to name just two). Three decades of exaggerations like this have eroded away much of scientists’ credibility. No matter if these are noble lies or just excess enthusiasm, they decrease our ability to prepare for the severe environmental damage almost certain as the world’s population grows to 10 billion (or more). Here is a look at the state of the science at present. As usual, it is a debate about models — with the key facts uncertain and the key factors poorly understood. The science There is little support in the peer-reviewed literature for those wild numbers about current extinction rates. Let’s start with the hard facts as described by Endangered Species International — “More than 16,000 species are threatened to become extinct in the near future.” “Of the 44,838 species assessed worldwide using the IUCN Red List criteria, 905 are extinct {was 784 in 2006} and 16,928 are listed as threatened to be extinct.” Modeling can produce far larger estimates, albeit they vary by an order of magnitude. But they have only weak empirical foundations, as they rest on poorly understood dynamics with weak estimates of key factors. Worse, they are impossible to prove or disprove today. Here are a few of the papers saying that extinction rates are high and warning about future rates of extinction. Note they discuss extinction rates in terms of multiple of the “natural” or “background” rate,  E/MSY (extinctions per million species-years), or per cent extinct at some future point. They do not give sensational numbers of “species going extinct every day.” “Using projections of species’ distributions for future climate scenarios, we assess extinction risks for sample regions that cover some 20% of the Earth’s terrestrial surface. Exploring three approaches in which the estimated probability of extinction shows a power-law relationship with geographical range size, we predict, on the basis of mid-range climate-warming scenarios for 2050, that 15–37% of species in our sample of regions and taxa will be ‘committed to extinction’.” — “Extinction risk from climate change” by Chris D. Thomas et al. in Nature, 8 January 2004. Gated; open copy here. Fourteen years later, 30% of the time until their 2050 target, we see little evidence of the predicted mass extinctions. “The consensus of scientists is that the current global rate of species extinctions is on average somewhere between 100 and 1,000 times greater than pre-human levels (the natural background extinction rate) …” — Sustaining Life: How Human Health Depends on Biodiversity, edited by Eric Chivian and Aaron Bernstein (2008). This estimate remains the consensus ten years later. “In sum, present extinction rates of ~100 E/MSY and the strong suspicion that these rates miss extinctions even for well-known taxa, and certainly for poorer known ones, means present extinction rates are likely a thousand times higher than the background rate of 0.1 E/MSY. — “The biodiversity of species and their rates of extinction, distribution, and protection” by S. L. Pimm et al, Science, 30 May 2014. Ungated copy. “If we follow our current, business-as-usual trajectory [representative concentration pathway (RCP) 8.5; 4.3°C rise], climate change threatens one in six species (16%).” — “Accelerating extinction risk from climate change” by Mark C. Urban in Science, 1 May 2015. Gated. Open copy here. That’s the high-end estimate since RCP8.5 is the worst case scenario in the IPCC’s AR5. So it is unlikely (details here). “Even under our assumptions, which would tend to minimize evidence of an incipient mass extinction, the average rate of vertebrate species loss over the last century is up to 100 times higher than the background rate.” — “Accelerated modern human–induced species losses: Entering the sixth mass extinction” by Gerardo Ceballos and Paul R. Ehrlich et al, Science Advances, 19 June 2015. They note that only 477 vertebrates have gone extinct since 1900. Papers giving a more optimistic perspective There are quite a few of these. Here are two. “Re-assessing current extinction rates” by Neil Stork in Biodiversity and Conservation, February 2010. Gated. Open copy. He cites the overwhelming peer-reviewed research evidence that claims of mass extinctions occurring today are exaggerated or false, and explains the reasons for these errors. Conclusions … “So what can we conclude about extinction rates? First, less than 1% of all organisms are recorded to have become extinct in the last few centuries and there are almost no empirical data to support estimates of current extinctions of 100 or even one species a day. “Second, the most frequently used predictions for global extinction rates are still largely based on the species–area relationship and the fact that large areas of forests (in particular) are being converted. As Lewis (2006) suggested these and the first models of the impacts of climate change on biodiversity, are first passes with subsequent more sophisticated analyses frequently reducing first estimates of extinction rates (see Table 1). With the increasing evidence that some species appear to survive in regrowth forests (Chazdon et al. 2009) the key question is how long are the time lags to extinction for those remaining species which are unable to survive in regrowth or fragmented forests? “Third, the evidence is so overwhelming that extinction threats vary for different groups of organisms and different faunas and floras that it is surprising that there are still some who seek to draw conclusions on global extinction rates for all organisms based on the knowledge of just a few very highly threatened groups. … “In contrast to the lack of evidence for mass global extinctions, there is considerable evidence for widespread loss of species at the local and regional level and I suggest that and the consequences of such losses on ecosystem function should be key foci for future research.” “Species–area relationships always overestimate extinction rates from habitat loss” by Fangliang He and Stephen P. Hubbell in Nature, 19 May 2011. Gated. “Extinction from habitat loss is the signature conservation problem of the twenty-first century. Despite its importance, estimating extinction rates is still highly uncertain because no proven direct methods or reliable data exist for verifying extinctions. “The most widely used indirect method is to estimate extinction rates by reversing the species–area accumulation curve, extrapolating backwards to smaller areas to calculate expected species loss. Estimates of extinction rates based on this method are almost always much higher than those actually observed. This discrepancy gave rise to the concept of an ‘extinction debt’, referring to species ‘committed to extinction’ owing to habitat loss and reduced population size but not yet extinct during a non-equilibrium period. “Here we show that the extinction debt as currently defined is largely a sampling artefact due to an unrecognized difference between the underlying sampling problems when constructing a species–area relationship (SAR) and when extrapolating species extinction from habitat loss. The key mathematical result is that the area required to remove the last individual of a species (extinction) is larger, almost always much larger, than the sample area needed to encounter the first individual of a species, irrespective of species distribution and spatial scale. We illustrate these results with data from a global network of large, mapped forest plots and ranges of passerine bird species in the continental USA; and we show that overestimation can be greater than 160%. “Although we conclude that extinctions caused by habitat loss require greater loss of habitat than previously thought, our results must not lead to complacency about extinction due to habitat loss, which is a real and growing threat.” An important reminded by John C. Briggs (Prof Marine Science, U South FL) in Science, 14 November 2014. “Most extinctions have occurred on oceanic islands or in restricted freshwater locations, with very few occurring on Earth’s continents or in the oceans.” Dodo by Frederick William Frohawk from Waler Rothschild’s Extinct Birds. Non-technical articles about these questions Even the leftists at the BBC ask if “Biodiversity loss: How accurate are the numbers?” (2012). “Current estimates of the number of species can vary from, let’s say, two million species to over 30 or even 100 million species,” says Dr Braulio Dias, executive secretary of the Convention on Biological Diversity. “So we don’t have a good estimate to an order of magnitude of precision,” he says. But if it’s really true that up to 150 species are being lost every day, shouldn’t we expect to be able to name more than 801 extinct species in 512 years?” For a good summary of the science, see “Global Extinction Rates: Why Do Estimates Vary So Wildly?” by Fred Pearce at YaleEnvironment360, August 2015 — “Is it 150 species a day or 24 a day or far less than that? Prominent scientists cite dramatically different numbers when estimating the rate at which species are going extinct. Why is that?” For a non-technical summary of these and actual good news about extinction rates (i.e. that the news is bad but not catastrophic) see “Rethinking Extinction” by Stewart Brand at Aeon, April 2015 – “The idea that we are edging up to a mass extinction is not just wrong – it’s a recipe for panic and paralysis.” Brand edited the Whole Earth Catalog (1968-74). Now he is president of the Long Now Foundation and co-founder of the Revive and Restore project in San Francisco. Also see his articles about de-extinction. In their 2015 Report to the Club of Rome, On the Edge: The State and Fate of the World’s Tropical Rainforests, Claude Martin and Thomas E. Lovejoy walked back on some of the WWF’s claims. Martin is a former director of WWF International. Lovejoy is “the Godfather of Biodiversity” and a professor of environmental science at George Mason U. “Ariel Lugo found that the massive deforestation in Puerto Rico, which lost up to 99% of the primary forest area, did not lead to the massive extinction expected by Myers et al. After 500 years of human pressure, only seven bird species became extinct, which corresponded to 11.6% of the indigenous bird fauna. Introduced species had even increased the number of species on the island. “Similar observations were been made in El Salvador, which had lost more than 90% of its natural forest even before the war of the 1980s and since then experienced some forest resurgence. Despite massive forest cover loss, El Salvador seems to have preserved impressive levels of biodiversity …Of the 508 bird species …at the end of the 1990s those in danger of extinction numbered 117, but only 3 were then believe to be extinct.” Who are those extinct animals? Mostly bugs. For the most accurate list of extinct and endangered species, see the IUCN Red List of extinctions. Wikipedia posts this in a more easily viewed form. Seldom mentioned in the alarmist articles is the big fact: most Animalia are bugs. “There is a story, possibly apocryphal, of the distinguished British biologist, J.B.S. Haldane, who found himself in the company of a group of theologians. On being asked what one could conclude as to the nature of the Creator from a study of his creation, Haldane is said to have answered, ‘An inordinate fondness for beetles.’” — “Homage to Santa Rosalia or Why Are There So Many Kinds of Animals?” by G. E. Hutchinson in The American Naturalist, May-June 1959. Hat tip to the Quote Investigator. “It has been suggested that we do not know within an order of magnitude the number of all species on Earth. Roughly 1.5 million valid species of all organisms have been named and described. Given Kingdom Animalia numerically dominates this list and virtually all terrestrial vertebrates have been described, the question of how many terrestrial species exist is all but reduced to one of how many arthropod species there are. With beetles alone accounting for about 40% of all described arthropod species, the truly pertinent question is how many beetle species exist. — “New approaches narrow global species estimates for beetles, insects, and terrestrial arthropods” by Nigel E. Stork et al. in PNAS, 16 June 2015. Can we even count the number of animals? Calculations of extinction rates require knowing the number of species. Estimates vary widely. But scientists are making progress. Slow progress. None of the estimates are remotely close to the WWF claim (above) that there are 100 million species. “8.7 Million: A New Estimate for All the Complex Species on Earth” by Daniel Strain, Science, 26 August 2011. Ungated copy here. “Can We Name Earth’s Species Before They Go Extinct?” by Mark J. Costello, Robert M. May, and Nigel E. Stork in Science, 25 Jan 2013. Gated. Open copy here. “Some people despair that most species will go extinct before they are discovered. However, such worries result from overestimates of how many species may exist, beliefs that the expertise to describe species is decreasing, and alarmist estimates of extinction rates. We argue that the number of species on Earth today is 5 ± 3 million, of which 1.5 million are named. New databases show that there are more taxonomists describing species than ever before, and their number is increasing faster than the rate of species description. …Extinction rates are, however, poorly quantified, ranging from 0.01 to 1% (at most 5%) per decade.” Despite the rebuttals, fears that “most species will go extinct before they are discovered” are still repeated, and the rebuttals ignored. As in “Species, extinct before we know them?” by Alexander C.Lees and Stuart L. Pimm in Current Biology. Also, some scientists are skeptical about the progress: “Global species richness estimates have not converged” by M. Julian Caley et al. in Trends in Ecology and Evolution, April 2014. Gated. Open copy here. Opening … “Our ability to estimate the total number of species that live on our planet, or in any of its major habitats, realms, or ecosystems, has immense practical and symbolic importance. Global species richness, whether estimated by taxon, habitat, ecosystem, or the entire planet, is a key metric of biodiversity. In the absence of agreed, and relatively certain, estimates of global species richness, we are unable to adequately understand the magnitude of what is at risk from global change during the Anthropocene, or our successes and failures in mitigating those risks and remediating impacts. “{I}f our ability to estimate species richness has been improving, our estimates should be converging, and the uncertainty around them progressively decreasing. Indeed, such convergence in global species richness estimates has recently been claimed. Here we review published estimates of global species richness, but argue instead that these estimates have failed to converge over more than six decades of research.” But there is progress on the bug front: “New approaches narrow global species estimates for beetles, insects, and terrestrial arthropods” by Nigel E. Stork et al. in PNAS, 16 June 2015. “It has been suggested that we do not know within an order of magnitude the number of all species on Earth. Roughly 1.5 million valid species of all organisms have been named and described. Given Kingdom Animalia numerically dominates this list and virtually all terrestrial vertebrates have been described, the question of how many terrestrial species exist is all but reduced to one of how many arthropod species there are. With beetles alone accounting for about 40% of all described arthropod species, the truly pertinent question is how many beetle species exist. “Here we present four new and independent estimates of beetle species richness, which produce a mean estimate of 1.5 million beetle species. We argue that the surprisingly narrow range (0.9–2.1 million) of these four autonomous estimates – derived from host-specificity relationships, ratios with other taxa, plant:beetle ratios, and a completely novel body-size approach – represents a major advance in honing in on the richness of this most significant taxon, and is thus of considerable importance to the debate on how many species exist. “Using analogous approaches, we also produce independent estimates for all insects, mean: 5.5 million species (range 2.6–7.8 million), and for terrestrial arthropods, mean: 6.8 million species (range 5.9–7.8 million), which suggest that estimates for the world’s insects and their relatives are narrowing considerably.” More bug progress: “How Many Species of Insects and Other Terrestrial Arthropods Are There on Earth?” by Nigel E. Stork in the Annual Review of Entomology, January 2018. “In the last decade, new methods of estimating global species richness have been developed and existing ones improved through the use of more appropriate statistical tools and new data. Taking the mean of most of these new estimates indicates that globally there are approximately 1.5 million, 5.5 million, and 7 million species of beetles, insects, and terrestrial arthropods, respectively. Previous estimates of 30 million species or more based on the host specificity of insects to plants now seem extremely unlikely. With 1 million insect species named, this suggests that 80% remain to be discovered …” # Related Links:  Next Eco-Scare is Here! ‘Biodiversity’: ‘The new Big Lie’: The green movement is ditching ‘Climate Change’ in favor of species extinction fears – ‘The independent platform will in many ways mirror the UN IPCC’ and ‘provide gold standard reports to governments. ‘Gold standard’, eh? Now where have I heard that phrase before? — Suddenly it becomes clear why they kept Pachauri on at the IPCC. Because the IPCC simply doesn’t matter any more’ — ‘Not only does the great big new Biodiversity scam already have its own IPCC but it even has its own pseudoeconomic, panic-generating Stern Report.’ Greenpeace Co-Founder mocks human extinction claim: ‘We are presently the most successful species on the planet’ – Greenpeace Co-Founder & Ecologist Dr. Patrick Moore challenges specious species claims: ‘That is so 1970s. Paul Ehrlich is pathetic and has been crying wolf for decades. While he pontificated doom for starving millions in the 1970 from his Ivory Tower at Stanford.’ … Moore bluntly mocked species extinction claims made by biologist Edward O. Wilson from Harvard University. Wilson estimated that up to 50,000 species go extinct every year based on computer models of the number of potential but as yet undiscovered species in the world. Moore said in 2000: “There’s no scientific basis for saying that 50,000 species are going extinct. The only place you can find them is in Edward O. Wilson’s computer at Harvard University. They’re actually electrons on a hard drive. I want a list of Latin names of actual species.” Moore was interviewed by reporter Marc Morano (now with Climate Depot) in the 2000 Amazon rainforest documentary: Environmental activist Tim Keating of Rainforest Relief was asked in the 2000 documentary if he could name any of the alleged 50,000 species that have gone extinct and he was unable. “No, we can’t [name them], because we don’t know what those species are. But most of the species that we’re talking about in those estimates are things like insects and even microorganisms, like bacteria,” Keating explained. … UK scientist Professor Philip Stott, emeritus professor of Biogeography at the University of London, dismissed current species claims in the 2000 Amazon rainforest documentary. “The earth has gone through many periods of major extinctions, some much bigger in size than even being contemplated today,” Stott, the author of a book on tropical rainforests, said in the 2000 documentary. “Change is necessary to keep up with change in nature itself. In other words, change is the essence. And the idea that we can keep all species that now exist would be anti-evolutionary, anti-nature and anti the very nature of the earth in which we live,” Stott said. 1972 Article Unearthed: ‘Worse than Hitler’: ‘Population Bomb’ author Paul Ehrlich suggested adding a forced sterilization agent to ‘staple food’ and ‘water supply’ – Warned of ‘Unpredictable climatic effects’ — Called on U.S. to ‘de-develop’ Flashback: Greenpeace Co-Founder Slams Species Extinction Scare Study as proof of how ‘peer-review process has become corrupted’ – Study ‘greatly underestimate the rate new species can evolve’ Time for Next Eco-Scare: ‘As the global warming bubble deflates, another scare is being inflated – species extinction’ ‘Warmist Ilya Maclean’s incredible hockey stick: 200 years after the dawn of the Industrial Revolution, he can’t name one CO2-induced species extinction’ – But ‘In 89 years, he suggests there may be over one million CO2-induced species extinctions’ Species Extinction Rates Grossly Overestimated — Species loss claims in UN IPCC report are ‘fundamentally flawed’ Species Extinction Rates Grossly Overestimated: ‘We don’t even know how many species actually exist’ Flashback: Greenpeace Co-Founder Slams Species Extinction Scare Study as proof of how ‘peer-review process has become corrupted’ – Study ‘greatly underestimate the rate new species can evolve’ Flashback: Next Eco-Scare is Here! ‘Biodiversity’: ‘The new Big Lie’: The green movement is ditching ‘Climate Change’ in favor of species extinction fears Flashback: UN Species Scares Debunked: ‘Persistence of Species’: ‘Earth’s plant & animal species are not slip-sliding away – even slowly – into netherworld of extinction’ UN Species Scares Debunked: ‘The Persistence of Species’: ‘Earth’s plant and animal species are not slip-sliding away – even slowly – into the netherworld of extinction’ Another New Analysis: Plant and Animal Response to Global Warming Flat-Earthers Are Also Climate Alarmists Who Fear Human Extinction 3 New Papers: Permian Mass Extinction Coincided With Global Cooling, Falling Sea Levels, And Low CO2 ESA is ‘Federally funded fiction’ – Report on U.S. Endangered Species Act exposes federal deception –‘The ESA is so ineffective that taxpayer dollars are used to fabricate successes—and many species now listed should be removed from the list as mistakes or extinct’ Greens Should Cheer! Earth’s problems solved! BBC: Sperm count drop ‘may lead to human extinction’ – Humans could become extinct if sperm counts in men from North America, Europe and Australia continue to fall at current rates, a doctor has warned. Oops: One Of Greatest Mass Extinctions Was Due To An Ice Age And Not Global Warming New study: Great mass extinction caused by ice age, not global warming – Scientists have long believed a mass die-off was caused by global warming. A new study shows differently. Yet Another Mistake For UN IPCC!? Claims About Extinction ‘confusing’ — IPCC Scientists ‘acknowledge there was inconsistency and flawed writing’ in extinction section – ‘In the Summary for Policy Makers of the report on climate impacts, there are different summations of extinction risk within a few pages’ Are 30 thousand species going extinct every year? – These exaggerations are typical of climate activists, making their massive project one of the most incompetent publicity campaign ever. As the years roll by people will wonder about those forecasts that tens of thousands of species are going extinct every year — while only a few score animals are added to the extinct list. Not only might this inconvenient truth wash away the credibility of the climate change campaign, but it might damage that of science as well. Also, decades of false warnings about certain disaster appear to have made the public skeptical or even disinterested in the doom du jour, as in the Boy Who Cried Wolf. Remember the ending: eventually the wolf came, but nobody listened. And what’s the result of this risky campaign? Climate change ranks at the bottom of most surveys of what Americans’ see as our greatest challenges. (CEOs, too.) We can only guess if a more truthful campaign would have succeeded. If only activists had tried. It’s a problem seen throughout American politics. A closer grasp of the truth might be a necessary start for any reform program in America.

The Great Pause lengthens again: Global temperature update: The Pause is now 18 years 3 months (219 months)

Special to Climate Depot [Also see: It’s Official – There are now 66 excuses for Temp ‘pause’ – Updated list of 66 excuses for the 18-26 year ‘pause’ in global warming –  Surface Data: 2014 Officially the ‘Warmest Year on Record’ & Climatologist Dr. Roy Spencer: ‘Why 2014 Won’t Be the Warmest Year on Record’ (based on surface data)– ‘We are arguing over the significance of hundredths of a degree’ – Physicist analyzes satellite temperature data: ‘Please laugh out loud when someone will be telling you that it was the warmest year’ For more explanation of how the ‘pause’ in global warming conflicts with the claims of 2014 being the ‘hottest year ever’ based on surface data, see related links below.  ] # The Great Pause lengthens again Global temperature update: the Pause is now 18 years 3 months By Christopher Monckton of Brenchley Since October 1996 there has been no global warming at all (Fig. 1). This month’s RSS [1] temperature plot pushes up the period without any global warming from 18 years 2 months to 18 years 3 months. Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 3 months since October 1996. The hiatus period of 18 years 3 months, or 219 months, is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. As the Pope unwisely prepares to abandon forever the political neutrality that his office enjoins upon him, and to put his signature to a climate-Communist encyclical largely drafted by the radical Prefect of the Pontifical Academy of Sciences, Mgr. Marcelo Sanchez Sorondo, the Almighty continues to display a sense of humor. We are now less than a year away the Paris world-government conference. Yet the global warming that the IPCC had so confidently but misguidedly predicted 25 years ago has stopped altogether. Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), January 1990 to November 2014 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at less than 1.4 K/century equivalent, taken as the mean of the RSS and UAH satellite monthly mean lower-troposphere temperature anomalies. A quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS [1] and UAH [2] monthly global mean surface temperature anomalies – is 0.34 Cº, equivalent to just 1.4 Cº/century, or a little below half of the central estimate in IPCC (1990) and well below even the least estimate (Fig. 2). The Great Pause is a growing embarrassment to those who had told us with “substantial confidence” that the science was settled and the debate over. Nature had other ideas. Though approaching 70 mutually incompatible and more or less implausible excuses for the Pause are appearing in nervous reviewed journals and among proselytizing scientists, the possibility that the Pause is occurring because the computer models are simply wrong about the sensitivity of temperature to manmade greenhouse gases can no longer be dismissed, and is demonstrated in a major peer-reviewed paper published this month in the Orient’s leading science journal. Remarkably, even the IPCC’s latest and much reduced near-term global-warming projections are also excessive (Fig. 3). +++ Figure 3. Predicted temperature change, January 2005 to November 2014, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the observed anomalies (dark blue) and zero real-world trend (bright blue), taken as the average of the RSS and UAH satellite lower-troposphere temperature anomalies. In 1990, the IPCC’s central estimate of near-term warming was higher by two-thirds than it is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. 3 shows, even that is proving to be a substantial exaggeration. On the RSS satellite data, there has been no global warming statistically distinguishable from zero for more than 26 years. None of the models predicted that, in effect, there would be no global warming for a quarter of a century. Key facts about global temperature The RSS satellite dataset shows no global warming at all for 219 months from October 1996 to December 2014 – more than half the 432-month satellite record. The global warming trend since 1900 is equivalent to 0.8 Cº per century. This is well within natural variability and may not have much to do with us. Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century. The fastest warming rate lasting ten years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century. In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century. The global warming trend since 1990, when the IPCC wrote its first report, is equivalent to below 1.4 Cº per century – half of what the IPCC had then predicted. Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100. The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than ten years that has been measured since 1950. The IPCC’s 4.8 Cº-by-2100 prediction is almost four times the observed real-world warming trend since we might in theory have begun influencing it in 1950. From September 2001 to November 2014, the warming trend on the mean of the 5 global-temperature datasets is nil. No warming for 13 years 3 months. Recent extreme weather cannot be blamed on global warming, because there has not been any global warming. It is as simple as that.  Technical note Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend. But is the RSS satellite dataset “cherry-picked”? No. There are good reasons to consider it the best of the five principal global-temperature datasets. The indefatigable “Steven Goddard” demonstrated in the autumn of 2014 that the RSS dataset – at least as far as the Historical Climate Network is concerned – shows less warm bias than the GISS [3] or UAH [2] records. The UAH record is shortly to be revised to reduce its warm bias and bring it closer to conformity with RSS. Figure 4. Warm biases in temperature. RSS shows less bias than the UAH or GISS records. UAH, in its forthcoming Version 6.0, will be taking steps to reduce the warm bias in its global-temperature reporting. Steven Goddard writes: “The graph compares UAH, RSS and GISS US temperatures with the actual measured US HCN stations. UAH and GISS both have a huge warming bias, while RSS is close to the measured daily temperature data. The small difference between RSS and HCN is probably because my HCN calculations are not gridded. My conclusion is that RSS is the only credible data set, and all the others have a spurious warming bias.” Also, the RSS data show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that RSS is better able to capture such fluctuations without artificially filtering them out than other datasets. Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates appreciably below those that are published. The satellite datasets are based on measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years. The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file, takes their mean and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity. The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line via two well-established and functionally identical equations that are compared with one another to ensure no discrepancy between them. The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression. Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat. RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at Dr Mears’ results are summarized in Fig. 5: Figure 5. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998. Dr Mears writes: “The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.” Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph: “Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.” In fact, the spike in temperatures caused by the Great el Niño of 1998 is largely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. Replacing all the monthly RSS anomalies for 1998 with the mean anomaly value of 0.55 K that obtained during the 2010 el Niño and recalculating the trend from September 1996 [not Dr Mears’ “1997”] to September 2014 showed that the trend values “–0.00 C° (–0.00 C°/century)” in the unaltered data (Fig. 1) became “+0.00 C° (+0.00 C°/century)” in the recalculated graph. No cherry-picking, then. The length of the Great Pause in global warming, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded: “Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.” That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 is about half what the IPCC had then predicted. Is the ocean warming? One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date. The ocean “missing heat” theory is chiefly advocated by a single group in the United States. Meehl, Arblaster, Fasullo, Hu and Trenberth [7] say, “Eight decades with a slightly negative global mean surface-temperature trend show that the ocean above 300 m takes up significantly less heat whereas the ocean below 300 m takes up significantly more, compared with non-hiatus decades. The model provides a plausible depiction of processes in the climate system causing the hiatus periods, and indicates that a hiatus period is a relatively common climate phenomenon and may be linked to La Niña-like conditions,” while Balmaseda, Trenberth and Källen [8] say, “In the last decade, about 30% of the warming has occurred below 700 m, contributing significantly to an acceleration of the warming trend. The warming below 700 m remains even when the Argo observing system is withdrawn although the trends are reduced,” and Trenberth & Fasullo [2013], repeated in Trenberth, Fasullo & Balmaseda [9], say, “An inventory of energy storage changes shows that over 90% of the imbalance is manifested as a rise in ocean heat content (OHC). … Global warming has not stopped: it is merely manifested in different ways.” The U.S. group is supported by a group at the Chinese Academy of Sciences [10]: “A vacillating global heat sink at intermediate ocean depths is associated with different climate regimes of surface warming under anthropogenic forcing. The latter part of the 20th century saw rapid global warming as more heat stayed near the surface. In the 21st century, surface warming slowed as more heat moved into deeper oceans. … Cooling periods associated with the latter deeper heat-sequestration mechanism historically lasted 20 to 35 years.” In [11] the academicians speculate that at some future date the hiatus may change its sign, leading to a further episode of perhaps accelerated global warming. Yet to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions that are relevant to land-based life on Earth. Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere. Even if heat is reaching the benthic strata without warming the near-surface strata on the way, the transient near-surface response is rather insensitive to rising atmospheric CO2 concentration. For this reason, resolving ocean thermodynamics and circulation dynamics is not a prerequisite to the empirical study of climate sensitivity by way of our simple model. If the “deep heat” explanation for the hiatus in global warming is correct (and it is merely one among dozens that have been offered), then the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has. Since the complex models have failed in this respect, and since there are insufficient deep-ocean observations to provide reliable quantitative evidence of the putative heat accumulation below 2000 m, still less to determine the mechanism of the imagined heat transfer, still less again to apportion duly the respective contributions of anthropogenic, solar and subsea volcanic influences on the benthic heat accumulation, it is surely unreasonable for our simple model to be expected to do what the complex models have self-evidently failed to do – and what cannot be done by any model, simple or complex, unless and until measurements of far higher resolution than is now to hand become available at all points of the oceanic column. For instance, the 3500 automated Argo bathythermograph buoys have a resolution equivalent to taking a single temperature  and salinity profile in Lake Superior less than once a year: and before Argo came onstream in the middle of the last decade the resolution of oceanic temperature measurements was considerably poorer even than that, especially in the abyssal strata. The mean depth of the global ocean is 3700 m. As recently observed in [11], implicitly questioning the U.S. group’s assertions in [7-9], the resolution of samples at various depths and the length of the record are both insufficient either to permit reliable measurement of ocean heat content or to permit monitoring of oceanic radiative fluxes: “Some basic elements of the sampling problem are compiled in Table 2. About 52% of the ocean lies below 2000 m and about 18% below 3600 m. By defining a volume as having been ‘probed’ if at least one CTD station existed within a roughly 60 x 60 km2 box in the interval 1992-2011 … [a]bout 1/3 (11% of total volume) of water below 2000 m was sampled … Of the [region] lying below 3600 m, about 17% was measured. … [M]any papers assume no significant changes take place in the deep ocean over the historical period … The history of exploration suggests, however, that blank places on the map have either been assumed to be without any interesting features and dropped from further discussion, or at the other extreme, filled with ‘dragons’ invoked to explain strange reports [in G. de Jode, 1578, Speculum Orbis Terrarum, Antwerp]. … “[R]ecently, [60] offered estimates of abyssal changes with claimed accuracies of order of 0.01 W/m2 (0.0004°C temperature change equivalent over 20 years) below 700 m. If that accuracy has in fact been obtained, the sparse coverage, perhaps extended to the scope of WOCE hydrographic survey, repeated every few decades, would be sufficient.” Furthermore, almost all current analyses of ocean heat content and budget lack an accurate accounting of spatial, temporal and other systematic errors and uncertainties such as those identified in recent works by a group at the Chinese Academy of Sciences [12]: “In this study, a new source of uncertainties in calculating OHC due to the insufficiency of vertical resolution in historical ocean subsurface temperature profile observations was diagnosed. This error was examined by sampling a high-vertical-resolution climatological ocean according to the depth intervals of in situ subsurface observations, and then the error was defined as the difference between the OHC calculated by subsampled profiles and the OHC of the climatological ocean. The obtained resolution-induced error appeared to be cold in the upper 100 m (with a peak of approximately −0.1°C), warm within 100–700 m (with a peak of ~0.1°C near 180 m), and warm when averaged over 0–700-m depths (with a global average of ~0.01°–0.025°C, ~1–2.5 × 1022 J). Geographically, it showed a warm bias within 30°S–30°N and a cold bias at higher latitudes in both hemispheres, the sign of which depended on the concave or convex shape of the vertical temperature profiles. Finally, the authors recommend maintaining an unbiased observation system in the future: a minimal vertical depth bin of 5% of the depth was needed to reduce the vertical-resolution-induced bias to less than 0.005°C on global average (equal to Argo accuracy).” Again [13]: “… a new correction scheme for historical XBT data is proposed for nine independent probe-type groups. The scheme includes corrections for both temperature and depth records, which are all variable with calendar year, water temperature, and probe type. The results confirm those found in previous studies: a slowing in fall rate during the 1970s and 2000s and the large pure thermal biases during 1970–85. The performance of nine different correction schemes is compared. After the proposed corrections are applied to the XBT data in the WOD09 dataset, global ocean heat content from 1967 to 2010 is reestimated.” A forthcoming paper [14], after properly accounting for some of the sampling biases and instrumental errors and uncertainties in the ocean heat content data (i.e., applying the new global ocean temperature dataset from the Institute of Atmospheric Physics), describes a vertical profile of ocean temperature change from 2004-2013, reporting a warming hiatus above 100 m depth and from 300-700 m. The two layers that show warming are 100-300 m and 700-1500 m. These warming strata show their own distinctive horizontal spatial patterns when compared to the non-warming stratum at 300-700 meters. This observational fact leads to the following conclusion: “It is still unclear how the heat is transferring to the deeper ocean.” Furthermore, the suggestion that heat accumulation in the deep ocean explains why there has been no global warming at all for up to 18 years is far from generally accepted in the scientific literature. A remarkable variety of competing and often mutually exclusive explanations for the hiatus in global warming, chiefly involving near-surface phenomena, are offered in recent papers in the reviewed journals of climate science. In the literature, the cause of the hiatus in global warming is variously attributed to (1) coverage-induced cool bias in recent years [15], rebutted by [16] and, with respect to Arctic coverage, by [17]; (2) anthropogenic aerosols from coal-burning [18], rebutted by [19-20]; (3) decline in the warming caused by black-carbon absorption [20]; (4) emission of aerosol particulates by volcanic eruptions [21], rebutted by [22]; (5) reduced solar activity [23]; (6) effectiveness of the Montreal Protocol in controlling emissions of chlorofluorocarbons [24]; (7) a lower-than-predicted increase in methane concentration [24]; (8) a decrease in stratospheric water vapor concentration [25]; (9) strengthened Pacific trade winds [26] (previously, [27] had attributed weaker Pacific trade winds to anthropogenic global warming); (10) stadium waves in tropical Pacific circulation [28]; (11) coincidence [29]; (12) aerosol particulates from pine-trees [30]; (13) natural variability [31-32]; (14) cooler night-time temperatures in the Northern Hemisphere [33]; (15) predictions by those models that allowed for the possibility of a pause in global warming [34-35]; (16) the negative phase of the Pacific Decadal Oscillation [36-38]; (17) the Atlantic meridional overturning circulation [39]; (18) global dimming following the global brightening of 1983-2001 [40]; (19) relative frequencies of distinct el Niño types [41]; (20) surface cooling in the equatorial Pacific [42]; (21) Pacific cooling amplified by Atlantic warming [43]; (22) a combination of factors, including ENSO variability, solar decline and stratospheric aerosols [44]; (23) underestimated anthropogenic aerosol forcing [45]; (24) a new form of multidecadal variability distinct from but related to the ocean oscillations [46]; and (25) failure to initialize most models in order to conform with observation, particularly of oceanic conditions [47]. Finally, though the ARGO buoys measure ocean temperature change directly, before publication the temperature change is converted into zettajoules of ocean heat content change, which make the change seem larger. Converting the ocean heat content change back to temperature change is highly revealing. It shows how little change has really been measured. The increase in ocean heat content over the 94 ARGO months September 2005 to June 2013 was 10 x 1022 J = 100 ZJ (Fig. 6). Figure 6. Ocean heat content change, 1957-2013, from NODC Ocean Climate Laboratory: Conversion: 650 million km3 x 4 MJ per tonne per Kelvin: each cubic meter is 1.033 tonnes. Then: 100 ZJ increase in ohc               100,000,000,000,000,000,000,000 J To raise                                        650,000,000,000,000,000 m3 x 1.033 te m–3                              671,450,000,000,000,000 te x 4,000,000 J te                2,685,800,000,000,000,000,000,000 J per Kelvin   Then 100,000 / 2,685,800 = 0.037233 K in 94 months is equivalent to 0.0475 K per decade. Accordingly, even on the quite extreme NODC ocean heat content record, the change in mean ocean temperature in the upper 2000 m in recent decades has been less than 0.5 K per century equivalent. References RSS (2014) Satellite-derived monthly global mean lower-troposphere temperature anomaly dataset: Accessed 1 July 2014 UAH (University of Alabama at Huntsville) (2014) Satellite MSU monthly global mean lower-troposphere temperature anomalies. Accessed 1 July 2014 NCDC, 2014, National Climatic Data Center monthly global mean land and ocean surface temperature anomalies, 1880-2013, Accessed 1 July 2014 Morice, CP, Kennedy JJ, Rayner N, Jones PD (2012) Quantifying uncertainties in global and regional temperature change using an ensemble of observational estimates: The HadCRUT4 data set. J. Geophys Res 117:D08101. doi:10.1029/2011JD017187 GISS, 2014, Goddard Institute for Space Studies monthly global mean land and sea surface temperature anomalies, 1880-2014, Accessed 1 July 2014 McKitrick RR (2014) HAC-robust measurement of the duration of a trendless subsample in a global climate time series. Open J Stat 4:527-535 Meehl GA, Arblaster JM, Fasullo JT et al (2011) Model-based evidence of deep-ocean heat uptake during surface-temperature hiatus periods. Nat Clim Change 1: 360–364 Balmaseda MA, Trenberth KE, Källen E (2013) Distinctive climate signals in reanalysis of global ocean heat content. Geophys Res Lett 40:175401759 Trenberth KE, Fasullo JT, Balmaseda MA (2014) Earth’s energy imbalance. J Clim 27:3129-3144 Chen X, Tung KK (2014) Varying planetary heat sink led to global-warming slowdown and acceleration. Science 345: 897–903 Wunsch C, Heimbach P (2014) Bidecadal thermal changes in the abyssal ocean. J Phys Oceanol 44: 2013–2030 Cheng L, Zhu J (2014) Uncertainties of the ocean heat content estimation induced by insufficient vertical resolution of historical ocean subsurface observations. J Atm Oceanic Tech 31: 1383–1396 Cheng L, Zhu J, Cowley R et al (2014a) Time, probe type, and temperature variable bias corrections to historical expendable bathythermograph observations. J Atm Oceanic Tech 31: 1793–1825 Cheng L, Zheng F, Zhu J (2014b) Distinctive ocean interior changes during the recent climate hiatus. Geophys Res Lett submitted Cowtan K, Way RG (2014) Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends. Quart J R Meteot Soc  140: 1934-1944 Fyfe JC, Gillet NP, Zwiers FW (2013) Overestimated global warming over the past 20 years. Nat Clim Change 3: 767-769 Chung CE, Cha H, Vilma T et al (2013) On the possibilities to use atmospheric reanalyses to evaluate the warming structure of the Arctic. Atmos Chem Phys 13: 11209-11219 Kaufmann RK, Kauppi H, Stock JH (2011) Reconciling anthropogenic climate change with observed temperature 1998-2008. Proc Natl Acad Sci USA 108: 11790-11793 Kühn T, Partanen A-I, Laakso A et al(2014) Climate impacts of changing aerosol emissions since 1996. Geophys ResLett 41: 4711-4718 Neely RR, Toon OB, Solomon S et al(2013) Recent anthropogenic increases in SO2 from Asia have minimal impact on stratospheric aerosol. Geophys Res Lett 40. doi: 10.1002/grl.50263 Santer BD, Bonfils C, Painter JF et al (2014) Volcanic contribution to decadal changes in tropospheric temperature. Nat Geosci 7:185-189 Haywood J, Jones A, Jones GS (2014) The impact of volcanic eruptions in the period 2000-2013 on global mean temperature trends evaluated in the HadGEM2-ES climate model. Atmos Sci Lett 15: 92-96 Stauning P (2014) Reduced solar activity disguises global temperature rise, Atmos Clim Sci 4: 60-63 Estrada F, Perron P, Martinez-Lopez B (2013) Statistically derived contributions of diverse human influences to twentieth-century temperature changes. Nat Geosci 6: 1050–1055 Solomon S, Rosenlof KH, Portmann RW et al(2010) Contributions of stratospheric water vapor to decadal changes of global warming. Science 327: 1219-1223 England MH, McGregor S, Spence P et al (2014) Recent intensification of wind-driven circulation in the Pacific and the ongoing warming hiatus. Nat Clim Change 4: 222-227 Vecchi ga, Soden BJ, Wittenberg AT,  et al (2006) Weakening of tropical Pacific atmospheric circulation due to anthropogenic forcing. Nature 441: 73-76. Glaze Wyatt M, Curry JA (2013) Role for Eurasian Arctic shelf sea ice in a secularly varying hemispheric climate signal during the 20th century. Clim Dyn 42: 2763-2782 Schmidt GA, Shindell DT, Tsigaridis K (2014) Reconciling warming trends. Nat Geosci 7(158-160). doi: 10.1038/ngeo2105 Ehn M, Thornton JA, Kleist E,  et al (2014) A large source of low-volatility secondary organic aerosol. Nature 506:476-479 Watanabe M, Shiogama H, Tatebe H et al (2014) Contribution of natural decadal variability to global warming acceleration and hiatus. Nat Clim Change 4: 893–897 Lovejoy S (2014) Return periods of global climate fluctuations and the pause. Geophys Res Lett 41:4704-47 Sillmann, J, Donat MG, Fyfe JC et al (2014) Observed and simulated temperature extremes during the recent warming. Environ Res Lett 9. doi: 10.1088/1748-9326/9/6/064023 Risbey J, Lewandowsky S, Langlais C,et al (2014) Nat Clim Change 4:835-840 Guemas V, Doblas-Reyes FJ, Andreu-Burillo I et al (2013) Retrospective prediction of the global warming slowdown in the past decade. Nat Clim Change 3:649-653 Maher N, Sen Gupta A, England MH (2014) Drivers of decadal hiatus periods in the 20th and 21st centuries. Geophys Res Lett 41:5978-5986 Trenberth KE, Fasullo JT, Branstator G et al (2014) Seasonal aspects of the recent pause in surface warming. Nat Clim Change 4: 911–916 Dong L, Zhou T (2014) The formation of the recent cooling in the eastern tropical Pacific Ocean and the associated climate impacts: a competition of global warming, IPO  and AMO. J Geophys Res doi: 10.1002/2013JD021395 Schleussner CF, Runge J, Lehmann J, et al (2014) The role of the North Atlantic overturning and deep ocean for multi-decadal global-mean-temperature variability. Earth Sys Dyn 5:103-115 Rahimzadeh F, Sanchez-Lorenzo A, Hamedi M,  et al (2014) New evidence on the dimming/brightening phenomenon and decreasing diurnal temperature range in Iran (1961-2009). Int J Climatol doi: 10.1002/joc.4107 Banholzer S, Donner S (2014) The influence of different El Nino types on global average temperature. Geophys Res Lett 41:2093–2099 Kosaka Y, Xie SP (2013) Recent global-warming hiatus tied to equatorial Pacific surface cooling. Nature 501: 403–40 McGregor S, Timmermann A, Stuecker MF, England MH, Merrifield M, Jin FF, Chikamoto Y (2014) Recent Walker circulation strengthening and Pacific cooling amplified by Atlantic warming. Nature Clim. Change 4:888-892. doi: 10.1039/nclimate2330 Huber M, Knutti R (2014) Natural variability, radiative forcing and climate response in the recent hiatus reconciled. Nat Geosci 7: 651–656 Hansen J, Sato M, Kharecha PK, et al(2011) Earth’s energy imbalance and implications. Atmos. Chem Phys 11:13421-13449. Maclas D, Stips A, Garcia-Gorriz E (2014) Application of the Singular Spectrum Analysis Technique to Study the Hiatus on the Global Surface Temperature Record. Plos One. doi: 10.1371/journal.pone.0107222 Meehl, GA, Teng H (2014) CMIP5 multi-model hindcasts for the mid-1970s shift and early 200s hiatus and predictions for 2016-2035. Geophys. Res. Lett. 41(5):17y11-1716 # Related Links:  Climatologist Dr. Roy Spencer: ‘Why 2014 Won’t Be the Warmest Year on Record’ – ‘We are arguing over the significance of hundredths of a degree’ Climatologist Dr. Pat Michaels debunks 2014 ‘hottest year’ claim: ‘Is 58.46° then distinguishable from 58.45°? In a word, ‘NO.’ Eco-Activists Warn 2014 Could Be Hottest Year On Record – Satellites Disagree ‘Hottest Year’ Update: NASA & NOAA ignore satellite data which reveal 2014 ‘well below’ hottest claims Even ignoring satellite data Year-to-date ‘record’ temps are 0.21C *below* climate model projections New paper finds excuse #66 for the ‘pause’: There’s no pause if you look at only at the warmest & coldest day of the year – Published in Environmental Research Letters 2014 might be 0.01C warmer than 2010! No Record Temperatures According To Satellites – BBC put up a deliberately apocalyptic picture while telling us the world is on course for the warmest year on record. What they failed to tell us was that the more accurate satellites, which monitor atmospheric temperatures over nearly all of the globe, say no such thing.  Figures from UAH are out for November, and these show a drop from the  October anomaly of 0.27C to 0.33C. This means that at the end of November, this year is only in a tie for 3rd with 2005, and well below the record year of 1998, and 2010. Flashback: 1990 NASA Report: ‘Satellite analysis of upper atmosphere is more accurate, & should be adopted as the standard way to monitor temp change.’ Study using dozens of models Claims: ‘Warming Climate Can Be Slowed in a Decade’ by cutting CO2 Climate Depot Note: If future temps continue to flatline or even cool, warmists can claim climate policy is responsible. They are already doing it! See: It’s Official — Temperature ‘Pause’ Caused By Climate Policies?! Medieval witchcraft lives! UK Energy Minister: Government policies ‘may have slowed down global warming’  AP’s Seth Borenstein publishes pure propaganda: Climate change has made Earth ‘hotter, weirder…downright wilder’ Climate Depot’s Morano comment: ‘AP’s Borenstein can be trusted to shill for UN’s climate summit in Lima Peru, which I will be attending and speaking at. Borenstein relies on Michael Oppenheimer (who is the UN scientists on the payroll of Hollywood stars) and Climategate’s Michael Mann. Borenstein ignores tide gauges on sea level  showing deceleration of sea level rise and ignores satellite temperatures which show the Earth in an 18 year ‘pause’ or ‘standstill’ of global warming. Borenstein tortures data in order to claim more weather extremes. We are currently at or near historic lows in tornadoes and hurricanes. Even droughts are on long term declines and floods show no trend. We know not to expect more from Borenstein.’ See: ‘Long sad history of AP reporter Seth Borenstein’s woeful global warming reporting’ Sea level claims debunked here: Extreme weather claims debunked here: Greenland ice claims debunked here: Antarctica ice claims debunked here: Overpopulation claims debunked here: Analysis: Why ’90% of the missing heat’ cannot be hiding in the oceans  

New paper claims the ‘pause’ is ‘not so unusual’ & ‘no more than natural variability’

New paper claims the ‘pause’ is ‘not so unusual’ & ‘no more than natural variability’ He’s baack!…Shaun Lovejoy has published a new paper which cites his prior claim of 99.9% confidence that one of the two temperature graphs below is your fault, and the other due to natural variability. Both graphs are half-century plots of HADCRUT4 global temperatures. Both use exactly the same time and temperature scales.  Can you tell with 99.9% confidence which one is 1895-1945 (Nature’s fault), and which is 1963-2013 (Your fault)? TIME –> [graphs from Not A Lot Of People Know That, not Lovejoy’s paper] FYI according to Lovejoy’s dodgy statistics the top graph is man-made, the bottom graph is due to natural variability. In Lovejoy’s new paper, he acknowledges a ‘pause’ in global warming since 1998, says it’s “not so unusual” and concludes “the pause is no more than natural variability.” Indeed, the pause is due to natural variability that has not been accounted for by climate models, and thus invalidates attribution claims that the past 50 years of temperature variations are necessarily due to man-made CO2. Furthermore, prior work by NOAA and others has found ‘pauses’ of 15 or more years are indeed unusual and would suggest the climate models are overly sensitive to CO2. According to RSS satellite data, the ‘pause’ has lasted almost 18 years. Return periods of global climate fluctuations and the pause S. Lovejoy An approach complementary to General Circulation Models (GCM’s), using the anthropogenic CO2 radiative forcing as a linear surrogate for all anthropogenic forcings [Lovejoy, 2014], was recently developed for quantifying human impacts. Using pre-industrial multiproxy series and scaling arguments, the probabilities of natural fluctuations at time lags up to 125 years were determined. The hypothesis that the industrial epoch warming was a giant natural fluctuation was rejected with 99.9% confidence. In this paper, this method is extended to the determination of event return times. Over the period 1880-2013, the largest 32 year event is expected to be 0.47 K, effectively explaining the postwar cooling (amplitude 0.42 – 0.47 K). Similarly, the “pause” since 1998 (0.28 – 0.37 K) has a return period of 20-50 years (not so unusual). It is nearly cancelled by the pre-pause warming event (1992-1998, return period 30-40 years); the pause is no more than natural variability.

Prof in NYTimes: Failure of climate models vs. reality not important since we use them to predict the future

Prof in NYTimes: Failure of climate models vs. reality not important since we use them to predict the future Thomas Lovejoy writes in the New York Times: Does the leveling-off of temperatures mean that the climate models used to track them are seriously flawed? Not really. It is important to remember that models are used so that we can understand where the Earth system is headed. Read more… Sent by gReader Pro

For more results click below