Search
Close this search box.

Search Results for: 18 years

How Socialism Wiped Out Venezuela’s Spectacular Oil Wealth – ‘Gradual deterioration was 18 years in the making’

https://reason.com/video/2021/02/01/how-socialism-wiped-out-venezuelas-spectacular-oil-wealth/ By ANDRÉS FIGUEREDO THOMSON Venezuela has the world’s largest proven oil reserves and yet the country has run out of gasoline. The socialist government has lost the capacity to extract oil from the ground or refine it into a usable form. The industry’s gradual deterioration was 18 years in the making, tracing back to then-President Hugo Chávez’s 2003 decision to fire the oil industry’s most experienced engineers in an act of petty political retribution. The near-total collapse in the nation’s oil output in the ensuing years is a stark reminder that the most valuable commodity isn’t a natural resource, but the human expertise to put it to productive use. “At this moment Venezuela is living through its worst nightmare,” said Luis Pedro España, a professor of sociology at Andrés Bello Catholic University in Caracas, who has studied the nation’s economic collapse. “We are witnessing the end of Venezuela as a petro-state.” Gasoline shortages have crippled the economy, made travel within the country prohibitively expensive, and it has increased prices at grocery stores. Shortages and price restrictions have given rise to a vibrant black market. “Drivers who operate gas-powered busses prefer to keep them parked so that they can suck out the gas and later resell it,” says Andrés, a public bus operator in Caracas, who asked that we only use his first name. “[My] bus runs on diesel. It uses 16 [or] 17 gallons daily. Nowadays, we have to wait in a long line to fill up,” he said. “The gas stations even have national guards who ask for bribes before they’ll fill up the tank because the 40 liters that the government gives us isn’t enough.” Andrés is allowed special access to fill up his tank because he provides an essential city service. But earning the equivalent of just $200 a month, he struggles to make ends meet. So he keeps his bus parked and extracts gas from the tank to resell on the black market, earning about $8 per gallon. To put that into perspective, the average Venezuelan subsists on less than $10 per day. The little gas that is still available comes via periodic shipments from Iran. But the Venezuelan government doesn’t officially charge at most gas stations. It uses a quota system, so filling a tank can mean waiting in line for days. David is a mechanic living in Caracas. These days he’s making a living by waiting in line to fill up his tank and then extracting the gas to resell on the black market. “My business isn’t selling gas,” David says. “It is meeting the needs of my customers.” “A lot of the clients from my repair shop are elderly people—people who can’t be standing in line for eight hours, or two days, or three days, or a week. I am the person who is sacrificing my time. Clearly, I have to charge for my time. We all have to make a living.” The Venezuelan oil industry turned a once poor agricultural nation into an important geopolitical player and one of the region’s richest countries. In the mid-1970s, Venezuela nationalized its oil industry, and Petróleos de Venezuela, S.A. (PDVSA) was created to manage operations. Though state-owned, the company was given a high level of autonomy and was known as a “state within a state.” The nation’s oil wealth allowed for a massive investment in infrastructure and the industry attracted talent from all over the world. “The current administration, in the last 20 to 25 years, destroyed what we built,” says Pedro España. What they had built was “economic independence.” “PDVSA once had over 20 refineries around the world,” says José Toro Hardy, an economist and former director at PDVSA from 1996-1999. “We were able to move our oil from the nation’s subsoil into the tanks of American drivers,” he said. “And the entire process was managed by Venezuelan entities with Venezuelan oil wells, pipelines, Venezuelan tankers…We had built something gigantic, but suddenly, we were faced with a costly historic accident: Hugo Chávez won the election. Chávez held power from 1999 to 2013, when he died of cancer. He dubbed his policy agenda “socialism of the 21st century.” It turned one of the most prosperous countries in Latin America into the site of a humanitarian crisis. Chávez rewrote the constitution, clamped down on the freedom of the press, nationalized over a thousand private companies, and destroyed the national currency through hyperinflation. Chávez sought control of the nation’s oil wealth to fund his political ambitions, but first, he had to dismantle the mechanisms that were put in place to protect PDVSA’s autonomy. In a move intended to begin that erosion, Chávez began appointing military leaders to PDVSA’s board. The conflict between PDVSA’s top management and Chávez culminated in a national strike, which took place from December 2002 to February 2003. Chávez proceeded to fire 18,000 state oil workers, including 80 percent of its top engineers, handing control of the industry to the military. The workers who were fired had “an average of 15 years of experience,” Toro says. “In a sense, he threw away 300 thousand years of experience.” “Now, instead of producing five to six million barrels of oil [a day], which is the amount we should be producing, last month’s report from OPEC showed that our production, based on external sources, was 339 thousand barrels per day. After once having been a major player in the oil industry, we’ve become nothing. An insignificant exporter of oil,” he says. “[T]he erosion of checks and balances and the restructuring of PDVSA, allowed Chávez to convert the oil sector into, in essence, the regime’s checking account,” wrote the political scientists Javier Corrales and Michael Penfold in their 2011 book, Dragon in the Tropics: Venezuela and the Legacy of Hugo Chávez. Andrés, the bus driver, believes that if the gas crisis doesn’t abate, Venezuela will experience rioting and public unrest. “If there’s no more diesel we can’t transport food,” he said. “Diesel is necessary for heavy shipping, including basic necessities. So people will be out in the streets protesting.” The economic crisis has caused much of the nation’s educated middle class to flee the country, which will make rebuilding Venezuela’s human capital an even greater challenge. In an ironic twist, Chavez’s hand-picked successor, Nicolás Maduro, is now working to bring privately-run foreign oil companies back in. Produced by Andrés Figueredo Thomson; translation by María José Inojosa Salina. Music Credits: Homeroad, Nothing, and Run by Kai Engel; Suspect Located by Scott Holmes Photo Credits: JORGE SILVA/REUTERS/Newscom; Juan Carlos Hernandez/ZUMA Press/Newscom; KIMBERLY WHITE/REUTER/Newscom;

Meteorologist Joe Bastardi rips heatwave hysteria: ‘Our scorching global July is about .2C above 1981-2010 average and is only the 11th warmest in last 18 years’

https://twitter.com/BigJoeBastardi/status/1021768998898360320 Joe Bastardi: Our scorching global July is about .2C above 1981-2010 average and is only the 11th warmest in last 18 years There is almost as much cool if you actually look, I notice no running to the arctic where 6th straight summer below normal, or Greenland where nr record snow piling up

Two satellite datasets agree: The Pause lives on: ‘No warming for the last 18 years’

By Paul Homewood   http://data.remss.com/msu/monthly_time_series/RSS_Monthly_MSU_AMSU_Channel_TLT_Anomalies_Land_and_Ocean_v03_3.txt RSS have also now released their temperature data for December, which, as with UAH, shows a big drop from the month before. Annually, RSS come to the same conclusion as UAH, that 2016 was 0.02C warmer than 1998. As Roy Spencer has pointed out, the margin of error is 0.1C, so statistically 2016 is tied with 1998 as the warmest year in the satellite record. The fact that there has been no warming for the last 18 years is a massive blow to the credibility of climate science.

June 2016 Update: ‘The Pause still refuses to go away’ – 18 years 8 months – No warming since Nov. 1997

Via: https://kenskingdom.wordpress.com/2016/07/08/the-pause-update-june-2016/ The Pause Update: June 2016 The complete UAH v6.0 data for June were released yesterday.  I present all the graphs for various regions, and as well summaries for easier comparison.  The Pause still refuses to go away, despite all expectations. These graphs show the furthest back one can go to show a zero or negative trend (less than 0.1 +/-0.1C per 100 years) in lower tropospheric temperatures. I calculate 12 month running means to remove the small possibility of seasonal autocorrelation in the monthly anomalies. Note: The satellite record commences in December 1978- now 37 years and 7 months long- 451 months. 12 month running means commence in November 1979. The y-axes in the graphs below are at December 1978, so the vertical gridlines denote Decembers. The final plotted points are June 2016.  [CLICK ON IMAGES TO ENLARGE] Globe: The 12 month mean to June 2016 remains at +0.46C and should stay at about this value for the next two months.  If so, The Pause, (now 1 month shorter), will continue to be an embarrassing reality! However, it may end soon after with a small positive trend. And, for the special benefit of those who think that I am deliberately fudging data by using 12 month running means, here is the plot of monthly anomalies, which shows that The Pause is over by my rather strict criterion: +0.3C/100 years since December 1997- not exactly alarming.  The Pause will return sooner with monthly anomalies than 12 month means of course. Northern Hemisphere: The Northern Hemisphere Pause has ended as expected.  Note the not very alarming warming of 0.21 +/- 0.1C per 100 years for half the record compared with 1.37C for the whole period. Southern Hemisphere: The Pause has shortened by 2 months.  For well over half the record the Southern Hemisphere has zero trend. Tropics: The Pause has shortened by another 3 months with the El Nino influence, but is still over half the record. Tropical Oceans: The Pause has shortened by another 2 years- the El Nino now having a strong effect on the 12 month means. Northern Extra Tropics: The Pause by this criterion has ended in this region, however note that the slope since 1998 is +0.29 +/- 0.1C per 100 years compared with +1.59C for the whole period.  That’s still embarassingly slow warming. Southern Extra Tropics: The Pause has lengthened by another month. Northern Polar: The Pause has decreased by 1 month. Southern Polar: The South Polar region has been cooling for the entire record. USA 49 States: No change. Australia: The Australian Pause has not changed. The next graphs summarise the above plots. First, a graph of the relative length of The Pause in the various regions: Note that the Pause has ended by my criteria in the Northern Extra Tropics and the Northern Hemisphere, but apart from the North Polar region, all other regions have a Pause of 18 years 8 months or longer- well over half the record, including the South Polar region which has been cooling for the whole record. The variation in the linear trend for the whole record, 1978 to the present: Note the decrease in trends from North Polar to South Polar. And the variation in the linear trend since June 1998, which is about halfway between the global low point of December 1997 and the peak in December 1998: The only region to show strong warming for this period (18 years 1 month) is the North Polar region: the Northern Extra Tropics, Tropics, and the Northern Hemisphere have very mild warming but all other regions (including the Globe as a whole and all of the Southern Hemisphere) are Paused or cooling. The imbalance between the two hemispheres is obvious. The lower troposphere over Australia has been strongly cooling for more than 18 years. 12 month means will continue to grow in some regions for the next few months, so the Pause as here defined may end in some regions shortly (probably North Polar, Tropics, and Tropical Oceans), and may not reappear until early 2018.  The impact of the coming La Nina will be worth watching.  Unless temperatures reset at a new, higher level and continue rising, very low trends will remain.

18 years – 10 months! The Global Warming ‘Pause’ refuses to go away, despite greatly exaggerated rumors of its death

Special to Climate Depot Via: https://kenskingdom.wordpress.com/2016/04/02/the-pause-update-march-2016-preliminary/ The Pause Update: March 2016 (Preliminary) Well my last post certainly stirred up some Global Warming Enthusiasts who found it difficult to get their heads around the continued existence of The Pause.  What will they make of this month’s update?  The Pause refuses to go away, despite greatly exaggerated rumours of its death. Dr Roy Spencer has just released UAH v6.0 data for March.  This is a preliminary post with graphs only for the Globe, the Northern Hemisphere, the Southern Hemisphere, and the Tropics.  Other regions will be updated in a few days’ time when the full data for March are released.  (These preliminary figures may change slightly as well.) These graphs show the furthest back one can go to show a zero or negative trend (less than +0.1C/ 100 years) in lower tropospheric temperatures.    I calculate 12 month running means to remove the small possibility of seasonal autocorrelation in the monthly anomalies.  Note: The satellite record commences in December 1978- now 37 years and 4 months long- 448 months.  12 month running means commence in November 1979.  The graphs below start in December 1978, so the vertical gridlines denote Decembers.  The final plotted points are March 2016. Except for the Tropics, where The Pause has reduced by three months, in other regions it has remained at the same length. [CLICK ON IMAGES TO ENLARGE] Globe: Sorry, GWEs, The Pause is still an embarrassing reality!  For how much longer we don’t know. And, for the special benefit of those who think that I am deliberately fudging data by using 12 month running means, here is the plot of monthly anomalies, which shows that The Pause is over in monthly anomalies by my rather strict criterion: I will continue posting these figures showing these scarey trends from monthly anomalies.  The Pause will return sooner with monthly anomalies than 12 month means of course.  Meanwhile, shudder at the thought of 18 years and 4 months with a frightening trend of +0.15C +/-0.1C per 100 years. Northern Hemisphere: The Northern Hemisphere Pause refuses to go quietly and remains at nearly half the record.  It may well disappear in the next month or two. Southern Hemisphere: For well over half the record the Southern Hemisphere has zero trend. Tropics: The Pause has shortened by three months, but is still well over half the record long.  

Satellites: No global warming at all for 18 years 8 months

No global warming at all for 18 years 8 months By Christopher Monckton of Brenchley The Paris agreement is more dangerous than it appears. Though the secession clause that this column has argued for was inserted into the second draft and remained in the final text, the zombies who have replaced the diplomatic negotiators of almost 200 nations did not – as they should have done in a rational world – insert a sunset clause that would bring the entire costly and pointless process to an end once the observed rate of warming fell far enough below the IPCC’s original predictions in 1990. It is those first predictions that matter, for they formed the official basis for the climate scam – the biggest transfer of wealth in human history from the poor to the rich, from the little guy to the big guy, from the governed to those who profit by governing them. Let us hope that the next President of the United States insists on a sunset clause. I propose that if 20 years without global warming occur, the IPCC, the UNFCCC and all their works should be swept into the dustbin of history, and the prosecutors should be brought in. We are already at 18 years 8 months, and counting. The el Niño has shortened the Pause, and will continue to do so for the next few months, but the discrepancy between prediction and reality remains very wide. Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 8 months since May 1997, though one-third of all anthropogenic forcings have occurred during the period of the Pause. It is worth understanding just how surprised the modelers ought to be by the persistence of the Pause. NOAA, in a very rare fit of honesty, admitted in its 2008 State of the Climatereport that 15 years or more without global warming would demonstrate a discrepancy between prediction and observation. The reason for NOAA’s statement is that there is supposed to be a sharp and significant instantaneous response to a radiative forcing such as adding CO2 to the air. The steepness of this predicted response can be seen in Fig. 1a, which is based on a paper on temperature feedbacks by Professor Richard Lindzen’s former student Professor Gerard Roe in 2009. The graph of Roe’s model output shows that the initial expected response to a forcing is supposed to be an immediate and rapid warming. But, despite the very substantial forcings in the 18 years 8 months since May 1997, not a flicker of warming has resulted. Figure 1a: Models predict rapid initial warming in response to a forcing. Instead, no warming at all is occurring. Based on Roe (2009). The current el Niño, as Bob Tisdale’s distinguished series of reports here demonstrates, is at least as big as the Great el Niño of 1998. The RSS temperature record is now beginning to reflect its magnitude. If past events of this kind are a guide, there will be several months’ further warming before the downturn in the spike begins. However, if there is a following la Niña, as there often is, the Pause may return at some time from the end of this year onward. The hiatus period of 18 years 8 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate. The rate on the RSS dataset since it began in 1979 is equivalent to 1.2 degrees/century. And yes, the start-date for the Pause has been inching forward, though just a little more slowly than the end-date, which is why the Pause has continued on average to lengthen. The UAH satellite dataset shows a Pause almost as long as the RSS dataset. However, the much-altered surface tamperature datasets show a small warming rate (Fig. 1b). Figure 1b. The least-squares linear-regression trend on the mean of the GISS, HadCRUT4 and NCDC terrestrial monthly global mean surface temperature anomaly datasets shows global warming at a rate equivalent to 1.1 C° per century during the period of the Pause from May 1997 to September 2015. Bearing in mind that one-third of the 2.4 W m–2 radiative forcing from all manmade sources since 1750 has occurred during the period of the Pause, a warming rate equivalent to little more than 1 C°/century (even if it had occurred) would not be cause for concern. As always, a note of caution. Merely because there has been little or no warming in recent decades, one may not draw the conclusion that warming has ended forever. The trend lines measure what has occurred: they do not predict what will occur. The Pause – politically useful though it may be to all who wish that the “official” scientific community would remember its duty of skepticism – is far less important than the growing discrepancy between the predictions of the general-circulation models and observed reality. The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. If the Pause lengthens just a little more, the rate of warming in the quarter-century since the IPCC’sFirst Assessment Report in 1990 will fall below 1 C°/century equivalent. Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 311 months January 1990 to November 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at just 1 K/century equivalent, taken as the mean of the RSS and UAH v.6 satellite monthly mean lower-troposphere temperature anomalies. Figure 3. Predicted temperature change, January 2005 to September 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v.6 satellite lower-troposphere temperature anomalies. The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. In a rational scientific discourse, those who had advocated extreme measures to prevent global warming would now be withdrawing and calmly rethinking their hypotheses. However, this is not a rational scientific discourse. Key facts about global temperature These facts should be shown to anyone who persists in believing that, in the words of Mr Obama’s Twitteratus, “global warming is real, manmade and dangerous”. Ø The RSS satellite dataset shows no global warming at all for 224 months from May 1997 to December 2015 – more than half the 444-month satellite record. Ø There has been no warming even though one-third of all anthropogenic forcings since 1750 have occurred since 1997. Ø The entire UAH dataset for the 444 months (37 full years) from December 1978 to November 2015 shows global warming at an unalarming rate equivalent to just 1.14 Cº per century. Ø Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century. Ø The global warming trend since 1900 is equivalent to 0.75 Cº per century. This is well within natural variability and may not have much to do with us. Ø The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century. Ø Compare the warming on the Central England temperature dataset in the 40 years 1694-1733, well before the Industrial Revolution, equivalent to 4.33 C°/century. Ø In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century. Ø The warming trend since 1990, when the IPCC wrote its first report, is equivalent to little more than 1 Cº per century. The IPCC had predicted close to thrice as much. Ø To meet the IPCC’s original central prediction of 1 C° warming from 1990-2025, in the next decade a warming of 0.75 C°, equivalent to 7.5 C°/century, would have to occur. Ø Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100. Ø The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950. Ø The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950. Ø The oceans, according to the 3600+ ARGO buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century, or 1 C° in 430 years. Ø Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that. Technical note Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend. The fact of a long Pause is an indication of the widening discrepancy between prediction and reality in the temperature record. The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able than the rest to capture such fluctuations without artificially filtering them out. Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe as 13.82 billion years. The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity. The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line. The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend. Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat. RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures. Dr Mears’ results are summarized in Fig. T1: Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998. Dr Mears writes: “The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.” Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph: “Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.” In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. The headline graph in these monthly reports begins in 1997 because that is as far back as one can go in the data and still obtain a zero trend. Fig. T1a. Graphs for RSS and GISS temperatures starting both in 1997 and in 2001. For each dataset the trend-lines are near-identical, showing conclusively that the argument that the Pause was caused by the 1998 el Nino is false (Werner Brozek and Professor Brown worked out this neat demonstration). Curiously, Dr Mears prefers the terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite data to calibrate its own terrestrial record. The length of the Pause, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. Sources of the IPCC projections in Figs. 2 and 3 IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded: “Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.” That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted. In 1990, the IPCC said this: “Based on current models we predict: “under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii). Later, the IPCC said: “The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv). The orange region in Fig. 2 represents the IPCC’s medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025. The IPCC’s predicted global warming over the 25 years from 1990 to the present differs little from a straight line (Fig. T2). Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii). Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely. But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3). Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990). Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date. True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless. The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn. Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality. To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to little more than 1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was approaching three times too big. In fact, the outturn is visibly well below even the least estimate. In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration. Is the ocean warming? One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date. Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork. Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1. Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow). Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger. The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize. Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is. Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO. ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming. Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth. Figure T7. Near-global ocean temperatures by stratum, 0-1900 m, providing a visual reality check to show just how little the upper strata are affected by minor changes in global air surface temperature. Source: ARGO marine atlas. Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere. If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has. In early October 2015 Steven Goddard added some very interesting graphs to his website. The graphs show the extent to which sea levels have been tampered with to make it look as though there has been sea-level rise when it is arguable that in fact there has been little or none. Why were the models’ predictions exaggerated? In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T8): Figure T8. Predicted manmade radiative forcings (IPCC, 1990). However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990 (Fig. T9): Figure T9: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013). Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T10): Figure T10. Energy budget diagram for the Earth from Stephens et al. (2012) In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission. It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC. Professor Ray Bates gave a paper in Moscow in summer 2015 in which he concluded, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi (1990) (Fig. T11) and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling. Figure T11. Reality (center) vs. 11 models. From Lindzen & Choi (2009). A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today. On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions. Finally, how long will it be before the Freedom Clock (Fig. T12) reaches 20 years without any global warming? If it does, the climate scare will become unsustainable. Figure T12. The Freedom Clock edges ever closer to 20 years without global warming # Related Links:  It’s Official – There are now 66 excuses for Temp ‘pause’ – Updated list of 66 excuses for the 18-26 year ‘pause’ in global warming Flashback: 1990 NASA Report: ‘Satellite analysis of upper atmosphere is more accurate, & should be adopted as the standard way to monitor temp change.’ April 1990 – The Canberra Times: ‘A report Issued by the U.S. space agency NASA…’ ‘The [NASA] report’s authors said that their satellite analysis of the upper atmosphere is more accurate, and should be adopted as the standard way to monitor temperature change.’ Real Science website analysis: ‘Twenty-four years later, NASA and NOAA ignore the more accurate satellite data – and report only useless, tampered surface temperatures.’ Flashback 1974: ’60 theories have been advanced to explain the global cooling’ – In the 1970’s scientists were predicting a new ice age, and had 60 theories to explain it.: – Ukiah Daily Journal 0 November 20, 1974 – “The cooling trend heralds the start of another ice age, of a duration that could last form 200 years to several milenia…Sixty theories have been advanced, he said, to explain the global cooling period.” 20 Nov 1974, Page 17 – at Newspapers.com  

No global warming at all for 18 years 9 months – a new record – The Pause lengthens again – just in time for UN Summit in Paris

Special To Climate Depot The Pause lengthens again – just in time for Paris No global warming at all for 18 years 9 months – a new record By Christopher Monckton of Brenchley As the faithful gather around their capering shamans in Paris for the New Superstition’s annual festival of worship, the Pause lengthens yet again. One-third of Man’s entire influence on climate since the Industrial Revolution has occurred since February 1997. Yet the 225 months since then show no global warming at all (Fig. 1). With this month’s RSS temperature record, the Pause beats last month’s record and now stands at 18 years 9 months. Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 9 months since February 1997, though one-third of all anthropogenic forcings have occurred during the period of the Pause. The accidental delegate from Burma provoked shrieks of fury from the congregation during the final benediction in Doha three years ago, when he said the Pause had endured for 16 years. Now, almost three years later, the Pause is almost three years longer. It is worth understanding just how surprised the modelers ought to be by the persistence of the Pause. NOAA, in a very rare fit of honesty, admitted in its 2008 State of the Climate report that 15 years or more without global warming would demonstrate a discrepancy between prediction and observation. The reason for NOAA’s statement is that there is supposed to be a sharp and significant instantaneous response to a radiative forcing such as adding CO2 to the air. The steepness of this predicted response can be seen in Fig. 1a, which is based on a paper on temperature feedbacks by Professor Richard Lindzen’s former student Professor Gerard Roe in 2009. The graph of Roe’s model output shows that the initial expected response to a forcing is supposed to be an immediate and rapid warming. But, despite the very substantial forcings in the 18 years 9 months since February 1997, not a flicker of warming has resulted. Figure 1a: Models predict rapid initial warming in response to a forcing. Instead, no warming at all is occurring. Based on Roe (2009). At the Heartland and Philip Foster events in Paris, I shall reveal in detail the three serious errors that have led the models to over-predict warming so grossly. The current el Niño, as Bob Tisdale’s distinguished series of reports here demonstrates, is at least as big as the Great el Niño of 1998. The RSS temperature record is beginning to reflect its magnitude. From next month on, the Pause will probably shorten dramatically and may disappear altogether for a time. However, if there is a following la Niña, as there often is, the Pause may return at some time from the end of next year onward. The hiatus period of 18 years 9 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate. And yes, the start-date for the Pause has been inching forward, though just a little more slowly than the end-date, which is why the Pause continues on average to lengthen. So long a stasis in global temperature is simply inconsistent not only with the extremist predictions of the computer models but also with the panic whipped up by the rent-seeking profiteers of doom rubbing their hands with glee in Paris. The UAH dataset shows a Pause almost as long as the RSS dataset. However, the much-altered surface tamperature datasets show a small warming rate (Fig. 1b). Figure 1b. The least-squares linear-regression trend on the mean of the GISS, HadCRUT4 and NCDC terrestrial monthly global mean surface temperature anomaly datasets shows global warming at a rate equivalent to 1.1 C° per century during the period of the Pause from January 1997 to September 2015. Bearing in mind that one-third of the 2.4 W m–2 radiative forcing from all manmade sources since 1750 has occurred during the period of the Pause, a warming rate equivalent to little more than 1 C°/century is not exactly alarming. As always, a note of caution. Merely because there has been little or no warming in recent decades, one may not draw the conclusion that warming has ended forever. The trend lines measure what has occurred: they do not predict what will occur. The Pause – politically useful though it may be to all who wish that the “official” scientific community would remember its duty of skepticism – is far less important than the growing discrepancy between the predictions of the general-circulation models and observed reality. The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. If the Pause lengthens just a little more, the rate of warming in the quarter-century since the IPCC’s First Assessment Report in 1990 will fall below 1 C°/century equivalent. Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 309 months January 1990 to September 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at just 1.02 K/century equivalent, taken as the mean of the RSS and UAH v.6 satellite monthly mean lower-troposphere temperature anomalies. Figure 3. Predicted temperature change, January 2005 to September 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v.6 satellite lower-troposphere temperature anomalies. As ever, the Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. In a rational scientific discourse, those who had advocated extreme measures to prevent global warming would now be withdrawing and calmly rethinking their hypotheses. However, this is not a rational scientific discourse. On the questioners’ side it is rational: on the believers’ side it is a matter of increasingly blind faith. The New Superstition is no fides quaerens intellectum. Key facts about global temperature These facts should be shown to anyone who persists in believing that, in the words of Mr Obama’s Twitteratus, “global warming is real, manmade and dangerous”. The RSS satellite dataset shows no global warming at all for 225 months from February 1997 to Octber 2015 – more than half the 442-month satellite record. There has been no warming even though one-third of all anthropogenic forcings since 1750 have occurred since the Pause began in February 1997. The entire RSS dataset for the 442 months December 1978 to September 2015 shows global warming at an unalarming rate equivalent to just 1.13 Cº per century. Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century. The global warming trend since 1900 is equivalent to 0.75 Cº per century. This is well within natural variability and may not have much to do with us. The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century. Compare the warming on the Central England temperature dataset in the 40 years 1694-1733, well before the Industrial Revolution, equivalent to 4.33 C°/century. In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century. The warming trend since 1990, when the IPCC wrote its first report, is equivalent to 1 Cº per century. The IPCC had predicted close to thrice as much. To meet the IPCC’s central prediction of 1 C° warming from 1990-2025, in the next decade a warming of 0.75 C°, equivalent to 7.5 C°/century, would have to occur. Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100. The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950. The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950. The oceans, according to the 3600+ ARGO buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century, or 1 C° in 430 years. Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that. Technical note Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend. The fact of a long Pause is an indication of the widening discrepancy between prediction and reality in the temperature record. The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able than the rest to capture such fluctuations without artificially filtering them out. Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years. The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity. The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line. The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend. Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat. RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures. Dr Mears’ results are summarized in Fig. T1: Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998. Dr Mears writes: “The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.” Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph: “Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.” In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. The headline graph in these monthly reports begins in 1997 because that is as far back as one can go in the data and still obtain a zero trend. Fig. T1a. Graphs for RSS and GISS temperatures starting both in 1997 and in 2001. For each dataset the trend-lines are near-identical, showing conclusively that the argument that the Pause was caused by the 1998 el Nino is false (Werner Brozek and Professor Brown worked out this neat demonstration). Curiously, Dr Mears prefers the terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite data to calibrate its own terrestrial record. The length of the Pause, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. Sources of the IPCC projections in Figs. 2 and 3 IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded: “Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.” That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted. In 1990, the IPCC said this: “Based on current models we predict: “under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii). Later, the IPCC said: “The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv). The orange region in Fig. 2 represents the IPCC’s medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025. The IPCC’s predicted global warming over the 25 years from 1990 to the present differs little from a straight line (Fig. T2). Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii). Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely. But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3). Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990). Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date. True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless. The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn. Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality. To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to little more than 1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was approaching three times too big. In fact, the outturn is visibly well below even the least estimate. In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration. Is the ocean warming? One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date. Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork. Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1. Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow). Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger. The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize. Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is. Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO. ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming. Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth. Figure T7. Near-global ocean temperatures by stratum, 0-1900 m, providing a visual reality check to show just how little the upper strata are affected by minor changes in global air surface temperature. Source: ARGO marine atlas. Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere. If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has. In early October 2015 Steven Goddard added some very interesting graphs to his website. The graphs show the extent to which sea levels have been tampered with to make it look as though there has been sea-level rise when it is arguable that in fact there has been little or none. Why were the models’ predictions exaggerated? In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T8): Figure T8. Predicted manmade radiative forcings (IPCC, 1990). However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990 (Fig. T9): Figure T9: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013). Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T10): Figure T10. Energy budget diagram for the Earth from Stephens et al. (2012) In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission. It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC. Professor Ray Bates gave a paper in Moscow in summer 2015 in which he concluded, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi (1990) (Fig. T11) and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling. Figure T11. Reality (center) vs. 11 models. From Lindzen & Choi (2009). A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today. On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions. Finally, how long will it be before the Freedom Clock (Fig. T12) reaches 20 years without any global warming? If it does, the climate scare will become unsustainable. Figure T12. The Freedom Clock edges ever closer to 20 years without global warming

A new record ‘Pause’ length: Satellite Data: No global warming for 18 years 8 months!

Special To Climate Depot The Pause lengthens yet again A new record Pause length: no warming for 18 years 8 months By Christopher Monckton of Brenchley One-third of Man’s entire influence on climate since the Industrial Revolution has occurred since January 1997. Yet for 224 months since then there has been no global warming at all (Fig. 1). With this month’s RSS (Remote Sensing Systems) temperature record, the Pause sets a new record at 18 years 8 months. Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 8 months since January 1997, though one-third of all anthropogenic forcings occurred during the period of the Pause. As ever, a warning about the current el Niño. It is becoming ever more likely that the temperature increase that usually accompanies an el Niño will begin to shorten the Pause somewhat, just in time for the Paris climate summit, though a subsequent La Niña would be likely to bring about a resumption and perhaps even a lengthening of the Pause. The spike in global temperatures caused by the thermohaline circulation carrying the warmer waters from the tropical Pacific all around the world usually occurs in the northern-hemisphere winter during an el Niño year. However, the year or two after an el Niño usually – but not always – brings an offsetting la Niña, cooling first the ocean surface and then the air temperature and restoring global temperature to normal. Figure 1a. The sea surface temperature index for the Nino 3.4 region of the tropical eastern Pacific, showing the climb towards a peak that generally occurs in the northern-hemisphere winter. For now, the Pause continues to lengthen, but before long the warmer sea surface temperatures in the Pacific will be carried around the world by the thermohaline circulation, causing a temporary warming spike in global temperatures. The hiatus period of 18 years 8 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate. The UAH dataset shows a Pause almost as long as the RSS dataset. However, the much-altered surface tamperature datasets show a small warming rate (Fig. 1b). Figure 1b. The least-squares linear-regression trend on the mean of the GISS, HadCRUT4 and NCDC terrestrial monthly global mean surface temperature anomaly datasets shows global warming at a rate equivalent to a little over 1 C° per century during the period of the Pause from January 1997 to July 2015. Bearing in mind that one-third of the 2.4 W m–2 radiative forcing from all manmade sources since 1750 has occurred during the period of the Pause, a warming rate equivalent to little more than 1 C°/century is not exactly alarming. However, the paper that reported the supposed absence of the Pause was extremely careful not to report just how little warming the terrestrial datasets – even after all their many tamperings – actually show. As always, a note of caution. Merely because there has been little or no warming in recent decades, one may not draw the conclusion that warming has ended forever. The trend lines measure what has occurred: they do not predict what will occur. The Pause – politically useful though it may be to all who wish that the “official” scientific community would remember its duty of skepticism – is far less important than the growing discrepancy between the predictions of the general-circulation models and observed reality. The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. If the Pause lengthens just a little more, the rate of warming in the quarter-century since the IPCC’s First Assessment Report in 1990 will fall below 1 C°/century equivalent. Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 307 months January 1990 to July 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at just 1 K/century equivalent, taken as the mean of the RSS and UAH v. 5.6 satellite monthly mean lower-troposphere temperature anomalies. Figure 3. Predicted temperature change, January 2005 to July 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v. 5.6 satellite lower-troposphere temperature anomalies. The page Key Facts about Global Temperature (below) should be shown to anyone who persists in believing that, in the words of Mr Obama’s Twitteratus, “global warming is real, manmade and dangerous”. The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. Key facts about global temperature The RSS satellite dataset shows no global warming at all for 224 months from January 1997 to August 2015 – more than half the 440-month satellite record. There has been no warming even though one-third of all anthropogenic forcings since 1750 have occurred since the Pause began in January 1997. The entire RSS dataset from January 1979 to date shows global warming at an unalarming rate equivalent to just 1.2 Cº per century. Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century. The global warming trend since 1900 is equivalent to 0.75 Cº per century. This is well within natural variability and may not have much to do with us. The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century. Compare the warming on the Central England temperature dataset in the 40 years 1694-1733, well before the Industrial Revolution, equivalent to 4.33 C°/century. In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century. The warming trend since 1990, when the IPCC wrote its first report, is equivalent to 1 Cº per century. The IPCC had predicted close to thrice as much. To meet the IPCC’s central prediction of 1 C° warming from 1990-2025, in the next decade a warming of 0.75 C°, equivalent to 7.5 C°/century, would have to occur. Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100. The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950. The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950. The oceans, according to the 3600+ ARGO buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century, or 1 C° in 430 years. Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that.   Technical note Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend. The fact of a long Pause is an indication of the widening discrepancy between prediction and reality in the temperature record. The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able than the rest to capture such fluctuations without artificially filtering them out. Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years. The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity. The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line. The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend. Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat. RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures. Dr Mears’ results are summarized in Fig. T1: Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998. Dr Mears writes: “The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.” Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph: “Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.” In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. The headline graph in these monthly reports begins in 1997 because that is as far back as one can go in the data and still obtain a zero trend. Curiously, Dr Mears prefers the terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite data to calibrate its own terrestrial record. The length of the Great Pause in global warming, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. Sources of the IPCC projections in Figs. 2 and 3 IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded: “Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.” That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted. In 1990, the IPCC said this: “Based on current models we predict: “under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii). Later, the IPCC said: “The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv). The orange region in Fig. 2 represents the IPCC’s medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025. The IPCC’s predicted global warming over the 25 years from 1990 to the present differs little from a straight line (Fig. T2). Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii). Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely. But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3). Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990). Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date. True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless. The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn. Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality. To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to little more than 1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was approaching three times too big. In fact, the outturn is visibly well below even the least estimate. In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration. Is the ocean warming? One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date. Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork. Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1. Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow). Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger. The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize. Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is. Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO. ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming. Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth. Figure T7. Near-global ocean temperatures by stratum, 0-1900 m, providing a visual reality check to show just how little the upper strata are affected by minor changes in global air surface temperature. Source: ARGO marine atlas. Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere. If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has. Why were the models’ predictions exaggerated? In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T8): Figure T8. Predicted manmade radiative forcings (IPCC, 1990). However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990 (Fig. T9): Figure T9: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013). Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T10): Figure T10. Energy budget diagram for the Earth from Stephens et al. (2012) In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission. It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC. Professor Ray Bates gave a paper in Moscow in summer 2015 in which he concluded, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi (1990) (Fig. T11) and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling. Figure T11. Reality (center) vs. 11 models. From Lindzen & Choi (2009). A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today. On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions. Finally, how long will it be before the Freedom Clock (Fig. T11) reaches 20 years without any global warming? If it does, the climate scare will become unsustainable. Figure T12. The Freedom Clock edges ever closer to 20 years without global warming Related Links: It’s Official – There are now 66 excuses for Temp ‘pause’ – Updated list of 66 excuses for the 18-26 year ‘pause’ in global warming) Update: Scientists Challenge New Study Attempting to Erase The ‘Pause’: Warmists Rewrite Temperature History To Eliminate the ‘Pause’ ‘Deceptive hottest year temperature record claims’ – ‘Inconvenient fact that these records are being set by less than the uncertainty in the statistics’

A new record ‘Pause’ length: No global warming for 18 years 7 months – Temperature standstill extends to 223 months

Special to Climate Depot The Pause draws blood A new record Pause length: no warming for 18 years 7 months By Christopher Monckton of Brenchley For 223 months, since January 1997, there has been no global warming at all (Fig. 1). This month’s RSS temperature shows the Pause setting a new record at 18 years 7 months. It is becoming ever more likely that the temperature increase that usually accompanies an el Niño will begin to shorten the Pause somewhat, just in time for the Paris climate summit, though a subsequent La Niña would be likely to bring about a resumption and perhaps even a lengthening of the Pause. Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 7 months since January 1997. The hiatus period of 18 years 7 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. The start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate. The Pause has now drawn blood. In the run-up to the world-government “climate” conference in Paris this December, the failure of the world to warm at all for well over half the satellite record has provoked the climate extremists to resort to desperate measures to try to do away with the Pause. First there was Tom Karl with his paper attempting to wipe out the Pause by arbitrarily adding a hefty increase to all the ocean temperature measurements made by the 3600 automated ARGO bathythermograph buoys circulating in the oceans.  Hey presto! All three of the longest-standing terrestrial temperature datasets – GISS, HadCRUT4 and NCDC – were duly adjusted, yet again, to show more global warming than has really occurred. However, the measured and recorded facts are these. In the 11 full years April 2004 to March 2015, for which the ARGO system has been providing reasonably-calibrated though inevitably ill-resolved data (each buoy has to represent 200,000 km3 of ocean temperature with only three readings a month), there has been no warming at all in the upper 750 m, and only a little below that, so that the trend over the period of operation shows a warming equivalent to just 1 C° every 430 years. Figure 1a. Near-global ocean temperatures by stratum, 0-1900 m. Source: ARGO marine atlas. And in the lower troposphere, the warming according to RSS occurred at a rate equivalent to 1 C° every 700 years. Figure 1b. The least-squares linear-regression trend on the UAH satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 5 months since March 1997. Then along came another paper, this time saying that the GISS global temperature record shows global warming during the Pause and that, therefore, GISS shows global warming during the Pause. This instance of argumentum ad petitionem principii, the fallacy of circular argument, passed peer review without difficulty because it came to the politically-correct conclusion that there was no Pause. The paper reached its conclusion, however, without mentioning the word “satellite”. The UAH data show no warming for 18 years 5 months. Figure 1c. The least-squares linear-regression trend on the UAH satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 5 months since March 1997. For completeness, though no reliance can now be placed on the terrestrial datasets, here is the “warming” rate they show since January 1997: Figure 1d. The least-squares linear-regression trend on the mean of the GISS, HadCRUT4 and NCDC terrestrial monthly global mean surface temperature anomaly datasets shows global warming at a rate equivalent to a little over 1 C° per century during the period of the Pause from January 1997 to July 2015. Bearing in mind that one-third of the 2.4 W m–2 radiative forcing from all manmade sources since 1750 has occurred during the period of the Pause, a warming rate equivalent to little more than 1 C°/century is not exactly alarming. However, the paper that reported the supposed absence of the Pause was extremely careful not to report just how little warming the terrestrial datasets – even after all their many tamperings – actually show. As always, a note of caution. Merely because there has been little or no warming in recent decades, one may not draw the conclusion that warming has ended forever. The trend lines measure what has occurred: they do not predict what will occur. Furthermore, the long, slow build-up of the current el Nino, which has now become strongish and – on past form – will not peak till the turn of the year, is already affecting tropical temperatures and, as the thermohaline circulation does its thing, must eventually affect global temperatures. Though one may expect the el Nino to be followed by a la Nina, canceling the temporary warming, this does not always happen. In short, the Pause may well come to an end and then disappear. However, as this regular column has stressed before, the Pause – politically useful though it may be to all who wish that the “official” scientific community would remember its duty of skepticism – is far less important than the growing divergence between the predictions of the general-circulation models and observed reality. The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. If the Pause lengthens just a little more, the rate of warming in the quarter-century since the IPCC’s First Assessment Report in 1990 will fall below 1 C°/century equivalent. Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 307 months January 1990 to July 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at just 1 K/century equivalent, taken as the mean of the RSS and UAH v. 5.6 satellite monthly mean lower-troposphere temperature anomalies. Figure 3. Predicted temperature change, January 2005 to July 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v. 5.6 satellite lower-troposphere temperature anomalies. The page Key Facts about Global Temperature (below) should be shown to anyone who persists in believing that, in the words of Mr Obama’s Twitteratus, “global warming is real, manmade and dangerous”. The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. Key facts about global temperature The RSS satellite dataset shows no global warming at all for 223 months from January 1997 to July 2015 – more than half the 439-month satellite record. There has been no warming even though one-third of all anthropogenic forcings since 1750 have occurred since January 1997, during the pause in global warming. The entire RSS dataset from January 1979 to date shows global warming at an unalarming rate equivalent to just 1.2 Cº per century. Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century. The global warming trend since 1900 is equivalent to 0.75 Cº per century. This is well within natural variability and may not have much to do with us. The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century. Compare the warming on the Central England temperature dataset in the 40 years 1694-1733, well before the Industrial Revolution, equivalent to 4.33 C°/century. In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century. The warming trend since 1990, when the IPCC wrote its first report, is equivalent to 1 Cº per century. The IPCC had predicted more than two and a half times as much. To meet the IPCC’s central prediction of 1 C° warming from 1990-2025, in the next decade a warming of 0.75 C°, equivalent to 7.5 C°/century, would have to occur. Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100. The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950. The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950. The oceans, according to the 3600+ ARGO buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century, or 1 C° in 430 years. Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that.  Technical note Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend. The fact of a long Pause is an indication of the widening discrepancy between prediction and reality in the temperature record. The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able than the rest to capture such fluctuations without artificially filtering them out. Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years. The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity. The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line. The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend. Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat. RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures. Dr Mears’ results are summarized in Fig. T1: Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998. Dr Mears writes: “The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.” Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph: “Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.” In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. Curiously, Dr Mears prefers the terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite data to calibrate its own terrestrial record. The length of the Great Pause in global warming, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. The el Nino may well strengthen throughout this year, reducing the length of the Great Pause. However, the discrepancy between prediction and observation continues to widen. Sources of the IPCC projections in Figs. 2 and 3 IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded: “Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.” That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted. In 1990, the IPCC said this: “Based on current models we predict: “under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii). Later, the IPCC said: “The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv). The orange region in Fig. 2 represents the IPCC’s medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025. The IPCC’s predicted global warming over the 25 years from 1990 to the present differs little from a straight line (Fig. T2). Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii). Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely. But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3). Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990). Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date. True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless. The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn. Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality. To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to little more than 1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was approaching three times too big. In fact, the outturn is visibly well below even the least estimate. In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration. Is the ocean warming? One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date. Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork. Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1. Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow). Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger. The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize. Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is. Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO. ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming. Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth. Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere. If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has. Why were the models’ predictions exaggerated? In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T7): Figure T7. Predicted manmade radiative forcings (IPCC, 1990). However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990: Figure T8: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013). Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T9): Figure T9. Energy budget diagram for the Earth from Stephens et al. (2012) In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission. It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC. Professor Ray Bates gave a paper in Moscow in summer 2015 in which he concluded, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi (1990) (Fig. T10) and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling. Figure T10. Reality (center) vs. 11 models. From Lindzen & Choi (2009). A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today. On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions. Finally, how long will it be before the Freedom Clock (Fig. T11) reaches 20 years without any global warming? If it does, the climate scare will become unsustainable. Figure T11. The Freedom Clock # Climate Depot Note: Above Graphs courtesy of WattsUpWithThat.com (Also see: It’s Official – There are now 66 excuses for Temp ‘pause’ – Updated list of 66 excuses for the 18-26 year ‘pause’ in global warming) Update: Scientists Challenge New Study Attempting to Erase The ‘Pause’: Warmists Rewrite Temperature History To Eliminate the ‘Pause’

Global warming standstill/pause increases to ‘a new record length’: 18 years 6 months’

Special to Climate Depot El Niño strengthens: the Pause lengthens Global temperature update: no warming for 18 years 6 months By Christopher Monckton of Brenchley For 222 months, since December 1996, there has been no global warming at all (Fig. 1). This month’s RSS temperature – still unaffected by a slowly strengthening el Niño, which will eventually cause temporary warming – passes another six-month milestone, and establishes a new record length for the Pause: 18 years 6 months. What is more, the IPCC’s centrally-predicted warming rate since its First Assessment Report in 1990 is now more than two and a half times the measured rate. On any view, the predictions on which the entire climate scare was based were extreme exaggerations. However, it is becoming ever more likely that the temperature increase that usually accompanies an el Niño may come through after a lag of four or five months. The Pause may yet shorten somewhat, just in time for the Paris climate summit, though a subsequent La Niña would be likely to bring about a resumption of the Pause. Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 6 months since December 1996. The hiatus period of 18 years 6 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. Note that the start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate. The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. For the time being, these two graphs will be based on RSS alone, since the text file for the new UAH v.6 dataset is not yet being updated monthly. However, the effect of the recent UAH adjustments – exceptional in that they are the only such adjustments I can recall that reduce the previous trend rather than steepening it – is to bring the UAH dataset very close to that of RSS, so that there is now a clear distinction between the satellite and terrestrial datasets, particularly since the latter were subjected to adjustments over the past year or two that steepened the apparent rate of warming. Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 305 months January 1990 to May 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at less than 1.1 K/century equivalent, taken as the mean of the RSS and UAH v. 5.6 satellite monthly mean lower-troposphere temperature anomalies. Figure 3. Predicted temperature change, January 2005 to May 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v. 5.6 satellite lower-troposphere temperature anomalies. The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century. Key facts about global temperature The RSS satellite dataset shows no global warming at all for 222 months from December 1996 to May 2015 – more than half the 437-month satellite record. The entire RSS dataset from January 1979 to date shows global warming at an unalarming rate equivalent to just 1.2 Cº per century. Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century. The global warming trend since 1900 is equivalent to 0.8 Cº per century. This is well within natural variability and may not have much to do with us. The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century. In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century. The warming trend since 1990, when the IPCC wrote its first report is equivalent to 1.1 Cº per century. The IPCC had predicted two and a half times as much. Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100. The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950. The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950. The oceans, according to the 3600+ ARGO bathythermograph buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century. Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that. Technical note Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend. The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able to capture such fluctuations without artificially filtering them out than other datasets. Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years. The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity. The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line. The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend. Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat. RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at remss.com/blog/recent-slowing-rise-global-temperatures. Dr Mears’ results are summarized in Fig. T1: Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998. Dr Mears writes: “The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.” Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph: “Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.” In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself. Curiously, Dr Mears prefers the much-altered terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite record to calibrate its own terrestrial record. The length of the Great Pause in global warming, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. It remains possible that el Nino-like conditions may prevail this year, reducing the length of the Great Pause. However, the discrepancy between prediction and observation continues to widen. Sources of the IPCC projections in Figs. 2 and 3 IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded: “Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.” That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted. In 1990, the IPCC said this: “Based on current models we predict: “under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii). Later, the IPCC said: “The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv). The orange region in Fig. 2 represents the IPCC’s less extreme medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025, rather than its more extreme Scenario-A estimate, i.e. 1.8 [1.3, 3.7] K by 2030. It has been suggested that the IPCC did not predict the straight-line global warming rate that is shown in Figs. 2-3. In fact, however, its predicted global warming over so short a term as the 25 years from 1990 to the present differs little from a straight line (Fig. T2). Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii). Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely. Likewise, to reach 1.8 K by 2030, there would have to be four or five times as much warming in the next 15 years as there was in the last 25 years. That is still less likely. But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3). Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990). Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date. True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless. The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn. Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality. To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to less than 1.1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was two and a half times too big. In fact, the outturn is visibly well below even the least estimate. In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration. Is the ocean warming? One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date. Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork. Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1. Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow). Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger. The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize. Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is. Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO. ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming. Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth. Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere. If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has. Why were the models’ predictions exaggerated? In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T7): Figure T7. Predicted manmade radiative forcings (IPCC, 1990). However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990: Figure T8: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013). Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T9): Figure T9. Energy budget diagram for the Earth from Stephens et al. (2012) In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission. It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to scibull.com and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC. Professor Ray Bates will shortly give a paper in Moscow in which he will conclude, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling. Figure T10. Reality (center) vs. 11 models. From Lindzen & Choi (2009). A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today. On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions.  

For more results click below