Global warming standstill/pause increases to ‘a new record length’: 18 years 6 months’

By: - Climate DepotJune 3, 2015 5:37 PM with 32 comments

Special to Climate Depot

El Niño strengthens: the Pause lengthens

Global temperature update: no warming for 18 years 6 months

By Christopher Monckton of Brenchley

For 222 months, since December 1996, there has been no global warming at all (Fig. 1). This month’s RSS temperature – still unaffected by a slowly strengthening el Niño, which will eventually cause temporary warming – passes another six-month milestone, and establishes a new record length for the Pause: 18 years 6 months.

What is more, the IPCC’s centrally-predicted warming rate since its First Assessment Report in 1990 is now more than two and a half times the measured rate. On any view, the predictions on which the entire climate scare was based were extreme exaggerations.

However, it is becoming ever more likely that the temperature increase that usually accompanies an el Niño may come through after a lag of four or five months. The Pause may yet shorten somewhat, just in time for the Paris climate summit, though a subsequent La Niña would be likely to bring about a resumption of the Pause.


Figure 1. The least-squares linear-regression trend on the RSS satellite monthly global mean surface temperature anomaly dataset shows no global warming for 18 years 6 months since December 1996.

The hiatus period of 18 years 6 months is the farthest back one can go in the RSS satellite temperature record and still show a sub-zero trend. Note that the start date is not cherry-picked: it is calculated. And the graph does not mean there is no such thing as global warming. Going back further shows a small warming rate.

The divergence between the models’ predictions in 1990 (Fig. 2) and 2005 (Fig. 3), on the one hand, and the observed outturn, on the other, continues to widen. For the time being, these two graphs will be based on RSS alone, since the text file for the new UAH v.6 dataset is not yet being updated monthly. However, the effect of the recent UAH adjustments – exceptional in that they are the only such adjustments I can recall that reduce the previous trend rather than steepening it – is to bring the UAH dataset very close to that of RSS, so that there is now a clear distinction between the satellite and terrestrial datasets, particularly since the latter were subjected to adjustments over the past year or two that steepened the apparent rate of warming.


Figure 2. Near-term projections of warming at a rate equivalent to 2.8 [1.9, 4.2] K/century, made with “substantial confidence” in IPCC (1990), for the 305 months January 1990 to May 2015 (orange region and red trend line), vs. observed anomalies (dark blue) and trend (bright blue) at less than 1.1 K/century equivalent, taken as the mean of the RSS and UAH v. 5.6 satellite monthly mean lower-troposphere temperature anomalies.


Figure 3. Predicted temperature change, January 2005 to May 2015, at a rate equivalent to 1.7 [1.0, 2.3] Cº/century (orange zone with thick red best-estimate trend line), compared with the near-zero observed anomalies (dark blue) and real-world trend (bright blue), taken as the mean of the RSS and UAH v. 5.6 satellite lower-troposphere temperature anomalies.

The Technical Note explains the sources of the IPCC’s predictions in 1990 and in 2005, and also demonstrates that that according to the ARGO bathythermograph data the oceans are warming at a rate equivalent to less than a quarter of a Celsius degree per century.

Key facts about global temperature

  • The RSS satellite dataset shows no global warming at all for 222 months from December 1996 to May 2015 – more than half the 437-month satellite record.
  • The entire RSS dataset from January 1979 to date shows global warming at an unalarming rate equivalent to just 1.2 Cº per century.
  • Since 1950, when a human influence on global temperature first became theoretically possible, the global warming trend has been equivalent to below 1.2 Cº per century.
  • The global warming trend since 1900 is equivalent to 0.8 Cº per century. This is well within natural variability and may not have much to do with us.
  • The fastest warming rate lasting 15 years or more since 1950 occurred over the 33 years from 1974 to 2006. It was equivalent to 2.0 Cº per century.
  • In 1990, the IPCC’s mid-range prediction of near-term warming was equivalent to 2.8 Cº per century, higher by two-thirds than its current prediction of 1.7 Cº/century.
  • The warming trend since 1990, when the IPCC wrote its first report is equivalent to 1.1 Cº per century. The IPCC had predicted two and a half times as much.
  • Though the IPCC has cut its near-term warming prediction, it has not cut its high-end business as usual centennial warming prediction of 4.8 Cº warming to 2100.
  • The IPCC’s predicted 4.8 Cº warming by 2100 is well over twice the greatest rate of warming lasting more than 15 years that has been measured since 1950.
  • The IPCC’s 4.8 Cº-by-2100 prediction is four times the observed real-world warming trend since we might in theory have begun influencing it in 1950.
  • The oceans, according to the 3600+ ARGO bathythermograph buoys, are warming at a rate of just 0.02 Cº per decade, equivalent to 0.23 Cº per century.
  • Recent extreme-weather events cannot be blamed on global warming, because there has not been any global warming to speak of. It is as simple as that.

Technical note

Our latest topical graph shows the least-squares linear-regression trend on the RSS satellite monthly global mean lower-troposphere dataset for as far back as it is possible to go and still find a zero trend. The start-date is not “cherry-picked” so as to coincide with the temperature spike caused by the 1998 el Niño. Instead, it is calculated so as to find the longest period with a zero trend.

The satellite datasets are arguably less unreliable than other datasets in that they show the 1998 Great El Niño more clearly than all other datasets. The Great el Niño, like its two predecessors in the past 300 years, caused widespread global coral bleaching, providing an independent verification that the satellite datasets are better able to capture such fluctuations without artificially filtering them out than other datasets.

Terrestrial temperatures are measured by thermometers. Thermometers correctly sited in rural areas away from manmade heat sources show warming rates below those that are published. The satellite datasets are based on reference measurements made by the most accurate thermometers available – platinum resistance thermometers, which provide an independent verification of the temperature measurements by checking via spaceward mirrors the known temperature of the cosmic background radiation, which is 1% of the freezing point of water, or just 2.73 degrees above absolute zero. It was by measuring minuscule variations in the cosmic background radiation that the NASA anisotropy probe determined the age of the Universe: 13.82 billion years.

The RSS graph (Fig. 1) is accurate. The data are lifted monthly straight from the RSS website. A computer algorithm reads them down from the text file and plots them automatically using an advanced routine that automatically adjusts the aspect ratio of the data window at both axes so as to show the data at maximum scale, for clarity.

The latest monthly data point is visually inspected to ensure that it has been correctly positioned. The light blue trend line plotted across the dark blue spline-curve that shows the actual data is determined by the method of least-squares linear regression, which calculates the y-intercept and slope of the line.

The IPCC and most other agencies use linear regression to determine global temperature trends. Professor Phil Jones of the University of East Anglia recommends it in one of the Climategate emails. The method is appropriate because global temperature records exhibit little auto-regression, since summer temperatures in one hemisphere are compensated by winter in the other. Therefore, an AR(n) model would generate results little different from a least-squares trend.

Dr Stephen Farish, Professor of Epidemiological Statistics at the University of Melbourne, kindly verified the reliability of the algorithm that determines the trend on the graph and the correlation coefficient, which is very low because, though the data are highly variable, the trend is flat.

RSS itself is now taking a serious interest in the length of the Great Pause. Dr Carl Mears, the senior research scientist at RSS, discusses it at

Dr Mears’ results are summarized in Fig. T1:


Figure T1. Output of 33 IPCC models (turquoise) compared with measured RSS global temperature change (black), 1979-2014. The transient coolings caused by the volcanic eruptions of Chichón (1983) and Pinatubo (1991) are shown, as is the spike in warming caused by the great el Niño of 1998.

Dr Mears writes:

“The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation.  This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.”

Dr Mears concedes the growing discrepancy between the RSS data and the models, but he alleges “cherry-picking” of the start-date for the global-temperature graph:

“Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades.  Because of this, we are getting a lot of questions along the lines of ‘I saw this plot on a denialist web site.  Is this really your data?’  While some of these reports have ‘cherry-picked’ their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990s is less than that predicted by most of the IPCC AR5 simulations of historical climate.  … The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.”

In fact, the spike in temperatures caused by the Great el Niño of 1998 is almost entirely offset in the linear-trend calculation by two factors: the not dissimilar spike of the 2010 el Niño, and the sheer length of the Great Pause itself.

Curiously, Dr Mears prefers the much-altered terrestrial datasets to the satellite datasets. The UK Met Office, however, uses the satellite record to calibrate its own terrestrial record.

The length of the Great Pause in global warming, significant though it now is, is of less importance than the ever-growing discrepancy between the temperature trends predicted by models and the far less exciting real-world temperature change that has been observed. It remains possible that el Nino-like conditions may prevail this year, reducing the length of the Great Pause. However, the discrepancy between prediction and observation continues to widen.

Sources of the IPCC projections in Figs. 2 and 3

IPCC’s First Assessment Report predicted that global temperature would rise by 1.0 [0.7, 1.5] Cº to 2025, equivalent to 2.8 [1.9, 4.2] Cº per century. The executive summary asked, “How much confidence do we have in our predictions?” IPCC pointed out some uncertainties (clouds, oceans, etc.), but concluded:

“Nevertheless, … we have substantial confidence that models can predict at least the broad-scale features of climate change. … There are similarities between results from the coupled models using simple representations of the ocean and those using more sophisticated descriptions, and our understanding of such differences as do occur gives us some confidence in the results.”

That “substantial confidence” was substantial over-confidence. For the rate of global warming since 1990 – the most important of the “broad-scale features of climate change” that the models were supposed to predict – is now below half what the IPCC had then predicted.

In 1990, the IPCC said this:

“Based on current models we predict:

“under the IPCC Business-as-Usual (Scenario A) emissions of greenhouse gases, a rate of increase of global mean temperature during the next century of about 0.3 Cº per decade (with an uncertainty range of 0.2 Cº to 0.5 Cº per decade), this is greater than that seen over the past 10,000 years. This will result in a likely increase in global mean temperature of about 1 Cº above the present value by 2025 and 3 Cº before the end of the next century. The rise will not be steady because of the influence of other factors” (p. xii).

Later, the IPCC said:

“The numbers given below are based on high-resolution models, scaled to be consistent with our best estimate of global mean warming of 1.8 Cº by 2030. For values consistent with other estimates of global temperature rise, the numbers below should be reduced by 30% for the low estimate or increased by 50% for the high estimate” (p. xxiv).

The orange region in Fig. 2 represents the IPCC’s less extreme medium-term Scenario-A estimate of near-term warming, i.e. 1.0 [0.7, 1.5] K by 2025, rather than its more extreme Scenario-A estimate, i.e. 1.8 [1.3, 3.7] K by 2030.

It has been suggested that the IPCC did not predict the straight-line global warming rate that is shown in Figs. 2-3. In fact, however, its predicted global warming over so short a term as the 25 years from 1990 to the present differs little from a straight line (Fig. T2).


Figure T2. Historical warming from 1850-1990, and predicted warming from 1990-2100 on the IPCC’s “business-as-usual” Scenario A (IPCC, 1990, p. xxii).

Because this difference between a straight line and the slight uptick in the warming rate the IPCC predicted over the period 1990-2025 is so small, one can look at it another way. To reach the 1 K central estimate of warming since 1990 by 2025, there would have to be twice as much warming in the next ten years as there was in the last 25 years. That is not likely.

Likewise, to reach 1.8 K by 2030, there would have to be four or five times as much warming in the next 15 years as there was in the last 25 years. That is still less likely.

But is the Pause perhaps caused by the fact that CO2 emissions have not been rising anything like as fast as the IPCC’s “business-as-usual” Scenario A prediction in 1990? No: CO2 emissions have risen rather above the Scenario-A prediction (Fig. T3).


Figure T3. CO2 emissions from fossil fuels, etc., in 2012, from Le Quéré et al. (2014), plotted against the chart of “man-made carbon dioxide emissions”, in billions of tonnes of carbon per year, from IPCC (1990).

Plainly, therefore, CO2 emissions since 1990 have proven to be closer to Scenario A than to any other case, because for all the talk about CO2 emissions reduction the fact is that the rate of expansion of fossil-fuel burning in China, India, Indonesia, Brazil, etc., far outstrips the paltry reductions we have achieved in the West to date.

True, methane concentration has not risen as predicted in 1990 (Fig. T4), for methane emissions, though largely uncontrolled, are simply not rising as the models had predicted. Here, too, all of the predictions were extravagantly baseless.

The overall picture is clear. Scenario A is the emissions scenario from 1990 that is closest to the observed CO2 emissions outturn.


Figure T4. Methane concentration as predicted in four IPCC Assessment Reports, together with (in black) the observed outturn, which is running along the bottom of the least prediction. This graph appeared in the pre-final draft of IPCC (2013), but had mysteriously been deleted from the final, published version, inferentially because the IPCC did not want to display such a plain comparison between absurdly exaggerated predictions and unexciting reality.

To be precise, a quarter-century after 1990, the global-warming outturn to date – expressed as the least-squares linear-regression trend on the mean of the RSS and UAH monthly global mean surface temperature anomalies – is 0.27 Cº, equivalent to less than 1.1 Cº/century. The IPCC’s central estimate of 0.71 Cº, equivalent to 2.8 Cº/century, that was predicted for Scenario A in IPCC (1990) with “substantial confidence” was two and a half times too big. In fact, the outturn is visibly well below even the least estimate.

In 1990, the IPCC’s central prediction of the near-term warming rate was higher by two-thirds than its prediction is today. Then it was 2.8 C/century equivalent. Now it is just 1.7 Cº equivalent – and, as Fig. T5 shows, even that is proving to be a substantial exaggeration.

Is the ocean warming?

One frequently-discussed explanation for the Great Pause is that the coupled ocean-atmosphere system has continued to accumulate heat at approximately the rate predicted by the models, but that in recent decades the heat has been removed from the atmosphere by the ocean and, since globally the near-surface strata show far less warming than the models had predicted, it is hypothesized that what is called the “missing heat” has traveled to the little-measured abyssal strata below 2000 m, whence it may emerge at some future date.

Actually, it is not known whether the ocean is warming: each of the 3600 automated ARGO bathythermograph buoys takes just three measurements a month in 200,000 cubic kilometres of ocean – roughly a 100,000-square-mile box more than 316 km square and 2 km deep. Plainly, the results on the basis of a resolution that sparse (which, as Willis Eschenbach puts it, is approximately the equivalent of trying to take a single temperature and salinity profile taken at a single point in Lake Superior less than once a year) are not going to be a lot better than guesswork.

Unfortunately ARGO seems not to have updated the ocean dataset since December 2014. However, what we have gives us 11 full years of data. Results are plotted in Fig. T5. The ocean warming, if ARGO is right, is equivalent to just 0.02 Cº decade–1, equivalent to 0.2 Cº century–1.


Figure T5. The entire near-global ARGO 2 km ocean temperature dataset from January 2004 to December 2014 (black spline-curve), with the least-squares linear-regression trend calculated from the data by the author (green arrow).

Finally, though the ARGO buoys measure ocean temperature change directly, before publication NOAA craftily converts the temperature change into zettajoules of ocean heat content change, which make the change seem a whole lot larger.

The terrifying-sounding heat content change of 260 ZJ from 1970 to 2014 (Fig. T6) is equivalent to just 0.2 K/century of global warming. All those “Hiroshima bombs of heat” of which the climate-extremist websites speak are a barely discernible pinprick. The ocean and its heat capacity are a lot bigger than some may realize.


Figure T6. Ocean heat content change, 1957-2013, in Zettajoules from NOAA’s NODC Ocean Climate Lab:, with the heat content values converted back to the ocean temperature changes in Kelvin that were originally measured. NOAA’s conversion of the minuscule warming data to Zettajoules, combined with the exaggerated vertical aspect of the graph, has the effect of making a very small change in ocean temperature seem considerably more significant than it is.

Converting the ocean heat content change back to temperature change reveals an interesting discrepancy between NOAA’s data and that of the ARGO system. Over the period of ARGO data, from 2004-2014, the NOAA data imply that the oceans are warming at 0.05 Cº decade–1, equivalent to 0.5 Cº century–1, or rather more than double the rate shown by ARGO.

ARGO has the better-resolved dataset, but since the resolutions of all ocean datasets are very low one should treat all these results with caution. What one can say is that, on such evidence as these datasets are capable of providing, the difference between underlying warming rate of the ocean and that of the atmosphere is not statistically significant, suggesting that if the “missing heat” is hiding in the oceans it has magically found its way into the abyssal strata without managing to warm the upper strata on the way. On these data, too, there is no evidence of rapid or catastrophic ocean warming.

Furthermore, to date no empirical, theoretical or numerical method, complex or simple, has yet successfully specified mechanistically either how the heat generated by anthropogenic greenhouse-gas enrichment of the atmosphere has reached the deep ocean without much altering the heat content of the intervening near-surface strata or how the heat from the bottom of the ocean may eventually re-emerge to perturb the near-surface climate conditions relevant to land-based life on Earth.

Most ocean models used in performing coupled general-circulation model sensitivity runs simply cannot resolve most of the physical processes relevant for capturing heat uptake by the deep ocean. Ultimately, the second law of thermodynamics requires that any heat which may have accumulated in the deep ocean will dissipate via various diffusive processes. It is not plausible that any heat taken up by the deep ocean will suddenly warm the upper ocean and, via the upper ocean, the atmosphere.

If the “deep heat” explanation for the Pause were correct (and it is merely one among dozens that have been offered), the complex models have failed to account for it correctly: otherwise, the growing discrepancy between the predicted and observed atmospheric warming rates would not have become as significant as it has.

Why were the models’ predictions exaggerated?

In 1990 the IPCC predicted – on its business-as-usual Scenario A – that from the Industrial Revolution till the present there would have been 4 Watts per square meter of radiative forcing caused by Man (Fig. T7):


Figure T7. Predicted manmade radiative forcings (IPCC, 1990).

However, from 1995 onward the IPCC decided to assume, on rather slender evidence, that anthropogenic particulate aerosols – mostly soot from combustion – were shading the Earth from the Sun to a large enough extent to cause a strong negative forcing. It has also now belatedly realized that its projected increases in methane concentration were wild exaggerations. As a result of these and other changes, it now estimates that the net anthropogenic forcing of the industrial era is just 2.3 Watts per square meter, or little more than half its prediction in 1990:


Figure T8: Net anthropogenic forcings, 1750 to 1950, 1980 and 2012 (IPCC, 2013).

Even this, however, may be a considerable exaggeration. For the best estimate of the actual current top-of-atmosphere radiative imbalance (total natural and anthropo-genic net forcing) is only 0.6 Watts per square meter (Fig. T9):


Figure T9. Energy budget diagram for the Earth from Stephens et al. (2012)

In short, most of the forcing predicted by the IPCC is either an exaggeration or has already resulted in whatever temperature change it was going to cause. There is little global warming in the pipeline as a result of our past and present sins of emission.

It is also possible that the IPCC and the models have relentlessly exaggerated climate sensitivity. One recent paper on this question is Monckton of Brenchley et al. (2015), which found climate sensitivity to be in the region of 1 Cº per CO2 doubling (go to and click “Most Read Articles”). The paper identified errors in the models’ treatment of temperature feedbacks and their amplification, which account for two-thirds of the equilibrium warming predicted by the IPCC.

Professor Ray Bates will shortly give a paper in Moscow in which he will conclude, based on the analysis by Lindzen & Choi (2009, 2011) (Fig. T10), that temperature feedbacks are net-negative. Accordingly, he supports the conclusion both by Lindzen & Choi and by Spencer & Braswell (2010, 2011) that climate sensitivity is below – and perhaps considerably below – 1 Cº per CO2 doubling.


Figure T10. Reality (center) vs. 11 models. From Lindzen & Choi (2009).

A growing body of reviewed papers find climate sensitivity considerably below the 3 [1.5, 4.5] Cº per CO2 doubling that was first put forward in the Charney Report of 1979 for the U.S. National Academy of Sciences, and is still the IPCC’s best estimate today.

On the evidence to date, therefore, there is no scientific basis for taking any action at all to mitigate CO2 emissions.


  • Ghostmaker

    Those satellites are bad and we all know that because that is what a liberal web site told me.

    • planet8788

      And if you try to argue with them, they will ban you.

  • Steve Case

    I hear that the so-called pause will be disappeared tomorrow.

  • Josh
  • docgee

    Congratulations Mark. And thank you Christopher.

    “Note that the start date is not cherry-picked: it is calculated.” And thank you again. Here’s what I had to say on this issue in my new book:

    “Yes, 1998 was a particularly warm year, and yes, there
    was an exceptionally strong El Niño beginning in 1997. However, if we study the Hadcrut4 data (or any of the other data from other sources) we see a steady uptrend beginning in the late 70s and climaxing in 1998. The exceptionally warm “El Niño year” was clearly part of the same trend. We have no way of knowing how much of the warming was produced by El Niño and how much was produced by whatever it was that prompted the unusual upswing during the years leading up to that El Niño. And since El Niños have occurred on a cyclic basis over many years in the past, and are one of many perfectly natural factors affecting climate, I see no reason why this one El Niño year should be discounted.

    As is evident from any of the temperature graphs, 1998 was an obvious turning point between a steep upturn in warming and the leveling
    off that followed. Cherry picking involves choosing certain data points arbitrarily simply to “show what you want . . . to show,” yes. But selecting a particular year because it clearly represents a turning point is the opposite of cherry picking because it is not arbitrary but meaningful.”

    Forgive me for promoting my book here, but if not here, then where?

  • Nower1965

    < col Hiiiiiii Friends…——–.???? ? @mi12//++climatedepot++ < Find More='' ……..''


  • Foctris

    Arguing with the Jesuits of AGW is pointless. ANYTIME you point out that their data is cooked they simply point out the data is not cooked and the data is perfect no matter how cooked the data obviously is.

    • warnerathey

      Unfortunately it looks like you are right. It is almost pointless to get into an argument (discussion) with a zealot. No mater what happens to them it is proof of man caused CO2 global warming. It there is a drought, it’s proof of man made CO2 global warming. Now on the other hand if there is a flood well there you go, more proof of man made CO2 global warming. No matter what it is then if must be proof of global warming. Now on the other hand if there is no global warming for 15 years. That means nothing. You must be cherry picking. Also they can do some adjustments to to the data to make some global warming if they have to. So it is pointless to discuss anything with a zealot. You will not get a straight answer or any real information from them.
      Now on the other hand we need to have a discussion among normal people. We need to get to the bottom of it. If there is anything to global warming we need to find out. If it turns out to be a fraud we need to find that out also. Thanks.

      • Foctris

        And that is my point in all this too.

      • Foctris

        And that is my point in all this too.

  • Matthew Shadle

    “The hiatus period of 18 years 6 months is the farthest back one can go
    in the RSS satellite temperature record and still show a sub-zero trend.
    Note that the start date is not cherry-picked: it is calculated. And
    the graph does not mean there is no such thing as global warming. Going
    back further shows a small warming rate.”

    Calculating your starting point so that you get the desired result- a zero trend- sounds like the definition of cherry-picking. Especially if, as you admit, when you extend it out just a bit there is actually a warming trend.

    What you don’t mention is that only in March of last year you were claiming that the global warming pause began in August of 1996, but now apparently it begins in December of 1996. If this was truly a pause with “no global warming,” it is hard to understand why the starting date would move forward. But the reason is because the past couple of years have actually been warmer, so you had to cut off the colder dates at the beginning of your original graph, otherwise it would show that during the period you had claimed was a pause, there has been warming after all.

    You have created a version of Zeno’s arrow paradox: Zeno concluded that since at any given moment, of short enough duration, an arrow appears motionless, that therefore the arrow was always motionless.

    • Dave Randall

      You seem to be ignoring the essential fact that if global warming were caused by increases in CO2 concentrations, we should see a somewhat linear increase in global temperatures, regardless of what point we start measuring. This clearly is NOT taking place.

      • Tim Mantyla

        Not necessarily true that a “linear increase in global temperatures” would ensue–especially if other mechanisms like oceans taking up heat at different rates apply.

        Why are you trying to artificially (and unilaterally) simplify what’s already known to be a very complex process with many impinging factors?

        Are you such a renowned, credentialed, experienced climate scientist yourself that you can authoritatively pronounce and decide which factors are “essential facts?”

        Wouldn’t it be more scientific to look at trends–whether or not they corroborate known science or current theories–correlate ALL known facts and theories, then come up with a well reasoned theory to accommodate that?

        It’s what real scientists do.

        Where have you published your scientific findings?

        • Dave Randall

          Ah, an appeal to authority – always the hallmark of a third-rate mind.

      • flowirin

        because carbon dioxide increases entropic cooling

    • Uzi1

      Right, the bishops of cherry-picking are accusing the dissenters of cherry-picking? You’re charging Monckton et al with doing what you warmists have already done- ad nauseam/ad infinitum. The reason for renaming your pogrom “climate change” is because you know warming occurs in a cyclical pattern. Plus the definition of climate change is so broad it allows any variation in weather to support your “cause”—truly a fool’s paradise…………………………..

    • Tim Mantyla

      Nicely put, Matthew.

      • John Hart

        As soon as one resorts to ad hominem attacks, one looses the argument, and calling those who disagree with a theory, “deniers” instead of addressing their argument, is the truest sign of a useful idiot. I went to the source of your information, mediamatters, and found it had little factual information, and was filled with personal attacks. Instead of explaining why someone might think CO2 would cool the Earth, not heat it, and logically explain what is wrong with the theory, the mediamatters site was filled with crap, implying the idea was fabricated by the same people who lied about cancer, now working for coal companies.

        Questions the site needs to answered to have any creditability:
        How much heat leaves the Earth by convection?
        How much leaves by radiation?
        Does the solar wind shield the Earth from cosmic rays?
        How many adsorption lines in the IR spectrum does CO2 have?

        Do cosmic rays affect cloud formation.
        Does ice core data show CO2 lagging or leading temperature?
        Does averaging different models reduce error?
        How much CO2 comes from the oceans at the equator?
        How much CO2 goes into the oceans at the poles?
        How many studies have been done to determine if CO2 might be beneficial?

  • Rad Matic

    Personally, I’d take it back 10,000 years since before the Holocene Optimum.

  • KaraPEllis

    All time hit the .climatedepot Find Here

  • NormanLChrisman

    Your first choice climatedepot Find Here

  • Tim Mantyla

    Despite the pseudo-scientific appearance of this article, its conclusions are demonstrably false.

    It fails to recognize the storage and subsequent heating of oceans at ever greater depths to soak up heat that’s not evidenced at the surface. Ignoring this huge temperature sink is like saying Miguel Cabrera is a poor hitter–while counting ONLY his singles. A real scientist could pick apart Monckton’s arguments in his sleep. Plus he’s corrupt, having accepted funding from sources historically hostile to science in general and the science of AGW in particular. Why not ask him who’s paying him now and did so in the recent past–and investigate the conflicts of interest among those sources. Very enlightening.

    But…I’m guessing this isn’t a popular hangout for qualified scientists. Is it?

    I trust real climate science over cherry-picking by rogue scientists and nonscientific self-promoting “experts” who lie, promote myths and disinformation about AGW.

    Best sources include:
    The IPCC


    “There is no standstill in global warming,” said World Meteorological Organization (WMO) Secretary-General Michel Jarraud. “The warming of our oceans has accelerated and at lower depths. More than 90 percent of the excess energy trapped by greenhouse gases is stored in the oceans. Levels of these greenhouse gases are at record levels, meaning that our atmosphere and oceans will continue to warm for centuries to come.”

    • Dave Randall

      Do you grow mushrooms for a living? “More than 90 percent of the excess energy trapped by greenhouse gases is stored in the oceans.” There’s simply no evidence of that. Call me back when the Maldives are underwater, Timmy, and then maybe I’ll listen to what you have to say.

  • Tim Mantyla

    I will ask:

    Who exactly has, recently and in the past, paid you, Mr. Monckton, to dispute established climate science by experts far more skilled in actual science than you?

  • Tim Mantyla

    “Christopher Monckton is a non-scientist AGW denier, who has had articles published in The Guardian and in a non-peer-reviewed newsletter[1] of the American Physical Society (whose Council subsequently disagreed with Monckton’s conclusions)[1] claiming that global warming is neither man-made nor likely to be catastrophic. Monckton has made various false claims in the past such as that he is a member of the British House of Lords.[2], a Nobel Prize winner, inventor of a cure for HIV, winner of a defamation case against George Monbiot and writer of a peer-reviewed article. He was deputy leader of the far right United Kingdom Independence Party (UKIP) before being sacked from the party in 2013.”


    Kinda makes him more look like a zany attention seeker than a scientist, eh?

    If his work is soooo good, why won’t peer reviewed climate science journals publish it??

  • Tim Mantyla

    Monckton’s article–DEBUNKED (got the facts AND the math wrong, apparently. ):

    “Climate Myth…It hasn’t warmed since 1998: For the years 1998-2005, temperature did not increase. This period coincides with society’s continued pumping of more CO2 into the atmosphere. (Bob Carter)”

    “No, it hasn’t been cooling since 1998. Even if we ignore long term trends and just look at the record-breakers, that wasn’t the hottest year ever. Different reports show that, overall, 2005 was hotter than 1998. What’s more, globally, the hottest 12-month period ever recorded was from June 2009 to May 2010.

    “Though humans love record-breakers, they don’t, on their own, tell us a much about trends — and it’s trends that matter when monitoring Climate Change. Trends only appear by looking at all the data, globally, and taking into account other variables — like the effects of the El Nino ocean current or sunspot activity — not by cherry-picking single points.

    “There’s also a tendency for some people just to concentrate on surface air temperatures when there are other, more useful, indicators that can give us a better idea how rapidly the world is warming. Oceans for instance — due to their immense size and heat storing capability (called ‘thermal mass’) — tend to give a much more ‘steady’ indication of the warming that is happening.

    “Records show that the Earth has been warming at a steady rate before and since 1998 and there is no sign of it slowing any time soon (Figure 1). More than 90% of global warming heat goes into warming the oceans, while less than 3% goes into increasing the surface air temperature.

    “Even if we focus exclusively on global surface temperatures, Cowtan & Way (2013) shows that when we account for temperatures across the entire globe (including the Arctic, which is the part of the planet warming fastest), the global surface warming trend for 1997–2012 is approximatley 0.11 to 0.12°C per decade.”

  • flowirin

    carbon dioxide is a cooling gas

  • nikhilsheth

    thanks for this. I wonder if we should also be looking at climate EXTREMES instead of just warming? At least in the part of the world where I live, there is testimony by our senior citizens about the rainfall pattern during rainy season : that earlier we had a more constant precipitation at low or moderate intensity (at times it would rain “nonviolently” but constantly for a whole week or so.. we no longer see rain like that now), but now we are having more dry days and intermittent rainfalls at higher volume. When we average it out, it’s still more or less equal. But that’s not what really matters, is it? Our crops need the constant moderate rainfall, not dry spells with intermittent violent showers that will damage the crops. For me it’s important to know if climate disruption is happening or not, and the most crucial data in this are wind speeds and rain intensity. That too not average, but rather actual day-to-day, hourly values and their variations. Overall, average warming or cooling are irrelevant. At best, the temperature-related data that is important is the DIFFERENCE between max and min temperatures in the day, and over the year, for a given location.

  • ElveTwelfe