Search
Close this search box.

Search Results for: data manipulation – Page 3

Climate change models are vulnerable to manipulation. Biden should take note

http://co2coalition.org/2021/03/17/climate-change-models-are-vulnerable-to-manipulation-biden-should-take-note/ Climate change models are vulnerable to manipulation. Biden should take noteCO2 Coalition / by CO2Coalition / 2hBy Kevin Mooney Climate change models should not be used as policymaking tools because they are based on assumptions that can be easily manipulated to produce desired outcomes, according to economists and data analysts. In January, President Biden issued an executive order that calls for federal agencies to account for the social cost of carbon, which is defined as the economic damages associated with a ton of carbon dioxide emissions over a specified time period. But key assumptions about the climate’s sensitivity to carbon dioxide emissions, future projects about climate impacts, and a mathematical concept known as the discount rate have all come under renewed scrutiny. That’s a problem for Team Biden since these assumptions all figure into the social cost of carbon. Meanwhile, Biden’s executive actions have also prompted 12 states to file a lawsuit arguing that only Congress, and not the executive branch, can set regulations associated with social costs. “This quintessentially legislative policy has enormous consequences for America’s economy and people,” the lawsuit reads. “In theory, the Biden administration’s calculation of ‘social costs’ would justify imposing trillions of dollars in regulatory costs on the American economy every year to offset these supposed costs.” There are three main climate change models that weigh the social cost of carbon: the DICE model, the FUND model, and the PAGE model. After examining the DICE and FUND models, researchers with the Heritage Foundation discovered that the social cost of carbon could change significantly based on very moderate changes to the assumptions. “One of the major takeaways we have is from our analysis of the FUND model, which showed that the social cost of carbon could be negative, meaning that that benefits of CO2 emissions could outweigh the costs,” Kevin Dayaratna, the principal statistician and data scientist for the Heritage Foundation, said in an interview. “While the DICE model does not account for potential benefits from climate change, the FUND model does this by pulling in positive agricultural feedbacks that could flow from carbon dioxide emissions.” Since there are “significant policy implications” attached to climate model calculations that show net benefits attached to carbon dioxide emissions, Dayaratna recommends against using the models to set public policy.Got a great idea? Find and register your domain name today.Ad by GoDaddy See More “Someone could take the position based on these results that the government should be subsidizing rather than taxing CO2,” Dayaratna observes. “We don’t take either position, but we do recommend that models not be used to set regulatory policy since they are open to manipulation.” On Feb. 26, the Biden administration issued an interim estimate that placed the social cost of carbon at about $51 per metric ton of carbon dioxide emissions, which is in line with what was in place during the Obama administration. By comparison, the Trump administration had placed the social cost of carbon between $1 and $7 per ton of carbon dioxide emissions. Since top Biden administration officials favor a carbon tax, David Kreutzer, a senior economist with the Institute for Energy Research, is concerned that the discount rate and the social cost of carbon can be “ginned up” by the Biden administration to produce costly regulations that do not generate any significant benefit for the climate. “They could run very sketchy models far out into the future to, say, 2300 and project huge damages and discount those damages at a percentage that extracts a huge cost for present-day taxpayers and consumers,” Kreutzer said. “Some government officials are talking about imposing a $200 per ton carbon tax. This would add $1.80 to the cost of each gallon of gasoline and subtract out another $2,000 per household just based on the impact this would have on gasoline prices. These extra costs would be added on top of the higher electricity prices that consumers would be forced to pay for heating and cooling their homes and other everyday items.” Kreutzer has fashioned a real-world example of how the discount rate would work to help illustrate how costly and counterproductive climate change policies can be for current and future generations. For anyone familiar with the Little Women television series set in 1861, Kreutzer creates a scenario in which carbon dioxide emissions from that era (possibly from burning a lot of candles) cause climate damage in 2019 valued at $2,000. A discount rate of 1% over that time period yields a value of $415 in 1861. Under this scenario, the social cost of carbon demands that people in 1861 reduce their carbon dioxide emissions by a ton so long as the cost did not exceed $415. “In a way, we are infantilizing the future,” Kreutzer observes. “We are essentially saying that our grandchildren are no more capable at age 50 than they are at age 5. There’s also a bizarre twist of reasoning at work here that says income transfers from poorer generations to richer ones are somehow more equitable.” If the technological improvements and increased wealth that have occurred since 1861 are any guide to what might materialize over the next 160 years, then now would be a good time to ditch the social cost of carbon. This article appeared on the Washington Examiner website at https://www.washingtonexaminer.com/opinion/climate-change-models-are-vulnerable-to-manipulation-biden-should-take-note SHAREVISIT WEBSITE

Top NOAA scientist is removed from his position after he asked new Trump-appointed staff to adhere to the agency’s integrity policy that bans changing research data to fit political agenda

https://www.dailymail.co.uk/news/article-8888735/Trump-administration-fires-NOAA-scientist.html   NOAA acting chief scientist Craig McLean was removed from his position in September McLean was forced to step down from his role after emailing new Trump appointees about the agency’s ethics policy, according to a NYT report The NOAA scientific integrity policy prohibits fabrication, falsification, or the manipulation of research data to fit a political agenda McLean’s email drew a sharp response from Erik Noble, the agency’s new Trump-appointed chief of staff Noble the following morning informed McLean he would no longer serve as the agency’s acting chief scientist  He was replaced by Dr Ryan Maue, a climate change critic and former researcher for the Cato Institute, a libertarian think tank  By KAREN RUIZ FOR DAILYMAIL.COM Craig McLean, the NOAA’s acting chief scientist, was dismissed from his role last month The National Oceanic Atmospheric Administration’s top scientist has been fired after he asked new Trump-appointed staff to acknowledge the agency’s scientific integrity policy, according to a new report. Craig McLean, the agency’s acting chief scientist, was dismissed from his role last month shortly after sending an email to the new political appointees, including former White House adviser Erik Noble, the New York Times reported. McLean had reportedly asked them to respect the NOAA’s scientific integrity policy which prohibits fabrication, falsification and manipulation of research data driven to fit a political agenda. The message however did not sit well with Noble, a former Trump campaign data analyst who was recently appointed NOAA chief of staff. +4 McLean was removed as acting chief scientist by the National Oceanic Atmospheric Administration after emailing new Trump appointees the agency’s ethics policy, according to a NYT report ‘Respectfully, by what authority are you sending this to me?’ Noble replied, according to the report. McLean responded that he was ensuring the agency’s rules were being followed – which was his responsibility as the acting chief scientist for NOAA. The next morning McLean received an email from Noble informing him: ‘You no longer serve as the acting chief scientist for NOAA’, the NYT said. He told McLean the agency had already found his replacement while adding: ‘Thank you for your service.’ A spokesperson for the NOAA told DailyMail.com McLean remains director of the Oceanic and Atmospheric Research branch, which he ran while serving as acting chief scientist. The NOAA later appointed research meteorologist Dr Ryan Maue, a climate change critic and former researcher for the Cato Institute, a libertarian think tank. Maue is a well-known climate change skeptic who in 2018 wrote an op-ed casting doubt on global warming predictions from NASA scientists 30 years prior. He has also challenged connections between climate change and extreme weather events and most recently criticized Democrats for blaming the devastating California wildfires ‘solely on climate change.’ + McLean’s message drew a stern response from Erik Noble (left)  a former Trump campaign data analyst who was recently appointed NOAA chief of staff. McLean was replaced by Dr Ryan Maue (right) a former researcher for the Cato Institute, a libertarian think tank, and climate change skeptic Maue joined fellow Trump-pick David Legates, who assumed the new role of NOAA deputy assistant secretary last month. Legates, a former geography professor at the University of Delaware, is also known to be a climate change contrarian, having questioned the notion that human activity is behind global warming. The new political appointees has fueled speculation the Trump administration could change the direction of the NOAA, which has mostly remained independent, and could undermine scientific research. Trump has clashed with climate experts over the last four years and has even imposed stricter controls on communications at the NOAA – which is within the Department of Commerce run by Wilbur Ross. In 2019, the president was widely criticized by weather agencies after he tweeted false predictions about the path of a Hurricane Dorian. Trump claimed Alabama would be among US states that would ‘most likely be hit (much) harder than anticipated’ by the hurricane, then one of the most powerful Atlantic storms on record. Within minutes, the National Weather Service (NWS) office in Birmingham, Alabama, responded by saying that Alabama would not see any impacts from Dorian. The controversy became known as ‘Sharpiegate,’ after Trump displayed a modified NOAA map, which had been drawn on to depict the storm threatening Alabama. The New York Times reported last year that Ross threatened to fire top employees after the Birmingham office contradicted Trump and that then acting White House chief of staff Mick Mulvaney had directed Ross to order the NOAA to disavow the NWS tweet.

Public Manipulation: German ARD Television Using Red Hot Weather Charts For Showing Cool Temperatures

https://notrickszone.com/2020/06/02/public-manipulation-german-ard-television-using-red-hot-weather-charts-for-showing-cool-temperatures/ By P Gosselin on 2. June 2020 German public television is splashing red on its weather forecast charts to make people think it’s hot.  At Facebook, Akinom Dnagiew posted two fascinating side-by-side weather charts used by German ARD public television for weather forecasting: “Do you feel manipulated.” Image: ARD Readers will notice how in 2009 a calm, neutral chart was used for showing the day’s high temperatures, in Celsius. But then 10 years later, in 2019, the ARD was using more dramatic and aggressive color scheme to represent temperatures. Now it seems to be hot even when it’s relatively cool! In 2009, in northeast Germany, 27°C is shown without red hot colors. Then in 2019, even a relatively cool 22°C (72°F) gets shown as red hot. The German title of the above chart reads: “Fühlen Sie sich manipuliert?” (“Do you feel manipulated?” Looking at the chart from 2009, one sees the temperatures figures are high, yet we don’t get a hot visual impression. In 2019, on the other hand, a viewer way up in northern Germany might even start sweating a little bit from all the hot looking red, even though the high is expected to be only 20°C (68°F). Other examples What follows is a forecast chart from 2014 on ZDF public television: Here in the above video you’ll see that 25°C is designated by a light orange, and not dark red. What follows next is a screen shot from an ARD weather forecast made in 2008: Above we see a low key chart in plain green. No psychological color manipulations. – just the numbers. Taming the red? Okay, but let’s be fair. This year it seems the ARD has tamed its red colors somewhat. What follows is an image from today’s weather forecast: Temperatures in the mid 20s are now shown as light red. Image cropped from the ARD tagesschau.de. 

Watch: Morano exposes 97% climate consensus con testifying before Congress: ‘Pulled from thin air… tortured data’

Submitted Written Testimony of Marc Morano, Publisher of CFACT’s Climate Depot Author of Best Selling “The Politically Incorrect Guide to Climate Change” & former staff of U.S. Senate Environment & Public Works Committee House Natural Resources Committee Subcommittee on Water, Oceans, and Wildlife “Responding to the Global Assessment Report of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services.” Date: Wednesday, May 22, 2019 Time: 10:00 AM Location: Longworth House Office Building 1324 Video & Submitted Written Congressional Testimony of Marc Morano – Examining UN Species/Climate Report – UN report is ‘authoritative propaganda’   Testifying before Congress, Climate Depot’s Marc Morano says the 97 percent consensus on catastrophic man-caused global warming is nothing but ‘a form of intimidation… pulled from thin air.’ Marc Morano: “It’s a form of intimidation… Before Congress you had a United Nation’s IPCC lead author testify that the 97 percent, as evidenced by John Cook, was quite simply ‘pulled from thin air’… They tortured the data until they came up with 75 or so anonymous scientists who agree with the 97 percent. We don’t know the scientists’ name, we don’t know their affiliation, we don’t know their bios, but that number in those studies are pounded on us daily, over and over, anyone in this debate to silence dissent.” Hearing – Responding to the Global Assessment Report of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services US House Subcommittee on Water, Oceans, and Wildlife May 22, 2019 References below:  Editor’s Note: The following is an excerpt from author Marc Morano’s new 2018 best-selling book, The Politically Incorrect Guide to Climate Change. The section below deals with the alleged 97% consensus on climate change.  (Move over Rachel Carson! – Morano’s Politically Incorrect Climate Book outselling ‘Silent Spring’ at Earth Day – Order Your Book Copy Now! ‘The Politically Incorrect Guide to Climate Change’ By Marc Morano) Book Excerpt – Chapter 3: “Pulled from Thin Air”: The 97 Percent “Consensus” (Page 27) ★★★★★ A Harvard Consensus In 2017 Princeton Professor Emeritus of Physics William Happer drew parallels to today’s man-made climate change claims. “I don’t see a whole lot of difference between the consensus on climate change and the  consensus on witches. At the witch trials in Salem, the judges were educated at Harvard. This was supposedly 100% science. The one or two people who said there were no witches were immediately hung. Not much has changed,” Happer quipped. # Economists versus Climatologists “You take 400 economists and put them in the room and give them exactly the same data and you will get 400 different answers as to what is going to happen in the economic future. I find that refreshing because it tells me that these guys don’t have an agenda. But if you take 400 climatologists and put them in the same  room and give them some data about a system which they understand very imperfectly, you are going to get a lot of agreement and that disturbs me. I think that’s arguing with an agenda.” —geologist Robert Giegengack of the University of Pennsylvania. # Dubious Evidence for a Ubiquitous Number The alleged “consensus” in climate science does not hold up to scrutiny. But what about the specific claim that 97 percent of scientists agree? MIT’s Richard Lindzen has explained the “psychological need” for the 97 percent claims. “The claim is meant to satisfy the non-expert that he or she has no need to understand the science. Mere agreement with the 97 percent will indicate that one is a supporter of science and superior to anyone denying disaster. This actually satisfies a psychological need for many people,” Lindzen said in 2017. But what is the basis for this specific number, and what exactly is this overwhelming majority of scientists supposed to be agreeing on? In 2014, UN lead author Richard Tol explained his devastating research into the 97 percent claim. One of the most cited sources for the claim was a study by Australian researcher John Cook, who analyzed the abstracts of 11,944 peer-reviewed papers on climate change published between 1991 and 2011.35 Cook and his team evaluated what positions the papers took on mankind’s influence on the climate and claimed “among abstracts expressing a position on AGW, 97.1% endorsed the consensus position that humans are causing global warming.” The 97 percent number took off. This 97 percent claim was despite the fact that 66.4 percent of the studies’ abstracts “expressed no position on AGW” at all. “The 97% estimate is bandied about by basically everybody. I had a close look at what this study really did. As far as I can see, this estimate just crumbles when you touch it. None of the statements in the papers are supported by the data that is actually in the paper,” Tol said. “But this 97% is essentially pulled from thin air, it is not based on any credible research whatsoever.” Tol’s research found that only sixty-four papers out of nearly twelve thousand actually supported the alleged “consensus.” Tol published his research debunking the 97 percent claim in the journal Energy Policy. Meteorologist Anthony Watts summed up Tol’s research debunking Cook’s claims. The “97% consensus among scientists is not just impossible to reproduce (since Cook is withholding data) but a veritable statistical train wreck rife with bias, classification errors, poor data quality, and inconsistency in the ratings process,” Watts wrote. Andrew Montford of the Global Warming Policy Foundation had authored a critique of Cook’s claim the previous year. “The consensus as described by the survey is virtually meaningless and tells us nothing about the current state of scientific opinion beyond the trivial observation that carbon dioxide is a greenhouse gas and that human activities have warmed the planet to some unspecified extent,” Montford found. “The survey methodology therefore fails to address the key points that are in dispute in the global warming debate.” Climatologist Roy Spencer and Heartland Institute’s Joe Bast noted that even if a certain study accepts the premise of man-made global warming, that paper may not even study how CO2 impacts temperatures: The methodology is “flawed,” noted Spencer, adding, “a study published earlier this year in Nature noted that abstracts of academic papers often contain claims that aren’t substantiated in the papers.” In 2015, former Margaret Thatcher advisor Christopher Monckton also examined the 97 percent claim. Monckton’s analysis found that “only 41 papers—0.3% of all 11,944 abstracts or 1.0% of the 4014 expressing an opinion, and not 97.1%” had actually endorsed the claim that “more than half of recent global warming was anthropogenic.” As Monckton explained, “They had themselves only marked 64 out of 11,944 of the papers as representing that view of the consensus, and that is not 97.1% that’s 0.5%…. There is no consensus.” The 97 percent claim is “fiction. ‘97 percent’ was a figure that was arrived at many years ago by the people who’ve pushed this ‘agenda,’” Monckton noted. “They then realized that they needed some sort of support for it, so they did a couple of very dopey papers.” In 2013, climatologist David Legates from the University of Delaware and his team of researchers had also challenged Cook’s 97 percent claims. “The entire exercise was a clever sleight-of-hand trick,” Legates explained. “What is the real figure? We may never know. Scientists who disagree with the supposed consensus—that climate change is man-made and dangerous— find themselves under constant attack.” Another survey that claimed 97 percent of scientists agreed was based not on thousands of scientists or even hundreds of scientists …or even ninety-seven scientists, but only seventy-seven. And of those seventy-seven scientists, seventy-five formed the mythical 97 percent consensus. In other words, in this instance the 97 percent of scientists wasn’t even ninety-seven scientists. This was a 2009 study published in Eos, Transactions American Geophysical Union by Maggie Kendall Zimmerman, a student at the University of Illinois, and her master’s thesis advisor Peter Doran. As Lawrence Solomon revealed in the National Post, The number stems from a 2009 online survey of 10,257 earth scientists, conducted by two researchers at the University of Illinois. The survey results must have deeply disappointed the researchers—in the end, they chose to highlight the views of a subgroup of just 77 scientists, 75 of whom thought humans contributed to climate change. The ratio 75/77 produces the 97% figure that pundits now tout. The two researchers started by altogether excluding from their survey the thousands of scientists most likely to think that the Sun, or planetary movements, might have something to do with climate on Earth—out were the solar scientists, space scientists, cosmologists, physicists, meteorologists and astronomers. That left the 10,257 scientists in disciplines like geology, oceanography, paleontology, and geochemistry that were somehow deemed more worthy of being included in the consensus. This was “a quickie survey that would take less than two minutes to complete, and would be done online.” And still less than a third of those surveyed even sent in an answer! The questions, as Solomon noted, “were actually non-questions”: 1. When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant? 2. Do you think human activity is a significant contributing factor in changing mean global temperatures? As Solomon explained, those two points do not give a complete picture of what’s at issue. They don’t even mention carbon dioxide—which, as we’ll explore at length in the next chapter, is the heart of the climate change debate. “From my discussions with literally hundreds of skeptical scientists over the past few years, I know of none who claims that the planet hasn’t warmed since the 1700s, and almost none who think that humans haven’t contributed in some way to the recent warming—quite apart from carbon dioxide emissions, few would doubt that the creation of cities and the clearing of forests for agricultural lands have affected the climate,” Solomon pointed out. # End The Politically Incorrect Guide To Climate Change book excerpt. Marc Morano, author of “The Politically Incorrect Guide to Climate Change,” in Paris. (Photo Courtesy of Marc Morano) # Related Links:  ‘83% Consensus’?! 285 Papers From 1960s-’80s Reveal Robust Global Cooling Scientific ‘Consensus’ – “If we were to employ the hopelessly flawed methodology of divining the relative degree of scientific “consensus” by counting the number of papers that agree with one position or another (just as blogger John Cook and colleagues did with their 2013 paper “Quantifying the Consensus…” that yielded a predetermined result of 97% via categorical manipulation), the 220 “cooling” papers published between 1965-’79 could represent an 83.3% global cooling consensus for the era (220/264 papers), versus only a 16.7% consensus for anthropogenic global warming (44/264 papers).” Flashback 1974: ’60 theories have been advanced to explain the global cooling’ – In the 1970’s scientists were predicting a new ice age, and had 60 theories to explain it.: Ukiah Daily Journal 0 November 20, 1974 – “The cooling trend heralds the start of another ice age, of a duration that could last from 200 years to several millenia…Sixty theories have been advanced, he said, to explain the global cooling period.” Listen & Read: Q&A with Morano on his book: ‘The Politically Incorrect Guide to Climate Change’   Updates: Climate skeptics hijack House Dem hearing – Dominate discussion – Warmists lament: “How did these two dominate a hearing run by Democrats?” Greenpeace Co-Founder Dr. Patrick Moore’s testimony to Congress on UN Species Report: UN is using ‘extinction as a fear tactic to scare the public into compliance’ Climatologist Dr. Judith Curry: Morano ‘has prepared an extremely hard-hitting report for his written testimony’ at Congressional species hearing – “Morano was asked a question about ‘97% of scientists agree.’  Morano nailed it. Moore effectively chimed in on this issue also” Media laments: House GOP ‘called two prominent climate deniers’ Moore & Morano to testify – Morano: UN report ‘hypes and distorts biodiversity issues for lobbying purposes’ Climate skeptic Morano picks fight at Congressional hearing – ‘Verbally attacks the credibility of one of the UN scientists’ Opening Statement of GOP Ranking Member Tom McClintock at Species hearing: UN practicing ‘the antithesis of science’ House Democrats attack Morano: “It’s truly odd that @NatResources Republicans would invite Marc Morano to a serious hearing on the future of the planet. If this is who they listen to on policy, why should anyone take their arguments seriously? Or is extinction a joke to them?”

Moving The Goalposts, IPCC Secretly Redefines ‘Climate’ by mixing existing and non-existing data

Moving The Goalposts, IPCC Secretly Redefines ‘Climate’ http://www.thegwpf.com/moving-goalposts-ipcc-secretly-redefines-climate/ by DavidWhitehouse The IPCC appears to have secretly changed the definition of what constitutes ‘climate’ by mixing existing and non-existing data The definition of ‘climate’ adopted by the World Meteorological Organisation is the average of a particular weather parameter over 30 years. It was introduced at the 1934 Wiesbaden conference of the International Meteorological Organisation (WMO’s precursor) because data sets were only held to be reliable after 1900, so 1901 – 1930 was used as an initial basis for assessing climate. It has a certain arbitrariness, it could have been 25 years. For its recent 1.5°C report the IPCC has changed the definition of climate to what has been loosely called “the climate we are in.” It still uses 30 years for its estimate of global warming and hence climate – but now it is the 30 years centred on the present. There are some obvious problems with this hidden change of goalposts. We have observational temperature data for the past 15 years but, of course, none for the next 15 years. However, never let it be said that the absence of data is a problem for inventive climate scientists. Global warming is now defined by the IPCC as a speculative 30-year global average temperature that is based on one hand on the observed global temperature data from the past 15 years and on the other hand on assumed global temperatures for the next 15 years. This proposition was put before the recent IPCC meeting at Incheon, in the Republic of Korea and agreed as a reasonable thing to do to better communicate climate trends. Astonishingly, this new IPCC definition mixes real and empirical data with non-exiting and speculative data and simply assumes that a short-term 15-year trend won’t change for another 15 years in the future. However, this new definition of climate and global warming is not only philosophically unsound, it is also open to speculation and manipulation. It is one thing to speculate what the future climate might be; but for the IPCC to define climate based on data that doesn’t yet exist and is based on expectations of what might happen in the future is fraught with danger. This strategy places a double emphasis on the temperature of the past 15 years which was not an extrapolation of the previous 15 years, and was not predicted to happen as it did. Since around the year 2000, nature has taught us a lesson the IPCC has still not learned. With this new definition of climate all data prior to 15 years ago is irrelevant as they are part of the previous climate. Let’s look at the past 15 years using Hadcrut4. The first figure shows 2003-2017. It’s a well-known graph that shows no warming trend – except when you add the El Nino at the end, which of course is a weather event and not climate. The effect of the El Nino on the trend is significant. With it the trend for the past 15 years is about 0.15° C per decade, close to the 0.2 per decade usually quoted as the recent decadal trend. Before the El Nino event, however, the warming trend is a negligible 0.02° C per decade and statistically insignificant. The second graph shows the 15 years before the recent El Nino, i.e. 2000-2014. The trend over this period is influenced by the start point which is a deep La Nina year. Without it the trend is 0.03 °C per decade – statistically insignificant. Note that there are minor El Ninos and La Ninas during this period but they tend to have a small net effect. So which does one choose? The El Nino version that leads to 0.6° C warming over the 30 years centred on the present, or the non-El Nino version that suggests no significant warming? The latter of course, because the trend should be as free from contamination of short-term weather evens — in the same way as they are free from decreases caused by aerosols from volcanoes blocking out the sun and causing global cooling for a while. The same problem can be seen in the IPCC’s 1.5C report when it analyses the decade 2006-2015 which it does extensively. In this specific decade 2015 is significantly warmer than the other years, by about 0.2°C. NOAA said, “The global temperatures in 2015 were strongly influenced by strong El Nino conditions that developed during the year.” The temperature trend including the El Nino year of 2015 is 0.2°C, that future again. Without the El Nino the trend is statistically insignificant. To see the future temperature and climate the IPCC envisage in their report consider their Summary for Policy Makers figure 1, (click on image to enlarge.) The IPCC’s attempt to move the goalposts is highly questionable. Non-existing data extrapolated for assumed temperature trends over the next 15 years should not be part of a formal definition of what constitutes climate.

The real story behind the famous starving polar-bear video reveals more manipulation

By Dr. Susan J. Crockford New facts have emerged from the filmmaker behind the cruel and deliberate exploitation of a dying bear in quest to advance climate change agenda It was tragedy porn meant to provoke a visceral response — the gut-wrenching video of an emaciated polar bear struggling to drag himself across a snowless Canadian landscape made billions of people groan in anguish. Taken in August 2017 by biologist Paul Nicklen, a co-founder of the Canadian non-profit SeaLegacy, the video was posted on Instagram in December 2017, stating “This is what starvation looks like” as part of a discussion about climate change. Two days later, SeaLegacy’s media and communications partner, National Geographic, published the video with added subtitles that began: “This is what climate change looks like.” The SeaLegacy webpage published the video also, under a headline that claimed “This is the face of climate change.” The message was clear: Blame climate change for this bear’s fate. The National Geographic video went viral, with an estimated audience of 2.5 billion (it set a site record), and garnered international media attention that blamed global warming for the polar bear’s plight. Polar bears keep thriving even as global warming alarmists keep pretending they’re dying Terence Corcoran: Polar bear battle in Toronto! It’s good science vs. climate do-gooders Terence Corcoran: Canadian finds polar bears are doing fine — and gets climate-mauled The backlash from the public was swift and fierce. Some were angry that nothing had been done to help the bear. Nicklen insisted he could not have saved it: “It would also have been illegal to feed him, to approach him, or to do anything to ease his pain.” The last bit is not quite true. Nicklen could have called the nearest conservation officer, who would have euthanized the bear and arranged a necropsy to determine the cause of the bear’s poor condition (which was likely one of the cancers known to cause muscle wasting in polar bears). But a necropsy result would have hamstrung Nicklen’s plan to use the starving-bear video to spur action on climate change, so he and his crew simply let the bear swim away without telling anyone, extending the animal’s suffering. Many other viewers were furious that an obviously sick animal had been deliberately exploited to advance a political agenda based on lies. Climate change was clearly not the cause of this bear’s plight: Sea-ice loss had not been exceptionally high that year and no other bears in the area were starving. Viewers felt manipulated. National Geographic might hope its apology will bring donors back. I suspect it’s done the opposite   The criticism continued for months until, out of the blue, some previously undisclosed facts about the incident were revealed online in an essay written by Nicklen’s SeaLegacy partner, Cristina Mittermeier, destined for the August 2018 print issue of National Geographic. Mittermeier admitted Nicklen was scouting for an image that could be used to “communicate the urgency of climate change” when he spotted the emaciated bear. She confessed that she and Nicklen knew the bear was probably sick or injured before they started to film but proceeded regardless. She also revealed that days passed between Nicklen’s first sight of the starving bear and the actual filming of it: He told no one about the suffering animal while he waited for his film crew to arrive. In her essay, Mittermeier makes a number of excuses for the subsequent public outcry over the footage but ultimately blames National Geographic for subtitles on the video that missed the story’s “nuance.” Apparently, she thinks SeaLegacy’s caption that implicated climate change is materially different from National Geographic’s caption that implicated climate change. Oddly, National Geographic admitted culpability with an apology (embedded in Mittermeier’s essay) that begins: “National Geographic went too far in drawing a definitive connection between climate change and a particular starving polar bear in the opening caption of our video about the animal.” Why would it issue such a mea culpa? Had the public backlash and editorial criticism hurt its organization more than it was willing to admit? National Geographic might hope that Mittermeier’s essay and its apology will bring former supporters and donors back into the fold, but I suspect it’s done the opposite. The additional details make the actions of the SeaLegacy founders harder to forgive, not easier. They also raise the question of whether this cruel and deliberate exploitation of a dying bear violated strict Nunavut conservation laws for documentary filmmaking. We’ll see. Mittermeier claims the public should never have taken the video literally. However, it’s apparent people took it literally because it was presented as a simple message: Blame climate change for this bear’s suffering. Mittermeier and Nicklen still don’t understand why their efforts backfired but it’s simple. They tried to dupe people with obvious lies and allowed an animal to suffer for days in order to satisfy their agenda. Mittermeier says she’d do it again. Sadly, I believe her. Susan Crockford is a zoologist and adjunct professor at the University of Victoria. She blogs about polar bears at www.polarbearscience.com.

The Real Story Behind The Starving Polar-Bear Video Reveals More Manipulation

http://www.thegwpf.com/the-real-story-behind-the-starving-polar-bear-video-reveals-more-manipulation/ New facts have emerged from the filmmaker behind the cruel and deliberate exploitation of a dying bear in quest to advance climate change agenda It was tragedy porn meant to provoke a visceral response — the gut-wrenching video of an emaciated polar bear struggling to drag himself across a snowless Canadian landscape made billions of people groan in anguish. Taken in August 2017 by biologist Paul Nicklen, a co-founder of the Canadian non-profit SeaLegacy, the video was posted on Instagram in December 2017, stating “This is what starvation looks like” as part of a discussion about climate change. Two days later, SeaLegacy’s media and communications partner, National Geographic, published the video with added subtitles that began: “This is what climate change looks like.” The SeaLegacy webpage published the video also, under a headline that claimed “This is the face of climate change.” The message was clear: Blame climate change for this bear’s fate. The National Geographic video went viral, with an estimated audience of 2.5 billion (it set a site record), and garnered international media attention that blamed global warming for the polar bear’s plight. The backlash from the public was swift and fierce. Some were angry that nothing had been done to help the bear. Nicklen insisted he could not have saved it: “It would also have been illegal to feed him, to approach him, or to do anything to ease his pain.” The last bit is not quite true. Nicklen could have called the nearest conservation officer, who would have euthanized the bear and arranged a necropsy to determine the cause of the bear’s poor condition (which was likely one of the cancers known to cause muscle wasting in polar bears). But a necropsy result would have hamstrung Nicklen’s plan to use the starving-bear video to spur action on climate change, so he and his crew simply let the bear swim away without telling anyone, extending the animal’s suffering. Many other viewers were furious that an obviously sick animal had been deliberately exploited to advance a political agenda based on lies. Climate change was clearly not the cause of this bear’s plight: Sea-ice loss had not been exceptionally high that year and no other bears in the area were starving. Viewers felt manipulated. The criticism continued for months until, out of the blue, some previously undisclosed facts about the incident were revealed online in an essay written by Nicklen’s SeaLegacy partner, Cristina Mittermeier, destined for the August 2018 print issue of National Geographic. Mittermeier admitted Nicklen was scouting for an image that could be used to “communicate the urgency of climate change” when he spotted the emaciated bear. She confessed that she and Nicklen knew the bear was probably sick or injured before they started to film but proceeded regardless. She also revealed that days passed between Nicklen’s first sight of the starving bear and the actual filming of it: He told no one about the suffering animal while he waited for his film crew to arrive. In her essay, Mittermeier makes a number of excuses for the subsequent public outcry over the footage but ultimately blames National Geographic for subtitles on the video that missed the story’s “nuance.” Apparently, she thinks SeaLegacy’s caption that implicated climate change is materially different from National Geographic’s caption that implicated climate change. Oddly, National Geographic admitted culpability with an apology (embedded in Mittermeier’s essay) that begins: “National Geographic went too far in drawing a definitive connection between climate change and a particular starving polar bear in the opening caption of our video about the animal.” Why would it issue such a mea culpa? Had the public backlash and editorial criticism hurt its organization more than it was willing to admit? National Geographic might hope that Mittermeier’s essay and its apology will bring former supporters and donors back into the fold, but I suspect it’s done the opposite. The additional details make the actions of the SeaLegacy founders harder to forgive, not easier. They also raise the question of whether this cruel and deliberate exploitation of a dying bear violated strict Nunavut conservation laws for documentary filmmaking. We’ll see. Mittermeier claims the public should never have taken the video literally. However, it’s apparent people took it literally because it was presented as a simple message: Blame climate change for this bear’s suffering. Full post & comments

NASA GISS Practicing Political Science, Not Atmospheric Science; Why Doesn’t NASA Rely on Satellite Data?

NASA GISS Practicing Political Science, Not Atmospheric Science; Why Doesn’t NASA Rely on Satellite Data? https://co2islife.wordpress.com/2018/04/06/nasa-giss-practicing-political-science-not-atmospheric-science-why-doesnt-nasa-rely-on-satellite-data/ This slideshow requires JavaScript. Now what we have always known to be true is official, NASA is essentially a Political Action Committee for the Political Left. How else can you explain NASA using highly flawed and “adjusted” NOAA data when they have their own state of the art Satellite Temperature Data? The Answer: because the highly accurate satellite data doesn’t give them the results they want. The organization that once put a man on the moon clearly has people smart enough to understand just how wrong they are on this Climate Change issue, and the damage unhinged activists like James Hansen and the scathing IG report does to their reputation. (I’m ignoring the missing $1.63 million and focusing on the politics) GISS’ prominent role in Earth science research – as a contributor to the Intergovernmental Panel on Climate Change’s (IPCC) Nobel Prize winning report on climate change in 2007 – coupled with on-duty public outreach and education, as well as off-duty advocacy by individual GISS staff about climate change, has raised the group’s public profile. At the same time climatologists debate the impact of man-made greenhouse gas emissions in predictive models, the issue has carried over into Government policy discussions and congressional hearings about the impact of human activity on global climate change. What society needs to ask is, “can we really afford to politicize science and critical institutions like NASA that can influence public policy that can cost in the trillions of dollars?” The EPA, IRS, FBI, NASA and who knows what other critical institutions were politicized over the previous 8 years, and the results have been disastrous. Progressives behave as if the public treasury is their own private piggy bank, and politicize everything in an effort to circumvent the democratic process. Why doesn’t NASA use NASA Satellite Data? Climate alarmists are certain to claim that there isn’t a long enough history. That is partly true, but the period it does cover includes the period of greatest CO2 growth. Using the ice core data, it wasn’t until 1950 when CO2 levels broke into recent historic levels above 300 ppm (Note: that level is still very low on a geologic scale, CO2 has reached 7,000 ppm). The satellite data started in 1979 when CO2 was 330, so satellite data covers a 70 ppm change in CO2, or a 21% increase. During the previous 400,000 years, CO2 bounced around between 180 and 300, or a 67% swing. That data is fine to define natural variation, but the NASA data covers the period that best defines the man-made contribution of AGW. Simply take a look at the bizarre construction of the “Hockeystick” graph. It cherry-picks proxies, uses “tricks” to “hide the decline,” excludes available instrumental until 1902, mixes proxy and instrumental data between 1902 and 1980, and then drops proxies after 1980, causing distinct “dog legs” in the chart. Only a fool would accept that as an accurate reconstruction. The ground measurements aren’t any better and are legendary for their “adjustments,” yet the climate alarmists attack the highly accurate satellite data. The reason is simple, the satellite data demonstrate an extreme relationship between global temperatures and the ocean temperatures, and no relationship between CO2 and atmospheric temperatures. Ironically, if you control for H2O and the Urban Heat Island Effect, like in a real science, even the ground measurements debunk the CO2 and temperature relationship. Using the Climate Alarmist approach of combining data sets, the logical approach would be to use ice core data up until 1860, ground measurements between 1860 and 1979, and then satellite data after that. That would be a reconstruction using the most accurate data sets available at the time. If you do that and test the hypothesis “man does not cause climate change,” you quickly discover why climate alarmists rely on bizarre data manipulations. Using that dataset, there is absolutely nothing abnormal with the temperature variation over the past 150 and 50 years when compared to the entire Holocene. Congress should have NASA GISS testify and explain why they use NOAA and HadCRU data and not NASA Satellite data. They should also testify as to the political activities of their “scientists.” Lastly, NASA should be asked to defend the “Hockeystick” and the many “tipping points” predicted by James Hansen that have passed. More on this topic: Ceteris Paribus and Global Warming; Ground Measurements are Garbage GIGO; On the Validity of NOAA, NASA and Hadley CRU Global Average Surface Temperature Data Greenland, Antarctica And Dozens Of Areas Worldwide Have Not Seen Any Warming In 60 Years And More! Almost 300 Graphs Undermine Claims Of Unprecedented, Global-Scale Modern Warmth Climate “Science” on Trial; Temperature Records Don’t Support NASA GISS Climate “Science” on Trial; Cherry Picking Locations to Manufacture Warming Climate Science Behaving Badly; 50 Shades of Green & The Torture Timeline Climate “Science” on Trial; The Criminal Case Against the Alarmists Please Like, Share, Subscribe and Comment — gReader Pro

Report: ‘Adjustments’ To Create Fake Sea Level Rise Have Now Infected PSMSL Tide Gauge Data

By Kenneth Richard on 4. December 2017 ‘Adjustments’ To Create Spurious Sea Level Rise Have Now Infected The PSMSL Tide Gauge Data In a new paper published in Earth Systems and Environment this month, Australian scientists Dr. Albert Parker and Dr. Clifford Ollier uncover evidence that Permanent Service for Mean Sea Level (PSMSL) overseers appear to have been engaging in the “highly questionable” and “suspicious” practice of adjusting historical tide gauge data to show recent accelerated sea level rise where no such acceleration (or rise) exists. Extensive evidence from “tide gauges, coastal morphology, stratigraphy, radiocarbon dating, archaeological remains, and historical documentation” all suggest that sea levels in the Indian Ocean have effectively been stable in recent decades. The authors expose how PSMSL  data-adjusters make it appear that stable sea levels can be rendered to look like they are nonetheless rising at an accelerated pace. The data-adjusters take misaligned and incomplete sea level data from tide gauges that show no sea level rise (or even a falling trend).  Then, they subjectively and arbitrarily cobble them together, or realign them.   In each case assessed, PSMSL data-adjusters lower the earlier misaligned rates and raise the more recent measurements.  By doing so, they concoct a new linearly-rising trend. This adjustment of tide gauge data to yield a rising sea level trend where none exists is not occasional or episodic.  Instead, for every adjustment of raw data analyzed, “the adjustments are always in the direction to produce a large rise in sea level.” The suspicious perpetuity of this pattern strongly suggests that there is an agenda driving these arbitrary and subjective realignments. From all appearances, the data-adjusters at PSMSL are attempting to “correct” the sea level rise data that do not support the conceptualization of a rapidly-rising sea level trend in response to rising human CO2 emissions. As Drs. Parker and Ollier conclude: “It is always highly questionable to shift data collected in the far past without any proven new supporting material.”

New Study: Tide gages find no global ‘acceleration in sea level’ – But satellite data ‘manipulated’ to show acceleration

Via: http://www.ijesi.org/papers/Vol(6)8/Version-1/G0608014851.pdf Abstract: Sea level changes is a key issue in the global warming scenario. It has been widely claimed that sea is rising as a function of the late 20th’s warming pulse. Global tide gauge data sets may vary between +1.7 mm/yr to +0.25 mm/yr depending upon the choice of stations. At numerous individual sites, available tide gauges show variability around a stable zero level. Coastal morphology is a sharp tool in defining ongoing changes in sea level. A general stability has been defined in sites like the Maldives, Goa, Bangladesh and Fiji. In contrast to all those observations, satellite altimetry claim there is a global mean rise in sea level of about 3.0 mm/yr. In this paper, it is claimed that the satellite altimetry values have been “manipulated”. In this situation, it is recommended that we return to the observational facts, which provides global sea level records varying between ±0.0 and +1.0 mm/yr; i.e. values that pose no problems in coastal protection. Keywords: Manipulation, observational facts, satellite altimetry, sea level change, tide gauges ————————————————————————————————————————————— Date of Submission: 26-07-2017 Date of acceptance: 05-08-2017 … V. CONCLUSION Satellite altimetry is a new elegant tool to view the changes in sea level over the globe, maybe especially the spatial changes, which, indeed, verified the long-term notion that sea level change over the last 5000-6000 years are dominated by the redistribution of water masses [29]. The temporal changes, on the other hand, has always remained very questionable as they seem to over-estimate observed sea level changes by 100-400% [9-16]. It seems quite weird to claim that it would be the satellite altimetry that is right and that the true observations in the field are wrong (still this is what the people around the IPCC and the Paris agreement at COP21 continue to claim). Fig. 1 reveals what is going on. It is the satellite altimetry data, which have been “corrected” to give a rise in the order of 3.0 mm/yr. This “correction” [19-21] may, of course, be classified as a “manipulation” of facts, like the manipulation temperature measurements recently revealed [1-3]. In this situation, there are all reasons to return to solid observational facts [11-16].Those facts are controllable, and this is a key criterion in science.The global perspective is general stability to a minor rise with variations between ±0.0 and +1.0 mm/yr [16]. This poses no problem for coastal protection. Therefore, we should free the world from the horror issue that low-lying coasts and islands will become seriously flooded in this century. Up to the present, there has been no convincing recording of any acceleration in sea level, rather the opposite: a total lack of any sign of an accelerating trend. For full study see here. # End study excerpt Related Links: Sea levels have been rising since the last ice age ended more than 10,000 years ago. There is currently no acceleration in sea level rise. Sea level rise hysteria can be cured by looking at tide gauge data Sea Level 2000 years ago higher than today? Roman coastline discovered two miles inland Bjorn Lomborg About Those Non-Disappearing Pacific Islands – ‘Total land area of the islands has actually grown’ Former NASA Climatologist Dr. Roy Spencer in 2016: “Sea level rise, which was occurring long before humans could be blamed, has not accelerated and still amounts to only 1 inch every ten years. If a major hurricane is approaching with a predicted storm surge of 10-14 feet, are you really going to worry about a sea level rise of 1 inch per decade? Climatologist Dr. Judith Curry of Georgia Institute of Technology: ‘Sea level will continue to rise, no matter what we do about CO2 emissions.’ – ‘The IPCC figure 3.14 suggests that there is no acceleration, given the large rates of sea level rise in the first half of the 20th century.  Until we have an understanding of variations in decadal and multi-decadal sea level rise, we can’t make a convincing argument as to acceleration.’ Meteorologist Tom Wysmuller: ‘For the past 130 years there has been ZERO acceleration in sea-level rise as directly measured by tide gauges in tectonically inert areas (land neither moving up nor down), even as CO2 has risen almost 40% in the same period.’ Peer-Reviewed Studies Demolish Warmists’ Sea Level Rise Scares: ‘Decelerated 44% since 2004′ – ‘Global sea levels have been naturally rising for ~20,000 years and have decelerated over the past 8,000 years, decelerated over the 20th century, decelerated 31% since 2002 and decelerated 44% since 2004 to less than 7 inches per century. There is no evidence of an acceleration of sea level rise, and therefore no evidence of any effect of mankind on sea levels. Global sea level rise from tide gauges (1.6 mm/year) is half of that claimed from satellites (3.2 mm/year). Which is right? – ‘There is no acceleration of the increase’ – [Climate Depot Note: According to tide gauges, Sea Level is rising LESS than the thickness of one nickel (1.95 mm thick) per year or about the thickness of one penny (1.52 mm thick) a year. According to satellite info it is rising slightly more than two pennies a year (3.04 mm)]

For more results click below