It’s been an open secret, ever since Dr. Michael Mann used “Mike’s Nature Trick” to “hide the decline” by covering up some inconvenient tree ring data in the hockey stick climate graph, that climate alarmists will go to almost any length to only show the public the “crisis side” of climate data.
The National Interagency Fire Center (NIFC) has been the keeper of U.S. wildfire data for decades, tracking both the number of wildfires and acreage burned all the way back to 1926. However, after making that entire dataset public for decades, now, in a blatant act of cherry picking, NIFC “disappeared” a portion of it, and only show data from 1983. You can see it here.
Fortunately, the Internet never forgets, and the entire dataset is preserved on the Internet Wayback machine and other places, despite NIFC’s ham-handed attempt to disappear the data.
Why would they do this you ask? The answer is simple; data prior to 1983 shows that U.S. wildfires were far worse both in frequency and total acreage burned. By disappearing all data prior to 1983, which just happens to be the lowest point in the dataset, now all of the sudden we get a positive slope of worsening wildfire aligning with increased global temperature, which is perfect for claiming “climate change is making wildfire worse”. See figure 1 below for a before and after comparison of what the data looks like when you plot it.
Clearly, wildfires were far worse in the past, and clearly, now the data tells an entirely different story when showing only data post-1983. The new story told by the sanitized data is in alignment with the irrational screeching of climate alarmists that “wildfires are driven by climate change”.
This wholesale erasure of important public data stinks, but in today’s narrative control culture that wants to rid us of anything that might be inconvenient or doesn’t fit the “woke” narrative, it isn’t surprising.
Interestingly, the history on the Internet Wayback Machine shows how NIFC rationalized this erasure of important public data.
Back in June 2011 when this data was first presented by NIFC publicly, it was simply presented “as-is”. They say only this:
Figures prior to 1983 may be revised as NICC verifies historical data.
In 2018, they added a new caveat, saying this:
The National Interagency Coordination Center at NIFC compiles annual wildland fire statistics for federal and state agencies. This information is provided through Situation Reports, which have been in use for several decades. Prior to 1983, sources of these figures are not known, or cannot be confirmed, and were not derived from the current situation reporting process. As a result the figures prior to 1983 should not be compared to later data.
According to the Internet Wayback Machine, that caveat first appeared on the NIFC data page somewhere between January 14 and March 7 of 2018.
Curiously, that caveat appeared just a few weeks after I first drew wide attention to the issue in December 2017, with an article citing NIFC fire data titled Is climate change REALLY the culprit causing California’s wildfires?
It seems they received some blowback from the idea that their data, when plotted, clearly showed wildfires to be far worse in the past, completely blowing the global-warming-climate-change-wildfire connection out of the water.
Here is what NIFC says now:
Prior to 1983, the federal wildland fire agencies did not track official wildfire data using current reporting processes. As a result, there is no official data prior to 1983 posted on this site.
Not only is that a lie of omission, it is ridiculous. Their agenda seems very clear. When the data was first published, they only advised the public that some data prior to 1983 might be “…revised as NICC verifies historical data”.
There was no published concern that the data might be invalid, or that we shouldn’t use it. Besides, the data is very simple; a count of the number of fires and the number of acres burned. How hard is that to compile and verify as accurate?
What’s worse is that this data has been trusted for decades in almost every news story about any wildfire that ever occurred in the U.S. In virtually every news story about a wildfire, the number of acres burned it THE NUMBER the press uses in the story, without it, there is no scale of the severity of the fire. Similarly, for every story about “what a bad wildfire season we’ve had”, the press cites the number of fires as well as the acreage burned.
And now, after decades of that data being provided to the press and the public, and nearly a decade of NIFC making it publicly available on their website, they want us to believe that it is now unreliable data?
Seriously, just how hard is it to count the number of fires that have happened and the number of acres burned?
What NIFC is doing is essentially labeling every firefighter, every fire captain, every forester, and every smoke jumper who has fought wildfires for decades as being untrustworthy in their assessment and measurement of this critical, yet very simple fire data. I’ll take data from people on the fire scene over government bureaucratic doublespeak every day of the week and twice on Sundays.
This whole affair is outrageous. But what is even more outrageous is that NIFC isn’t at all transparent as to the reason for the change. They essentially say “The data prior to 1983 is no good, trust us”. There is no citation of a study, no methodology given, no rationale for the removal. That’s not science, that’s not statistics, that’s not even sensible, but that is what is happening.
Plotting the entire NIFC dataset (before it was partially disappeared) gives us some hints as to why this has been done, and how wildfire and weather patterns have been inextricably linked for decades. Note figure 2 below, combining the number of fires and number of acres burned. See the annotations that I have added.
Clearly, what NIFC has done by saying data prior to 1983 is “unreliable” and disappearing it is not just hiding important fire history, but cherry picking a data starting point that is the lowest in the entire record to ensure that an upwards trend exists from that point.
The definition of cherry picking is:
Cherry picking, suppressing evidence, or the fallacy of incomplete evidence is the act of pointing to individual cases or data that seem to confirm a particular position while ignoring a significant portion of related and similar cases or data that may contradict that position.
And by choosing the lowest point in the record for total fires, 1983, and making all data prior to that unavailable, NIFC ensures that any comparison between fires and climate change over the last 38 years always shows an upward trend and correlation with rising temperature.
It seems to me that NIFC very likely caved to pressure from climate activists to disappear this inconvenient data. By erasing the past data, NIFC has become untrustworthy. This erasure is not just unscientific, it’s dishonest and possibly fraudulent.
For posterity, the entire dataset from NIFC (including pre-1983) is available here in an Excel (.xlsx) file:
UPDATE: Here is an analysis paper from 2015 using the same data that is on the U.S. Forest Service website:
https://www.fs.fed.us/research/sustain/docs/national-reports/2003/data/documents/Indicator%2015/Indicator%2015.pdf