Michael Tennant of The New American, and Anthony Watts, of the notable climate change site WattsUpWithThat, write that the U.S. National Interagency Fire Center (did I miss that in the Constitution?) appears to have been caught cutting off early years of fire stats, evidently to lend credence to claims of “more fires due to anthropogenic climate change.”
Created in 1965, the National Interagency Fire Center (NIFC) maintains statistics on annual wildfire counts and the number of acres burned in those fires. Until recently, the NIFC posted on its website wildfire statistics for every year since 1926, as evidenced by this Internet Archive screen capture. However, the agency now only posts statistics from 1983 to the present. Why?
Watts provides his answer:
The answer is simple; data prior to 1983 shows that U.S. wildfires were far worse both in frequency and total acreage burned. By disappearing all data prior to 1983, which just happens to be the lowest point in the dataset, now all of the sudden we get a positive slope of worsening wildfire aligning with increased global temperature, which is perfect for claiming ‘climate change is making wildfire[s] worse.’
Indeed, it certainly looks like Tennant and Watts are right.
To prove his point, Watts created graphs from both the original data and the now-scrubbed data. The graph of the complete dataset shows that from the 1920s to the early 1980s, there were far more wildfires covering far more acreage than there have been since. The graph of the current NIFC dataset, on the other hand, suggests an increase in both statistics over time.
And, as it was when “scientists” involved with the Intergovernmental Panel on Climate Change (IPCC) did not adjust for the loss of Siberian temp readers after the fall of the USSR (thus making it appear as if temps rose quickly after 1991), and Tennant and Watts note that two major (and opposing) fire extremes saw one eliminated from the “history” of U.S. wildfires… Again, why? Writes Tennant:
Another graph generated by Watts sheds further light on the complete dataset. The worst of the wildfires occurred during the 1930–1941 ‘Dust Bowl’ era and again during the 1976–1978 drought in the West. Meanwhile, 1982–1983 saw a ‘super El Nino’ that soaked the western states, causing 1983 to have the fewest and least-destructive wildfires on record. After that, wildfire and acreage counts naturally increased, but thus far they have seldom approached most of the pre-1983 counts and have been far below the counts from the peak years of that era.
Why? The reader can decide for himself or herself, but this dataset manipulation certainly helps to artificially prop up the seemingly endless and unproven claims that “man-made climate change is leading to more wildfire disasters.
As I noted for MRCTV in 2019, much of the fire trouble experienced in the west is directly tied to the facts that the land is government-run and, in many cases, the government (especially California) allows the government-granted power oligopoly to run spark-producing electricity lines over the badly managed, tinderbox land. Here’s a portion of that piece, pertaining to Pacific Gas and Electric (PG&E):
…the core and majority of PG&E’s power lines run above government owned and mismanaged land. As numerous commentators have observed, the lack of private property ownership, control, and real liability for management of land has led to a shocking history of fires in California and elsewhere in the US. The government won’t go out of business if it causes damage to other peoples’ property through the mismanagement of land it claims as its own.
And now it appears that government bureaucrats are fudging the data on fires that often have been far worse because government runs the land.
Is it any wonder why many Americans want at least a return to the US Constitution, and ask for the elimination of these bureaus and the return of this government-run land to the private sector, where real liability can come into play to incentivize proper resource allocation and husbandry?