In 1994 the Harvard Business Review insightfully wrote, “The news media and the government are entwined in a vicious circle of mutual manipulation, mythmaking, and self-interest. Journalists need crises to dramatize news, and government officials need to appear to be responding to crises.” So it’s no surprise that NPR headlines hyped, Climate Change Is Driving Deadly Weather Disasters From Arizona To Mumbai claiming, “Heat waves. Floods. Wildfires…We know that climate change is to blame.” Nor is it surprising that the New York Times wrote, “climate change is a key culprit” for the American west’s wildfire disasters. And right on cue, the Intergovernmental Panel on Climate Change warns “It’s code red for humanity”. But governments promise to fix the “crisis” by controlling your energy and social policies.
Is the media honestly following the science, or once again engaging in “mutual manipulation, mythmaking, and self-interest”?
We ecologists know that wildfires and climate change are extremely complex issues with many contributing variables. Bigger, more intense, and more frequent wildfires and a longer fire season are similarly produced by different variables. Distinguishing the “key culprit” is not as easy as a NY Times’ opinion suggests. Thus, ecologists are trained to maintain multiple working hypotheses to determine which hypothesis best fits the evidence. This scientific process has grave social importance. Wrong analyses always produce wrong remedies, and bad remedies can be worse than the problems they seek to fix. So here are some major competing hypotheses explaining recent upticks in wildfires.
1. Wildfire Suppression
In the 1800s the western USA landscape was a mosaic of open meadows and patchy forests. That mosaic was maintained by frequent wildfires, either ignited naturally or by native Americans. The patchy mosaic created natural fire breaks that prevented the spread of megafires. Frequent fires also reduced both ground fuels that cause more intense fires and ladder fuels that carry fire into the canopies. All wildfire experts agree the policy of wildfire suppression from 1900 to 1970 allowed a) the explosive accumulation of ground fuels and ladder fuels and b) reduced patchiness which allowed fuel continuity across larger swathes of forests. The increase in wildfires since 1970 coincides with the relaxation of fire suppression policies when ecologists emphasized that low-intensity frequent fires are needed to maintain forest health and biodiversity.
A US Forest Service report provides photographic evidence of the landscape patchiness in 1909 before wildfire suppression began versus the increase in ground and ladder fuels and dense connected forests that evolved by 1979 due to fire exclusion.
2. Human Ignitions
Since 1900, California’s population increased from 1.5 million people to more than 39 million. More people inevitably produce to more accidental ignitions. Eighty-four percent of all USA fires are ignited by people. A larger population also requires a larger electrical grid. California’s 2nd largest fire (Dixie) was ignited by an electrical spark, as was the deadliest ever Camp fire and the 4th deadliest Tubbs fire. Three of California’s largest fires (Mendocino, Rim, & Carr fires) were caused by other human accidents.
While the natural fire season, ignited by lightning, extends from May through September, peaking in hot and dry July, human ignitions extend the fire season throughout the entire year. In California this is especially dangerous, as ignitions during the winter are rapidly spread by fierce Santa Anna and Diablo winds. These winds begin to ramp up in October as increasingly cold seasonal temperatures in the high mountain deserts push dry air down across California towards a relatively warmer Pacific Ocean. A downed powerline in December ignited a fire spread by the Santa Ana winds that devastated southern California with its 8th largest fire (Thomas Fire). California’s 4 deadliest fires, Camp, Griffith Park, Tunnel, and the Tubbs wine country fires were all rapidly spread by October and November winds. The 2nd deadliest Griffith Park fire was accidentally started in October 1933.
3. Altered Grasslands
Grasses and shrubs produce small diameter “fine fuels” that rapidly dry within 1 to 10 hours of dry weather. As discussed in part 1, those 1 hour fuels are highly flammable even during freezing temperatures and easily ignited. As seen in Fig 3. (from Keeley, 2015) grassland fires account for the largest burnt areas. Some fire experts argue the tremendous drop in wildfires during the early 20th century was not only due to fire suppression policies, but severe overgrazing that reduced the grasslands ability to spread fire. Similarly, a recent NASA report determined wildfires globally had “declined by 24 percent between 1998 and 2015.” They attributed this global decline to a change in landscapes across the African savannahs. Before, to support grazing, fires had been intentionally set to keep grasslands free from invading shrubs and trees. As villages and homes intruded into the savannah and cultivation of permanent crop fields replaced grassland, the use of fire was reduced.
In 1992, ecologists from UC Berkeley and Stanford University wrote the seminal Biological Invasions by Exotic Grasses, the Grass/Fire Cycle and Global Change. Grasses create conditions that favor fire by producing a “microclimate in which surface temperatures are hotter, vapor pressure deficits are larger, and the drying of tissues more rapid than in forests or woodlands.” An invasion of alien grass species “provides the fine fuel necessary for the initiation and propagation of fire. Fires then increase in frequency, area, and perhaps intensity. Following these grass-fueled fires, alien grasses recover more rapidly than native species and cause a further increase in susceptibility to fire.” This cycle had been well known to land managers who seeded alien grasses to increase fire frequency and intensity to suppress woody species.
The grass/fire cycle has altered landscapes across the world. In Hawaii alien grasses thoroughly filled the spaces between native shrubs, providing continuous layers of fine fuel. Prior to a 1960’s invasion, only 27 fires were recorded in 48 years, each burning an equivalent of 8 football fields. In the 20 years following invasion, twice as many fires ignited each burning an average of 400 football fields. In western North America the invasion of cheat grass increased the frequency of fires in Idaho shrublands from once every 60-110 years to every 3-5 years. In eastern Oregon land dominated by cheatgrass is considered 500 times more likely to bum than other landscapes.
In the Great Basin deserts, due to low fuel abundance, sagebrush ecosystems burned just once every 60 -100 years. Now cheat grass-dominated sagebrush habitat burns every 3-5 years, up to 20 times more frequently than historic natural conditions. Eleven of the USA’s 50 biggest fires in last 20 years have been in Great Basin sagebrush habitats, where invasive cheatgrass is spreading. Nevada’s largest fire was the 2018 Martin Fire. Rapidly spreading through the cheat grass, it burned 439,000 acres, a burned area rivaling California’s 4th largest fire in recorded history. Additionally, a greater frequency of fires due to the combination of the grass/fire cycle and increased human ignitions has caused large areas of California shrublands to convert to grasslands.
4. Natural Climate Cycles
Most media, like the NY Times, seek out the researchers whose research focuses on finding a connection between a climate change crisis and wildfire crises. So typically the media quote Park Williams, John Abatzoglou, Daniel Swain, or Kevin Trenberth. Park Williams summarizes, “This climate-change connection is straightforward: warmer temperatures dry out fuels. That statement is true, but so is the converse. Drier conditions also dry out fuels and raise temperatures. So, blaming warmer temperatures maybe the tail wagging the dog. Furthermore, it is bad science to apply a 2°F rise in global temperatures to the measured local temperatures where fires ignite. In the USA, 36% of the weather stations with 70+ years of data report cooling trends. It’s also bad science to use average daily temperatures. Rising minimum temperatures are associated with growing populations and may still be below the dew point which would moisten ground fuels. It’s the maximum temperatures that dry out fuels.
For example, in the region of the huge Mendocino-Complex fire, which was accidentally ignited in dry grasses, annual maximum temperatures have cooled since the 1930s as recorded in the US Historical Climate network. Cooling maximum temperatures holds true throughout northern California.
Contrary to Williams’ suggestion, dryness does not depend on temperature as witnessed by the warm wet tropics or cold dry tundra. The Sahara Desert was driest during the depth of the Ice Age but converted to the moist Green Sahara as temperatures warmed. Dry conditions are mostly a function of how atmospheric circulation transports moisture from the oceans to the land, and its atmospheric circulation that makes the American west so dry and the eastern USA so moist. The most important modulator of that circulation is the natural El Nino cycles (ENSO) and the Pacific Decadal Oscillation (PDO) that shift rains and drought northward and southward. Researchers found “combined warm phases (positive PDO during El Nino) co-occurred with large fires in the central and northern Rockies, while the combined cool phases (negative PDO during La Nina) appeared to promote large fires in the southern Rockies. Almost 70% of large fires in Rocky Mountain National Park burned during La Nina events that coincided with a negative PDO, although these phases co-occurred during only 29% of the 1700-1975 AD period.”
Appropriately, the recent uptick in forest fires coincides with the natural 21st century shift of the PDO to its negative phase and the increased frequency of La Nina conditions. This natural cycle of droughts and fires forced Williams to find a narrative that synthesized the scientific evidence of La Nina effects with modeled speculation of a climate crisis induced wildfires. In 2014 Williams wrote, “The southwestern United States (SW) experienced extreme drought in 2011, related at least in part to a La Niña event in the tropical Pacific Ocean. The 2011 SW drought event was accompanied by record breaking total burned and record-size ‘‘megafires’’ in the forests of eastern Arizona and northern New Mexico.” He then conflated model myth-making writing, “Model projections developed for the fifth phase of the Coupled Model Intercomparison Project suggest that by the 2050s warming trends will cause mean warm-season vapor pressure deficit to be comparable to the record high VPD (dryness) observed in 2011.
But climate models have done an extremely poor job of modeling drought. Michael Wehner published the graph below, which was also featured in a National Climate Assessment. The observed (red) Fractional Extreme drought area over the USA and Mexico was clearly greatest during the 1930s and attributed to landscape changes and natural cycles. The second worse drought extent occurred in the 1950s. Although modelers already knew the results, their CO2 driven model results (blue) failed to even hint at those historical droughts and accompanying heat waves. All the climate models could do is project imagined disasters in the future. Rarely does the media mention the grass/fire cycle, or the dryness of ENSO cycles. Instead, NPR prefers to blame climate change for worse wildfires The media is indeed stuck in the “vicious circle of mutual manipulation, mythmaking, and self-interest”