Translate

Wednesday, December 18, 2019

Why Modern Famine Predictions Failed


 What's Natural column

published in the Pacifica Tribune Wednesday December 18, 2019

Victims if 1876 Famine in India


When I graduated high school in 1968 there were rampant predictions of environmental collapse and eco-catastrophe theories flourished. The highly influential Stanford scientist Dr Paul Ehrlich dominated the doomsday media stating, “Most of the people who are going to die in the greatest cataclysm in the history of man have already been born.” Predicting global famine in 1970 this PhD wrote, “The death rate will increase until at least 100-200 million people per year will be starving to death during the next ten years.”

Why was Ehrlich’s apocalyptic predictions so wrong? Ehrlich believed the promise of the “green revolution” via high yield crops and better cultivation practices would never offset the needs of a growing human population. Indeed, the early distribution of high yield “miracle seeds” had failed to stave off famines during the cooler 1960s. But then the earth began to warm, there was a CO2 fertilization effect and the growing season increased in concert with great leaps forward in genetics, biotechnology and agricultural innovations. 

From NASA : CO2 fertilization -  change in leaf area across the globe from 1982-2015. Credits: Boston University/R. Myneni


To produce high yield crops, early researchers had simply cross-bred compatible plants possessing desirable traits. Genetic manipulation via selective breeding had been done for hundreds and thousands of years as humans successfully created a wide variety of farm animals, or dog breeds like Chihuahuas and Great Danes from their ancestral wolf. Similarly, selective breeding transformed a scraggly grass into modern corn. Later, modern “mutant breeding” evolved in the 20th century as seeds were exposed to gamma rays searching for “hopeful monsters.” The public fails to recognize they likely consume many of the 3,000 crop varieties created via mutant breeding such as high yield barley, oats and grains commonly used in making premium beers and whiskey. For chocolate lovers, mutant breeding created a cocoa tree resistant to deadly fungus. At California’s UC Davis, scientists irradiated rice seeds to create a high yield variety that reached supermarket shelves in 1976 as “Calrose” rice, which still dominates over other varieties in many regions of the Pacific.

Next, the discovery of “restriction enzymes” in 1970 allowed scientists to successfully engineer organisms by surgically removing useful genes from one species and placing them into another species. Such genetically engineered plants now comprise most of today’s soybean, corn and cotton crops, as well as some varieties of potatoes, apples, papayas and sugar beets. Yields increased as engineered crops were better able to flourish under stressful conditions. Plants became more resistant to specific insect and fungal pests, producing greater yields without pesticides. Despite the tremendous benefits of genetically modified crops (GMOs), many people remained distrustful of possible detrimental health effects. More radical groups condemned GMO’s simply because they generated profits for large businesses. 


Golden Rice Adds Vitamin A


The strange battle between pro and anti GMO groups is best illustrated in the Golden Rice saga. Golden Rice was first engineered in 1990s as a non-profit attempt to prevent blindness and premature deaths from vitamin A deficiency. That deficiency afflicted 250 million children, mostly in Asia, and killed more than 200,000 people a year. Because rice is those children’s primary food source, 2 German scientists, Ingo Potrykus and Peter Beyer, removed a gene, from daffodils that produced beta-carotene and carefully inserted the gene into rice.  Beta carotene is the key building block for vitamin A and gives the rice its golden color. Later a more efficient gene from corn was used. Potrykus and Beyer also insisted the technology to create Golden Rice be donated freely. So, the biotech company Syngenta waived its right to commercialize the product. The humanitarian benefits of free Golden Rice could not be more clear.

Nonetheless opponents of GMOs, led primarily by Greenpeace, vilified Golden Rice. Greenpeace lobbied countries around the world to prevent legalization of Golden Rice by simply generating as much fearful speculation about “imagined” health repercussions.  Greenpeace was also fanatical about fighting biotechnology companies. They justified their propaganda campaign against Golden Rice claiming, "Corporations are overhyping golden rice benefits to pave the way for global approval of other more profitable genetically engineered crops.”

However, as Golden Rice continued to be proven safe, a letter signed by more than 100 Nobel laureates, accused Greenpeace of leading a “fact-challenged propaganda campaign against innovations in agricultural biotechnology." They demanded Greenpeace end its campaign against GMOs. Patrick Moore, a co-founder of Greenpeace, had previously left the organization because its original good intentions were being subverted by extremists. Moore bemoaned, "They're linking Golden Rice with death, which scares parents into not wanting the technology developed”. Such false propaganda so infuriated Moore he created an alternative movement – Allow Golden Rice Now. As the outcry of support grows, countries are now increasingly moving towards legalizing Golden Rice. 

Ehrlich’s doomsday predictions were incomprehensible. That environmental groups like Greenpeace vehemently propagandized against the technologies that prevented Ehrlich’s doomsday is even more unintelligible. But then again, there are many activists who only see humanity as earth’s scourge and best eliminated.



Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism


Wednesday, December 4, 2019

Why Worse Wildfires part 2

What’s Natural?

Why Worse Wildfires?  Part 2

Fighting the Ranch Fire


Why worse wildfires? The short answer is more humans cause more wildfire ignitions in altered landscapes. Since 1970, California’s population doubled, adding 20 million people. As more human habitat was developed, the increasingly disturbed landscape quickly became covered in easily ignitable invasive grasses (see part 1). To protect human habitat, fires were suppressed and ground fuels increased. Development also expanded a vulnerable electric grid. Furthermore, more people increased the probability of careless fires and more innocent accidents. And sadly, a larger population added more arsonists.

During a typically warm and dry July day, a rancher was innocently driving a stake into the ground to plug a wasp’s nest. Surrounded by dry grass, the hammer’s spark ignited a devastating inferno named the Ranch Fire. Despite sensationalist’s hype, global warming had not made the grass drier. Grass becomes highly combustible in just a few hours of dry weather. And like most of northern California, there has been no warming trend for maximum summertime tempertures. Based on Western Regional Climate Center data, maximum summer temperatures in the Mendocino area had cooled by 3°F since the 1930s. The rapidly spreading Ranch Fire soon merged  with a different fire to form the Mendocino Complex Fire, California’s largest documented fire.

Similarly, a highway accident sparked roadside grasses that kindled northern California’s 7th largest fire, the Carr Fire

Summertime cooling trend at Ukia,  Mendocino County, California


Careless fires cannot be considered accidents and offenders should be held accountable. A hunter’s illegal and improperly attended campfire caused the August 2013 Rim Fire, centered around Yosemite National Park. It was California’s 5th largest fire. 

Governments and utility companies should likewise be held accountable for carelessly maintaining our electric grids. An electric spark ignited California’s deadliest fire, the Camp Fire which destroyed the town of Paradise and killed 85 people. As a 2018 research paper estimates, “Since the year 2000 there’ve been a half-million acres burned due to powerline-ignited fires, which is five times more than we saw in the previous 20 years.” 

More disturbing is the number of fires started by arson. According to the U.S. Fire Administration, nationally, as in California, one in every five brush, grass, or forest fires since 2007 were intentionally set. Arsonists have been recently charged for some of California’s 2019 fires. Arson accounted for 55% of Kentucky’s fires and is the leading cause of Florida’s fires. Because arson is so difficult to prove, arson statistics are probably underestimated. So, experts in Australia combine arson and “suspicious” fires to argue half of Australia’s fireswere likely intentionally set. That means each year 31,000 Australian bushfires are intentionally ignited. And as in the American west, Australia’s bush fires have been increasingly fueled by invasive grasses like Buffel grass.

Wildfires caused by natural lightning ignitions, peak during the summer months of July and August, and become virtually non-existent in the autumn and winter. In contrast, human ignitions have created year-long fire seasons. Counter-intuitively, California experiences the most dangerous fire weather during the cooler and wetter seasons.  As seasonally cold air settles in over the high mountain deserts in autumn and winter, episodes of high winds, known as the Santa Ana and Diablo winds, flow downslope. Sinking air warms 5°F for every 1000-foot drop in elevation so these downslope winds can raise lowland temperatures 25°F in just a few hours. That warming causes relative humidity to fall, so these winds rapidly suck moisture out of whatever vegetation it passes over. In combination with faster spreading embers, fires burn 2 to 3 times more area during high wind events.

Human vs Lightning Wildfire Ignitions from Balch 2017


Under natural conditions, seasonally extreme winds never coincided with the season of abundant lightning. But due to human ignitions there has been an increased probability of more ignitions occurring during strong cool-weather winds. California’s 2nd biggest fire, the Thomas fire, was ignited in December by a downed power line during high winds. The third largest fire, the Cedar Fire was ignited in October by a lost hunter who carelessly lit a signal fire. California’s deadliest fire, the Camp Fire, was ignited by a powerline and fiercely spread due to a November high wind event.  

Climate change does not ignite fires. Climate change does not affect how quickly dead grasses and bushes can dry. Climate change may affect the winds, but any warming, natural or human, would reduce those extreme winds. Regards California’s worst fires, a US Geological Survey’s wildfire expert states, “Some will argue that it’s climate change, but there is no evidence that it is. It’s the fact that somebody ignites a fire during an extreme [wind] event.”

Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism



Friday, November 22, 2019

Why Worse Wildfires? Part 1


What’s Natural?

published in the Pacifica Tribune, November 20, 2019

Why Worse Wildfires? Part 1

SAGE GROUSE LEKKING



There are several theories trying to explain the recent uptick in wildfires throughout the western USA. Some scientists blame increased human ignitions. Others suggest accumulating surface fuels due to a century of fire suppression. Others argue landscape changes and invasive grasses have amplified the amount of easily ignited vegetation, while still others blame climate change. What’s the Sage Grouse connection? Like human communities, the Sage Grouse’s habitat is being threatened by fast spreading wildfires, and that increase in bigger wildfires in sagebrush country is due to invading annual grasses, like cheatgrass.

Historically hot dry sagebrush habitat rarely burned (just once every 60-100 years) because slow growing, patchy sagebrush only provides scant surface fuels incapable of supporting large and frequent fires. But the invasion of introduced annual grasses, like cheatgrass, has changed all that. As one wildlife researcher lamented, The color of Nevada has changed from a sagebrush silver gray to a cheatgrass tawny brown since the 1990s”.  Likewise, in the 1800s California’s hills were covered with perennial grasses that stayed green during the summer. Now California’s hills are golden brown as highly flammable annual grasses have taken over.



Cheat grass-dominated sagebrush habitat now burns every 3-5 years, up to 20 times more frequently than historic natural conditions. Extensive research on the effects of cheat grass found habitats with high cheat grass abundance are “twice as likely to burn as those with low abundance, and four times more likely to burn multiple times between 2000-2015.” What makes cheatgrass such a problem?

Invading annual grasses germinate earlier in the season and deprive the later-germinating native grasses of needed moisture. These foreign grasses die after setting seed, leaving highly flammable fuels that can burn much earlier in the year and thus extend the fire season. Eleven of the USA’s 50 biggest fires in last 20 years have been in Great Basin sagebrush habitats, where invasive cheatgrass is spreading. Nevada’s largest fire was the 2018 Martin Fire. Rapidly spreading through the cheat grass, it burned 439,000 acres, a burned area rivaling California’s largest fires in recorded history



Cheatgrass in Juniper Woodland




The 2012 Rush Fire was California’s 4th largest fire since 1932, burning 272,000 acres of sagebrush habitat in northeastern California. It then continued to spread burning an additional 43,000 acres in Nevada. The 2018 Carr Fire was California’s 7th largest fire and threatened the town of Redding, California. It started when a towed trailer blew a tire causing its wheel rim to scrape the asphalt. The resulting sparks were enough to ignite roadside grasses. Grassfires then carried the flames into the shrublands and forests, where burning grasses served as kindling to ignite less-flammable trees. Likewise, grasses were critical in spreading northern California’s biggest fires. In southern California, as humans ignite more and more fires, shrublands are being converted to more flammable grasslands.

Wildfire experts classify grasses as 1-hour fine fuels, meaning dead grass becomes highly flammable with just one hour of warm dry conditions. When experts estimate impending fire danger, they calculate the extent of a region’s fine fuels to determine how fast a fire will spread. The amount of small diameter fuels like grasses that can dry out in an hour, as well as twigs and small branches that dry out within 10 to 100 hours of dry weather, determine how fast the winds will spread a fire. It does not matter if it was wet and cool, or hot and dry during previous weeks or years. Just one hour of warm dry fire weather sets the stage for an explosive grass fire. Decades of climate change are totally irrelevant.

Some scientists point out that certain logging practices also spread “invasive grasses”. For that reason, California’s Democrat congressman, Ro Khanna, has been arguing that the U.S. Forest Service policy to clear cut after a wildfire is making California’s forest fires spread faster and burn hotter by increasing the forest floor’s flammable debris. Khanna warns, “Because we don’t have the right science, it is costing us lives, and that is the urgency of getting this right.” 

Bad analyses promote bad remedies and blaming climate change has distracted people from real solutions. The “cheatgrass” problem will continue to cause bigger fast-moving fires no matter how the climate changes. But there are several tactics that could provide better remedies. Holistic grazing that targets annual grasses before they set seed is one tactic. Better management of surface fuels via prescribed burns is another, as well as more careful logging practices. And re-seeding habitat with native perennial grasses or sagebrush could help shift the competitive balance away from cheatgrass. In combination with limiting human ignitions, (see part 2), all those tactics may ensure healthy populations of Sage Grouse living alongside safer human communities.

Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism


Monday, November 18, 2019

Venice and Unenlightened Climate Fearmongering



Venice and Unenlightened Climate Fearmongering

 
Venice flooding 2019




Venice is composed of a hundred linked islands and sits in the center of the shallow Venice Lagoon. Island elevations are low and easily flooded during storms. The Great Flood of 1966 was the worst on record. Since then, Venice has been working to avert the next inevitable flood. But because its flood control projects were fraught with corruption and other difficulties, the government failed to prevent the 2019 flood. So, now experiencing its 2nd greatest flood, the mayor covered his political derriere and immediately blamed climate change. But that’s a tactic typical for politicians these days. In California, ex-governor Jerry Brown vetoed a bipartisan bill to secure the electrical grid. Shortly thereafter power-line sparks ignited some of California’s biggest wildfires, so of course Brown blamed climate change to disguise his policy failures. 

Figure 1. Venice Lagoon and its 3 inlets


As seen in Figure 1 above, sea level rise in the Venice Lagoon is modulated by how much water from the Adriatic Sea enters the lagoon via 3 inlets and how quickly it flushes out again. To prevent further flooding, Venice began designing the MOSE project, which would construct inflatable barriers that could be deployed when weather conditions predicted threatening inflows from the Adriatic Sea. High inflows from the Adriatic Sea are driven by the strength of the Sirocco and Bora winds that cause local sea level to surge. 

Increasing Venice’s vulnerability, the land has been sinking. Dwarfing the 1.4 millimeters per year of estimated sea level rise, from 1930 to 1970 Venice sank at the rate of 2.3 millimeters per year, largely due to ground water extraction. After addressing that problem, the rate of sinking slowed, but Venice continues to sink at a rate of 1 millimeter per year.  Furthermore, due to alterations of the lagoon’s basin, the amplitude of tides has been changing, which accounts for 20% of the rise in extreme sea level events. That tidal effect was largely due to alteration of flows through the inlets due to dredging for ship traffic, and alterations from the MOSE project.

Venice Great Flood of 1966


As has become typical for every catastrophe, media outlets shamelessly and mindlessly blame climate change for Venice’s flooding.  Others, like Dr. Marshall Shepherd writing for Forbes, attempted to appear more objective by acknowledging many factors had contributed to the flooding. But Shepherd’s real intent was to ensure that people would still blame climate change, at least a part, and that skeptics were biased by only focusing on Venice’s sinking land. But there is much more to the skeptics’ arguments. Furthermore Dr. Shepherd failed to provide any support for his climate change claims.  But that is to be expected as the evidence provides very little support for Venice’s mayor or Shepherd. 


If climate change had really played a significant role, then we would expect the flooding to be worse in 2019 when compared to the more “natural flooding” in 1966. But a comparison of floodwaters inundating Doge’s Palace (see above) suggests the flooding was slightly worse in 1966. Official measurements likewise determined flood levels in Venice Lagoon peaked at 74 inches, shy of the 1966 record of 76 inches. The climate change argument is weakened further when it is understood that the 1966 flood happened during a low tide, in contrast to the 2019 flood that happened during an extreme high tide. Furthermore, there is no correlation with global warming as the November 1966 flood happened when Venice experienced its coldest temperatures since 1924. Recent Venice temperatures are slightly less than the 1950s (Figure 2).


Figure 2 Venice Temperature Trend

The Venice Lagoon is situated at the northernmost end of the Adriatic Sea. The Adriatic Sea is bordered by mountains on both its eastern and western boundaries. That geography creates a funnel effect. Each autumn the Sirocco Winds begin to intensify. These winds drive warm air from Africa northward, which in turn pushes Adriatic Sea water northward up the “funnel”. The end result is sea water piles up in front of the 3 inlets and begins flooding the shallow Venice Lagoon. Stronger winds drive greater flooding. And if the winds are strong enough, they temporarily prevent sea water from exiting the lagoons, causing sea level to rise even higher. 

Naturally it would be natural to ask if climate change has caused an increasing trend in the Sirocco Winds. But there has been no trend. 





We should also analyze how much has sea level rise affected Venice?  It could certainly be argued that rising sea levels since 1900 contributed about 100 millimeters (4 inches) to the 1966 Great Flood as there was a steady rise in sea level between 1900 and 1970. But between 1970 and 2000 the Venezia (Venice) tide gauge shows sea level peaked around 7150 millimeters and then plateaued (Figure 3). Unfortunately, that tide gauge then moved to a new location designated Venezia II. There sea level began at a lower elevation and rose from 2001 to 2010, again plateauing just under 7150 millimeters (Figure 4). 


Figure 3  Venezia (Venice) Sea Level 1910-2000


Figure 4 Venezia II (Venice) Sea Level 2001-2016


Because various parts of Venice are sinking at different rates, it is difficult to know how much the new tide gauge location affected new estimates of sea level change. However, due to the uncertainty caused by Venice’s sinking land, researchers typically compare Venice sea level trends to neighboring Trieste in the far northeast corner of the Adriatic Sea. There the land appears to be more stable. Surprisingly, the Trieste sea level trend has been declining since 2000 (Figure 5). So, it would appear impossible to attribute sea level rise in the Adriatic Sea and climate change to the 2019 Venice flood.

Figure 5 Trieste Sea Level 2001-2016


However, there is another factor to consider. The winds in the northern Adriatic Sea cause sea levels to oscillate from east to west across the Adriatic’s northern basin. When sea levels fall around Trieste, they often surge around the Venice Lagoon. That sea level surge is associated with higher sea levels in the lagoon. Thus, at least in part, higher sea levels in the Venice Lagoon are driven by an ocean oscillation that creates higher sea level surges. And when that oscillation coincides with strong Sirocco Winds, a sinking Venice should expect more flooding.

In contrast, it’s unclear what effect is caused by global warming. Perhaps it’s negligible. Unfortunately for the public, that doesn’t stop media outlets from falsely hijacking the hardships in Venice to push a climate crisis. Alarmists continue to falsely suggest that every catastrophe has been partly driven by CO2 global warming.  Sadly, as savvy propagandists know, if you tell a big enough lie often enough, people will start believing the lie.



Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism


How Bad Science & Horrific Journalism Misrepresent Wildfires and Climate



How Bad Science & Horrific Journalism Misrepresent Wildfires and Climate 





As one wildfire expert wrote, “Predicting future fire regimes is not rocket science; it is far more complicated than that.” But regardless of accuracy, most people are attracted to very simple narratives such as: more CO2 causes global warming causes more fires. Accordingly in the summer of 2019, CNN trumpeted the headline California wildfires burn 500% more land because of climate change. They claimed, “the cause of the increase is simple. Hotter temperatures cause drier land, which causes a parched atmosphere.” CNN based their claims on a scientific paper by lead authors Park Williams and John Abatzoglou titled Observed Impacts of Anthropogenic Climate Change on Wildfire in CaliforniaThe authors are very knowledgeable but appear to have hitched their fame and fortune to pushing a very simple claim that climate change drives bigger wildfires. As will be seen, their advocacy appears to have caused them to stray from objective scientific analyses. 

If Williams and Abatzoglou were not so focused on forcing a global warming connection, they would have at least raised the question, ‘why did much bigger fires happen during cooler decades?’ The 1825 New Brunswick fire burned 3,000,000 acres. In Idaho and Montana the Great Fire of 1910 burnt another 3,000,000 acres. In 1871, the Great Michigan Fire burned 2,500,000 acres. Those fires were not only 6 times larger than California’s biggest fire, they occurred in moister regions, regions that don’t experience California’s Mediterranean climate with its guaranteed months of drought each and every summer. If those huge devastating fires occurred in much cooler times, what are the other driving factors of big wildfires? 

Bad analyses cause bad remedies, and here is why Williams and Abatzoglou’s last paper exemplifies a bad scientific analysis. Analyzing changes in California’s burned areas from 1972 to 2018 they claimed, “The clearest link between California wildfires and anthropogenic climate change thus far, has been via warming-driven increases in atmospheric aridity, which works to dry fuels and promote summer forest fire.” But natural cycles of low rainfall due to La Niñas also cause dry fuels. The increase in burned area is also attributed to increases in human ignitions such as faulty electrical grids, to increased surface fuels from years of fire suppression, and to changes in vegetation that increased the abundance of easily ignited fine fuels like annual grasses. Furthermore, temperatures in some local regions experiencing the biggest fires have not been warming over the past 50 years (See temperature graphs in this essay’s last segment. Data from Western Regional Climate Center). All those factors promote rapid wildfire spread and greater burned areas. Although good science demands separating those contributing factors before analyzing a possible correlation between temperature and area burned, Williams and Abatzoglou oddly did not do so! That’s bad science

Although Williams and Abatzoglou did acknowledge that other factors modulate the effects of warming on burned areas they admitted their statistical correlations did not “control” for those effects. To “control” for all those contributing factors, they could have easily subtracted estimates of burned areas associated with those factors. For example, a 2018 research paper estimates, “Since the year 2000 there’ve been a half-million acres burned due to powerline-ignited fires, which is five times more than we saw in the previous 20 years.”  Did Williams and Abatzoglou not do the needed subtractions of other well-established factors because it would weaken their global warming correlation?

Similarly, CNN journalists were content to simply blame climate change. However, in light of the increasing devastation caused by powerline-ignited fires, good investigative journalists should have asked the former California Governor Jerry Brown if he now regrets having vetoed the bipartisan bill crafted to secure the power grid; an action that could have saved so many lives and property. Instead CNN simply promoted Brown’s persistent climate fearmongering quoting, “This is only a taste of the horror and terror that will occur in decades.”

Ignoring the complex effects of human ignitions, CNN also parroted claims that global warming is causing fire season to last all year. But as seen in the graph below from a 2017 wildfire study, the United States’ natural fire season is due to lightning and only dominates during the months of July and August, when California’s high wind events are low. In contrast it is human ignitions that extend fire season, dramatically increasing ignitions throughout the winter months when fuel moisture is higher, and into seasons when cooling desert air generates strong episodes of Santa Ana and Diablo winds. Those high winds cause fires to spread rapidly, burning 2-3 times more area than fires ignited during low winds, and California’s most destructive fires recently occurred during those high wind events. However, like other researchers, Williams and Abatzoglou reported no trend in those destructive California winds. Furthermore, climate models suggest a warming climate should cause weaker winds. So, without a change in California’s windy conditions, high winds can’t be blamed, directly, for the increased burned areas. However, because more human-caused ignitions occur during the winter, it increases the probability that more fires will be amplified by those strong winter winds. As US Geological Survey’s wildfire expert states, “Some will argue that it’s climate change but there is no evidence that it is. It’s the fact that somebody ignites a fire during an extreme [wind] event.”


Human Ignitions Extend Wild Fire Season  from Balch 2017


The timing of human ignitions is but one driver of more and bigger fires. Increased surface fuels are another huge factor. It is well known that past fire suppression has allowed surface fuels to accumulate in forests, leading to bigger and more devastating fires. But the changes in surface fuels are more complex. Some scientists point out that certain logging practices spread “invasive grasses called cheat grass, for example, and other ones that form this really thick mat across the area after logging and that grass just spreads flames very rapidly and fires burn very intensely through that.” California’s Democrat congressman Ro Khanna has been arguing that the U.S. Forest Service policy to clear cut after a wildfire is making California’s forest fires spread faster and burn hotter by increasing the forest floor’s flammable debris. Khanna says, “Because we don’t have the right science, it is costing us lives, and that is the urgency of getting this right.” 

Controlling the spread of cheat grass is urgently needed. Grasses are “fine fuels” that ignite most easily. The 2018 Carr Fire was California’s 7th largest fire and threatened the town of Redding, California. It started when a towed trailer blew a tire causing its wheel rim to scrape the asphalt creating a spark which ignited roadside grasses. Those grasses carried the fire into the shrublands and forests. Grasses are classified as 1-hour fine fuels, meaning they become highly flammable in just one hour of warm dry conditions. Climate change is totally irrelevant. It does not matter if it was wet and cool, or hot and dry during previous days, weeks or years. Just one hour of warm dry fire weather sets the stage for an explosive grass fire that then gets carried into the forests. Fire weather happens every year, and partially explains why fires could burn 3,000,000 acres in the cool 1800s.

It was not human ignition but lightning that caused the 2012 Rush Fire. It was California’s 4th largest fire burning 272,000acres of sagebrush habitat, which then continued to burn additional area in Nevada. Historically, because surface fuels are scarce, hot dry sagebrush habitat rarely burned (once every 60-100 years). But invasions of non-native cheat grass have now provided ample fuel to turn small lighting fires into huge conflagrations. Eleven of the USA’s 50 biggest fires in last 20 years are in the Great Basin, where invasive cheatgrass is spreading. Nevada’s largest fire was the 2018 Martin Fire. Rapidly spreading through the cheat grass, it burned 439,000 acresCheat grass fires are a great concern for biologists trying to protect the threatened Sage Grouse as cheat grass-dominated sagebrush habitat now burns every 3-5 years. Habitat with high cheat grass abundance are “twice as likely to burn as those with low abundance, and four times more likely to burn multiple times between 2000-2015.”

When experts estimate impending fire danger, they determine how fast a fire will spread. The Spread Component considers the effects of wind and slope and daily changes in the moisture content of the surface fuels. Large dead trees may become flammable after 1000 hours of warm dry conditions, but still thick fuels only ignite if fast burning surface fuels supply enough heat. Thus, the Spread Component only considers smaller-diameter fuels like grasses that can dry out in an hour, as well as twigs and small branches that dry out within 10 to 100 hours. Central and Southern California are dominated by shrubby habitat with small diameter fuels that allow fire to spread rapidly. The December 2017 Thomas Fire was California’s 2nd largest fire. Its human ignition coincided with a Santa Ana wind event resulting in the burning of 282,000 acres in southern California. 

Counter-intuitively Williams and Abatzoglou found the correlation between burned area in the hotter and drier climate of California’s Central and South Coast to be “relatively weak”. Accordingly, they reported “Annual burned area did not change significantly in Central and South Coast.” That insignificant climate effect over half of California escaped the notice of journalists who only cherry-picked the researcher’s more alarming climate narratives. Most interesting, Williams and Abatzoglou suggested the lack of a climate-change correlation with California’s Central and South Coast burned areas was because fires there were “strongly manipulated by humans via ignitions, suppression, and land cover change.” 

Lightning is rare along California’s Central and South Coast, so nearly 100% of those fires are ignited by humans. As California’s population doubled since the 1970s, adding 20 million people, the probability of more human-started fires has increased. Unlike forested areas where fire suppression builds up deadly surfaced fuels, California’s Central and South Coast need to suppress fires. Due to more frequent fires caused by humans, shrublands are converting to grasslands.  The increased fine fuels of the grasslands more readily ignite and spread fire.  Furthermore, California’s natural climate undergoes wet years due to El Nino followed by dry La Nina years. Wet years make fine fuels more abundant. Thus fire suppression is needed to prevent more frequent fires caused by the conversion of shrublands to grasslands.

In contrast to the insignificant changes in burned areas in California’s southern half, Williams and Abatzoglou reported burned areas in the Sierra Nevada and the North Coast increased by more than 600%, which they attributed to human-caused climate change. They reported, “During 1896–2018, March–October Tmax [maximum temperature] averaged across the four California study regions increased by 1.81 °C, with a corresponding increase in VPD [ Vapor Pressure Deficit - a measure of atmospheric dryness] of 1.59 hPa (+13%)…The observed trends in Tmax and VPD are consistent with trends simulated by climate models as part of the CMIP5 experiments, supporting the interpretation that observed increases in California warmseason temperature and VPD have been largely or entirely driven by anthropogenic forcing.”

But how can only half of California’s fires be due to global warming and the other half not? All of California is “strongly manipulated by humans via ignitions, suppression, and land cover change”?  Were Williams and Abatzoglou straying from objective science? 

Part of the problem is their ill-advised use of a maximum temperature averaged for all California. Several studies have reported that maximum temperatures in the northern half of California have not exceeded the high temperatures of the 1930s. Because the early 20th century temperatures were deemed natural, unless recent temperatures exceed the natural 1930s, then human-caused warming is unlikely. Curiouser and curiouser, southern California has experienced temperatures that exceeded the 1930s. Yet there Williams and Abatzoglou did not find a significant effect from climate change. 

Regardless Williams and Abatzoglou claimed The clearest link between California wildfire and anthropogenic climate change thus far, has been via warming-driven increases in atmospheric aridity, which works to dry fuels and promote summer forest fire.”  Yet summer maximum temperatures, averaged from March through October, located in the vicinity of California’s big fires do not indicate global warming. For example, the August 2013 Rim Fire centered around Yosemite National Park, was California’s 5th largest fire and 2nd largest in northern California, burning 257,000 acres. It was started by a hunter’s illegal campfire that he let get away. Unfortunately, there is no cure for stupid. Nonetheless, Yosemite’s maximum temperatures were warmer in the early 1900s.  However, an in depth study of the Rim Fire found a strong correlation with the amount of shrubland interspersed with the trees.

Trend in Yosemite NP maximum summer temperatures


The November 2018 Camp Fire was California’s deadliest fire destroying the town of Paradise. It was also its 16th largest fire burning 153,000 acres. It was ignited by a faulty power grid during a strong Diablo wind event. Similarly, based on weather data from nearby Chico CA, maximum temperatures were higher in the 1930s.

Trend in maximum summer temperature for California's deadliest Paradise/Camp Fire


The Mendocino Complex Fire was California’s largest fire (since 1932). In July of 2018 it burned 459,000 acres. The source of human ignitions is still under investigation. Still, those fires were centered around the town of Ukiah which also reveals a cooling trend since 1930.


Trend in maximum summer temperatures for California's largest fire




In October 2017, the wine country’s Tubbs Fire was the 4th deadliest. It only burned 37,000 acres but high winds drove embers into the dwellings of the heavily populated outskirts of Santa Rosa. Again, global warming was irrelevant as Santa Rosa has experienced a cooling trend since the 1930s.



 
Trend in maximum summer temperature for deadly Santa Rosa/Tubbs fire


Still some people are determined to link catastrophic fires with climate change. So, they will suggest delayed autumn rains allow more late season ignitions or the fall fires to burn longer. In Williams and Abatzoglou’s abstract they claim, “In fall, wind events and delayed onset of winter precipitation are the dominant promoters of wildfire.” But their results found, “no allregion trend in onset of winter precipitation or October–November wetday frequency during 1915–2018.” As illustrated below by the October precipitation data for Santa Rosa, since 1900 there’s a 10% chance no rains will fall in October. Furthermore, October experienced more zero rainfall months in the early 1900s. A global warming caused delay in autumn rains has not yet been detected.

So, doing my best Greta Thunberg imitation, I say to climate alarmists, “How dare you misrepresent the causes of wildfires. How dare you imply less CO2 will reduce human ignitions and reduce surface fuels and the spread invasive grasses. Bad analyses lead to bad remedies! Your bad science is stealing Californian’s dreams and your false remedies distract us of from the real solutions. Young people and old alike, must demand better science and better journalism!”

October precipitation for Santa Rosa, CA



Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism