Translate

Friday, November 22, 2019

Why Worse Wildfires? Part 1


What’s Natural?

published in the Pacifica Tribune, November 20, 2019

Why Worse Wildfires? Part 1

SAGE GROUSE LEKKING



There are several theories trying to explain the recent uptick in wildfires throughout the western USA. Some scientists blame increased human ignitions. Others suggest accumulating surface fuels due to a century of fire suppression. Others argue landscape changes and invasive grasses have amplified the amount of easily ignited vegetation, while still others blame climate change. What’s the Sage Grouse connection? Like human communities, the Sage Grouse’s habitat is being threatened by fast spreading wildfires, and that increase in bigger wildfires in sagebrush country is due to invading annual grasses, like cheatgrass.

Historically hot dry sagebrush habitat rarely burned (just once every 60-100 years) because slow growing, patchy sagebrush only provides scant surface fuels incapable of supporting large and frequent fires. But the invasion of introduced annual grasses, like cheatgrass, has changed all that. As one wildlife researcher lamented, The color of Nevada has changed from a sagebrush silver gray to a cheatgrass tawny brown since the 1990s”.  Likewise, in the 1800s California’s hills were covered with perennial grasses that stayed green during the summer. Now California’s hills are golden brown as highly flammable annual grasses have taken over.



Cheat grass-dominated sagebrush habitat now burns every 3-5 years, up to 20 times more frequently than historic natural conditions. Extensive research on the effects of cheat grass found habitats with high cheat grass abundance are “twice as likely to burn as those with low abundance, and four times more likely to burn multiple times between 2000-2015.” What makes cheatgrass such a problem?

Invading annual grasses germinate earlier in the season and deprive the later-germinating native grasses of needed moisture. These foreign grasses die after setting seed, leaving highly flammable fuels that can burn much earlier in the year and thus extend the fire season. Eleven of the USA’s 50 biggest fires in last 20 years have been in Great Basin sagebrush habitats, where invasive cheatgrass is spreading. Nevada’s largest fire was the 2018 Martin Fire. Rapidly spreading through the cheat grass, it burned 439,000 acres, a burned area rivaling California’s largest fires in recorded history



Cheatgrass in Juniper Woodland




The 2012 Rush Fire was California’s 4th largest fire since 1932, burning 272,000 acres of sagebrush habitat in northeastern California. It then continued to spread burning an additional 43,000 acres in Nevada. The 2018 Carr Fire was California’s 7th largest fire and threatened the town of Redding, California. It started when a towed trailer blew a tire causing its wheel rim to scrape the asphalt. The resulting sparks were enough to ignite roadside grasses. Grassfires then carried the flames into the shrublands and forests, where burning grasses served as kindling to ignite less-flammable trees. Likewise, grasses were critical in spreading northern California’s biggest fires. In southern California, as humans ignite more and more fires, shrublands are being converted to more flammable grasslands.

Wildfire experts classify grasses as 1-hour fine fuels, meaning dead grass becomes highly flammable with just one hour of warm dry conditions. When experts estimate impending fire danger, they calculate the extent of a region’s fine fuels to determine how fast a fire will spread. The amount of small diameter fuels like grasses that can dry out in an hour, as well as twigs and small branches that dry out within 10 to 100 hours of dry weather, determine how fast the winds will spread a fire. It does not matter if it was wet and cool, or hot and dry during previous weeks or years. Just one hour of warm dry fire weather sets the stage for an explosive grass fire. Decades of climate change are totally irrelevant.

Some scientists point out that certain logging practices also spread “invasive grasses”. For that reason, California’s Democrat congressman, Ro Khanna, has been arguing that the U.S. Forest Service policy to clear cut after a wildfire is making California’s forest fires spread faster and burn hotter by increasing the forest floor’s flammable debris. Khanna warns, “Because we don’t have the right science, it is costing us lives, and that is the urgency of getting this right.” 

Bad analyses promote bad remedies and blaming climate change has distracted people from real solutions. The “cheatgrass” problem will continue to cause bigger fast-moving fires no matter how the climate changes. But there are several tactics that could provide better remedies. Holistic grazing that targets annual grasses before they set seed is one tactic. Better management of surface fuels via prescribed burns is another, as well as more careful logging practices. And re-seeding habitat with native perennial grasses or sagebrush could help shift the competitive balance away from cheatgrass. In combination with limiting human ignitions, (see part 2), all those tactics may ensure healthy populations of Sage Grouse living alongside safer human communities.

Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism


Monday, November 18, 2019

Venice and Unenlightened Climate Fearmongering



Venice and Unenlightened Climate Fearmongering

 
Venice flooding 2019




Venice is composed of a hundred linked islands and sits in the center of the shallow Venice Lagoon. Island elevations are low and easily flooded during storms. The Great Flood of 1966 was the worst on record. Since then, Venice has been working to avert the next inevitable flood. But because its flood control projects were fraught with corruption and other difficulties, the government failed to prevent the 2019 flood. So, now experiencing its 2nd greatest flood, the mayor covered his political derriere and immediately blamed climate change. But that’s a tactic typical for politicians these days. In California, ex-governor Jerry Brown vetoed a bipartisan bill to secure the electrical grid. Shortly thereafter power-line sparks ignited some of California’s biggest wildfires, so of course Brown blamed climate change to disguise his policy failures. 

Figure 1. Venice Lagoon and its 3 inlets


As seen in Figure 1 above, sea level rise in the Venice Lagoon is modulated by how much water from the Adriatic Sea enters the lagoon via 3 inlets and how quickly it flushes out again. To prevent further flooding, Venice began designing the MOSE project, which would construct inflatable barriers that could be deployed when weather conditions predicted threatening inflows from the Adriatic Sea. High inflows from the Adriatic Sea are driven by the strength of the Sirocco and Bora winds that cause local sea level to surge. 

Increasing Venice’s vulnerability, the land has been sinking. Dwarfing the 1.4 millimeters per year of estimated sea level rise, from 1930 to 1970 Venice sank at the rate of 2.3 millimeters per year, largely due to ground water extraction. After addressing that problem, the rate of sinking slowed, but Venice continues to sink at a rate of 1 millimeter per year.  Furthermore, due to alterations of the lagoon’s basin, the amplitude of tides has been changing, which accounts for 20% of the rise in extreme sea level events. That tidal effect was largely due to alteration of flows through the inlets due to dredging for ship traffic, and alterations from the MOSE project.

Venice Great Flood of 1966


As has become typical for every catastrophe, media outlets shamelessly and mindlessly blame climate change for Venice’s flooding.  Others, like Dr. Marshall Shepherd writing for Forbes, attempted to appear more objective by acknowledging many factors had contributed to the flooding. But Shepherd’s real intent was to ensure that people would still blame climate change, at least a part, and that skeptics were biased by only focusing on Venice’s sinking land. But there is much more to the skeptics’ arguments. Furthermore Dr. Shepherd failed to provide any support for his climate change claims.  But that is to be expected as the evidence provides very little support for Venice’s mayor or Shepherd. 


If climate change had really played a significant role, then we would expect the flooding to be worse in 2019 when compared to the more “natural flooding” in 1966. But a comparison of floodwaters inundating Doge’s Palace (see above) suggests the flooding was slightly worse in 1966. Official measurements likewise determined flood levels in Venice Lagoon peaked at 74 inches, shy of the 1966 record of 76 inches. The climate change argument is weakened further when it is understood that the 1966 flood happened during a low tide, in contrast to the 2019 flood that happened during an extreme high tide. Furthermore, there is no correlation with global warming as the November 1966 flood happened when Venice experienced its coldest temperatures since 1924. Recent Venice temperatures are slightly less than the 1950s (Figure 2).


Figure 2 Venice Temperature Trend

The Venice Lagoon is situated at the northernmost end of the Adriatic Sea. The Adriatic Sea is bordered by mountains on both its eastern and western boundaries. That geography creates a funnel effect. Each autumn the Sirocco Winds begin to intensify. These winds drive warm air from Africa northward, which in turn pushes Adriatic Sea water northward up the “funnel”. The end result is sea water piles up in front of the 3 inlets and begins flooding the shallow Venice Lagoon. Stronger winds drive greater flooding. And if the winds are strong enough, they temporarily prevent sea water from exiting the lagoons, causing sea level to rise even higher. 

Naturally it would be natural to ask if climate change has caused an increasing trend in the Sirocco Winds. But there has been no trend. 





We should also analyze how much has sea level rise affected Venice?  It could certainly be argued that rising sea levels since 1900 contributed about 100 millimeters (4 inches) to the 1966 Great Flood as there was a steady rise in sea level between 1900 and 1970. But between 1970 and 2000 the Venezia (Venice) tide gauge shows sea level peaked around 7150 millimeters and then plateaued (Figure 3). Unfortunately, that tide gauge then moved to a new location designated Venezia II. There sea level began at a lower elevation and rose from 2001 to 2010, again plateauing just under 7150 millimeters (Figure 4). 


Figure 3  Venezia (Venice) Sea Level 1910-2000


Figure 4 Venezia II (Venice) Sea Level 2001-2016


Because various parts of Venice are sinking at different rates, it is difficult to know how much the new tide gauge location affected new estimates of sea level change. However, due to the uncertainty caused by Venice’s sinking land, researchers typically compare Venice sea level trends to neighboring Trieste in the far northeast corner of the Adriatic Sea. There the land appears to be more stable. Surprisingly, the Trieste sea level trend has been declining since 2000 (Figure 5). So, it would appear impossible to attribute sea level rise in the Adriatic Sea and climate change to the 2019 Venice flood.

Figure 5 Trieste Sea Level 2001-2016


However, there is another factor to consider. The winds in the northern Adriatic Sea cause sea levels to oscillate from east to west across the Adriatic’s northern basin. When sea levels fall around Trieste, they often surge around the Venice Lagoon. That sea level surge is associated with higher sea levels in the lagoon. Thus, at least in part, higher sea levels in the Venice Lagoon are driven by an ocean oscillation that creates higher sea level surges. And when that oscillation coincides with strong Sirocco Winds, a sinking Venice should expect more flooding.

In contrast, it’s unclear what effect is caused by global warming. Perhaps it’s negligible. Unfortunately for the public, that doesn’t stop media outlets from falsely hijacking the hardships in Venice to push a climate crisis. Alarmists continue to falsely suggest that every catastrophe has been partly driven by CO2 global warming.  Sadly, as savvy propagandists know, if you tell a big enough lie often enough, people will start believing the lie.



Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism


How Bad Science & Horrific Journalism Misrepresent Wildfires and Climate



How Bad Science & Horrific Journalism Misrepresent Wildfires and Climate 





As one wildfire expert wrote, “Predicting future fire regimes is not rocket science; it is far more complicated than that.” But regardless of accuracy, most people are attracted to very simple narratives such as: more CO2 causes global warming causes more fires. Accordingly in the summer of 2019, CNN trumpeted the headline California wildfires burn 500% more land because of climate change. They claimed, “the cause of the increase is simple. Hotter temperatures cause drier land, which causes a parched atmosphere.” CNN based their claims on a scientific paper by lead authors Park Williams and John Abatzoglou titled Observed Impacts of Anthropogenic Climate Change on Wildfire in CaliforniaThe authors are very knowledgeable but appear to have hitched their fame and fortune to pushing a very simple claim that climate change drives bigger wildfires. As will be seen, their advocacy appears to have caused them to stray from objective scientific analyses. 

If Williams and Abatzoglou were not so focused on forcing a global warming connection, they would have at least raised the question, ‘why did much bigger fires happen during cooler decades?’ The 1825 New Brunswick fire burned 3,000,000 acres. In Idaho and Montana the Great Fire of 1910 burnt another 3,000,000 acres. In 1871, the Great Michigan Fire burned 2,500,000 acres. Those fires were not only 6 times larger than California’s biggest fire, they occurred in moister regions, regions that don’t experience California’s Mediterranean climate with its guaranteed months of drought each and every summer. If those huge devastating fires occurred in much cooler times, what are the other driving factors of big wildfires? 

Bad analyses cause bad remedies, and here is why Williams and Abatzoglou’s last paper exemplifies a bad scientific analysis. Analyzing changes in California’s burned areas from 1972 to 2018 they claimed, “The clearest link between California wildfires and anthropogenic climate change thus far, has been via warming-driven increases in atmospheric aridity, which works to dry fuels and promote summer forest fire.” But natural cycles of low rainfall due to La Niñas also cause dry fuels. The increase in burned area is also attributed to increases in human ignitions such as faulty electrical grids, to increased surface fuels from years of fire suppression, and to changes in vegetation that increased the abundance of easily ignited fine fuels like annual grasses. Furthermore, temperatures in some local regions experiencing the biggest fires have not been warming over the past 50 years (See temperature graphs in this essay’s last segment. Data from Western Regional Climate Center). All those factors promote rapid wildfire spread and greater burned areas. Although good science demands separating those contributing factors before analyzing a possible correlation between temperature and area burned, Williams and Abatzoglou oddly did not do so! That’s bad science

Although Williams and Abatzoglou did acknowledge that other factors modulate the effects of warming on burned areas they admitted their statistical correlations did not “control” for those effects. To “control” for all those contributing factors, they could have easily subtracted estimates of burned areas associated with those factors. For example, a 2018 research paper estimates, “Since the year 2000 there’ve been a half-million acres burned due to powerline-ignited fires, which is five times more than we saw in the previous 20 years.”  Did Williams and Abatzoglou not do the needed subtractions of other well-established factors because it would weaken their global warming correlation?

Similarly, CNN journalists were content to simply blame climate change. However, in light of the increasing devastation caused by powerline-ignited fires, good investigative journalists should have asked the former California Governor Jerry Brown if he now regrets having vetoed the bipartisan bill crafted to secure the power grid; an action that could have saved so many lives and property. Instead CNN simply promoted Brown’s persistent climate fearmongering quoting, “This is only a taste of the horror and terror that will occur in decades.”

Ignoring the complex effects of human ignitions, CNN also parroted claims that global warming is causing fire season to last all year. But as seen in the graph below from a 2017 wildfire study, the United States’ natural fire season is due to lightning and only dominates during the months of July and August, when California’s high wind events are low. In contrast it is human ignitions that extend fire season, dramatically increasing ignitions throughout the winter months when fuel moisture is higher, and into seasons when cooling desert air generates strong episodes of Santa Ana and Diablo winds. Those high winds cause fires to spread rapidly, burning 2-3 times more area than fires ignited during low winds, and California’s most destructive fires recently occurred during those high wind events. However, like other researchers, Williams and Abatzoglou reported no trend in those destructive California winds. Furthermore, climate models suggest a warming climate should cause weaker winds. So, without a change in California’s windy conditions, high winds can’t be blamed, directly, for the increased burned areas. However, because more human-caused ignitions occur during the winter, it increases the probability that more fires will be amplified by those strong winter winds. As US Geological Survey’s wildfire expert states, “Some will argue that it’s climate change but there is no evidence that it is. It’s the fact that somebody ignites a fire during an extreme [wind] event.”


Human Ignitions Extend Wild Fire Season  from Balch 2017


The timing of human ignitions is but one driver of more and bigger fires. Increased surface fuels are another huge factor. It is well known that past fire suppression has allowed surface fuels to accumulate in forests, leading to bigger and more devastating fires. But the changes in surface fuels are more complex. Some scientists point out that certain logging practices spread “invasive grasses called cheat grass, for example, and other ones that form this really thick mat across the area after logging and that grass just spreads flames very rapidly and fires burn very intensely through that.” California’s Democrat congressman Ro Khanna has been arguing that the U.S. Forest Service policy to clear cut after a wildfire is making California’s forest fires spread faster and burn hotter by increasing the forest floor’s flammable debris. Khanna says, “Because we don’t have the right science, it is costing us lives, and that is the urgency of getting this right.” 

Controlling the spread of cheat grass is urgently needed. Grasses are “fine fuels” that ignite most easily. The 2018 Carr Fire was California’s 7th largest fire and threatened the town of Redding, California. It started when a towed trailer blew a tire causing its wheel rim to scrape the asphalt creating a spark which ignited roadside grasses. Those grasses carried the fire into the shrublands and forests. Grasses are classified as 1-hour fine fuels, meaning they become highly flammable in just one hour of warm dry conditions. Climate change is totally irrelevant. It does not matter if it was wet and cool, or hot and dry during previous days, weeks or years. Just one hour of warm dry fire weather sets the stage for an explosive grass fire that then gets carried into the forests. Fire weather happens every year, and partially explains why fires could burn 3,000,000 acres in the cool 1800s.

It was not human ignition but lightning that caused the 2012 Rush Fire. It was California’s 4th largest fire burning 272,000acres of sagebrush habitat, which then continued to burn additional area in Nevada. Historically, because surface fuels are scarce, hot dry sagebrush habitat rarely burned (once every 60-100 years). But invasions of non-native cheat grass have now provided ample fuel to turn small lighting fires into huge conflagrations. Eleven of the USA’s 50 biggest fires in last 20 years are in the Great Basin, where invasive cheatgrass is spreading. Nevada’s largest fire was the 2018 Martin Fire. Rapidly spreading through the cheat grass, it burned 439,000 acresCheat grass fires are a great concern for biologists trying to protect the threatened Sage Grouse as cheat grass-dominated sagebrush habitat now burns every 3-5 years. Habitat with high cheat grass abundance are “twice as likely to burn as those with low abundance, and four times more likely to burn multiple times between 2000-2015.”

When experts estimate impending fire danger, they determine how fast a fire will spread. The Spread Component considers the effects of wind and slope and daily changes in the moisture content of the surface fuels. Large dead trees may become flammable after 1000 hours of warm dry conditions, but still thick fuels only ignite if fast burning surface fuels supply enough heat. Thus, the Spread Component only considers smaller-diameter fuels like grasses that can dry out in an hour, as well as twigs and small branches that dry out within 10 to 100 hours. Central and Southern California are dominated by shrubby habitat with small diameter fuels that allow fire to spread rapidly. The December 2017 Thomas Fire was California’s 2nd largest fire. Its human ignition coincided with a Santa Ana wind event resulting in the burning of 282,000 acres in southern California. 

Counter-intuitively Williams and Abatzoglou found the correlation between burned area in the hotter and drier climate of California’s Central and South Coast to be “relatively weak”. Accordingly, they reported “Annual burned area did not change significantly in Central and South Coast.” That insignificant climate effect over half of California escaped the notice of journalists who only cherry-picked the researcher’s more alarming climate narratives. Most interesting, Williams and Abatzoglou suggested the lack of a climate-change correlation with California’s Central and South Coast burned areas was because fires there were “strongly manipulated by humans via ignitions, suppression, and land cover change.” 

Lightning is rare along California’s Central and South Coast, so nearly 100% of those fires are ignited by humans. As California’s population doubled since the 1970s, adding 20 million people, the probability of more human-started fires has increased. Unlike forested areas where fire suppression builds up deadly surfaced fuels, California’s Central and South Coast need to suppress fires. Due to more frequent fires caused by humans, shrublands are converting to grasslands.  The increased fine fuels of the grasslands more readily ignite and spread fire.  Furthermore, California’s natural climate undergoes wet years due to El Nino followed by dry La Nina years. Wet years make fine fuels more abundant. Thus fire suppression is needed to prevent more frequent fires caused by the conversion of shrublands to grasslands.

In contrast to the insignificant changes in burned areas in California’s southern half, Williams and Abatzoglou reported burned areas in the Sierra Nevada and the North Coast increased by more than 600%, which they attributed to human-caused climate change. They reported, “During 1896–2018, March–October Tmax [maximum temperature] averaged across the four California study regions increased by 1.81 °C, with a corresponding increase in VPD [ Vapor Pressure Deficit - a measure of atmospheric dryness] of 1.59 hPa (+13%)…The observed trends in Tmax and VPD are consistent with trends simulated by climate models as part of the CMIP5 experiments, supporting the interpretation that observed increases in California warmseason temperature and VPD have been largely or entirely driven by anthropogenic forcing.”

But how can only half of California’s fires be due to global warming and the other half not? All of California is “strongly manipulated by humans via ignitions, suppression, and land cover change”?  Were Williams and Abatzoglou straying from objective science? 

Part of the problem is their ill-advised use of a maximum temperature averaged for all California. Several studies have reported that maximum temperatures in the northern half of California have not exceeded the high temperatures of the 1930s. Because the early 20th century temperatures were deemed natural, unless recent temperatures exceed the natural 1930s, then human-caused warming is unlikely. Curiouser and curiouser, southern California has experienced temperatures that exceeded the 1930s. Yet there Williams and Abatzoglou did not find a significant effect from climate change. 

Regardless Williams and Abatzoglou claimed The clearest link between California wildfire and anthropogenic climate change thus far, has been via warming-driven increases in atmospheric aridity, which works to dry fuels and promote summer forest fire.”  Yet summer maximum temperatures, averaged from March through October, located in the vicinity of California’s big fires do not indicate global warming. For example, the August 2013 Rim Fire centered around Yosemite National Park, was California’s 5th largest fire and 2nd largest in northern California, burning 257,000 acres. It was started by a hunter’s illegal campfire that he let get away. Unfortunately, there is no cure for stupid. Nonetheless, Yosemite’s maximum temperatures were warmer in the early 1900s.  However, an in depth study of the Rim Fire found a strong correlation with the amount of shrubland interspersed with the trees.

Trend in Yosemite NP maximum summer temperatures


The November 2018 Camp Fire was California’s deadliest fire destroying the town of Paradise. It was also its 16th largest fire burning 153,000 acres. It was ignited by a faulty power grid during a strong Diablo wind event. Similarly, based on weather data from nearby Chico CA, maximum temperatures were higher in the 1930s.

Trend in maximum summer temperature for California's deadliest Paradise/Camp Fire


The Mendocino Complex Fire was California’s largest fire (since 1932). In July of 2018 it burned 459,000 acres. The source of human ignitions is still under investigation. Still, those fires were centered around the town of Ukiah which also reveals a cooling trend since 1930.


Trend in maximum summer temperatures for California's largest fire




In October 2017, the wine country’s Tubbs Fire was the 4th deadliest. It only burned 37,000 acres but high winds drove embers into the dwellings of the heavily populated outskirts of Santa Rosa. Again, global warming was irrelevant as Santa Rosa has experienced a cooling trend since the 1930s.



 
Trend in maximum summer temperature for deadly Santa Rosa/Tubbs fire


Still some people are determined to link catastrophic fires with climate change. So, they will suggest delayed autumn rains allow more late season ignitions or the fall fires to burn longer. In Williams and Abatzoglou’s abstract they claim, “In fall, wind events and delayed onset of winter precipitation are the dominant promoters of wildfire.” But their results found, “no allregion trend in onset of winter precipitation or October–November wetday frequency during 1915–2018.” As illustrated below by the October precipitation data for Santa Rosa, since 1900 there’s a 10% chance no rains will fall in October. Furthermore, October experienced more zero rainfall months in the early 1900s. A global warming caused delay in autumn rains has not yet been detected.

So, doing my best Greta Thunberg imitation, I say to climate alarmists, “How dare you misrepresent the causes of wildfires. How dare you imply less CO2 will reduce human ignitions and reduce surface fuels and the spread invasive grasses. Bad analyses lead to bad remedies! Your bad science is stealing Californian’s dreams and your false remedies distract us of from the real solutions. Young people and old alike, must demand better science and better journalism!”

October precipitation for Santa Rosa, CA



Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism





Friday, November 8, 2019

Protecting California’s Coast & Improving Our Environment

from What's Natural?  column

published in Pacifica Tribune November 6, 2019


Harbor built with Roman Cement



Imagine a seawall surviving 2000 years! Most modern seawalls have been built using Portland cement reinforced with metal rods. Counter-intuitively the metal reacts with seawater causing the concrete to corrode and crack within decades. But Romans developed a different kind of cement. Roman cement needed no reinforcement and has lasted 2000 years. Where no natural harbors existed, Romans created breakwaters and protective artificial harbors. By analyzing the composition of Roman cement, material scientists believe we can build similar long-lasting seawalls and breakwaters, and thus prevent coastal erosion that threatens human communities. Furthermore, scientists who worry about carbon dioxide emissions are likewise promoting Roman cement because it dramatically reduces emissions compared to the manufacture of Portland cement.

Currently, California’s Coastal Commission resists constructing sea walls. They argue seawalls prevent local erosion that is essential to providing sediments needed to maintain beaches. However, locally eroded sediments are often quickly removed to the deep ocean. It is the larger supply of sediments entering the ocean via rivers and streams that primarily determines the fate of a beach. The mining of San Francisco Bay sediments reduces that nourishing sediment supply. Furthermore, most California rivers and streams are dammed. That also reduces our sediment supply and blocks migrating salmon. Because California has historically suffered from natural 100-year droughts, some argue we need more dams to store water. But if there is no rain, there will be no water to store. But there is a promising solution - desalinization.

Developing technology has greatly reduced the cost of converting ocean water into fresh water. If desalinization supplies enough freshwater there would be no need for more dams and our communities would be resilient against inevitable natural droughts. And with an abundant supply of reliable freshwater, we would not need to divert water from critical ecosystems. And possibly, we could remove a few dams that now block sediments and fish. 

One criticism of desalinization had been that the process releases a plume of extremely salty water back into the ocean, which can have detrimental effects on local marine organisms. But we can diffuse and dilute that salty release. More importantly, concentrated sea salt can be harvested and sold at competitive prices. That not only reduces the salty outflow but reduces the cost of desalinization. Harvested salt could also replace supplies from salt ponds, hastening restoration of San Francisco Bay’s tidal marshes. 

Of course, desalinization plants must be constructed on the coast. To protect their buildings, long-lasting seawalls are required. Currently property owners located on fragile bluffs are dropping rip-rap boulders at the foot of eroding cliffs. They armor at their own expense, but not all owners can afford to do so. With a helter-skelter of armoring, waves get re-directed and focused onto unprotected sections, which accelerates local erosion. But coastal property owners are the first-line of defense against further erosion that could eventually undermine the entire community’s roads and infrastructure. The only reasonable solution is to build seawalls that protect coastal communities. Of course, property owners who enjoy their private coastal views should pay a greater share of that construction, but the community must also contribute their share to protect threatened community infrastructure.

Unfortunately, some people are urging coastal residents to “retreat” and “abandon the coast”. They’d rather spend billions moving homes and infrastructure inland. They’re petrified by the most highly-unlikely speculations that suggests a 10-foot sea level rise if Antarctica slides into the ocean. But research reveals Antarctica’s glaciers had retreated far more between 4000 and 6000 years ago. Temperatures were warmer then and still Antarctica remained stable. More importantly, during those warmer times the estimated rise in sea level was only 0.02 inches a year. More recently, research finds increasing snowfall is offsetting ice lost by Antarctica’s glaciers vulnerable to underwater melting.

Armoring fragile coastal bluffs, protects human habitat without harming local ecosystems. Beaches will survive if we maintain the natural sediment supply from local streams and rivers. The O’Shaughnessy seawall protecting San Francisco’s Golden Gate Park is a 90-year testament that beaches can still grow in front of seawalls. Creating breakwaters as the Romans had, can further protect beaches and prevent currents from removing beach sand. Secured ocean coastlines can support construction of desalinization plants and secure our freshwater requirements, reduce the need for dams, and reduce water diversions that would otherwise benefit wetland ecosystems. Desalinization can reduce groundwater extraction that’s causing coastal towns to subside below sea level. The sky is not falling. The sky is the limit for possible win-win solutions for both people and the environment.


 O’Shaughnessy seawall  on San Francisco's Ocean Beach, Built in 1929



Jim Steele is director emeritus of the Sierra Nevada Field Campus, SFSU and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism