Translate

Saturday, January 4, 2020

Hijacking Australian 2019 Bushfire Tragedies to Fearmonger Climate Change







As is customary now, whenever tragedy strikes, the internet buzzes with articles blaming climate change. Hijacking the tragic Australia’s bushfires was to be expected. For instance, Microsoft’s MSN website just published “Climate deniers are cooking themselves — and everyone else”. They wrote, “Fires get worse when things are hot, dry, and windy, and climate change has provided all of those conditions in abundance. The continent has warmed by about 2 degrees Fahrenheit (a bit over 1 degree Celsius) since the 1970s, and in keeping with the predictions of climate models, Australia has experienced steadily worse droughts and heat waves over the last 30 years. The current drought may end up being the worst in history — this spring was the driest ever recorded on the continent, and back on December 18 it set a new record for the hottest day ever measured with an average temperature across the entire country of 105.6 degrees.”

How truthful is MSN? Indeed, Australia is experiencing hot dry summer weather. The map below (Figure 1) shows that most of Australia experienced temperatures far above average for December 18, 2019. But curiously the east and west coasts, as well as northern Australia were experiencing temperatures several degrees below normal. If global warming was driving the extreme wildfire season, we would expect the worse fires to be located where temperatures were warmest. But as the map of wildfires reveals (Figure 2), the warmest regions had the least wildfires, while the most fires were happening in the cooler regions. Averaging Australia’s temperatures to deceptively blame global warming for the wildfires only obscures the regional temperature effects.




Figure 1 Australia December 18, 2019 temperature anomalies.  




Figure 2 Locations of Australia's 2019/2020 bushfires. https://www.newsweek.com/australia-wildfire-map-update-bushfires-sydney-new-south-wales-1480207

                                     
 
MSN’s climate fearmongers dishonestly claim “Australia has experienced steadily worse droughts.”  Climate fearmongers argue warmer temperatures will evaporate surface moisture more quickly and exacerbate droughts. But they have the tail wagging the dog. Australia’s Bureau of Meteorology’s illustration (Figure 3) shows the 1920s and 30s had experienced much worse droughts than recent decades.  Furthermore, during periods of low precipitation, drought conditions CAUSE higher temperatures. Without normal soil moisture to evaporate, solar radiation is no longer consumed as latent heat of evaporation, but instead, rapidly raises land temperatures.



Figure 3 Australia average annual precipitation from 1900-2018. http://www.bom.gov.au/climate/history/rainfall/?fbclid=IwAR2fUMmwkIr9NvJaxaNWpB1h8vaP8aNP9Aim27yGJ6r8xxcHc-lmuxdIFJg




 

The greatest 2019/2020 burned area is concentrated along the eastern coast in the states of New South Wales and Victoria. Both areas are known for habitat that is very susceptible to extreme fire danger. But are the recent fires worse than ever?  History says NO! In February 1851, the Black Thursday bushfires incinerated about five million hectares (about 1,900 square miles). Around 12 lives, one million sheep and thousands of cattle were lost.  Temperatures reached record extremes of about 47°C (117°F) in the shade.  In contrast, MSN attributes the 2019 December fires to a misleading average temperature across the whole country of 40.6°C (105°F).


If temperature and precipitation cannot be attributed to the increasing trend in wildfires, what other factors should be considered? As in California, Australia has experienced a tremendous increase in human ignitions. Arson is a huge problem. As government investigations reveal (Figure 4), deliberately set fires account for 64% of all ignitions, while only 11% of all wildfires are due to natural lightning ignitions.



Figure 4 Cause of wildfire ignitions. https://aic.gov.au/publications/bfab/bfab021?fbclid=IwAR0QhXu6Wpu_z4d1VPDTxkJoANeck7oZCqheKbf2z8Z7T2XIC646xHbDTbY


Furthermore to the north, tropical and subtropical regions are being invaded by foreign grasses that are easily ignited and provide greater surface fuel continuity allowing fires to spread over greater areas. Likewise, humans must manage forest floor fuel loads. The easiest solution is prescribed burns. However, that solution is often resisted because people do not want to experience the accompanying smoke. But until prescribed burns are allowed to be judiciously applied, the public becomes increasingly vulnerable to larger more severe wildfires as endured in 2019.

Bad analyses always promote bad remedies! Blaming rising CO2 concentrations and global warming is only misdirecting real efforts to minimize wildfire destruction. What Australia and the world needs to address is 1) human ignitions, 2) invasive grasses and 3) fire suppression that allows surface fuels to accumulate and enable large intense and destructive fires to wreak havoc like never before!


 

Jim Steele is director emeritus of the Sierra Nevada Field Campus, SFSU and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism.


Thursday, January 2, 2020

When Warming is Cooling!

published in the Pacifica Tribune, December 30, 2019


What’s Natural?

When Warming is Cooling!






The media has been awash with doomsday headlines, trumpeting a human-caused “climate crisis” simply because the estimated average global temperature has risen about 1.7°F since the end of the cold Little Ice Age 150 years ago. However, the Arctic has been the greatest climate outlier and its much warmer temperatures, averaging 3.6°F higher than the 1951–1990 average, has shifted the global warming average upwards. 

Furthermore, increased ventilation of stored ocean heat has contributed to much of the Arctic’s recent high temperatures. Ventilating heat warms the air but cools the ocean, and climate history reveals ventilating heat has caused several bouts of extreme warming. Over the past 100 thousand years Greenland’s ice cores recorded 24 extreme warming episodes when air temperatures suddenly rose 14°F - 28°F in a few decades or less. More recently over 4000 years in the Canadian Arctic, decades of rapid ice loss accompanied air temperatures 3°F -10°F warmer than today, quickly followed by centuries of more sea ice and colder temperatures. Although the physics of those dramatic warming events are still in play today, the good news is: warming climates minimize such dramatic warm events.

The most extreme Arctic warming episodes were named “Dansgaard-Oeschger events” in honor of two scientists who discovered them. Although Greenland experienced the greatest warming, Dansgaard-Oeschger events affected climates globally. Ventilating warmth changed global atmospheric circulation, shifting European forests and altering California’s ocean currents. Counter-intuitively, each Dansgaard-Oeschger extreme warming event happened during the last Ice Age when the northern hemisphere was covered with great ice sheets and global temperatures were 5°F-14°F colder than pre-industrial times

The climate dynamics responsible for these dramatic warm events is easily demonstrated, and a simple experiment might ease the troubled minds of students plagued with “climate crisis” anxiety. Heat a large covered pot of water. Measure the air temperature above the pot’s lid. If you don’t have a thermometer, simply hold your hand above the lid. Then remove the lid and feel the escaping heat. In a similar fashion, Arctic sea ice (and a surface layer of fresher water) act like the pot’s lid. Remove the ice and the air dramatically warms. Conversely, extensive ice-cover will cool the air but warm the underlying ocean.

Ocean currents constantly transport tropical heat towards the poles, making Arctic climates far warmer than possible if only the sun and greenhouse gases controlled the Arctic’s climate. Warm dense “Atlantic water” constantly enters the Arctic Ocean. Some heat ventilates but some is stored at intermediate depths. Currently there is enough stored heat to melt sea ice many times over. The extensive sea ice of the last Ice Age insulated the Arctic Ocean causing heat to increasingly accumulate. Eventually that great store of heat melted sea ice from below, allowing heat to vent and greatly warm the atmosphere. 

The same dynamics triggering past Dansgaard-Oeschger events provide insight into today’s Arctic warming. During the 1970s and 80s, measurements over an ice covered Arctic recorded cooling air temperatures prompting researchers to publish, “Absence of Evidence For Greenhouse Warming Over the Arctic Ocean In the Past 40 Years.” Those cooling air temperatures did not reduce modern Arctic sea ice. But other factors did.

Arctic winds can trap ice in the Arctic causing ice to thicken. However when the winds shift, thick sea ice will be blown out and melt in warmer Atlantic waters. The resulting open water and thinner ice allows more Arctic subsurface heat to ventilate. In addition to wind-driven ice loss, satellites reveal the greatest area of open Arctic waters corresponds to the pathway where warm dense tropical waters, via a branch from the Gulf Stream, enter the Arctic and melt sea ice from below. It is now widely believed high inflows of warm Atlantic water helped to melt sea ice and trigger the Dansgaard-Oeschger warm events, just as oscillating warm inflows now melt Arctic sea ice today.

A new monitoring system established off the coast of southern Florida in 2002, reports warm water increasingly flowed towards the Arctic until 2008, but since then warm inflows have been decreasing. So, we will now witness a natural experiment that tests competing climate hypotheses. If the increasing inflows of warm water caused most of the Arctic’s recent sea ice loss, then reduced flows should allow sea ice to recover within another decade. In contrast, the competing hypothesis is warming from increasing CO2 concentrations will cause an ice-free Arctic Ocean. Early predictions of an ice-free Arctic have already failed. But many still predict an ice-free Arctic by 2030 or 2050. Of greater interest, if sea ice does recover, will the reduction of Arctic heat ventilating to the atmosphere, also reduce global temperatures and avert the ballyhooed “climate crisis”?


Jim Steele is director emeritus of the Sierra Nevada Field Campus, SFSU and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism





Wednesday, December 18, 2019

Why Modern Famine Predictions Failed


 What's Natural column

published in the Pacifica Tribune Wednesday December 18, 2019

Victims if 1876 Famine in India


When I graduated high school in 1968 there were rampant predictions of environmental collapse and eco-catastrophe theories flourished. The highly influential Stanford scientist Dr Paul Ehrlich dominated the doomsday media stating, “Most of the people who are going to die in the greatest cataclysm in the history of man have already been born.” Predicting global famine in 1970 this PhD wrote, “The death rate will increase until at least 100-200 million people per year will be starving to death during the next ten years.”

Why was Ehrlich’s apocalyptic predictions so wrong? Ehrlich believed the promise of the “green revolution” via high yield crops and better cultivation practices would never offset the needs of a growing human population. Indeed, the early distribution of high yield “miracle seeds” had failed to stave off famines during the cooler 1960s. But then the earth began to warm, there was a CO2 fertilization effect and the growing season increased in concert with great leaps forward in genetics, biotechnology and agricultural innovations. 

From NASA : CO2 fertilization -  change in leaf area across the globe from 1982-2015. Credits: Boston University/R. Myneni


To produce high yield crops, early researchers had simply cross-bred compatible plants possessing desirable traits. Genetic manipulation via selective breeding had been done for hundreds and thousands of years as humans successfully created a wide variety of farm animals, or dog breeds like Chihuahuas and Great Danes from their ancestral wolf. Similarly, selective breeding transformed a scraggly grass into modern corn. Later, modern “mutant breeding” evolved in the 20th century as seeds were exposed to gamma rays searching for “hopeful monsters.” The public fails to recognize they likely consume many of the 3,000 crop varieties created via mutant breeding such as high yield barley, oats and grains commonly used in making premium beers and whiskey. For chocolate lovers, mutant breeding created a cocoa tree resistant to deadly fungus. At California’s UC Davis, scientists irradiated rice seeds to create a high yield variety that reached supermarket shelves in 1976 as “Calrose” rice, which still dominates over other varieties in many regions of the Pacific.

Next, the discovery of “restriction enzymes” in 1970 allowed scientists to successfully engineer organisms by surgically removing useful genes from one species and placing them into another species. Such genetically engineered plants now comprise most of today’s soybean, corn and cotton crops, as well as some varieties of potatoes, apples, papayas and sugar beets. Yields increased as engineered crops were better able to flourish under stressful conditions. Plants became more resistant to specific insect and fungal pests, producing greater yields without pesticides. Despite the tremendous benefits of genetically modified crops (GMOs), many people remained distrustful of possible detrimental health effects. More radical groups condemned GMO’s simply because they generated profits for large businesses. 


Golden Rice Adds Vitamin A


The strange battle between pro and anti GMO groups is best illustrated in the Golden Rice saga. Golden Rice was first engineered in 1990s as a non-profit attempt to prevent blindness and premature deaths from vitamin A deficiency. That deficiency afflicted 250 million children, mostly in Asia, and killed more than 200,000 people a year. Because rice is those children’s primary food source, 2 German scientists, Ingo Potrykus and Peter Beyer, removed a gene, from daffodils that produced beta-carotene and carefully inserted the gene into rice.  Beta carotene is the key building block for vitamin A and gives the rice its golden color. Later a more efficient gene from corn was used. Potrykus and Beyer also insisted the technology to create Golden Rice be donated freely. So, the biotech company Syngenta waived its right to commercialize the product. The humanitarian benefits of free Golden Rice could not be more clear.

Nonetheless opponents of GMOs, led primarily by Greenpeace, vilified Golden Rice. Greenpeace lobbied countries around the world to prevent legalization of Golden Rice by simply generating as much fearful speculation about “imagined” health repercussions.  Greenpeace was also fanatical about fighting biotechnology companies. They justified their propaganda campaign against Golden Rice claiming, "Corporations are overhyping golden rice benefits to pave the way for global approval of other more profitable genetically engineered crops.”

However, as Golden Rice continued to be proven safe, a letter signed by more than 100 Nobel laureates, accused Greenpeace of leading a “fact-challenged propaganda campaign against innovations in agricultural biotechnology." They demanded Greenpeace end its campaign against GMOs. Patrick Moore, a co-founder of Greenpeace, had previously left the organization because its original good intentions were being subverted by extremists. Moore bemoaned, "They're linking Golden Rice with death, which scares parents into not wanting the technology developed”. Such false propaganda so infuriated Moore he created an alternative movement – Allow Golden Rice Now. As the outcry of support grows, countries are now increasingly moving towards legalizing Golden Rice. 

Ehrlich’s doomsday predictions were incomprehensible. That environmental groups like Greenpeace vehemently propagandized against the technologies that prevented Ehrlich’s doomsday is even more unintelligible. But then again, there are many activists who only see humanity as earth’s scourge and best eliminated.



Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism


Wednesday, December 4, 2019

Why Worse Wildfires part 2

What’s Natural?

Why Worse Wildfires?  Part 2

Fighting the Ranch Fire


Why worse wildfires? The short answer is more humans cause more wildfire ignitions in altered landscapes. Since 1970, California’s population doubled, adding 20 million people. As more human habitat was developed, the increasingly disturbed landscape quickly became covered in easily ignitable invasive grasses (see part 1). To protect human habitat, fires were suppressed and ground fuels increased. Development also expanded a vulnerable electric grid. Furthermore, more people increased the probability of careless fires and more innocent accidents. And sadly, a larger population added more arsonists.

During a typically warm and dry July day, a rancher was innocently driving a stake into the ground to plug a wasp’s nest. Surrounded by dry grass, the hammer’s spark ignited a devastating inferno named the Ranch Fire. Despite sensationalist’s hype, global warming had not made the grass drier. Grass becomes highly combustible in just a few hours of dry weather. And like most of northern California, there has been no warming trend for maximum summertime tempertures. Based on Western Regional Climate Center data, maximum summer temperatures in the Mendocino area had cooled by 3°F since the 1930s. The rapidly spreading Ranch Fire soon merged  with a different fire to form the Mendocino Complex Fire, California’s largest documented fire.

Similarly, a highway accident sparked roadside grasses that kindled northern California’s 7th largest fire, the Carr Fire

Summertime cooling trend at Ukia,  Mendocino County, California


Careless fires cannot be considered accidents and offenders should be held accountable. A hunter’s illegal and improperly attended campfire caused the August 2013 Rim Fire, centered around Yosemite National Park. It was California’s 5th largest fire. 

Governments and utility companies should likewise be held accountable for carelessly maintaining our electric grids. An electric spark ignited California’s deadliest fire, the Camp Fire which destroyed the town of Paradise and killed 85 people. As a 2018 research paper estimates, “Since the year 2000 there’ve been a half-million acres burned due to powerline-ignited fires, which is five times more than we saw in the previous 20 years.” 

More disturbing is the number of fires started by arson. According to the U.S. Fire Administration, nationally, as in California, one in every five brush, grass, or forest fires since 2007 were intentionally set. Arsonists have been recently charged for some of California’s 2019 fires. Arson accounted for 55% of Kentucky’s fires and is the leading cause of Florida’s fires. Because arson is so difficult to prove, arson statistics are probably underestimated. So, experts in Australia combine arson and “suspicious” fires to argue half of Australia’s fireswere likely intentionally set. That means each year 31,000 Australian bushfires are intentionally ignited. And as in the American west, Australia’s bush fires have been increasingly fueled by invasive grasses like Buffel grass.

Wildfires caused by natural lightning ignitions, peak during the summer months of July and August, and become virtually non-existent in the autumn and winter. In contrast, human ignitions have created year-long fire seasons. Counter-intuitively, California experiences the most dangerous fire weather during the cooler and wetter seasons.  As seasonally cold air settles in over the high mountain deserts in autumn and winter, episodes of high winds, known as the Santa Ana and Diablo winds, flow downslope. Sinking air warms 5°F for every 1000-foot drop in elevation so these downslope winds can raise lowland temperatures 25°F in just a few hours. That warming causes relative humidity to fall, so these winds rapidly suck moisture out of whatever vegetation it passes over. In combination with faster spreading embers, fires burn 2 to 3 times more area during high wind events.

Human vs Lightning Wildfire Ignitions from Balch 2017


Under natural conditions, seasonally extreme winds never coincided with the season of abundant lightning. But due to human ignitions there has been an increased probability of more ignitions occurring during strong cool-weather winds. California’s 2nd biggest fire, the Thomas fire, was ignited in December by a downed power line during high winds. The third largest fire, the Cedar Fire was ignited in October by a lost hunter who carelessly lit a signal fire. California’s deadliest fire, the Camp Fire, was ignited by a powerline and fiercely spread due to a November high wind event.  

Climate change does not ignite fires. Climate change does not affect how quickly dead grasses and bushes can dry. Climate change may affect the winds, but any warming, natural or human, would reduce those extreme winds. Regards California’s worst fires, a US Geological Survey’s wildfire expert states, “Some will argue that it’s climate change, but there is no evidence that it is. It’s the fact that somebody ignites a fire during an extreme [wind] event.”

Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism



Friday, November 22, 2019

Why Worse Wildfires? Part 1


What’s Natural?

published in the Pacifica Tribune, November 20, 2019

Why Worse Wildfires? Part 1

SAGE GROUSE LEKKING



There are several theories trying to explain the recent uptick in wildfires throughout the western USA. Some scientists blame increased human ignitions. Others suggest accumulating surface fuels due to a century of fire suppression. Others argue landscape changes and invasive grasses have amplified the amount of easily ignited vegetation, while still others blame climate change. What’s the Sage Grouse connection? Like human communities, the Sage Grouse’s habitat is being threatened by fast spreading wildfires, and that increase in bigger wildfires in sagebrush country is due to invading annual grasses, like cheatgrass.

Historically hot dry sagebrush habitat rarely burned (just once every 60-100 years) because slow growing, patchy sagebrush only provides scant surface fuels incapable of supporting large and frequent fires. But the invasion of introduced annual grasses, like cheatgrass, has changed all that. As one wildlife researcher lamented, The color of Nevada has changed from a sagebrush silver gray to a cheatgrass tawny brown since the 1990s”.  Likewise, in the 1800s California’s hills were covered with perennial grasses that stayed green during the summer. Now California’s hills are golden brown as highly flammable annual grasses have taken over.



Cheat grass-dominated sagebrush habitat now burns every 3-5 years, up to 20 times more frequently than historic natural conditions. Extensive research on the effects of cheat grass found habitats with high cheat grass abundance are “twice as likely to burn as those with low abundance, and four times more likely to burn multiple times between 2000-2015.” What makes cheatgrass such a problem?

Invading annual grasses germinate earlier in the season and deprive the later-germinating native grasses of needed moisture. These foreign grasses die after setting seed, leaving highly flammable fuels that can burn much earlier in the year and thus extend the fire season. Eleven of the USA’s 50 biggest fires in last 20 years have been in Great Basin sagebrush habitats, where invasive cheatgrass is spreading. Nevada’s largest fire was the 2018 Martin Fire. Rapidly spreading through the cheat grass, it burned 439,000 acres, a burned area rivaling California’s largest fires in recorded history



Cheatgrass in Juniper Woodland




The 2012 Rush Fire was California’s 4th largest fire since 1932, burning 272,000 acres of sagebrush habitat in northeastern California. It then continued to spread burning an additional 43,000 acres in Nevada. The 2018 Carr Fire was California’s 7th largest fire and threatened the town of Redding, California. It started when a towed trailer blew a tire causing its wheel rim to scrape the asphalt. The resulting sparks were enough to ignite roadside grasses. Grassfires then carried the flames into the shrublands and forests, where burning grasses served as kindling to ignite less-flammable trees. Likewise, grasses were critical in spreading northern California’s biggest fires. In southern California, as humans ignite more and more fires, shrublands are being converted to more flammable grasslands.

Wildfire experts classify grasses as 1-hour fine fuels, meaning dead grass becomes highly flammable with just one hour of warm dry conditions. When experts estimate impending fire danger, they calculate the extent of a region’s fine fuels to determine how fast a fire will spread. The amount of small diameter fuels like grasses that can dry out in an hour, as well as twigs and small branches that dry out within 10 to 100 hours of dry weather, determine how fast the winds will spread a fire. It does not matter if it was wet and cool, or hot and dry during previous weeks or years. Just one hour of warm dry fire weather sets the stage for an explosive grass fire. Decades of climate change are totally irrelevant.

Some scientists point out that certain logging practices also spread “invasive grasses”. For that reason, California’s Democrat congressman, Ro Khanna, has been arguing that the U.S. Forest Service policy to clear cut after a wildfire is making California’s forest fires spread faster and burn hotter by increasing the forest floor’s flammable debris. Khanna warns, “Because we don’t have the right science, it is costing us lives, and that is the urgency of getting this right.” 

Bad analyses promote bad remedies and blaming climate change has distracted people from real solutions. The “cheatgrass” problem will continue to cause bigger fast-moving fires no matter how the climate changes. But there are several tactics that could provide better remedies. Holistic grazing that targets annual grasses before they set seed is one tactic. Better management of surface fuels via prescribed burns is another, as well as more careful logging practices. And re-seeding habitat with native perennial grasses or sagebrush could help shift the competitive balance away from cheatgrass. In combination with limiting human ignitions, (see part 2), all those tactics may ensure healthy populations of Sage Grouse living alongside safer human communities.

Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism


Monday, November 18, 2019

Venice and Unenlightened Climate Fearmongering



Venice and Unenlightened Climate Fearmongering

 
Venice flooding 2019




Venice is composed of a hundred linked islands and sits in the center of the shallow Venice Lagoon. Island elevations are low and easily flooded during storms. The Great Flood of 1966 was the worst on record. Since then, Venice has been working to avert the next inevitable flood. But because its flood control projects were fraught with corruption and other difficulties, the government failed to prevent the 2019 flood. So, now experiencing its 2nd greatest flood, the mayor covered his political derriere and immediately blamed climate change. But that’s a tactic typical for politicians these days. In California, ex-governor Jerry Brown vetoed a bipartisan bill to secure the electrical grid. Shortly thereafter power-line sparks ignited some of California’s biggest wildfires, so of course Brown blamed climate change to disguise his policy failures. 

Figure 1. Venice Lagoon and its 3 inlets


As seen in Figure 1 above, sea level rise in the Venice Lagoon is modulated by how much water from the Adriatic Sea enters the lagoon via 3 inlets and how quickly it flushes out again. To prevent further flooding, Venice began designing the MOSE project, which would construct inflatable barriers that could be deployed when weather conditions predicted threatening inflows from the Adriatic Sea. High inflows from the Adriatic Sea are driven by the strength of the Sirocco and Bora winds that cause local sea level to surge. 

Increasing Venice’s vulnerability, the land has been sinking. Dwarfing the 1.4 millimeters per year of estimated sea level rise, from 1930 to 1970 Venice sank at the rate of 2.3 millimeters per year, largely due to ground water extraction. After addressing that problem, the rate of sinking slowed, but Venice continues to sink at a rate of 1 millimeter per year.  Furthermore, due to alterations of the lagoon’s basin, the amplitude of tides has been changing, which accounts for 20% of the rise in extreme sea level events. That tidal effect was largely due to alteration of flows through the inlets due to dredging for ship traffic, and alterations from the MOSE project.

Venice Great Flood of 1966


As has become typical for every catastrophe, media outlets shamelessly and mindlessly blame climate change for Venice’s flooding.  Others, like Dr. Marshall Shepherd writing for Forbes, attempted to appear more objective by acknowledging many factors had contributed to the flooding. But Shepherd’s real intent was to ensure that people would still blame climate change, at least a part, and that skeptics were biased by only focusing on Venice’s sinking land. But there is much more to the skeptics’ arguments. Furthermore Dr. Shepherd failed to provide any support for his climate change claims.  But that is to be expected as the evidence provides very little support for Venice’s mayor or Shepherd. 


If climate change had really played a significant role, then we would expect the flooding to be worse in 2019 when compared to the more “natural flooding” in 1966. But a comparison of floodwaters inundating Doge’s Palace (see above) suggests the flooding was slightly worse in 1966. Official measurements likewise determined flood levels in Venice Lagoon peaked at 74 inches, shy of the 1966 record of 76 inches. The climate change argument is weakened further when it is understood that the 1966 flood happened during a low tide, in contrast to the 2019 flood that happened during an extreme high tide. Furthermore, there is no correlation with global warming as the November 1966 flood happened when Venice experienced its coldest temperatures since 1924. Recent Venice temperatures are slightly less than the 1950s (Figure 2).


Figure 2 Venice Temperature Trend

The Venice Lagoon is situated at the northernmost end of the Adriatic Sea. The Adriatic Sea is bordered by mountains on both its eastern and western boundaries. That geography creates a funnel effect. Each autumn the Sirocco Winds begin to intensify. These winds drive warm air from Africa northward, which in turn pushes Adriatic Sea water northward up the “funnel”. The end result is sea water piles up in front of the 3 inlets and begins flooding the shallow Venice Lagoon. Stronger winds drive greater flooding. And if the winds are strong enough, they temporarily prevent sea water from exiting the lagoons, causing sea level to rise even higher. 

Naturally it would be natural to ask if climate change has caused an increasing trend in the Sirocco Winds. But there has been no trend. 





We should also analyze how much has sea level rise affected Venice?  It could certainly be argued that rising sea levels since 1900 contributed about 100 millimeters (4 inches) to the 1966 Great Flood as there was a steady rise in sea level between 1900 and 1970. But between 1970 and 2000 the Venezia (Venice) tide gauge shows sea level peaked around 7150 millimeters and then plateaued (Figure 3). Unfortunately, that tide gauge then moved to a new location designated Venezia II. There sea level began at a lower elevation and rose from 2001 to 2010, again plateauing just under 7150 millimeters (Figure 4). 


Figure 3  Venezia (Venice) Sea Level 1910-2000


Figure 4 Venezia II (Venice) Sea Level 2001-2016


Because various parts of Venice are sinking at different rates, it is difficult to know how much the new tide gauge location affected new estimates of sea level change. However, due to the uncertainty caused by Venice’s sinking land, researchers typically compare Venice sea level trends to neighboring Trieste in the far northeast corner of the Adriatic Sea. There the land appears to be more stable. Surprisingly, the Trieste sea level trend has been declining since 2000 (Figure 5). So, it would appear impossible to attribute sea level rise in the Adriatic Sea and climate change to the 2019 Venice flood.

Figure 5 Trieste Sea Level 2001-2016


However, there is another factor to consider. The winds in the northern Adriatic Sea cause sea levels to oscillate from east to west across the Adriatic’s northern basin. When sea levels fall around Trieste, they often surge around the Venice Lagoon. That sea level surge is associated with higher sea levels in the lagoon. Thus, at least in part, higher sea levels in the Venice Lagoon are driven by an ocean oscillation that creates higher sea level surges. And when that oscillation coincides with strong Sirocco Winds, a sinking Venice should expect more flooding.

In contrast, it’s unclear what effect is caused by global warming. Perhaps it’s negligible. Unfortunately for the public, that doesn’t stop media outlets from falsely hijacking the hardships in Venice to push a climate crisis. Alarmists continue to falsely suggest that every catastrophe has been partly driven by CO2 global warming.  Sadly, as savvy propagandists know, if you tell a big enough lie often enough, people will start believing the lie.



Jim Steele is Director emeritus of San Francisco State’s Sierra Nevada Field Campus and authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism