Translate

Tuesday, August 24, 2021

Drying and Warming Our Earth

My short video "Drying and Warming of Our Earth" describes how enlightened land management can make landscapes more resilient to drying and heatwaves. It reports the most recent research showing heatwaves are not made worse by rising CO2 but by altering the earth's surface. https://www.youtube.com/watch?v=nn2G-DZVm-s&t=47s

Friday, August 20, 2021

Documenting Youtube Manipulation

Dishonest manipulation of skeptical science has long been suspected. But without solid evidence, allegations of fraudulent manipulation are dismissed as “just paranoid speculation” or “conspiracy theory”. By documenting how "malleable" the youtube stats are, I demonstrate how easily such manipulations are performed. I hope this can encourage a Congressional investigation into their tactics.

Background information: I am an ecologist who served as director of San Francisco State University’s Sierra Nevada Field Campus for 25 years. I studied ecosystems and wildlife and how they are affected by landscape changes, regional climate change, hydrology and wildfires. On August 17, 2021 I gave a science based talk to the Rotary Club of Novato, CA regards Wildfires and Climate which clearly showed climate change is not a significant factor, but managing the ecosystem properly is the key to environmentally sound adaptation. I converted my Powerpoint presentation to a Video and uploaded it to Youtube with the title Understanding Wildfires and Climate and How we Must Adapt and posted that link to my blog, Facebook, and WUWT.

I first noticed on day one of the video’s publication that the number of LIKEs suddenly dropped from 22 to 15 in just a few hours. Not sure if I was imagining it, or Youtube was trying to downgrade the video for debunking the bogus narrative that wildfires were due to a “climate crisis,” I began checking the stats every few hours and taking screenshots to document any changes (see below). I also changed the thumbnail name from “Understanding Wildfires and Climate and How we Must Adapt” to “Prevent Looming Megafires” which was my main reason for this video, but a subsequent google search for that title did not list the video at all, so my thumbnail name got returned to “Understanding Wildfires and Climate and How we Must Adapt” which was always part of the video description.

I believe the # of views and # of Likes determine the way Youtube/Google algorithms guide the public to finding any video via “Youtube searches” or “Other YouTube features” which presently accounts for a total of 5% of my video’s views.

Here is the screenshot documentation showing undeniable dishonest manipulation.

1. screenshot August 18, 8:33 PM ; a) 741 Views b) 22 Likes

2. screenshot August 19, 7:33 AM ; a) 782 Views b) 20 Likes (Likes reduced by 2)

3. screenshot August 19, 9:51 AM ; a) 768 Views b) 20 Likes: (views reduced by 14, Likes still only 20)

At this point I tweeted about Youtube’s manipulation of video stats and that Tweet had over 4000 views. Coincidentally, Likes shortly thereafter jumped up by 3 but with only 1 added view. Hmmm?

4. screenshot August 19, 10:07 AM; a) 769 Views b) 23 Likes: (1 more View, Likes up by 3)

5. screenshot August 19, 10:25 AM; a) 770 Views b) 24 Likes: (1 more View but still less that 782 at 7:30 AM, Likes up by 1)

6. screenshot August 19, 10:37 PM; a) 850 Views b) 28 Likes (Views and Likes gradually increased throughout the day

7. screenshot August 20, 5:47 AM; a) 863 Views b) 22 Likes. (Views up, Likes reduced by 6). So again, I tweeted about Youtube manipulation!

8. screenshot August 20, 8:12 AM; a) 860 Views b) 30 Likes. (Views down by 3, but Likes raised up by 8 in 2 hours to 30 total). Hmmm, do they monitor my tweets?

Overnight August 29-30 the reduced Likes by 20

I contimue to screenshot persistent removal of Likes and Views

Tuesday, August 10, 2021

part 2: National Public Radio’s Misinformation on Wildfires and Climate

In 1994 the Harvard Business Review insightfully wrote, “The news media and the government are entwined in a vicious circle of mutual manipulation, mythmaking, and self-interest. Journalists need crises to dramatize news, and government officials need to appear to be responding to crises.” So it’s no surprise that NPR headlines hyped, Climate Change Is Driving Deadly Weather Disasters From Arizona To Mumbai claiming, “Heat waves. Floods. Wildfires…We know that climate change is to blame.” Nor is it surprising that the New York Times wrote, “climate change is a key culprit” for the American west’s wildfire disasters. And right on cue, the Intergovernmental Panel on Climate Change warns “It’s code red for humanity”. But governments promise to fix the “crisis” by controlling your energy and social policies.

Is the media honestly following the science, or once again engaging in “mutual manipulation, mythmaking, and self-interest”?

We ecologists know that wildfires and climate change are extremely complex issues with many contributing variables. Bigger, more intense, and more frequent wildfires and a longer fire season are similarly produced by different variables. Distinguishing the “key culprit” is not as easy as a NY Times’ opinion suggests. Thus, ecologists are trained to maintain multiple working hypotheses to determine which hypothesis best fits the evidence. This scientific process has grave social importance. Wrong analyses always produce wrong remedies, and bad remedies can be worse than the problems they seek to fix. So here are some major competing hypotheses explaining recent upticks in wildfires.

1. Wildfire Suppression

In the 1800s the western USA landscape was a mosaic of open meadows and patchy forests. That mosaic was maintained by frequent wildfires, either ignited naturally or by native Americans. The patchy mosaic created natural fire breaks that prevented the spread of megafires. Frequent fires also reduced both ground fuels that cause more intense fires and ladder fuels that carry fire into the canopies. All wildfire experts agree the policy of wildfire suppression from 1900 to 1970 allowed a) the explosive accumulation of ground fuels and ladder fuels and b) reduced patchiness which allowed fuel continuity across larger swathes of forests. The increase in wildfires since 1970 coincides with the relaxation of fire suppression policies when ecologists emphasized that low-intensity frequent fires are needed to maintain forest health and biodiversity.

A US Forest Service report provides photographic evidence of the landscape patchiness in 1909 before wildfire suppression began versus the increase in ground and ladder fuels and dense connected forests that evolved by 1979 due to fire exclusion.

2. Human Ignitions

Since 1900, California’s population increased from 1.5 million people to more than 39 million. More people inevitably produce to more accidental ignitions. Eighty-four percent of all USA fires are ignited by people. A larger population also requires a larger electrical grid. California’s 2nd largest fire (Dixie) was ignited by an electrical spark, as was the deadliest ever Camp fire and the 4th deadliest Tubbs fire. Three of California’s largest fires (Mendocino, Rim, & Carr fires) were caused by other human accidents.

While the natural fire season, ignited by lightning, extends from May through September, peaking in hot and dry July, human ignitions extend the fire season throughout the entire year. In California this is especially dangerous, as ignitions during the winter are rapidly spread by fierce Santa Anna and Diablo winds. These winds begin to ramp up in October as increasingly cold seasonal temperatures in the high mountain deserts push dry air down across California towards a relatively warmer Pacific Ocean. A downed powerline in December ignited a fire spread by the Santa Ana winds that devastated southern California with its 8th largest fire (Thomas Fire). California’s 4 deadliest fires, Camp, Griffith Park, Tunnel, and the Tubbs wine country fires were all rapidly spread by October and November winds. The 2nd deadliest Griffith Park fire was accidentally started in October 1933.

3. Altered Grasslands

Grasses and shrubs produce small diameter “fine fuels” that rapidly dry within 1 to 10 hours of dry weather. As discussed in part 1, those 1 hour fuels are highly flammable even during freezing temperatures and easily ignited. As seen in Fig 3. (from Keeley, 2015) grassland fires account for the largest burnt areas. Some fire experts argue the tremendous drop in wildfires during the early 20th century was not only due to fire suppression policies, but severe overgrazing that reduced the grasslands ability to spread fire. Similarly, a recent NASA report determined wildfires globally had “declined by 24 percent between 1998 and 2015.” They attributed this global decline to a change in landscapes across the African savannahs. Before, to support grazing, fires had been intentionally set to keep grasslands free from invading shrubs and trees. As villages and homes intruded into the savannah and cultivation of permanent crop fields replaced grassland, the use of fire was reduced.

In 1992, ecologists from UC Berkeley and Stanford University wrote the seminal Biological Invasions by Exotic Grasses, the Grass/Fire Cycle and Global Change. Grasses create conditions that favor fire by producing a “microclimate in which surface temperatures are hotter, vapor pressure deficits are larger, and the drying of tissues more rapid than in forests or woodlands.” An invasion of alien grass species “provides the fine fuel necessary for the initiation and propagation of fire. Fires then increase in frequency, area, and perhaps intensity. Following these grass-fueled fires, alien grasses recover more rapidly than native species and cause a further increase in susceptibility to fire.” This cycle had been well known to land managers who seeded alien grasses to increase fire frequency and intensity to suppress woody species.

The grass/fire cycle has altered landscapes across the world. In Hawaii alien grasses thoroughly filled the spaces between native shrubs, providing continuous layers of fine fuel. Prior to a 1960’s invasion, only 27 fires were recorded in 48 years, each burning an equivalent of 8 football fields. In the 20 years following invasion, twice as many fires ignited each burning an average of 400 football fields. In western North America the invasion of cheat grass increased the frequency of fires in Idaho shrublands from once every 60-110 years to every 3-5 years. In eastern Oregon land dominated by cheatgrass is considered 500 times more likely to bum than other landscapes. 

In the Great Basin deserts, due to low fuel abundance, sagebrush ecosystems burned just once every 60 -100 years. Now cheat grass-dominated sagebrush habitat burns every 3-5 years, up to 20 times more frequently than historic natural conditions. Eleven of the USA’s 50 biggest fires in last 20 years have been in Great Basin sagebrush habitats, where invasive cheatgrass is spreading. Nevada’s largest fire was the 2018 Martin Fire. Rapidly spreading through the cheat grass, it burned 439,000 acres, a burned area rivaling California’s 4th largest fire in recorded history. Additionally, a greater frequency of fires due to the combination of the grass/fire cycle and increased human ignitions has caused large areas of California shrublands to convert to grasslands.

4. Natural Climate Cycles

Most media, like the NY Times, seek out the researchers whose research focuses on finding a connection between a climate change crisis and wildfire crises. So typically the media quote Park Williams, John Abatzoglou, Daniel Swain, or Kevin Trenberth. Park Williams summarizes, “This climate-change connection is straightforward: warmer temperatures dry out fuels. That statement is true, but so is the converse. Drier conditions also dry out fuels and raise temperatures. So, blaming warmer temperatures maybe the tail wagging the dog. Furthermore, it is bad science to apply a 2°F rise in global temperatures to the measured local temperatures where fires ignite. In the USA, 36% of the weather stations with 70+ years of data report cooling trends. It’s also bad science to use average daily temperatures. Rising minimum temperatures are associated with growing populations and may still be below the dew point which would moisten ground fuels. It’s the maximum temperatures that dry out fuels.

For example, in the region of the huge Mendocino-Complex fire, which was accidentally ignited in dry grasses, annual maximum temperatures have cooled since the 1930s as recorded in the US Historical Climate network. Cooling maximum temperatures holds true throughout northern California.

Contrary to Williams’ suggestion, dryness does not depend on temperature as witnessed by the warm wet tropics or cold dry tundra. The Sahara Desert was driest during the depth of the Ice Age but converted to the moist Green Sahara as temperatures warmed. Dry conditions are mostly a function of how atmospheric circulation transports moisture from the oceans to the land, and its atmospheric circulation that makes the American west so dry and the eastern USA so moist. The most important modulator of that circulation is the natural El Nino cycles (ENSO) and the Pacific Decadal Oscillation (PDO) that shift rains and drought northward and southward. Researchers found “combined warm phases (positive PDO during El Nino) co-occurred with large fires in the central and northern Rockies, while the combined cool phases (negative PDO during La Nina) appeared to promote large fires in the southern Rockies. Almost 70% of large fires in Rocky Mountain National Park burned during La Nina events that coincided with a negative PDO, although these phases co-occurred during only 29% of the 1700-1975 AD period.”

Appropriately, the recent uptick in forest fires coincides with the natural 21st century shift of the PDO to its negative phase and the increased frequency of La Nina conditions. This natural cycle of droughts and fires forced Williams to find a narrative that synthesized the scientific evidence of La Nina effects with modeled speculation of a climate crisis induced wildfires. In 2014 Williams wrote, “The southwestern United States (SW) experienced extreme drought in 2011, related at least in part to a La Niña event in the tropical Pacific Ocean. The 2011 SW drought event was accompanied by record breaking total burned and record-size ‘‘megafires’’ in the forests of eastern Arizona and northern New Mexico.” He then conflated model myth-making writing, “Model projections developed for the fifth phase of the Coupled Model Intercomparison Project suggest that by the 2050s warming trends will cause mean warm-season vapor pressure deficit to be comparable to the record high VPD (dryness) observed in 2011.

But climate models have done an extremely poor job of modeling drought. Michael Wehner published the graph below, which was also featured in a National Climate Assessment. The observed (red) Fractional Extreme drought area over the USA and Mexico was clearly greatest during the 1930s and attributed to landscape changes and natural cycles. The second worse drought extent occurred in the 1950s. Although modelers already knew the results, their CO2 driven model results (blue) failed to even hint at those historical droughts and accompanying heat waves. All the climate models could do is project imagined disasters in the future. Rarely does the media mention the grass/fire cycle, or the dryness of ENSO cycles. Instead, NPR prefers to blame climate change for worse wildfires The media is indeed stuck in the “vicious circle of mutual manipulation, mythmaking, and self-interest”

Tuesday, August 3, 2021

National Public Radio’s Misinformation on Wildfires and Climate: part 1

 Each day I attempt to synthesize curiously divergent views in the news. In the morning I listen to National Public Radio (NPR) and in the evening I watch Fox News. However, I’m increasingly disturbed by NPR’s unbalanced reporting on wildfires. With every wildfire report, NPR now adds climate crisis comment but ignores wildfire science. I learned more about heat transfer and wildfires as a boy scout. I also expanded my wildfire science as an ecologist researching California’s Sierra Nevada ecosystems for 30 years and I must say an honest NPR would focus on the 3 major issues needed to minimize wildfire devastation.

1) Minimize human ignitions.

2) Improve ground fuel management.

3) Remove introduced annual grasses.

Politicians and media journalists that claim reducing CO2 will save us from bigger wildfires are only exacerbating public fears and promoting ineffective policies.

As a Boy Scout I was dropped off in a snow-covered Maine wilderness with temperatures below 32°F. My task was to build a life-saving campfire with only one match. Survival required finding enough flammable fuel. Fortunately, there was an abundance of small dead twigs on the lowest branches of most trees. With ample airflow, those small diameter fuels (A) are reliably ignited with a single match. Wildfire experts refer to small twigs, pine needles and dead grasses as 1-hour lag fuels because those fuels sufficiently dry out in an hour of dry weather, with or without global warming. Varying abundances of 1 hour fuels determine how readily wildfires ignite and spread. Thus, during fire season wildfire experts assess 1 hour fuels daily. Though easily ignited, twigs or dead grass have very little mass to sustain a fire. So, to build my survival fire I carefully added a layer of slightly larger sticks (B) that could be quickly ignited before heat from my twigs was exhausted. Added sticks can’t be too large. Larger branches and logs will absorb small amounts of heat without igniting.

My one match generated temperatures exceeding 1000 °F, but only for a short time. But with the right balance of 1 hour fuels and sticks for kindling, my survival fire was secured. Ample kindling is the key to igniting and sustaining every large fire. In California, accumulating ground fuels supplied the kindling allowing a spark from a tire rim scraping asphalt to ignite dead roadside grasses, that then expanded into the large Carr Fire. A spark from a rancher driving a stake into the ground ignited the surrounding dead grass, igniting California’s 2nd largest fire, the Mendocino Complex Fire, even though regional maximum temperatures had been 3°F cooler than the 1930s. Sparks from powerlines in November caused California’s deadliest Camp Fire; and powerlines likewise ignited California’s largest 2021 fire,the Dixie Fire.

All those conflagrations were caused by human ignitions, and experts calculate that 84% percent of America’s wildfires are started by humans. Good science cannot attribute an increase in wildfires to climate change unless it first accounts for increased human ignitions. Additionally, once ignited, fires produce such tremendous heat, ground fuels are readily dried out as the fire approaches. Swaths of burning grasses make great kindling, generating temperatures reaching 1400°F which easily ignites accumulated ground fuels and shrubbery. Those ground fuels then generate temperatures reaching 2000°F. Claims that an insignificant 2°F increase from global warming makes wildfires bigger and more intense appears to be politicized fear mongering. My survival fire started in below freezing temperatures and major conflagrations burned during the Little Ice Age. So best to prevent increased ignitions by burying powerlines and by removing 1 hour fuels from roadsides.

In NPR’s defense, gullible journalists are fed cherrypicked “science” from researchers seeking fame and fortune by promoting climate crises. Naïve journalists are provided graphs of “unprecedented” increasing wildfire trends, but only since 1970. The longer term trend, as exemplified by Oregon Department of Forestry, is rarely publicized widely. Furthermore crafty researchers blame global warming by showing how it theoretically dries out fire fuels, but simple good physics then gets grossly misapplied.

For example, widespread invasive annual grasses have also increased landscape aridity, and increased the probability of ignitions and increased burnt area extent. Non native cheatgrass earned its name because it grows during the winter and early spring, consuming soil moisture and depriving native grasses. By growing in the west’s moister winter season, cheatgrass expanded into arid regions where extensive grasslands had been prohibited. Added carpets of flammable cheatgrass engulfed sagebrush ecosystems, covering bare ground that had once limited large or frequent fires. And carpets of cheatgrass can easily carry fires into forests and other ecosystems.

In contrast native perennial grasses growing during the summer maintained high moisture content and didn’t go dormant until late August. Their high moisture content reduced the probability of both grass fire ignitions and its spread. But native grasses were soon out competed by moisture hoarding cheatgrass which dies, dries, and becomes highly flammable by April and May. As landscapes became dominated by cheatgrass, the fire season intensified and expanded by 2-3 months. Although NPR had once mentioned the destructive effects of invasive grasses, NPR now pushes climate change. But the drying out of western grasslands is not due to climate change. Its due to natural weather cycles and the landscapes’ conversions to moisture sucking invasive annuals. Finally, cheatgrass conversion correlates too well with maps of western USA’s most active fire regions. Restoring native grasslands would be another best practice to reduce the west’s larger wildfires.

added Aust 7,/p>

Here is a graph from USGS fire experts not only showing the greatest amount of California's burnt acres happens in 1-hour and 10 hour fuel habitat, but far more burnt acres happened in the the 1920s and 30s

Wednesday, July 21, 2021

Interview with David Siegel

Among other issues a discussion of how transport of warm Atlantic waters into the Arctic has caused Arctic warming despite cooling trends in North America and Eurasia. The Arctic temperatures bias the global average, but no similar warming is observed in the Antarctic



Monday, July 19, 2021

Islands of Truth Emerging from the Murky Depths of “Sea Level Science”

During the 1980s, the media and a few scientists warned that island nations in the tropical Pacific would soon be wiped from the face of the earth, drowned by rising sea levels attributed to rising CO2. The Maldives’ Environmental Affairs Director warned that an 8 inch to a foot rise in sea level in the next 20 to 40 years would be catastrophic. The Guardian’s headlines wrote of islanders abandoning their island home to become environmental refugees. In 2002 supported by Greenpeace, Tuvalu threatened to sue the United States and Australia for excessive carbon dioxide emissions. The Smithsonian magazine asked, “Will Tuvalu Disappear Beneath the Sea?”. However, the Smithsonian also admitted “not all scientists agree that Tuvalu’s future is underwater. Some critics have branded island leaders as opportunists angling for foreign handouts…while people and organizations sympathetic to Tuvalu are “eco-imperialists” intent on imposing their alarmist environmental views on the rest of the world.”

 

Theoretically, the fear of devastating rising sea levels was quite legitimate given the onslaught of global warming narratives and the fact the entire atoll nation of Tuvalu only averages 6.6 feet above sea level. So naturally, Tuvalu’s fund-seeking prime minister, Saufatu Sapo’aga, told the United Nations that global-warming’s threat to his island was no different than “a slow and insidious form of terrorism”. In reality, the latest science published in 2018 combined past aerial photographs and satellite imagery to report a net increase in Tuvalu’s land area. Built on coral reefs, the atoll’s many islands will change shape as coral debris and foraminifera are added to the land on the leeward side, while losing area on the windward side. Nonetheless, 74% of the islands increased in size with only 27% decreasing. Furthermore, Tuvalu’s growth is mirrored in other islands region-wide. A similar study examining “30 Pacific and Indian Ocean atolls including 709 islands, reveals that no atoll lost land area and that 88.6% of islands were either stable or increased in area, while only 11.4% contracted.” So how can coral islands grow in an era of accelerating sea level rise?

 
The answer may be, in contrast to global warming theories, that regional sea levels have not been rising by 3+ millimeters/year nor has the rise accelerated. The scary misinformation is partly due researchers using short¬¬ term sea level records that are incapable of accurately measuring the long-term trends. Pacific sea levels rise and fall over decades (see maps above). A single El Nino event can temporarily raise or lower island sea levels by 300 millimeters (mm). Thus, one study using just 6.8 years of data determined sea level at Tuvalu had fallen by 8.9 mm/year in contrast to a 2008 study using 15.5 years of data claiming sea level rose by 5.9 mm/year. 

In Tuvalu Not Experiencing Increased Sea Level Rise (2004), the climate skeptic Eschenbach reported climate scientists, who had averaged 27 regional stations using data greater than 25 years, had determined only a 0.8 mm/year trend with no acceleration. Eschenbach further detailed how Tuvalu’s erosion and the saltwater intrusion into the drinking water was not caused by rapid sea level rise. Among other issues, having paved over 10% of the island, the underground reservoir of freshwater was not being replenished, as rainwater was instead shunted away to the sea. Meanwhile the mining of sand and reefs for construction had disrupted the natural dynamics that maintained the island’s shape. 

 In “The Scientific Basis” of the IPCC’s 2001 report, they determined based on tide gauges “mean sea level rise during the 20th century ranged from 1.0 to 2.0 mm/year” and no detections of the predicted acceleration. Still, to bolster a theory unable to link rising CO2, warming, and accelerating sea level rise, Vermeer and Rhamstorf (2009) created a climate change model that projected a sea-level rise that ranged from 2.4 ft to 6.2 ft for the period 1990 –2100. In contrast, Australia’s NSW principal Coastal Specialist observed a consistent trend of weak deceleration at each gauge site throughout Australasia over the period from 1940 to 2000, and in 2011 published Is There Evidence Yet of Acceleration in Mean Sea Level Rise around Mainland Australia? 


There were several more conflicting issues. Scripps’ esteemed oceanographer Walter Munk found IPCC 20th century sea level rise estimates of 1.5 to 2.0 mm/year were too high. In 2013, Gregory et. al wrote Twentieth-Century Global-Mean Sea Level Rise: Is the Whole Greater than the Sum of the Parts? They solved the problem of an unbalanced sea level budget, by simply increasing previously estimated contributions from ocean warming and meltwater. In contrast, Harvard’s Mitrovica solved the same problem by lowering 20th century mean sea level rise to just 1 mm/yr. 


Meanwhile, new satellite era estimations suggested the long-awaited theoretical acceleration had arrived with sea level rising 3.5 mm/year from 1994-2002. Yet, like Tuvalu, such short periods of measurements are usually biased by natural variability and cannot accurately measure long term trends. Accordingly, subsequent satellite measurements between 2003 and 2011 found global sea level suddenly decelerated to 2.4 mm/year. To solve that conflict with global warming expectations, Cazenave et al. added the estimated water trapped on land during La Ninas to observed sea levels, plus added another 0.3 mm/year glacial isostatic adjustment to account for ocean basin expansion. Thus, she raised global warming’s rising sea level trend to 3.3 mm/year for the past 20 years, a rate commonly cited today.
In addition to Cazenave’s misguided trend building from short duration satellite measurements, calibrating satellite data to tide gauges can be distorted by the unaccounted land subsidence typically biasing tide gauges. It is well established that sinking land creates an illusion of sea level rise. China’s Huanghe Delta is sinking 10 inches/year (254 mm) and New Orleans is sinking 1.4 inches (36 mm) per year. Southern Florida experiences localized patches of 1–3 mm/year subsidence, undermining condos in urban areas built on reclaimed marshland. Similarly, San Francisco’s airport is sinking 0.4 inches per year. Between 1980 and 2014, despite San Francisco’s tide gauge experiencing large El Nino sea level spikes, there had been no rising trend suggesting falling sea levels have offset its subsidence.
Robustly evaluating the dangers of rising sea levels for Pacific islanders, in 2020 Alberto Boretti published Relative sea-level rise and land subsidence in Oceania from tide gauge and satellite GPS. Using only the 6 regional tide gauges with 100+ years of data, he found after subtracting subsidence effects from the +1.3 mm/year average relative sea level rise, the average absolute rate of rise computed to an astonishingly low +0.125 mm/year; with no signs of acceleration or evidence of thermal expansion. In addition, Boretti’s special Tuvalu case study, also accounting for subsidence, determined from 1977 to present an absolute rate of rise of just +0.157 mm/year. That rate of sea level rise from increased ocean volumes will only add about 0.6 inches in 100 years. The worrisome “sea-level rise of Tuvalu is due to subsidence rather than the increasing volume of the ocean waters” from thermal expansion or glacial meltwater. The best approach for Tuvalu’s prime minister would be to stop ill-advised urbanization that leads to subsidence and threatens its fresh water supply and stop covering his political rump by shifting the blame to climate change. Still stopping subsidence might not be possible. After all, in the late 1800s, Charles Darwin correctly surmised that atolls like Tuvalu were formed by fringing reefs attached to a naturally subsiding extinct volcano.









Jim Steele is Director emeritus of San Francisco State University’s Sierra Nevada Field Campus, authored Landscapes and Cycles: An Environmentalist’s Journey to Climate Skepticism, and proud member of the CO2 Coalition Contact: naturalclimatechange@earthlink.net