We can all make a difference…

“Life can be much broader once you discover one simple fact. And that is: Everything around you that you call life was made up by people that were no smarter than you. And you can change it, you can influence it … Once you learn that, you’ll never be the same again.” — Steve Jobs

Image Credit: Halar Khan from Flicker

Are We Overestimating How Much Trees Will Help Fight Climate Change?

September 4, 2019 by Jan Ellen Spiegel

Bob Marra navigated his way to the back of a dusty barn in Hamden, Connecticut, belonging to the state’s Agricultural Experiment Station. There, past piles of empty beehives, on a wall of metal shelves, were stacks of wooden disks — all that remains of 39 trees taken down in 2014 from Great Mountain Forest in the northwest corner of the state.

These cross-sections of tree trunks, known as stem disks — or more informally as cookies — are telling a potentially worrisome tale about the ability of forests to be critical hedges against accelerating climate change. As anyone following the fires burning in the Amazon rainforest knows by now, trees play an important role in helping to offset global warming by storing carbon from atmospheric carbon dioxide — a major contributor to rising temperatures — in their wood, leaves, and roots. The worldwide level of CO2 is currently averaging more than 400 parts per million — the highest amount by far in the last 800,000 years.

But Marra, a forest pathologist at the Experiment Station with a Ph.D. in plant pathology from Cornell University, has documented from studying his fallen trees that internal decay has the capacity to significantly reduce the amount of carbon stored within.

His research, published in Environmental Research Letters late last year and funded by the National Science Foundation, focused on a technique to see inside trees — a kind of scan known as tomography (the “T” in CAT scan.) This particular tomography was developed for use by arborists to detect decay in urban and suburban trees, mainly for safety purposes. Marra, however, may be the first to deploy it for measuring carbon content and loss associated with internal decay. Where there is decay there is less carbon, he explains, and where there is a cavity, there is no carbon at all.

“What we’re suggesting is that internal decay in trees has just not been properly accounted for,” says Marra.

While the first round of his research was a proof of concept that necessitated the destruction of 39 trees to show that tomography is accurate, his ultimate goal is a nondestructive technique to enable better assessments of carbon sequestration than those done annually by the U.S. Forest Service. Under the United Nations Framework Convention on Climate Change, ratified in 1994, governments are required to report annual estimates of carbon holdings in all their managed lands. The most recent Forest Service figures show that U.S. forests offset about 14 percent of the nation’s carbon emissions each year.

The Forest Service estimates that carbon makes up 48 to 50 percent of a tree’s biomass, so ones with decay will be less dense and therefore hold less carbon. But Marra contends that the visual signs monitored by the Forest Service, such as canopy and tree size, along with conspicuous problems such as lesions or cankers, don’t accurately reflect internal decay — a tree that looks healthy may have decay and one that appears problematic may be fine inside.



In addition, he says, foresters typically use a mallet to hammer a tree to register a sound that might indicate it’s hollow. “You know that there may be a hollow, but you don’t know how big the hollow is,” Marra says. As a result, he believes the government’s baseline data used to estimate carbon storage are not accurate.

“There are a lot of ways to improve our estimates of carbon being stored above ground in forests, and this decay component could certainly prove to be important,” says Andrew Reinmann, an ecologist and biogeochemist with the City University of New York’s Advanced Science Research Center. But, he added, “We haven’t really had the technology to explore this before — it’s still a little bit of an unknown.”


Marra used a two-stage system for his research: sonic tomography, which sends sound waves through the tree, followed by electrical resistance tomography, which transmits an electric current. Both processes are necessary to fine-tune each other’s readings.

The system, which costs about $25,000 and fits in a backpack, is cheap and small by scientific equipment standards. Each reading takes no more than a few minutes and computerized visual renderings of the results appear instantly.

Marra experimented with three northern hardwoods — sugar maple, yellow birch, and American beech — and included more than two dozen of each, along with some control trees with no decay. The researchers analyzed the lower bole — the first two meters or so — of each tree, which is the oldest part and closest to the soil, where most decay-causing fungi would come from.

A dozen or so nails were tapped in a circle around the trunk and connected by cables to the tomograph; a sonic hammer then activated the system to get sound-wave measurements.

For the electric resistance tomography, a second set of nails was hammered between the first, and electrodes — plus and minus — were attached to each.

The various nail areas were painted in different colors to enable the computer renderings to be aligned later with photographs of the cookies after the trees were cut down.

The cookies, about 4 inches thick and which Marra called “the truth,” were only taken from where the measurements were made — the areas with the paint markings.

He analyzed 105 cookies from the 39 trees taken down. In the 11 cases where tomography found no decay, the cookies revealed only one small cavity. In the 32 cases where incipient, or early, decay was detected, the cookies showed one additional cavity. The cookies confirmed the tomography results in 36 cases where active decay was found, though eight small cavities were also detected. Tomography correctly identified cavities in the remaining 26 cookies, meaning that it missed a total of 10 cavities among the 105 cookies.

“One thing to sort of mitigate against this failure, if you want to call it that — these were very small cavities,” Marra says of the ones the tomography missed. “So they would have very little impact on a carbon budget.”

Then came the time-consuming process of measuring the actual amount of carbon in each tree. After air-drying the cookies for a year, the wood from 500 drilled holes was sent to a gas chromatography lab at the University of Massachusetts to determine the carbon levels.

The tomography and lab results were then combined to calculate how much carbon was stored in the lower boles and to contrast that with the levels if the trees had been solid wood. Those calculations took until 2017 to complete.

“You’re looking at anywhere from a 19 percent to a 34 percent carbon loss” for an actively decaying tree among those studied, Marra says. “But any place there’s a cavity you’ve lost all of your carbon.”


The upshot of his five years of research, says Marra, is that accurate tomographic readings are possible in just a few minutes. “And what our tomography tells us is the carbon content,” he says.

At the same time, Marra is aware that tomography is not a practical substitute for the Forest Service’s carbon estimate system — which itself is a clunky and labor-intensive slog. But it could provide a valuable way to augment those estimates.

“Those are very, very impressive results,’’ says Kevin Griffin, a tree physiologist at Columbia University and its Lamont-Doherty Earth Observatory. “They obviously have obtained a lot of precision in the techniques.”

“The results are important,” he adds, “but whether internal tree decay is the single most burning question? Probably not. There’s probably bigger fish to fry before we get there.”

Among them, he says are forest growth rates and overall tree health and age, as well as the impact of harvesting and other kinds of losses, including disease.

A tree’s architecture and height could also play large roles in carbon sequestration, says Reinmann of the City University of New York’s Advanced Science Research Center, as could the makeup of the forest landscape. His own research, for instance, found trees grow faster and have more biomass at the edge of fragmented forest.

“I think they’re making a good point that we’re probably over-estimating” carbon storage levels, says Aaron Weiskittel, director of the University of Maine’s Center for Research on Sustainable Forests.

Even so, Weiskittel and others — including Marra — say the research needs to be scaled up to many more tree types and full forests. For his part, Marra would like to sample forests randomly with many more trees and controlling for factors including species, age, and soil characteristics.

The goal, he says, is to develop a methodology for generating data to provide better carbon estimates for more than three tree types in one small part of the country.

“We need to use tomography to refine models so we’re more accurately assessing the role that forests are playing as sequesterers or climate change mitigators,” Marra says. “We don’t want to be over-estimating the roles that they play.”


Jan Ellen Spiegel is a freelance writer and editor based in Connecticut. Her work appears regularly in numerous local and national publications, including The Connecticut Mirror, InsideClimate News, Yale Climate Connections, and The New York Times.

This article was originally published on Undark. Read the original article.

Cars and their societal impact

There has been a long-held belief that the auto industry has a great deal to answer for in it’s impact to our communities globally. Many Western societies enjoyed fairly capable and well-servicing public transport systems across the 1950-1970s. That is, until the auto industry in partnership with steel, rubber, and mega-industrial conglomerates orchestrated a buy-out and subsequent destruction of most networks around the world.

What followed has been an expansive grid of roads and motorways that carry fewer people, in many more polluting vehicles, with much greater environmental consequence than we had mid-last last century. Sadly, with the ever expanding infrastructure, and a burgeoning new middle class in many populous countries (like India, China, and many others) — the environmental impact of the automobile is just beginning…

While this thoroughly researched and detailed article from The Atlantic is geared toward the USA experience — it is an insightful analysis of what to expect (or what has already been experienced) in many other locales. More combustion vehicles mean more carbon based emissions — simple as that. Read the article from The Atlantic — and let us know what YOU think.

Our environment = Our life

Thanks for visiting. CentroProEco provides ecology, sustainability, environmental education and exchange to inspire others to preserve, restore and protect our natural environment. Working together… we CAN make a difference!

“It always seems impossible until it’s done.” ~ Nelson Mandela

post

Opinion – AEON: Nuclear power is not the answer in a time of climate change

The Woolsey Fire seen from Topanga Canyon in California. Photo courtesy of Peter Buschmann/USDA/Flickr

In November 2018, the Woolsey Fire scorched nearly 100,000 acres of Los Angeles and Ventura counties, destroying forests, fields and more than 1,500 structures, and forcing the evacuation of nearly 300,000 people over 14 days. It burned so viciously that it seared a scar into the land that’s visible from space. Investigators determined that the Woolsey Fire began at the Santa Susana Field Laboratory, a nuclear research property contaminated by a partial meltdown in 1959 of its failed Sodium Reactor Experiment, as well as rocket tests and regular releases of radiation.

The State of California’s Department of Toxic Substances Control (DTSC) reports that its air, ash and soil tests conducted on the property after the fire show no release of radiation beyond baseline for the contaminated site. But the DTSC report lacks sufficient information, according to the Bulletin of Atomic Scientists. It includes ‘few actual measurements’ of the smoke from the fire, and the data raises alarms. Research on Chernobyl in Ukraine following wildfires in 2015 shows clear release of radiation from the old nuclear power plant, calling into question the quality of DTSC’s tests. What’s more, scientists such as Nikolaos Evangeliou, who studies radiation releases from wildfires at the Norwegian Institute for Air Research, point out that the same hot, dry and windy conditions exacerbating the Woolsey Fire (all related to human-caused global warming) are a precursor to future climate-related radioactive releases.

With our climate-impacted world now highly prone to fires, extreme storms and sea-level rise, nuclear energy is touted as a possible replacement for the burning of fossil fuels for energy – the leading cause of climate change. Nuclear power can demonstrably reduce carbon dioxide emissions. Yet scientific evidence and recent catastrophes call into question whether nuclear power could function safely in our warming world. Wild weather, fires, rising sea levels, earthquakes and warming water temperatures all increase the risk of nuclear accidents, while the lack of safe, long-term storage for radioactive waste remains a persistent danger.

The Santa Susana Field Laboratory property has had a long history of contaminated soil and groundwater. Indeed, a 2006 advisory panel compiled a report suggesting that workers at the lab, as well as residents living nearby, had unusually high exposure to radiation and industrial chemicals that are linked to an increased incidence of some cancers. Discovery of the pollution prompted California’s DTSC in 2010 to order a cleanup of the site by its current owner – Boeing – with assistance from the US Department of Energy and NASA. But the required cleanup has been hampered by Boeing’s legal fight to perform a less rigorous cleaning.

Like the Santa Susana Field Lab, Chernobyl remains largely unremediated since its meltdown in 1986. With each passing year, dead plant material accumulates and temperatures rise, making it especially prone to fires in the era of climate change. Radiation releases from contaminated soils and forests can be carried thousands of kilometres away to human population centres, according to Evangeliou.

Kate Brown, a historian at the Massachusetts Institute of Technology and the author of Manual for Survival: A Chernobyl Guide to the Future (2019), and Tim Mousseau, an evolutionary biologist at the University of South Carolina, also have grave concerns about forest fires. ‘Records show that there have been fires in the Chernobyl zone that raised the radiation levels by seven to 10 times since 1990,’ Brown says. Further north, melting glaciers contain ‘radioactive fallout from global nuclear testing and nuclear accidents at levels 10 times higher than elsewhere’. As ice melts, radioactive runoff flows into the ocean, is absorbed into the atmosphere, and falls as acid rain. ‘With fires and melting ice, we are basically paying back a debt of radioactive debris incurred during the frenzied production of nuclear byproducts during the 20th century,’ Brown concludes. 

Flooding is another symptom of our warming world that could lead to nuclear disaster. Many nuclear plants are built on coastlines where seawater is easily used as a coolant. Sea-level rise, shoreline erosion, coastal storms and heat waves – all potentially catastrophic phenomena associated with climate change – are expected to get more frequent as the Earth continues to warm, threatening greater damage to coastal nuclear power plants. ‘Mere absence of greenhouse gas emissions is not sufficient to assess nuclear power as a mitigation for climate change,’ conclude Natalie Kopytko and John Perkins in their paper ‘Climate Change, Nuclear Power, and the Adaptation-Mitigation Dilemma’ (2011) in Energy Policy.

Proponents of nuclear power say that the reactors’ relative reliability and capacity make this a much clearer choice than other non-fossil-fuel sources of energy, such as wind and solar, which are sometimes brought offline by fluctuations in natural resource availability. Yet no one denies that older nuclear plants, with an aged infrastructure often surpassing expected lifetimes, are extremely inefficient and run a higher risk of disaster.

‘The primary source of nuclear power going forward will be the current nuclear fleet of old plants,’ said Joseph Lassiter, an energy expert and nuclear proponent who is retired from Harvard University. But ‘even where public support exists for [building new] nuclear plants, it remains to be seen if these new-build nuclear plants will make a significant contribution to fossil-emissions reductions given the cost and schedule overruns that have plagued the industry.’ 

Lassiter and several other energy experts advocate for the new, Generation IV nuclear power plants that are supposedly designed to deliver high levels of nuclear power at the lowest cost and with the lowest safety risks. But other experts say that the benefits even here remain unclear. The biggest critique of the Generation IV nuclear reactors is that they are in the design phase, and we don’t have time to wait for their implementation. Climate abatement action is needed immediately.

‘New nuclear power seemingly represents an opportunity for solving global warming, air pollution, and energy security,’ says Mark Jacobson, director of Stanford University’s Atmosphere and Energy Programme. But it makes no economic or energy sense. ‘Every dollar spent on nuclear results in one-fifth the energy one would gain with wind or solar [at the same cost], and nuclear energy takes five to 17 years longer before it becomes available. As such, it is impossible for nuclear to help with climate goals of reducing 80 per cent of emissions by 2030. Also, while we’re waiting around for nuclear, coal, gas and oil are being burned and polluting the air. In addition, nuclear has energy security risks other technologies don’t have: weapons proliferation, meltdown, waste and uranium-worker lung-cancer risks.’

Around the world, 31 countries have nuclear power plants that are currently online, according to the International Atomic Energy Agency. By contrast, four countries have made moves to phase out nuclear power following the 2011 Fukushima disaster, and 15 countries have remained opposed and have no functional power plants.

With almost all countries’ carbon dioxide emissions increasing – and China, India and the US leading the pack – the small Scandinavian country of Denmark is an outlier. Its carbon dioxide emissions are decreasing despite it not producing any nuclear power. Denmark does import some nuclear power produced by its neighbours Sweden and Germany, but in February, the country’s most Left-leaning political party, Enhedslisten, published a new climate plan that outlines a path for the country to start relying on its own 100 per cent renewable, non-nuclear energy for power and heat production by 2030. The plan would require investments in renewables such as solar and wind, a smart grid and electric vehicles that double as mobile batteries and can recharge the grid during peak hours.

Gregory Jaczko, former chairman of the US Nuclear Regulatory Commission and the author of Confessions of a Rogue Nuclear Regulator (2019), believes the technology is no longer a viable method for dealing with climate change: ‘It is dangerous, costly and unreliable, and abandoning it will not bring on a climate crisis.’ Aeon counter – do not remove

Heidi Hutner & Erica Cirino

This article was originally published at Aeon and has been republished under Creative Commons.

How do you feel about nuclear power? Please share your comments below…