At the rate things are going, the Earth in the coming decades could cease to be a “safe operating space” for human beings. That is the conclusion of a new paper published Thursday in the journal Science by 18 researchers trying to gauge the breaking points in the natural world.
The paper contends that we have already crossed four “planetary boundaries.” They are the extinction rate; deforestation; the level of carbon dioxide in the atmosphere; and the flow of nitrogen and phosphorous (used on land as fertilizer) into the ocean.
“What the science has shown is that human activities — economic growth, technology, consumption — are destabilizing the global environment,” said Will Steffen, who holds appointments at the Australian National University and the Stockholm Resilience Center and is the lead author of the paper.
These are not future problems, but rather urgent matters, according to Steffen, who said that the economic boom since 1950 and the globalized economy have accelerated the transgression of the boundaries. No one knows exactly when push will come to shove, but he said the possible destabilization of the “Earth System” as a whole could occur in a time frame of “decades out to a century.”
The researchers focused on nine separate planetary boundaries first identified by scientists in a 2009 paper. These boundaries set theoretical limits on changes to the environment, and include ozone depletion, freshwater use, ocean acidification, atmospheric aerosol pollution and the introduction of exotic chemicals and modified organisms.
Beyond each planetary boundary is a “zone of uncertainty.” This zone is meant to acknowledge the inherent uncertainties in the calculations, and to offer decision-makers a bit of a buffer, so that they can potentially take action before it’s too late to make a difference. Beyond that zone of uncertainty is the unknown — planetary conditions unfamiliar to us.
“The boundary is not like the edge of the cliff,” said Ray Pierrehumbert, an expert on Earth systems at the University of Chicago. “They’re a little bit more like danger warnings, like high-temperature gauges on your car.”
ON NOVEMBER 2ND the Intergovernmental Panel on Climate Change (IPCC), which represents mainstream scientific opinion, said that it was extremely likely that climate change is the product of human activity. Extremely likely in IPCC speak means having a probability of over 95%. The claim forms part of its fifth assessment on the state of the global climate. In its first assessment, in 1990, the IPCC had said that “the observed increase [in air temperatures] could be largely due to natural variability.” Why have climate scientists become so much more certain that climate change is man-made, not natural?
Many factors influence the climate but perhaps the single most important is carbon dioxide (CO₂). CO₂ absorbs infra-red heat at a constant rate and at a higher rate than nitrogen and oxygen—the main constituent parts of the atmosphere—so the more CO₂ in the air, the more the atmosphere will tend to warm up. Scientists attribute climate change to human activity mainly because people have been responsible for large increases in CO₂. At the start of the industrial revolution, in about 1800, there were 280 parts per million (ppm) of CO₂ in the atmosphere. That had been the level for most of human history. This year, however, concentrations exceeded 400 ppm, the first time it had reached that level for a million years.
Most of the increase has been caused by people burning fossil fuels. In the United States, for example, 38% of the CO₂ produced in 2012 came from generating electricity and 32% came from vehicle emissions (the rest came from industrial processes, buildings and other smaller CO₂ production). People also produce CO₂ when they cut down forests for farmland and pasture
The Earth is locked on an “irreversible” course of climatic disruption from the buildup of greenhouse gases in the atmosphere, and the impacts will only worsen unless nations agree to dramatic cuts in pollution, an international panel of climate scientists warned Sunday.
The planet faces a future of extreme weather, rising sea levels and melting polar ice from soaring levels of carbon dioxide and other gases, the U.N. panel said. Only an unprecedented global effort to slash emissions within a relatively short time period will prevent temperatures from crossing a threshold that scientists say could trigger far more dangerous disruptions, the panel warned.
“Continued emission of greenhouse gases will cause further warming and long-lasting changes in all components of the climate system, increasing the likelihood of severe, pervasive and irreversible impacts,” concluded the report by the United Nations’ Intergovernmental Panel on Climate Change (IPCC), which draws on contributions from thousands of scientists from around the world.
The report said some impacts of climate change will “continue for centuries,” even if all emissions from fossil-fuel burning were to stop. The question facing governments is whether they can act to slow warming to a pace at which humans and natural ecosystems can adapt, or risk “abrupt and irreversible changes” as the atmosphere and oceans absorb ever-greater amounts of thermal energy within a blanket of heat-trapping gases, according to scientists who contributed to the report.
Meanwhile in Canada, we don’t care about climate change, only our own economy.
Canada’s hopes of securing an outlet for its landlocked oil wealth and pulling an end run around the eternally deadlocked Keystone XL project took a big step forward Thursday with the release of formal plans to build a U.S. $11 billion pipeline to the Atlantic.
TransCanada, the biggest Canadian pipeline company, submitted its application to Canadian energy regulators for a nearly 3,000-mile-long, million-barrel-a-day pipe running from oil-rich western Canada to refineries and shipping terminals in the east. The so-called Energy East Pipeline Project, which TransCanada officials hope could be in operation as soon as 2018, would provide an export outlet for huge volumes of current and future oil production that right now has no easy way to get to market.
The project wouldn’t replace the Keystone XL pipeline — Canada’s other high-profile, multibillion-dollar oil-transport project, which has been awaiting U.S. approval for years — but it could give Republican critics of U.S. President Barack Obama’s administration fresh fodder ahead of the midterm elections. Republicans have long argued that the White House’s refusal to sign off on the Keystone project would cost the United States tens of thousands of jobs. The Obama administration has finished reviewing the environmental merits of Keystone, but pushed back any decision until later this year or early 2015.
If the new Canadian route gets approved in 2016 by Canada’s National Energy Board, as TransCanada expects, it would give the eastern provinces a source of domestic oil — removing the need for some 700,000 barrels a day of oil imports — and would give producers in Alberta and Saskatchewan a direct route to big refineries that could turn the sludgy tar sands into valuable products such as diesel, gasoline, and jet fuel.
On any given day, Johannes van Bergen, director of the municipal utility Stadtwerke Schwäbisch Hall in southwestern Germany, conducts his team’s array of gas, heat, and electricity sources to meet the energy needs of at least several hundred thousand Swabians in the region, as well as about more than 90,000 customers elsewhere in Germany. And every day — in fact, every hour — that energy mix is constantly in flux.
Technicians at the town’s smart-grid center monitor and manage the utility’s roughly 3,000 regional energy suppliers: several thousand solar photovoltaic (PV) installations, two wind parks, one gas-and-steam power station, six small hydro-electric works, three biomass (wood pellet), sixbiogas plants, and 48 combined heat and power plants, as well as other conventional and renewable energy suppliers outside the municipality.
The population that this ballet of coordinated energy sources serves is admittedly modest, but it’s here that the future of Germany’s energy industry is being tested in full — and proven.
Which of course is a model that we could use here but for whatever reason, the province and the country isn’t willing to experiment.
Their output, and increasingly that of the conventional, too, is distributed through a tightly knit, cross-border smart grid. The composition of supply changes from minute to minute depending on weather, demand, and other factors from one corner of the country to the other. Increasingly electricity is generated in and traded from locality to locality, and even across the country (or countries) via intelligent networks much like that in Schwäbisch Hall and other places in Germany.
No one predicted this scale of locally driven, citizen-led energy boom when the Energiewende began. Even just four years ago, just about everybody involved in the Energiewende thought that big-ticket projects like enormous offshore wind farms planned for Germany’s northern seas and Desertec, the mega-project to import solar energy across the Mediterranean from sprawling concentrated solar power arrays in the Middle East and Northern Africa, would be integral to Germany going renewable.
These projects, however, have flopped spectacularly.
Offshore wind has proven extremely pricey and technologically much trickier than originally assumed, which has led to billons in cost overruns and years-long delays. Germany’s seven operational offshore parks constitute a tiny fraction — just 0.6 percent — of the country’s renewably generated electricity, compared to onshore wind’s 34 percent. The offshore industry claims there’s smooth sailing for offshore wind just around the corner, but it’s been saying that for years.
There are several potential explanations for what’s going on here. The most likely is that some combination of increasingly infrequent summer snowstorms, wind-blown dust, microbial activity, and forest fire soot led to this year’s exceptionally dark ice. A more ominous possibility is that what we’re seeing is the start of a cascading feedback loop tied to global warming.
Box mentions this summer’s mysterious Siberian holes and offshore methane bubbles as evidence that the Arctic can quickly change in unpredictable ways.
This year, Greenland’s ice sheet was the darkest Box (or anyone else) has ever measured. Box gives the stunning stats: “In 2014 the ice sheet is precisely 5.6 percent darker, producing an additional absorption of energy equivalent with roughly twice the US annual electricity consumption.”
Perhaps coincidentally, 2014 will also be the year with the highest number of forest fires ever measured in Arctic.
Box ran these numbers exclusively for Slate, and what he found shocked him. Since comprehensive satellite measurements began in 2000, never before have Arctic wildfires been as powerful as this year. In fact, over the last two or three years, Box calculated that Arctic fires have been burning at a rate that’s double that of just a decade ago. Box felt this finding was so important that he didn’t want to wait for peer review, and instead decided to publish first on Slate. He’s planning on submitting these and other recent findings to a formal scientific journal later this year.
From the CBC
A new study by Cornell University, the University of Arizona, and the US Geological Survey researchers, looked at the deep-historical record (tree rings, etc.) and the latest climate change models to estimate the likelihood of major droughts in the Southwest over the next century. The results are as soothing as a thick wool sweater on mid-summer desert hike.
The researchers concluded that odds of a decade-long drought are “at least 80 percent.” The chances of a “mega-drought,” one lasting 35 or more years, stands at somewhere between 20 percent and 50 percent, depending on how severe climate change turns out to be. And the prospects for an “unprecedented 50-year megadrought”—one “worse than anything seen during the last 2000 years”—checks in at a non-trivial 5 percent to 10 percent.
It gets worse
his (paradoxically) chilling assessment comes on the heels of another study (study; my summary), this one released in early August by University of California-Irvine and NASA researchers, on the Colorado River, the lifeblood of a vast chunk of the Southwest. As many as 40 million people rely on the Colorado for drinking water, including residents of Las Vegas, Los Angeles, Phoenix, Tucson, and San Diego. It also irrigates the highly productive winter farms of California’s Imperial Valley and Arizona’s Yuma County, which produce upwards of 80 percent of the nation’s winter vegetables.
The researchers analyzed satellite measurements of the Earth’s mass and found that the region’s aquifers had undergone a much-larger-than-expected drawdown over the past decade—the region’s farms and municipalities responded to drought-reduced flows from the Colorado River by dropping wells and tapping almost 53 million acre-feet of underground water between December 2004 and November 2013—equal to about 1.5 full Lake Meads, drained off in just nine years, a rate the study’s lead researcher, Jay Famiglietti, calls “alarming.”
Considering how much of the Colorado River Basin, which encompasses swaths of Utah, Colorado, California, Arizona, and New Mexico, are desert, it’s probably not wise to rapidly drain aquifers, since there’s little prospect that they’ll refill anytime soon. And when you consider that that the region faces high odds of a coming mega-drought, the results are even more frightening. (Just before Labor Day, over fierce opposition from farm interests, the California legislature passed legislation that would regulate groundwater pumping—something that has never been done on a state-wide basis in California before. Gov. Jerry Brown is expected to sign it into law.)