Let’s assume that the Obama administration and Congress get their act together this year and make good on their pledge of enacting meaningful climate legislation by establishing the nation’s first cap-and-trade system.
Let’s further assume, for the sake of argument, that the administration, working with its international partners, succeeds in drafting a robust successor to the Kyoto Protocol at the climate talks in Copenhagen later this year.
If we accept that the U.S. climate bill, known as the American Clean Energy and Security Act (ACES), will accomplish its goal of bringing down emission levels 80 percent below 2005 levels by 2050—which is nothing to sneeze at when you consider that a substantial fraction of policymakers (including some Democrats) vehemently oppose the measure—then the question becomes: Will it be enough to prevent the worst of climate change?
Having spent the better part of the last two decades predicting the severity of unconstrained climate change, many researchers are now shifting their focus to the aftermath of emission mitigation. The limited consensus so far has been sobering: Even if we were to significantly ratchet down our current emission levels by midcentury, a full recovery to safe levels, let alone a partial one, could take many decades—if not centuries.
The scenarios become especially grim if we overshoot certain “dangerous” thresholds of atmospheric GHG levels—around 1.7°C above pre-industrial levels, according to James Hansen, or the more moderate 2°C above pre-industrial levels, according to the European Union.
Under certain worst-case scenarios, some researchers have predicted that we would need to keep emissions at near-zero, or even negative, levels to stabilize near-surface temperature—hardly realistic goals. Until now, however, few studies have attempted to examine the underlying reasons for the sluggish recovery rates.
In a new study detailed in the journal Environmental Research Letters, Jason A. Lowe of the Met Office Hadley Center at the University of Reading and his colleagues did just that, using two global climate models—the HadCM3LC model, a complex general circulation model (GCM) developed by the Hadley Center, and the MAGICC model, a simple model—to scrutinize the accuracy of previous predictions and assess their relevance in a more policy-centric context.
They used four different scenarios which followed identical emission estimates up until 2000, after which they followed SRES A2 emissions until at least 2010. For the first three scenarios, CO2 emissions were set to zero for the next 100 years at years 2012, 2050, and 2100. The fourth scenario, which was meant to better approximate real-life conditions, also included forcing from other GHGs and pollutants, such as sulfate aerosol particles.
Using only the CO2 component of the SRES A2 emissions scenario to force the complex GCM until the end of the 21st century, they found that atmospheric levels would likely exceed 1000 ppm in 2100. Setting emissions to zero in 2012 and 2050 resulted in atmospheric levels exceeding 404 ppm and 556 ppm, respectively. In all cases, the model simulated extremely low rates of decline in atmospheric CO2 levels.
The predicted temperature rise was considerable: over 2°C by 2050 and, assuming emissions are zeroed beginning that year, around 0.2°C per century thereafter, suggesting that temperatures could remain dangerously high for a long time. Furthermore, the 2050 and 2100 scenarios, by drastically altering precipitation levels and global temperatures, resulted in the terrestrial biosphere becoming a net source of carbon—emitting up to 50 GtC (gigatons of carbon) and 76 GtC, respectively, over the ensuing century. (The oceans, however, could potentially compensate by increasing their uptake.)
In the fourth scenario, with multiple GHG emissions peaking in 2015 before adjusting to an annual long-term reduction rate of 3 percent, the authors found that there was a 55 percent chance that temperatures would overshoot the 2°C. Worse, there was a 30 percent chance that temperatures would remain above this dangerous threshold for at least a century, and a 10 percent chance that they would exceed it for up to 3 centuries. And here’s the kicker:
“This particular scenario has a reduction in greenhouse gas emissions approaching 50% of the 1990 values by 2050, which we note is similar to the G8 statement in 2008 to consider ‘the goal of achieving at least 50% reduction of global emissions by 2050’.”
In other words, even adopting the emission targets set by the global community (which could be further watered down) may not be enough to prevent temperatures from staying dangerously high.
Putting aside the usual list of caveats, this study should worry anyone who believes that passing a climate bill, even an ambitious one, would solve all of our problems. The basic message is that climate change is here to stay and, though governments need to do everything in their power to forestall the worst, we will have to live with its effects for many decades to come. As such, Lowe and his colleagues argue, there should be more of a focus among the research community on studying the resiliency of what they call “Earth system components,” such as the Greenland ice sheet or the thermohaline circulation.
John Holdren, who, as director of the White House Office of Science & Technology Policy, oversaw the publication of the new 196-page report issued by the U.S. Global Change Research Program, and Steven Chu, the head of the Department of Energy, are well aware of the risks of complacency and presumably will do their utmost to ensure that the administration keeps its eyes on the climate ball.
Subscribe to our newsletter
Stay up to date with DeSmog news and alerts