Brown and Caldeira’s study eliminates the lower end of this range, finding that the most likely warming is about 0.9 degrees Fahrenheit (0.5 degrees Celsius) greater than what the raw model results suggest — that we can expect global temperatures to increase anywhere in the range of 5.8 and 10.6 degrees Fahrenheit (3.2 to 5.9 degrees Celsius) over preindustrial levels by the end of the century. Caldeira added. “Our study indicates that if emissions follow a commonly used business-as-usual scenario, there is a 93 percent chance that global warming will exceed 4 degrees Celsius (7.2 degrees Fahrenheit) by the end of this century. Previous studies had put this likelihood at 62 percent.”
December 6, 2017
The climate models that project greater amounts of warming this century are the ones that best align with observations of the current climate, according to a article. Their findings suggest that the models used by the Intergovernmental Panel on Climate Change, on average, may be underestimating future warming.
The climate models that project greater amounts of warming this century are the ones that best align with observations of the current climate, according to a new paper from Carnegie’s Patrick Brown and Ken Caldeira published by Nature. Their findings suggest that the models used by the Intergovernmental Panel on Climate Change, on average, may be underestimating future warming.
Climate model simulations are used to predict how much warming should be expected for any given increase in the atmospheric concentration of carbon dioxide and other greenhouse gases.
“There are dozens of prominent global climate models and they all project different amounts of global warming for a given change in greenhouse gas concentrations, primarily because there is not a consensus on how to best model some key aspects of the climate system,” Brown explained.
Raw climate model results for a business-as-usual scenario indicate that we can expect global temperatures to increase anywhere in the range of 5.8 and 10.6 degrees Fahrenheit (3.2 to 5.9 degrees Celsius) over preindustrial levels by the end of the century — a difference of about a factor of two between the most- and least-severe projections.
Brown and Caldeira set out to determine whether the upper or lower end of this range is more likely to prove accurate. Their strategy relied on the idea that the models that are going to be the most skillful in their projections of future warming should also be the most skillful in other contexts, such as simulating the recent past. Brown and Caldeira’s study eliminates the lower end of this range, finding that the most likely warming is about 0.9 degrees Fahrenheit (0.5 degrees Celsius) greater than what the raw model results suggest.
The researchers focused on comparing model projections and observations of the spatial and seasonal patterns of how energy flows from Earth to space. Interestingly, the models that best simulate the recent past of these energy exchanges between the planet and its surroundings tend to project greater-than-average warming in the future.
“Our results suggest that it doesn’t make sense to dismiss the most-severe global warming projections based on the fact that climate models are imperfect in their simulation of the current climate,” Brown said. “On the contrary, if anything, we are showing that model shortcomings can be used to dismiss the least-severe projections.”
The uncertainty in the range of future warming is mostly due to differences in how models simulate changes in clouds with global warming. Some models suggest that the cooling effect caused by clouds reflecting the Sun’s energy back to space could increase in the future while other models suggest that this cooling effect might decrease.
“The models that are best able to recreate current conditions are the ones that simulate a reduction in cloud cooling in the future and thus these are the models that predict the greatest future warming,” Brown explained.
“It makes sense that the models that do the best job at simulating today’s observations might be the models with the most reliable predictions,” Caldeira added. “Our study indicates that if emissions follow a commonly used business-as-usual scenario, there is a 93 percent chance that global warming will exceed 4 degrees Celsius (7.2 degrees Fahrenheit) by the end of this century. Previous studies had put this likelihood at 62 percent.”
Materials provided by Carnegie Institution for Science. Note: Content may be edited for style and length.
- Patrick T. Brown, Ken Caldeira. Greater future global warming inferred from Earth’s recent energy budget. Nature, 2017; 552 (7683): 45 DOI: 10.1038/nature24672
Cite This Page:
7 DECEMBER 2017 | VOL 552 | NATURE | 45 ARTICLE doi:10.1038/nature24672
Greater future global warming inferred from Earth’s recent energy budget
Many relevant impacts of global climate change are expected to scale with the change in global mean surface air temperature (GMSAT) and thus there is great scientific and societal interest in projections of future warming. The primary tools used to project GMSAT over the remainder of the twenty-first century are coupled atmosphere–ocean–land global climate models (hereafter referred to as models) but there is substantial uncertainty inherent in model projections.
Climate sensitivity is the amount of warming that can be expected in response to the concentration of CO2 in the atmosphere reaching double the level observed in pre-industrial times. New estimates could reduce the uncertainty surrounding climate sensitivity by 60%.
CarbonBrief.org, Jan 2018
Scientists have presented a new, narrower estimate of the “climate sensitivity” – a measure of how much the climate could warm in response to the release of greenhouse gases.
The latest assessment report from the Intergovernmental Panel on Climate Change (IPCC) estimates that climate sensitivity has a “likely” range of 1.5 to 4.5C.
The new study, published in Nature, refines this estimate to 2.8C, with a corresponding range of 2.2 to 3.4C. If correct, the new estimates could reduce the uncertainty surrounding climate sensitivity by 60%.
The narrower range suggests that global temperature rise is “going to shoot over 1.5C” above pre-industrial levels, the lead author tells Carbon Brief, but “we might be able to avoid 2C”. Meeting either limit will likely require negative emissions technologies that can remove CO2 from the atmosphere, he says.
The new estimate is another “brick in the wall” of scientists’ understanding of climate sensitivity, another scientist tells Carbon Brief, and “the best-informed views will be reached by multiple lines of evidence”.
Climate sensitivity is the amount of warming that can be expected in response to the concentration of CO2 in the atmosphere reaching double the level observed in pre-industrial times.
The research makes a new estimate of the “equilibrium” climate sensitivity (ECS) – that is, the amount of warming expected to occur once the full impact of the extra greenhouse gases release has played out. This measure includes the impact of warming on long-term climate feedback loops, which can take decades, or even centuries, to materialise.
The value of ECS is one of the big climate change questions that scientists are still trying to address.
It is important because understanding how sensitive the Earth is to CO2 could help us to estimate how much the planet could warm in response to greenhouse gases, explains Prof Peter Cox, lead author of the new paper and a climate scientist at the University of Exeter. He tells Carbon Brief:
“The issue about the equilibrium climate sensitivity is the range that has been given in successive IPCC reports – 1.5 to 4C – is a range that is essentially ‘climate change we could probably adapt to’ at the 1.5C end and ‘climate change we probably can’t adapt to’ at the 4C end. So that uncertainty has a huge impact on impeding the focused effort to mitigate climate change and adapt.”
The new findings indicate that the value of ECS could be close to 2.8C, says Cox:
“We get a value with a ‘likely’ range, which means there’s a 66% probability that it’s in that range of 2.2 to 3.4C with a central estimate of 2.8C. That’s not so far from the central estimate of the IPCC which is 3C, but the range is much reduced, from 1.5 to 4C, to 2.2 to 3.4C. What that means is we can rule out very low climate sensitivities and we can rule out very high climate sensitivities.”
Capturing a signal
There are a number of techniques that scientists can use to work out what ECS could be.
One method is to look at how Earth has responded to natural greenhouse gas changes in its geological past to try to work out how it might respond to future global warming.
Another common technique involves using global climate models to estimate the theoretical effect of a doubling the amount of CO2 in the atmosphere.
A third method used by scientists involves matching global surface temperatures with the global warming trend over the past century to try and work out sensitivity from how the planet is responding. (This is what is known as the “energy budget model” approach.)
The new study uses a similar method to the energy budget model approach. However, instead of matching the global temperature record to global warming, the new research attempts to match temperature records to natural, long-term fluctuations in temperature.
Looking at natural variability rather than the warming trend allowed the scientists to exclude a range of uncertainties associated with human-caused climate change, Cox explains:
“Normally the way this [research] is done is by looking at the historical record warming, which makes sense. We’ve seen 1C of warming, roughly speaking, and so you may think that must tell you how sensitive the climate is. But it doesn’t. The main reason it doesn’t is that we don’t know how much energy or heat we’ve put in the system in terms of radiative forcing – greenhouse gases.”
To understand how historical temperature fluctuations have changed over the past century, the researchers first removed the global warming trend from a set of observational temperature data.
They then compared this data to results from a series of 22 global climate models. Some models had lower climate sensitivity, while some some models had higher climate sensitivity.
The results are shown on the chart below. On the chart, black dots show natural fluctuations in temperature from 1940 to 2020. Each line represents the results from one model, with magenta lines showing results from higher sensitivity models and green showing the results from models with lower climate sensitivity.
The chart indicates that higher sensitivity models generally predict a higher level of variability than has been observed over the past 50 years, while lower sensitivity models either closely match the observed trend or estimate a lower amount of variability.
Together, these results allowed the researchers to produce their narrower range.
Understanding ECS could help scientists to work out how much the climate is likely to warm in the future, Cox says, which, in turn, could allow policymakers to estimate how easy it will be to meet the goals of the Paris Agreement.
Climate sensitivity is the amount of warming that will occur after CO2 concentrations become twice as high as they were in pre-industrial times. Pre-industrial CO2 concentration levels were about 280 parts per million (ppm) and levels are currently at around 404ppm.
This means that, if humans stopped releasing CO2 today, the world should expect to experience more than half of the warming dictated by the ECS. Cox explains:
“That means that if you’ve got an ECS of 4C, then you’ve pretty much already missed the 2C target of Paris. So the ECS value has a big impact on the feasibility of Paris.”
If the results are correct and the climate sensitivity is 2.8C, then it is likely that the world will fail to limit warming to 1.5C above pre-industrial levels, which is the aspirational goal of the Paris Agreement, Cox adds:
“Our numbers suggest that we’re going to shoot over 1.5C. We might be able to avoid 2C, it will take a huge effort to do so. I think, to achieve 1.5C, you definitely have to think of negative emissions technologies and, if you want 2C, you need to think about it, too, even if it’s only a short-term stop gap.”
Negative emissions technologies are a group of techniques – many of which still remain hypothetical – that aim to remove CO2 from the air in an attempt to tackle climate change.
The study’s results “reduce the probability of very high climate sensitivity”, which should “reassure” those taking steps to meet the goals of the Paris Agreement, says Prof Gabi Hegerl FRS, a climate system scientist from the University of Edinburgh, who was not involved in the research. She tells Carbon Brief:
“It also emphasises that climate change won’t be small, so reducing climate change will continue to require very sharp reductions of emissions leading towards ceasing emissions.”
Reducing uncertainty surrounding climate sensitivity should help policymakers to refocus their efforts on tackling climate change, says Cox:
“If you can reduce the uncertainty, which I think we can, then you can focus your mind on what needs to be done. We can rule out very low values, where you might say, ‘don’t worry about it, we’ll adapt’ and you can rule out very high values that might lead to you to a sort of hopeless where you think, ‘it’s too late’. We are still in that zone where action is urgent, but not too late. But it is very urgent.”
‘Brick in the wall’
The new paper adds to the extensive research around the potential value for ECS.
Despite debate among scientists about the best way to estimate climate sensitivity, each new research paper can be seen as “brick in the wall” of our understanding, says Prof Andrew Dessler, an atmospheric scientist from Texas A&M University, who was not involved in the research. He tells Carbon Brief:
“I don’t think any single paper will by itself redefine what we think about ECS. Rather, the best-informed views will be reached by multiple lines of evidence, with care taken in relating the inferred ECS from different methods.”
The methods used in the study are “credible”, says Prof Steven Sherwood, deputy director of the Climate Change Research Centre at the University of New South Wales. However, it does have its limitations, he says.
For example, the paper does not discuss how natural events, such as El Niño, could impact temperature fluctuations, he tells Carbon Brief:
“The approach mixes up natural variability due to El Niño, decadal variations, volcanic eruptions and air pollutants, and we know that models have different biases with respect to each of these. There are also theoretical problems with applying their statistical approach in this way, even though it seems to work. So it is not clear whether to put more weight on this study, or the previous ones suggesting even higher sensitivity.”
In addition, the research may have made “significant” errors in its attempts to reduce uncertainty surrounding climate sensitivity, says Dr Patrick Brown, a climate scientist from the Carnegie Institution for Science in Stanford, California.
Last month, Brown was the lead author of a Nature paper which found that ECS could be relatively higher than previous estimates have suggested – their central estimate was 3.7C. Brown tells Carbon Brief:
“They appear to be comparing the IPCC ECS ‘likely’ range of 1.5 to 4.5C to their constrained ECS model range. This is not an appropriate comparison because the 16 models that they use do not span the entire uncertainty range of ECS.
“For example, no model that they investigate has an ECS below 2.2C. Thus their claim that they reduced uncertainty in ECS by 60% comes partly from the coincidence of which models happened to be included in their study.”
However, writing in an accompanying News & Views article, Prof Piers Forster, director of the Priestley International Centre for Climate at the University of Leeds, who was also not involved in the research, argues that some published estimates for ECS have “depended on the researchers’ assumptions about ECS, rather than the evidence.” He writes:
“By contrast, Cox et al started from climate-model values that are at the upper end of the IPCC range and used evidence to effectively rule out catastrophically high values.”
Forster adds that the methods used in the present study are “enviably simple” and will leave climate scientists asking, “why didn’t I think of that?” He says:
“In my view, Cox and colleagues’ estimate and the estimates produced by analysing the historical energy budget carry the most weight, because they are based on simpler physical theories of climate forcing and response, and do not directly require the use of a climate that correctly represents cloud.”
(Improving the representation of clouds in climate models should be a major priority for future research, scientists recently told Carbon Brief.)
Cox, P. M. et al. (2018) Emergent constraint on equilibrium climate sensitivity from global temperature variability, Nature,http://nature.com/articles/doi:10.1038/nature25450