It's a common refrain from those who question mainstream climate science findings: The computer models scientists use to project future global warming are inaccurate and shouldn't be trusted to help policymakers decide whether to take potentially expensive steps to rein in greenhouse gas emissions.
A new study effectively snuffs out that argument by looking at how climate models published between 1970 - before such models were the supercomputer-dependent behemoths of physical equations covering glaciers, ocean pH and vegetation, as they are today - and 2007.
The study, published Wednesday in Geophysical Research Letters, finds that most of the models examined were uncannily accurate in projecting how much the world would warm in response to increasing amounts of planet-warming greenhouse gases. Such gases, chiefly the main long-lived greenhouse gas pollutant, carbon dioxide, hit record highs this year, according to a new UN report out Tuesday.
They are now higher than at any other time in human history.
The study does fault some of the models, including one of the most famous calculations by former NASA researcher James Hansen, for overestimating warming because they assumed there would be even greater amounts of greenhouse gases in the atmosphere than what actually occurred. These assumptions mostly involved non-CO2 greenhouse gases, such as methane.
Hansen's projection, says study lead author Zeke Hausfather, a researcher at the University of California at Berkeley, erred by about 50 percent because it did not foresee a significant drop in emissions of substances that deplete the stratospheric ozone layer.
Many of those gases are also powerful global warming agents. Hansen also didn't foresee a temporary stabilization in methane emissions during the 2000s, Hausfather says.
However, his model, like many of the others examined in the new study, got it right on the basic relationship between greenhouse gas emissions and the amount of warming they would cause. The errors came from poorly predicting bigger wild cards: How societal factors would govern future emissions through economic growth, emissions reduction agreements, and other factors.
"The big takeaway is that climate models have been around a long time, and in terms of getting the basic temperature of the Earth right, they've been doing that for a long time," Hausfather said in an interview.
For the uninitiated, a brief explanation of computer models. All computer models involve two main arms. The first is the structure of the model itself--what it depends on and what the internal and external relationships are. The second is the inputs. For example, let's suppose that the world had listened when Hansen issued his global warming warnings, and had moved to cut CO2 and methane emissions. Then global temperatures wouldn't have risen. But that doesn't make the models wrong. Equally, if emissions had risen as fast as Hansen assumed, then actual temperatures would have matched what the model predicted. The model was fine; the forecast inputs turned out to be wrong. GIGO, or "garbage in garbage out" as they say in the computer industry. That in turn means that forecasts of a 3 degree plus rise by 2100 depend on whether we slash emissions or not. If we do, we will avoid that terrible situation. If we do not, we won't.
What the models got right on average, was the decadal rise in temperatures, which they forecast would be about 0.2 degrees and which in fact turned out to be very close to that.
No comments:
Post a Comment