SAN FRANCISCO –A group of researchers has outlined a “framework” for testing whether global warming has contributed to record-setting extreme weather events.
In the past, scientists typically avoided linking individual weather events to climate change, citing the challenges of teasing apart human influence from the natural variability of the weather.
However, noted Noah Diffenbaugh, a professor of Earth system science at Stanford University’s School of Earth, Energy & Environmental Sciences, “over the past decade, there’s been an explosion of research, to the point that we are seeing results released within a few weeks of a major event.”
In a study published in this week’s issue of Proceedings of the National Academy of Sciences, Diffenbaugh and his colleagues add what is the latest to a burgeoning field of climate science called “extreme event attribution,” which combines statistical analyses of climate observations with increasingly powerful computer models to study the influence of climate change on individual extreme weather events.
To avoid inappropriately attributing an event to climate change, the researchers began with the assumption that global warming had played no role, and then used statistical analyses to test whether that assumption was valid.
“Our approach is very conservative,” Diffenbaugh said. “It’s like the presumption of innocence in our legal system: the default is that the weather event was just bad luck, and a really high burden of proof is required to assign blame to global warming.”
By applying their framework to the hottest, wettest and driest events that have occurred in different areas of the world, the researchers found that global warming from human emissions of greenhouse gases has increased the odds of the hottest events across more than 80 percent of the surface area of the globe for which observations were available.
“Our results suggest that the world isn’t quite at the point where every record hot event has a detectable human fingerprint, but we are getting close,” Diffenbaugh said.
Among their findings, for the driest and wettest events, human influence on the atmosphere has increased the odds across approximately half of the area that has reliable observations.
“Precipitation is inherently noisier than temperature, so we expect the signal to be less clear,” Diffenbaugh said. “One of the clearest signals that we do see is an increase in the odds of extreme dry events in the tropics. This is also where we see the biggest increase in the odds of protracted hot events — a combination that poses real risks for vulnerable communities and ecosystems.”
The research team has been focusing on individual events such as the 2012-2017 California drought and the catastrophic flooding in northern India in June 2013.
One high-profile test case was Arctic sea ice, which has declined by around 40 percent during the summer season over the past three decades.
When the team members applied their framework to the record-low Arctic sea ice cover observed in September 2012, they found overwhelming statistical evidence that global warming contributed to the severity and probability of the 2012 sea ice measurements.
Their approach, the team said, can be used to study not only the weather conditions at the surface, but also the meteorological “ingredients” that contribute to rare events.
“For example, we found that the atmospheric pressure pattern that occurred over Russia during the 2010 heat wave has become more likely in recent decades, and that global warming has contributed to those odds,” co-author Daniel Horton, an assistant professor at Northwestern University in Evanston, Illinois, and a former postdoc in Diffenbaugh’s lab, was quoted as saying in a news release from Stanford.
“The question is being asked by the general public and by people trying to make decisions about how to manage the risks of a changing climate,” said Diffenbaugh. “Getting an accurate answer is important for everything from farming to insurance premiums, to international supply chains, to infrastructure planning.”