Projecting the future of extreme weather events and their impact on human life, the environment and vulnerable ecosystems locally and across the globe remains a complex task in climate research -- and one in which statisticians are increasingly playing key roles, particularly through the development of new models. The December issue of CHANCE examines complexities of intense, massive data collection and statistical analysis techniques in climate research and features new proposed statistical methodology that could be a "game changer" in understanding our climate system and in the attribution of extreme climatic events.
Changes in events related to atmospheric circulation, such as storms, cannot be characterized robustly due to their underlying chaotic nature. In contrast, changes in thermodynamic state variables, such as global temperature, can be relatively well characterized. "Rather than trying to assess the probability of an extreme event occurring, a group of researchers suggest viewing the event as a given and assessing to which degree changes in the thermodynamic state (which we know has been influenced by climate change) altered the severity of the impact of the event," notes Dorit Hammerling, section leader for statistics and data science at the Institute for Mathematics Applied to Geosciences, National Center for Atmospheric Research.
Climate models are complex numerical models based on physics that amount to hundreds of thousands, if not millions, of lines of computer code to model Earth's past, present and future. Statisticians can analyze these climate models along with direct observations to learn about Earth's climate.
"This new way of viewing the problem could be a game changer in the attribution of extreme events by providing a framework to quantify the portion of the damage that can be attributed to climate change -- even for events that themselves cannot be directly attributed to climate change using traditional methods," continues Hammerling.
Another promising approach involves combining physics, statistical modeling and computing to derive sound projections for the future of ice sheets. Considering that the Greenland and Antarctic ice sheets span more than 1.7 million and 14 million square kilometers, respectively, while containing 90% of the world's freshwater ice supply, melting of ice shelves could be catastrophic for low-lying coastal areas.
Murali Haran, a professor in the department of statistics at Penn State University; Won Chang, an assistant professor in the department of mathematical sciences at the University of Cincinnati; Klaus Keller, a professor in the department of geosciences and director of sustainable climate risk management at Penn State University; Rob Nicholas, a research associate at Earth and Environmental Systems Institute at Penn State University; and David Pollard, a senior scientist at Earth and Environmental Systems Institute at Penn State University detail how parameters and initial values drive an ice sheet model, whose output describes the behavior of the ice sheet through time. Noise and biases are accounted for in the model that ultimately produces ice sheet data.
"Incorporating all of these uncertainties is daunting, largely because of the computational challenges involved," and to an extent, "whatever we say about the behavior of ice sheets in the future is necessarily imperfect," note the authors. "However, through such cutting-edge physics and multiple observation data sets that piece the information together in a principled manner, we have made progress."
Specific articles in this special issue of CHANCE include the following:
Story Source:
Materials provided by American Statistical Association. Note: Content may be edited for style and length.
Cite This Page: