The Key To Climate Models

AsianScientist (Jun. 05, 2024) – The unpredictable nature of the weather has long captured the imagination and fear of our ancestors, immortalized in the legends of gods. From the Grecian Zeus and the Mesoamerican Quetzalcoatl to the Shinto Raijin and Fujin, various deifications have manifested across cultures as humans struggled to impose a sense of order and control to the fickle moods of the winds and clouds.

Modern-day forecasting lifts some of the mysticism around weather with its ability to predict local meteorological conditions, albeit to varying accuracy levels and time horizons. For example, the Meteorological Service Singapore provides fairly accurate projections locally up to a fortnight in advance, while global weather center, AccuWeather, publishes estimates up to three months in advance. The advent of more accurate and extended predictions can help people and governments plan ahead, as well as mitigate property damage and loss of life.

Moreover, the irreversible environmental footprint that human activity has on the planet has led to an increasing global push to understand how the climate changes over much longer timescales. In fact, according to the United Nations Intergovernmental Panel on Climate Change’s Sixth Assessment Report, compound weather events—which are combinations of destructive events—will become more frequent as global warming accelerates. The same report highlights that even typical weather events, like maximum daily rainfall and daily temperature extremes, have significantly intensified over the years.

THE RIGHT RESOLUTION

Such simulations must be able to account for local weather events and fluctuations to be useful. For instance, a computer model that can only account for spatial weather patterns in a 10 km grid cannot identify the formation of small clouds or local bursts of rain. Additionally, climate scientists must consider the model’s time domain: an hourly weather prediction is more useful to the everyday pedestrian than a daily forecast.

In Singapore, the Center for Climate Research Singapore works with the National Supercomputing Centre (NSCC) Singapore to conduct climate studies that address both short- and long-term considerations. The Third National Climate Change Study (V3), which was recently launched, predicts rainfall, temperature, winds and relative humidity at a resolution of 8 km and 2 km up to the year 2100. On top of providing weather information to the public, such data supports the nation’s planning for sea levels, water resources, human health, biodiversity and food security.

Given this level of complexity and required resolution, it comes as no surprise that a highly-detailed model combining multiple datasets across decades is needed to get an accurate picture of the weather. As models become more complex, they also gobble up more computational resources—there are entire supercomputing facilities dedicated to climate research alone. All over Asia, new climate-focused centers are being established as world leaders prepare for turbulent times ahead.

Early last year, Japanese technology giant Fujitsu announced a new supercomputer system provided to the Japan Meteorological Agency for linear rainband forecasting. These slow-moving cumulonimbus clouds bring heavy rains and thunderstorms, which increase the risk of landslides and floods. The new computer features hardware similar to Fugaku, Asia’s fastest supercomputer, and will provide more accurate and rapid forecasts.

In India, computing solutions company Eviden is working with the Indian Institute of Tropical Meteorology and the National Centre for Medium Range Weather Forecast to deliver two new supercomputers. Higher computing capabilities would allow for better resolution in a virtual model.

“At present, we have a 4 petaFLOPS computer,” explained Ravichandran. “We have to achieve 18 petaFLOPS to move from a 12 km resolution to 6 km.”

OF VIRTUAL EARTHS AND MACHINE LEARNING

In parallel with hardware upgrades for meteorological research, several organizations are looking into techniques used to model the intricacy of long-term climate patterns. Estimates of next month’s weather already require a massive effort—imagine predicting years, months or even centuries of climate. Harris underlines the Herculean effort necessary: “You literally would need to be running simulations, all day, every day, for years on end to really get a good view of that.”

To address this issue, researchers and companies are looking into using AI to augment current models. AI is well-known for taking in large amounts of data, spotting patterns and ultimately making fairly efficient and accurate predictions.

At the moment, researchers are confident that AI models will supplement current weather models rather than replace them entirely. Dubbed “digital twins”, current state-of-the-art weather models are computer numerical simulations that construct a virtual diorama of the Earth and her weather patterns.

“To train some of the AI models, you either require a numerical-based simulation to provide most of the data inputs, or you need to simulate some surrogate models.”

In fact, he shared that some scientists are looking into daisy-chaining ensembles of digital twins and AI predictions: running the simulation to provide data for AI, and then using the AI to give economical longitudinal predictions over extended time scales.

However, some might wonder about the reliability of using simulated data to train a machine learning forecast model. Harris provided assurance on the stringent checks and balances put in place, with repeated comparisons performed on both the AI predictions and numerical simulations against real-world events, which are then used to further calibrate such models.

NVIDIA hosts their own data-driven weather model, dubbed the Fourier Forecasting Neural Network (FourCastNet). Accelerated by graphics processing, predicting the weather a week in advance only requires a fraction of a second on a single NVIDIA graphics processing unit (GPU).

Another revolutionary model is the Pangu-Weather model developed by Huawei Cloud. Published in Nature in 2023, Pangu-Weather breaks new ground as the first AIbacked model to outperform traditional numerical methods.

The model has been extensively tested on various key events, with remarkable success in modeling the sweltering 40°C UK summer in 2022, as well as tracking the path of Storm Eunice in 2022 and Typhoon Doksuri in 2023.

In late 2023, Google released Graphcast, the newest contender in the field. Built upon four decades of data, GraphCast outperforms conventional numerical models, demonstrating precise tracking of Hurricane Lee as well as various extreme thermal events in the same year.

Although such AI-supported forecast models are built upon slightly different architectures, they undeniably reinforce the superior edge AI provides in the arena of weather prediction. All three models are open-source, with charts available to the public on the European Centre for Medium-Rand Weather Forecast’s website.

AN OPEN FORECASTING REVOLUTION

Highlighted by a recent paper in Nature, active collaborations and data sharing will be the fulcrum for incorporation of larger climate models. Experts are excited to see the rapid proliferation of AI among national weather and climate research centers.

With the integration of AI into climate models, a few GPUs can now run a weather model to a degree of accuracy that previously required a supercomputer to achieve. Current state-of-the-art models can also be run on a bulked-up personal computer, making weather forecasting more accessible than any other point in history.

For governments, having hardware efficient models opens the possibility of having smaller regional centers for forecasting. Without the huge investment of a new supercomputing center, these centers can be set-up at lower costs, yet perform fairly accurate predictions, bringing better foreknowledge of the weather to the public. Taiwan is currently engaging with NVIDIA to gain a better understanding of the regional consequences of weather events.

At the same time, with climate models going open-source, researchers in the public domain, academia and industry can now work hand-in-hand to develop the next generation of climate models.

Harris highlights that NVIDIA has been working closely with the development community to ensure that its software runs quickly and efficiently. “We’re engaging with the broader community to help optimize and make sure that the models can be easily adopted and fine-tuned to meet reasonable use cases,” Harris said.

With the advent of faster and more accurate models, as well as upgrades to existing computing facilities for climate science, governments can now have a better understanding of the consequences of long-term countermeasures in fighting climate change. The new accessibility of climate science and modeling forecasts clear skies ahead for meteorology.

This article was first published in the print version of Supercomputing Asia, January 2024.Click here to subscribe to Asian Scientist Magazine in print. 

Copyright: Asian Scientist Magazine.

Disclaimer: This article does not necessarily reflect the views of AsianScientist or its staff.

 

Read More

Leave a Reply