NOAA’s powerful new weather forecasting supercomputers are now online

These supercomputers are three times faster than NOAA’s previous systems. Here’s what the agency plans to do with them.
stormy weather
NOAA's new supercomputers expected to yield better weather forecasts across the board. NOAA / Unsplash

Share

This week, the National Oceanic and Atmospheric Administration announced that the new supercomputers it is using for improved weather forecasting and modeling are now online. 

The supercomputers, named Dogwood and Cactus after the native plants in their respective locations, are Hewlett Packard Enterprise’s Cray supercomputers.

“These numerical models that we run on these supercomputers, they actually provide the foundation for the forecasts that our various stakeholders here—from the public, water managers, emergency managers in the case of hurricanes for example—use,” says Brian Gross, director of NOAA’s Environmental Modeling Center. “And there is a direct connection between computing capability and our modeling capability.”

[Related: The new Frontier supercomputer will be the fastest in the world]

Both Dogwood and Cactus will have a speed of 12.1 petaflops, meaning they can perform about 12 quadrillion calculations per second. That’s three times faster than NOAA’s previous system, which had a capacity of about 4.2 petaflops. The storage capacity doubled as well to about 26 petabytes. 

“On your normal laptop, you might have a quad-core processor. On these supercomputers, we have about 327,680 cores on the system,” says David Michaud, director of the office of central processing for the National Weather Service, which is an agency within NOAA.

Plans for the updated Weather and Climate Operational Supercomputing System were first unveiled in 2020. Back then, NOAA said that it was replacing its existing systems in Reston, Virginia and Orlando, Florida and awarding a contract to General Dynamics Information Technology to provide upgraded supercomputing products and services. The set of new supercomputers (a main machine and a backup) is in Manassas, Virginia, and Phoenix, Arizona. The contract with General Dynamics is for 10 years and it has a ceiling value of $505.2 million. 

In the next several years, NOAA hopes to address a progressive series of tasks with Dogwood and Cactus. 

“The very first thing that we do is to make sure that everything that was running on the previous system can continue to run on the new system,” says Gross. “There’s a number of major modeling systems that we’re planning on upgrading over the next few years because of this enhanced capacity.”

First up is a new and improved hurricane prediction system, called the Hurricane Analysis and Forecast System, that they hope to have up and running for the next hurricane season in the summer of 2023. Then, they’ll revamp the Global Prediction System, which projects weather conditions out to 30 days or so, and the regional system, which is used to forecast severe weather. 

[Related: Open data is a blessing for science—but it comes with its own curses

There are four main things NOAA scientists can do with these increases in computing power. They can run higher resolution models to see smaller scale features in the atmosphere, like thunderstorms. They can better capture the physical processes that are going on in the atmosphere, ocean, or on land or sea ice. “Think about how clouds form, and whether they will precipitate, and what form that precipitation will take. These are the things we try and improve on in our models,” Gross says. 

Better computers will also help NOAA forecasters run models called ensembles, which take different sets of initial weather conditions to predict a range of weather possibilities for a future time period, like for tomorrow or for next Thursday. 

Lastly, more computing power means that NOAA can build better data assimilation systems, meaning that it can potentially consider data inputs from a large range of sources, from new satellites to Saildrones. “Satellites can be used to directly observe things like sea surface temperature, or temperature profiles in the atmosphere. We’re also able to extract information from the way satellite signals bend as they propagate through the atmosphere. There’s a vast amount of information that we can take into our models to form our initial conditions,” says Gross. “The more data we can integrate over a longer period of time, the better the initial condition we’re going to have, the more accurate the forecast is going to be.”

In addition to providing forecasts for the public, NOAA will also make some of its data sources and models available to researchers through various cloud providers (Microsoft, Google, Amazon Web Services, etc) and open data programs

“That’s been an interesting way to get different views of the data for different purposes other than just straight up making a forecast,” says Michaud. “One example is on AWS, we have all the radar volume scans back to the 1990s through to real-time…When we got the radar data out, people were using it to study bird migration. Unless you make that data open and accessible, you’ll never really know what the limits or potentials are for that particular dataset.”