This climate model simulation showing sea surface temperatures was created by researchers at the National Center for Atmospheric Research with the help of NERSC's supercomputers. The image reveals the Gulf Stream meander pattern and the cooler tropical Pacific and Atlantic surface temperatures caused by upwelling of cold water. Also visible is cold water under the Arctic and East Greenland sea ice.

Climate Modeling


by Jon Bashor
Judging by the reliability of TV weather forecasters, figuring out what the weather's going to be like for the next few days or a week can be a real challenge. And if it's that difficult to get a handle on current weather conditions, the problem facing scientists trying to predict the Earth's climate (which is affected by myriad forces) over years or decades is a problem of much greater dimensions.

But as the issue of climate change heats up, scientists are making progress. In the past 10 years researchers have achieved a better understanding of the Earth's climate, while simultaneously the computer industry has developed ever-more-powerful machines. These two factors are now resulting in increasingly accurate computer modeling tools for studying and predicting changes in the Earth's climate. When climate modeling experts at the National Center for Atmospheric Research (NCAR) in Colorado were looking for a scientific computing center as a research partner, they linked up with the National Energy Research Scientific Computing Center (NERSC). Located at Berkeley Lab, NERSC is home to some of the nation's most powerful supercomputers.

"For our work, we first targeted NERSC and the Cray T3E," said Warren Washington, the principal investigator on the project and a senior scientist at NCAR who has been working on climate modeling since 1960. "Access to this machine allowed us to really get going. Our model has hundreds of thousands of lines of code and large memory requirements. We couldn't load the entire model on smaller systems. We needed higher resolution than is usually used."

With access to NERSC's Cray T3E-900 massively parallel processor supercomputers, scientists at NCAR, the Naval Postgraduate School and Los Alamos National Laboratory are able to run their Parallel Climate Model with High Resolution Ocean and Sea Ice (PCM) in less time and with better results. For example, running their PCM on an older model supercomputer, such as 16-processor Cray C90, would require more than 10 hours to simulate one model year, or more than three months of computer time to model a century. By using half of the 512 processors of NERSC's T3E-900, one year can be modeled in about 35 minutes, or 28 times faster.

"The performance NCAR has achieved is unmatched by other climate codes in the United States," said NERSC Division Director Horst Simon. "This was accomplished by combining NCAR's modeling expertise with NERSC's high-performance computing platforms. The NCAR team has demonstrated that production climate research can be carried out on a parallel processing, distributed memory supercomputer. Not only does NCAR's code run well on our machine, but its design also allows it to be scaled up for creating an even more powerful model in the future."

Working with Washington's group at NCAR are teams at the Naval Postgraduate School, led by Albert J. Semtner, Jr., and at Los Alamos National Lab, led by Robert Malone.

Their work, funded by the Department of Energy and the National Science Foundation, is aimed at investigating the effect of greenhouse gas increases and sulfate aerosols on global warming. While increased greenhouse gas causes global warming, sulfate aerosols cause less warming, and in some cases, regional cooling.

It is the capability of running the ocean and sea ice components that makes the model so useful, according to Washington. Just as the Earth's climate is affected by a wide range of factors, so too are reliable climate models. The higher the resolution for such features as the Gulf Stream or Kurioshio Current, the more accurately their role can be included as transporters of heat and salt. And as West Coast residents learned during the winter of 1998, the effects of El Niņo cannot be discounted.

Although some may point to El Niņo as evidence of climate change, Washington says El Niņos have always existed, though not regularly nor of the same magnitude. But their very variability is an important reason to simulate them into climate models.

"If we don't capture El Niņo in our models, we miss a lot of the natural climatic variability," says Washington. "To be able to simulate El Niņo, we need very high equatorial resolution in our models. In turn, this will give us better answers to regional climate change simulations."

And just as one year's weather can't be interpreted as evidence of climate change, one computerized model can't be expected to produce definitive answers. Instead, researchers need to run a series of modeling experiments, each one different, to take into account differing possible future climate scenarios. More powerful computers allow these simulations to be researched at higher resolution, or in greater detail.

Two and a half years ago, Washington and his colleagues at NCAR and the Naval Postgraduate School began developing a flexible, coupled model to run on massively parallel processor supercomputers, such as NERSC's Cray T3E-900. Such computers are able to run simulation models of greater size and with more detail than existing computers.

Washington says there is a certain stochastic element to weather and climate in that the variation is not always predictable. For example, even though temperature records for California over the past 50-plus years show a warming pattern, not every year is warmer than the last: some are warmer, and some are colder. Because the issue of global warming could lead policymakers to make far-reaching decisions, it's important that the best information be available. Will greenhouse gas emissions produce a significant difference in our climate over the next 10 years? 30 years?

"To determine this, we need to run ensembles of experiments—four or five experiments at a time. A single experiment is not satisfactory," Washington says. "We need to know if the change is statistically significant in order to confidently answer the questions being asked by policymakers."

Although their model is state-of-the-art, Washington admits that the field is still working to develop better models. For example, current models don't have enough resolution to show in detail the geographic features between the Sierra Nevada and the Rocky Mountains—they treat the region as a continuous mountain range. "The Great Salt Lake Basin just isn't there," Washington says. "The same thing can be said for the land/vegetation, ocean and sea-ice components of the models, all of which demonstrates the need for even greater resolution and improved physical processes such as clouds."

For more on this project visit http://goldhill.cgd.ucar.edu/ccr/pcm/


Next: Cosmic Tangle


— A Tool for Discovery —
A Tool for Discovery | Seeing It Whole | Climate Modeling | Cosmic Tangle | A Curvier Form of Carbon



Research Review Fall '98 Index | Berkeley Lab