December 17, 1998

Go to Berkeley Lab Home Page





 

Science magazine names Supernova Cosmology Project "Breakthrough of the Year"

 

Still images from the Supernova Cosmology Project website

 

Online supernova discovery movie clip

 

Lawrence Berkeley National Laboratory

Contact: Jon Bashor,   (510) 486-5849,  jbashor@lbl.gov

BERKELEY, CA.. -- When the National Energy Research Scientific Computing Center (NERSC) moved to Berkeley Lab in 1996, a computational science program was created to encourage collaborations between physical and computer scientists. The Supernova Cosmology Project's work was one of the first projects funded; it demonstrates how high-performance computing can accelerate scientific discovery.

With the recognition by Science magazine of the Supernova Cosmology Project’s scientific breakthrough, along with other collaborations, Berkeley Lab establishes itself as home to one of the leading computational science centers in the country.

"This summer we burned lots of time on the T3E," says team member Greg Aldering of NERSC’s contributions. "They gave us help developing our algorithms, and they gave us confidence in our methods."

The Cray T3E was particularly important, Aldering says, "because we spent a lot of time doing fits." To analyze their data from 40 supernovae for errors or biases, the team used the 512-processor Cray T3E-900 supercomputer to simulate 10,000 exploding supernovae at varying distances, given universes based on different assumptions about cosmological values; these were then plotted and compared with real data to detect any biases affecting observation or interpretation.

"One thing we needed to establish about our model -- and did establish -- is that the mass of the universe couldn’t go negative," says Aldering.

A completely separate line of inquiry, but one essential to the Supernova Cosmology Project’s search method, was to study the characteristics of type Ia supernovae. To make meaningful comparisons of nearby and distant type Ias -- in other words, to affirm their usefulness as standard candles -- the light measurements from the more distant supernovae, with larger redshifts, were compared with the redshifts of closer ones. These measurements were then altered slightly to examine the effects of dust along the line-of-sight, and to test slightly different explosion scenarios. These simulations were compared with the team's observations to make sure these matched their theoretical calculations.

Because the real measurements involved readings taken many times over a 60-day period from 40 supernovae, making the comparisons "is a task you only want to send to a supercomputer," says Berkeley Lab postdoctoral fellow Peter Nugent.

Nugent, who ran all of the simulations and analyses on the T3E for the project, said the Cray supercomputer was also used to make sure that the error bars presented in the research were reasonable. In addition to chi-square fitting, researchers also employed bootstrap resampling of the data. Here they plotted the mass density of the universe and the vacuum energy density based on data from 40 supernovae. Then they began resampling the data, taking random sets of any of the 40 supernovae and finding and plotting the minimum value for each parameter. The resampling procedure was repeated tens of thousands of times as an independent check on the assigned error bars.

"Currently this work takes about an hour using 128 processors on the T3E," Nugent says. "It's wonderful to be able to run six or seven of these in just one day and then compare the results."

Those results include the designation by Science of research revolutionary in its field. In addition, Supernova Cosmology Project team leader Saul Perlmutter was also honored with an invitation to address the recent supercomputing conference SC98, sponsored by the Institute of Electrical and Electronic Engineers, where he discussed the melding of cosmology and computational science at Berkeley Lab.