When Japan’s Earth Simulator debuted early in 2002, the scientific computing world got a wake-up call. Reaching 87 percent of its theoretical peak performance, running five times faster than the world’s next-fastest machine, the Earth Simulator was a reminder of what supercomputers designed to do science could achieve.
A government study concluded that one reason the U.S. had fallen behind was “the 1990s approach of building systems based on commercial off-the-shelf components.” While the sophisticated processors used in personal computers are some of the most advanced microdevices in existence, a PC’s applications don’t access memory the way scientific applications do. When the same kinds of processors are clustered together to do science, they are incapable of reaching anything like their potential.
Researchers in Berkeley Lab’s computing divisions soon joined with counterparts at Argonne and the IBM Corporation to create a strategy for a new class of scientific computers. They came up with a conceptual design for a system that would enhance the architecture of IBM’s Power series of processors.
Not all scientific problems are susceptible to the same mathematical approach or have the same needs for memory access and organization. The team identified specific problems like “memory contention” in IBM’s existing multiprocessor design, severely affecting the performance of processors that share an interface with main memory. They devised Virtual Vector Architecture (ViVA), a way to create very-high-performance virtual vector processors from groups of individual processors in a node. IBM has incorporated the results in its new generation of microprocessors.
“The high-performance systems of the future have to be balanced in many ways” — neither off-the-shelf cluster machines nor expensive special-purpose machines will do — “since the scientific applications of the future will combine many different methods,” the team concluded. Berkeley Lab’s computer scientists and users are leading the way.
About the Image
Some scientific questions can only be answered when data is visualized in three dimensions; with others, 3-D visualization helps researchers “see their data in a way they may not have seen it before,” says Cristina Siegerist of the Computation Research Division’s Visualization Group.
“Visualization begins when the researchers come to us with a box of data,” Siegerist says. “I ask them, how did you write it? If I can read it, I know how to employ one of our standard programs or how to write my own.”
For particle-accelerator simulations performed by Rob Ryne’s group in the Accelerator and Fusion Research Group, with Andreas Adelmann of the Paul Scherrer Institute, Siegerist developed sophisticated electron-cloud visualizations using AVS/Express software; she also custom-made the PartView tool, whose simple interface allows the researchers access from their desktops.
In this way Siegerist and Visgroup leader Wes Bethel, John Shalf, and their colleagues help NERSC users, especially those pursuing SciDAC (Scientific Discovery through Advanced Computing) and INCITE (Innovative and Novel Computational Impact on Theory and Experi-ment) projects, to see what they want to see in their data — and also what they may be surprised to find.
“Sometimes the surprises may be as simple as errors that need correcting,” Siegerist remarks, “like a bug in the code that assigns a particle the wrong charge.” But visualization can also lead to fresh insight.
In his studies of carotenoids in photosynthesis (shown at left), William Lester of the Chemical Sciences Division used Quantum Monte Carlo calculations, which employ “random walkers” — in this case, 314 walkers, representing electrons — to feel out quantum-mechanical energies and wave functions. “Watching the walkers follow their trajectories was revealing, giving new perceptions about the molecules’ electronic structures,” Siegerist says.
Siegerist’s background in physics and computer science, and her wide experience in academic and industrial settings in many countries, are typical of the broad reach and flexible approach the Visualization Group brings to its myriad challenges.