|BERKELEY, CA As computers have
become more powerful and more accessible, their role in scientific research has grown
accordingly. Today, DOE researchers are using some of the worlds more powerful
computing resources to unlock new secrets in science. With more powerful computers capable
of performing trillions of calculations per second, scientists are on the threshold of
another age of discovery. The results of these efforts are expected to return large
dividends for our nations economy, environment, safety and health.
The National Energy Research Scientific Computing Centers procurement of an IBM RS/6000 SP system as the center's next-generation supercomputer was based on its ability to handle actual scientific codes and tests designed to ensure the computers reliability as a full-production computing system in NERSC. When fully installed, the IBM system will provide four to five times the computational power of NERSC, already one of the most powerful supercomputing sites in the world. The entire system will have 2,048 processors dedicated to large-scale scientific computing. The system will have a peak performance capability of more than 3 teraflops, or 3 trillion calculations per second.
This resource will be allocated to DOE-sponsored scientists carrying out unclassified research on some of the most challenging scientific problems of our time. Here are examples of some of the areas of research expected to utilize NERSCs scientific computing resources.
Answering the Burning Questions of Combustion
The interiors of internal combustion engines and power plant boilers are hot, dirty and dangerous environments for conducting experiments. Unlocking the mysteries of combustion through numerical simulations requires a multidisciplinary approach that involves specialists in fields such as computational fluid dynamics and chemical kinetics. These areas are well suited to computer simulation.
With accurate combustion modeling, scientists can study the entire process and make incremental changes in the model system to test new ideas. The alternative is to actually build and modify engine components for testing a time-consuming and more expensive approach. The same simulation tools developed for studying internal combustion engines can also be adapted for analyzing and improving the performance of industrial boilers, such as those used to generate electricity.
Scaling up to Better Global Climate Modeling
Judging by the reliability of daily weather forecasts, figuring out what the weathers going to be like for the next few days or a week can be a real challenge. And if its that difficult to get a handle on current weather conditions, the problem facing scientists trying to analyze the Earths climate over years or decades is a problem of much greater dimensions.
Because energy and environmental policies have far-reaching effects, its essential that the underlying information be as accurate and reliable as possible. To reduce the uncertainty in current models of long-term climate change, and to provide accurate regional information as part of the bigger picture, the United States needs to accelerate its work in climate change analysis and simulation. The DOEs Accelerated Climate Prediction Initiative aims to do this by speeding up development of simulation models, improving decade-to-century model-based projections, and increasing the availability to the broader research community.
Making progress in these areas depends on advances in scientific knowledge (which depend, in part, on knowledge generated by models) and on the capabilities of computing infrastructure to process large amounts of data quickly. Advancing computer technology is essential to developing better climate models.
With the availability of faster, more powerful computers, such climate models will run more quickly, and will also be able to include more detail for regional modeling. The final outcome will be the wide availability of data and scientifically credible climate projections in usable formats. This will benefit the scientific community, policymakers, industry, and households seeking to understand climate change and its impacts.
Computing Challenges in High Energy and Nuclear Physics
The field of High Energy and Nuclear Physics (HENP) will soon be acquiring more experimental data, and with it more chances to observe new phenomena and make new discoveries. DOE projects nearing completion are the BaBar detector at the Stanford Linear Accelerator Center B-Factory, new experiments at Brookhaven National Labs Relativistic Heavy-Ion Collider (RHIC), upgraded experiments at the Fermilab Tevatron, and at Jefferson Lab.
Answers to fundamental questions about the nature of the universe will not simply leap out from these laboratories. They will be subtly hidden in the raw data. Large-scale numerical simulations of theoretical models are needed to compare data with theory, test the Standard Model and exciting new physics beyond it. These simulations will require computing power of the teraflops scale allow theorists to discover new possibilities.
These experiments, as well as U.S. participation in the Large Hadron Collider at CERN in Geneva, have the opportunity to increase knowledge of the physical world in unprecedented ways, through unprecedented amounts of data. The volume of data, however, demands new computer science tools for data management at previously unheard of scales. Beyond the upcoming generation of experiments, other future facilities will be needed and more powerful computers will enhance performance and capability for accelerator modeling to design compact, high-energy accelerators of the future. Components could be designed with significantly improved performance and cost effectiveness.
Computational Biology: Deciphering the Secrets of Life
The extraordinary advances in the biological sciences, begun after World War II with the infusion of physics and chemistry to microbiology and biochemistry (that led to the technology know as molecular biology), have accelerated in the past decade, due in large part to the pace of discoveries coming from the genomes projects: the Human Genome, model organisms, and notably, microbial genomes. Along with the maturation of the molecular biosciences has been the equally dramatic explosion in computer and information science and technology. As a consequence, there is exceptional synergism to be gained from exploiting these twin scientific revolutions. As living systems are characterized by complexity, biology is overwhelmed by large amounts of complex data. In contrast to other fields, this data is unique information on which theory and simulation must be based, making computer technologies ideal and essential tools for the biology of the 21st Century.
The Human Genome Program was undertaken, among other goals, to understand susceptibility to diseases, radiation and toxics in the environment. The results will provide a basis for future medical practice as well as advance fundamental biological understanding.
New algorithms and techniques for calculating molecular energies are the subject of intense research at this time, and their application using multi-teraflops computers will have a major impact on structural biology in the context of drug development and environmental effects.
Advanced computational tools will allow the genome project to bear fruit decades earlier in terms of individualized medical diagnosis and therapy, targeted drug design and development. Besides improving human productivity through enhanced public health, structural genomics will provide the basis for U.S. competitiveness in biotechnology.
Computational Materials Science: A Scientific Revolution Waiting to Materialize
From the Bronze Age to the silicon-driven Information Age, civilization has defined itselfand advanced itselfby mastering new materials. Advances in materials drive economic, social and scientific progress and profoundly shape our everyday lives. Today, thanks to increasingly powerful computers, the materials science community finds itself on the verge of another revolution. In this new era, extensive computational modeling will complement and sometimes even replace traditional methods of trial-and-error experimentation. With simulation, scientists will guide advanced materials development and will comprehend how materials form, how they react under changing conditions and how they can be optimized for better performance.
More powerful magnets will improve the performance of electric motors. Light-weight alloys will enhance the fuel economy of automobiles without compromising passenger safety. Advances in semiconductor materials will ensure that the rapid growth in computer power will extend well into the next millennium. The potential for computational science to accelerate materials development is enormous; the results will sustain our global economic competitiveness, keeping the United States preeminent in this strategic field.
Until recently, our knowledge of materials arose mainly from trial-and-error techniques. Only with information about the atomic and molecular structures have scientists been able to comprehend materials at the most elemental level. Today, extensive computer modeling capabilities can complement and accelerate laboratory development. Computer simulation tools expected to be available in the near future could substantially reduce the amount of time required to take a new material from synthesis to product, a process that currently takes a minimum of 10 yearsand as long as 25 years. In the United States economy, this time lag to market is generally the principal barrier to new materials development.
Computational Advances in Fusion Energy Research
A major part of the DOE mission is "to foster a secure and reliable energy system that is environmentally and economically sustainable." Recognizing the potential of fusion energy, the President's Committee of Advisors on Science and Technology (PCAST) recommended strengthening fusion research as a key component of the nation's long-term strategy for national energy security and climate change remediation. Better computer models could actually reduce the number of full-scale experimental facilities needed.
Predicting the properties of these plasma systems requires scientists to integrate many complex physics phenomena, which cannot be determined solely from experiments. DOE will begin operating the National Spherical Torus Experiment magnetic fusion device in 1999 and the National Ignition Facility for conducting inertial confinement fusion, in 2003. A greatly enhanced modeling effort, benchmarked against experimental results, will foster rapid, cost-effective exploration and assessment of alternate approaches in both types of fusion research and will be the catalyst for a rapid cycle of innovation and scientific understanding.
NERSCs predecessor, the Controlled Thermonuclear Research Computer Center, was established in 1974 to provide advanced computation and modeling capabilities for fusion energy research. Although NERSCs mission has since been broadened, fusion remains a primary focus. Recently, researchers at the Princeton Plasma Physics Laboratory utilized the full power of the half-teraflop Cray T3E supercomputer at NERSC to produce fully 3-dimensional general geometry nonlinear particle simulations of turbulence suppression by sheared flows. These results, which provide valuable new physics insights and correlate well with key experimental trends, are summarized in the Sept. 18, 1998, issue of Science magazine. The associated calculations, which typically utilized 400 million particles for 5000 time-steps, would not have been possible without access to powerful present generation massively parallel processor computers.
Nevertheless, important additional physics features must be included in these as well as other models to produce realistic simulations of plasmas relevant to key applications areas such as fusion power generation.
For more information about NERSC and research conducted at the center, visit the website at http://www.nersc.gov.