Supercomputing: Berkeley's Beehive Of Science

Tiny Metal Libraries A Big Tool For Materials Scientists


Supercomputing: Berkeley's Beehive Of Science

by Jeffery Kahn

The whole is greater than the sum of its parts. Centuries after the effect first was observed, Buckminister Fuller realized the power of this idea, popularizing the concept of synergy. More and more, the science that emerges from Berkeley Lab is a product of this phenomenon.

Through original design and through accident of history, the Laboratory over the years has become a rich environment for scientific interaction, developing teams of researchers across a broad spectrum of disciplines, and evolving into what is known today as a "multiprogram" laboratory. But the description does not fully credit the dynamic assembly of minds and forces that make up Berkeley Lab. Virtually a United Nations of science, the Lab's researchers come from all over the world to share their expertise and to work at our internationally acclaimed user facilities. The collective body includes cosmologists and earth scientists, particle physicists and materials scientists, chemists and engineers, biologists and computer scientists. Basic science, applied science, theory, and experiment all take place in a realm catalytically connected to the University of California, Berkeley, and the high-tech industries blossoming in the surrounding San Francisco Bay Area.

Now, a supercomputing center will be added to the mix, further stimulating the synergistic forces already in play at Berkeley. In November 1995, the Department of Energy selected Berkeley Lab as the site for its High-Performance Computing Access Center.

The supercomputers coming to Berkeley can be considered a form of time machine. Like eyes that can look into the future, they make it possible to pose questions and resolve problems years before it would be feasible by any other approach. How can the internal combustion engine be made both more efficient and less polluting? What changes will occur to the Earth's climate? What shape of molecule should be engineered to perform a targeted biological function? Supercomputer centers attack questions like these, at the frontiers of science. And, as they do so, they pioneer the future of computing itself-hardware, software, and networks destined to be increasingly important to our daily lives.

Bill McCurdy, Berkeley Lab's new associate laboratory director for Computing Sciences, believes supercomputers have changed the very nature of science. The traditional interplay between theory and experiment now has been joined by a new mode of inquiry, that of computational experiment.

"Over the past quarter-century," observes McCurdy, "a fundamental change has occurred in the way scientists and engineers view computation as a tool of research. In the 1960s, computation was a specialized tool whose application was largely limited to a few disciplines of physics, engineering, and chemistry, and which was widely considered to be merely an adjunct of theory. After a quarter-century of spectacular advances in computing hardware and numerical algorithms, we now commonly speak of experiment, theory, and computation as the three principal elements of modern scientific research. The change in our thinking is dramatically highlighted by discussions of large-scale computational experiments appearing in the scientific literature, side-by-side with the results of physical experiments."

In many cases, computer simulation or modeling is the only approach available to researchers. Physical experiments may not be possible because they are prohibitively large or small, unfold too quickly or too slowly, or because they cost too much. Using supercomputers, mathematical models of physical phenomena can be created that combine the known with the unknown to simulate and explore what can be otherwise off-limits.

Berkeley Lab's High-Performance Computing Access Center is devoted to the service of energy researchers, scientists with Energy Department grants who are involved in a broad range of energy research programs. The center is a mix of people, machines, and networks.

For starters, it includes the National Energy Research Supercomputer Center (NERSC). NERSC is being moved to Berkeley from the Lawrence Livermore National Laboratory. The supercomputers at the center are being upgraded to include a mix of new machines along with the existing Cray C-90. With 16 central processing units, the machine has a peak speed of one billion arithmetic operations per second. After a year, NERSC also expects to have a 512 processor Cray T3E in which each processor has a peak speed of 600 million arithmetic operations per second and four Cray J90 shared memory multiprocessors with 24 processors each and more than two and a quarter terabytes of disk space. Currently, NERSC's computers are used by thousands of researchers nationwide. The range of energy research problems they work on includes global change; materials, combustion, and molecular modeling; quantum chromodynamics, particle accelerator design; structural biology; and fusion.

The Department of Energy's major high-speed network, the Energy Sciences Network or ESnet, is a vital adjunct of the center. Like a virtual office hallway, the ESnet connects users all over the world to Berkeley's supercomputers as well as to a number of other unique Energy Department facilities. Because of its need to move huge streams of information, the ESnet is a prime shaper of the future face of the Internet. ESnet is the first national production network to make use of the new Asynchronous Transfer Mode (ATM) technology, which can transmit both voice and data. Some legs of ESnet's ATM network run at speeds of up to 155 million bits per second. That's compared to the 28 thousand bit-per-second speed of modems now coming into use on PCs.

Supercomputers, like PCs, run software. Unfortunately, Microsoft does not make a program that can persuade the Cray to numerically model the Earth's climate. Because researchers cannot go to the store and buy shrink-wrapped packages of programs, these applications must be custom crafted. That's the job of the users of NERSC and of the Center for Computational Sciences and Engineering. The center's team of about 15 scientists has expertise in a range of scientific disciplines as well as in computing. Working closely with some of the research groups that use the center, they collaborate to produce not just effective applications but what are really new approaches to scientific problems. Berkeley Lab's expertise in computer science will be integral to the supercomputing center. Computer scientists will be responsible for the continued evolution of the "architecture" of the center, making sure that the interaction of hardware, software, and the network not only are optimal for the present but anticipate and promote the future directions of computing.

Currently, the nation has some 20 different supercomputer centers. All of them perform vital roles, but certainly there are aspects that distinguish the Berkeley center.

"If we are going to maintain high performance computing as a competency of this country," says McCurdy, "then we need every last one of these centers. There is enough diversity of use, there is enough demand, there is enough need for this kind of synergy between scientific computing and science to keep every one of these supercomputing centers busy."

NERSC, which is expected to complete its move from Livermore to Berkeley by April, prides itself on a tradition of service, which it provides 24 hours a day, 365 days a year. "The entire ethic of the operation is focused on service, on the needs of our users," says McCurdy. "We have a very close partnership with the scientists who use the center. Our consulting staff is exceptional in terms of their experience and knowledge. They work with users, resolving small matters-say, code that runs one day and not the next-to more profound issues like figuring out why a particular parallelization doesn't work effectively on a given machine."

Moving to Berkeley will make the supercomputing center unique in another fashion. Says McCurdy, "It is true that computer networks connect people all over the world to us. But I've learned that networks are no substitute for working with people directly. To get people to collaborate with you, they must be physically close enough to where you can share coffee with them. There are communities here in high energy physics, materials sciences, chemical sciences, and life sciences. Here, we are in a place where the level of scientific collaboration between the center and the disciplines that it serves will be at an extent not equaled by any other supercomputer center."

Supercomputers alone-terabyte data sets accessed from a petabyte archive at gigabyte speeds-cannot manufacture great science. McCurdy says something else accounts for great science, something intangible that was best captured in an essay by the biologist and author, Lewis Thomas. Thomas wrote that science cannot be made to produce brilliant new ideas in a businesslike, predictable fashion. It doesn't work that way. The scientific enterprise, wrote Thomas, is more like a beehive: "If you want a bee to make honey, you do not issue protocols on solar navigation or carbohydrate chemistry. (Rather) you put him together with other bees and you do what you can to arrange the general environment around the hive. If the air is right, the science will come in its own season, like pure honey."

This is exactly the way science operates, and it is the key to the future contributions to be made by the new High-Performance Computing Access Center. Says McCurdy, "The atmosphere must be right. Here in Berkeley, the air is right."