There is an axiom which says that big problems are easier to tackle when divided into smaller tasks. Scientists in the Lab's Center for Computational Sciences and Engineering are solving problems in fluid dynamics by doing just that-using computer modeling algorithms to break large complex problems into smaller pieces in order to focus computing power on those areas of greatest scientific interest.


Even for supercomputers, some phenomena are too complex to be modeled. Using a technique called adaptive mesh refinement, researchers break their data down into cells, covering areas of particular interest with a mesh of thousands of segments, which then can be individually analyzed in detail.


Using a technique called adaptive mesh refinement, researchers in applied mathematics can optimize existing computer resources, while solving bigger and more complicated problems than would otherwise be possible with conventional methods. As a result, they are gaining new insights into such complicated processes as internal combustion engines, airplane flight, and weather prediction.

To accurately model the performance of an airplane in flight, for example, one must include a large region of air around the airplane (as in a wind tunnel). Adaptive mesh refinement allows scientists to focus on the details of the turbulent airflow around the wings without having to spend a large percentage of the available resources describing the relatively smooth flow in the much larger region of space away from the wings.

(Click on image to enlarge)

In the microsconds after an explosion, the most interesting scientific features are at the edge of the expanding materials. Adaptive mesh refinement (AMR) capabilities automatically track this area (shown by yellow grids above left and center). A 3D image of the explosion is at the far right.
The system works by covering the region of space being studied with a "mesh," which divides the region into individual segments, and then looks at each segment to determine its importance to the particular problem being addressed. Specific areas of interest are then covered with a finer mesh to allow scientists to gain even more detailed and more accurate information about the most important parts of the problem. If the region of interest moves over time, the fine mesh must move to follow it; in other words, the mesh must "adapt" to the solution.

Imagine wanting to accurately measure the temperature in an auditorium over the course of an afternoon. Taking temperature readings at eight locations in the auditorium every hour would be much less reliable than taking hundreds of measurements every minute. But taking more measurements takes not only more total effort, but also more storage space to record the measurements. This corresponds, in the computer modeling world, to more compute time and more computer memory usage. However, if scientists knew that the temperature was relatively constant except in one particular region of the room, they could focus their efforts on that one region, and take fewer measurements in the rest of the room. Using this type of local mesh refinement, scientists are able to focus their existing computer power on a narrower part of the overall problem, so as to get the most information, given a limited amount of computer time and memory.

Lab computer scientists have taken AMR to a new level by writing programs to run on distributed memory supercomputers such as the Cray T3E.

For example, a typical computer modeling program may provide a "big picture" image of how a diesel engine operates. This turbulent process, a strong interplay of chemistry and fluid dynamics, is complex and not fully understood even by the experts who design the engines. Better computer models will allow them to "see" inside the engine and gain a better understanding of the process. But researchers studying ways to make diesel combustion more efficient and less polluting want to look primarily at the point where fuel is burning. And not only do they want to look more closely at a particular location, they also want to take much smaller computational time steps in the regions where things change most quickly.

Mathematicians John Bell and Phil Colella are leading Lab efforts to apply this capability to real scientific puzzles. Creating workable algorithms to achieve this requires solving both mathematical and computer science problems. Mathematically, in designing the algorithms one has to make sure that they respect the physical laws they approximate. For example, certain quantities in nature are conserved. In the case of weather modeling, the amount of moisture in a cloud should stay the same over time, unless that moisture leaves the cloud by a physical process, such as rain. Making sure that the cloud has the same amount of moisture as it passes between coarse and fine meshes requires that the mathematical equations be written to maintain the balance and that the algorithms respect that property of the equations.

Lead scientists on this project
Learn more about these researchers
Making these algorithms work right for adaptive meshes requires computer science as well as mathematical expertise. Keeping track of the data on the different meshes, and across the interfaces between coarse and fine meshes, is much more complex than keeping track of data on a single uniform mesh. New data structures are needed to store the data, and efficient algorithms are needed to effectively allocate appropriate portions of the computer architecture to do the job.

Fortunately, developing programs to tackle new problems is made easier by the library of algorithms already developed in the Center. In many cases, the programs which solve specific pieces of the problems can be woven together to make new programs which solve new, more complex problems, all with adaptive mesh refinement. The result is a powerful computing tool that allows scientists to squeeze much better performance out of their computers. In some cases, adaptive mesh refinement capability allows researchers to come up with answers five years ahead of others using conventional computing tools, as well as solve problems that would otherwise not be solvable today.

- Jon Bashor
Previous | Next | 97-98 Highlights Table of Contents