March 2, 2000

Berkeley Lab Science Beat

Lab website index

Lawrence Berkeley National Lab home page

Search Lab science articles archive
 Advanced Search  
Search Tips

Keeping a transmission electron microscope aimed and focused over the Internet used to be considered impossible. Even on a good day the net encounters delays, while in just a fraction of a second a foil-thin sample, magnified to atomic scale and heated by an electron beam, can buckle and jump, requiring constant dial-fiddling by the operator.

On the other hand, astronomical telescopes have long been equipped to track any patch of sky automatically, and there's no problem keeping a telescope focused if the atmosphere is stable. Even if weather puts a telescope in one place out of business, another can do the job -- as long as it can be reached. There's the rub: what's needed to make remote astronomy practical is a common language for communicating with all the robotic telescopes scattered around the world, each with its own operating system.

To solve these very different kinds of problems, Berkeley Lab researchers have led in creating systems that can talk to different kinds of instruments and link different remote users interactively, allowing them to do experiments and make observations online.

For microscopy, an Internet channel dubbed DeepView is poised to revolutionize collaborative experiments. In astronomy, after extensive experience with robotic telescopes, researchers based at the Lab are leading the effort to write RTML -- advanced Robotic Telescope Markup Language -- with the help of a wide range of researchers, telescope makers and operators, and educators.

DeepView remote microscopy


Berkeley Lab researchers first demonstrated remote, in-situ electron microscopy in August, 1995, when at the annual meeting of the Microscopy Society of America in Kansas City they were able to conduct a dynamic experiment with the 1.5 million-volt High Voltage Electron Microscope at the National Center for Electron Microscopy (NCEM), 1,500 miles away.

Their secret was to augment the remote experimenter's intentions with an automated controller near the microscope to track changes in the sample environment and make quick-as-a-wink interventions.

"What we were able to do then was to allow a single user to control the instrumentation from a remote location," says Bahram Parvin of the Information and Computing Sciences Division (ICSD), who with Mike O'Keefe of NCEM led the team that developed the revolutionary system. But last November at SC99, the high-performance computing conference in Portland, Parvin and O'Keefe, together with John Taylor of ICSD, demonstrated a new level of sophistication in remote microscopy -- a "microscopy channel" that can link numerous remote participants with half a dozen microscopes of widely differing characteristics. The growing set of instruments includes not only transmission electron microscopes but scanning electron microscopes and optical microscopes as well -- and, says Parvin, "Soon these will be augmented with the most sophisticated computational tools of microscopy."

They call the new program DeepView. To realize economies of scale, take advantage of standardization, and reduce maintenance costs, the new system -- based on the Common Object Request Broker Architecture (CORBA) developed by the Object Management Group consortium -- uses off-the-shelf technology to support a unique set of services for doing collaborative science online.

"DeepView's instrument services provide access for control of microscopes and are scalable to the point that different instruments can be plugged into the framework and accessed using a common interface," says Parvin, a facility implemented by Gerald Fontenay of ICSD. Exchange services provide tools for trading information among different experimenters at different locations, and computation services provide for analysis such as image segmentation and 3-D reconstruction, developed by Parvin and ICSD's Ce Cong.

DeepView also integrates novel analytical tools. One that plugs into DeepView just as if it were a physical instrument is a program for examining atomic-scale structures, image simulation, a technique developed by Mike O'Keefe to achieve high resolution with medium-voltage microscopes.

"Electron microscope image components only have the same phase out to the microscope's resolution. But we wanted to get beyond the resolution, to use image components that have their phases scrambled," O'Keefe says.

To interpret beyond the microscope resolution, electron microscopists traditionally compare focal series of these scrambled images with series simulated from model specimens. This way images from a real specimen can be tested against a hypothetical model out to the microscope's information limit -- the limit to which it produces phase-scrambled information -- which may lie well beyond its traditionally defined, nominal resolution.

"Now we can use focal-series reconstruction software to combine information from many images," O'Keefe says. "Then a single image with resolution approaching the information limit can be achieved, with all the transferred information in phase."

NCEM's One-Angstrom Microscope project already provides 1.7-angstrom resolution on its CM300 microscope to DeepView users, but "using image-processing software with the CM300, we can achieve 0.89-angstrom resolution," says O'Keefe. "Soon, we'll be able to control this reconstruction software through DeepView. Then we'll be able to offer the microscopy community sub-angstom resolution at their personal terminals."

Image simulation software is also widely used to model anticipated atomic structures before they have been confirmed. DeepView will soon allow experimenters at remote locations to compare constantly updated simulated images with real observations and to discuss real and theoretical material structures and behavior, all without leaving their desks.

Robotic Telescope Markup Language


Photo by Mike Chin

Doing astronomy at remote locations over the Internet is already possible. When photographic plates were made obsolete by the advent of inexpensive, highly efficient CCDs (charge-coupled devices) in the 1970s, the observing power of even small telescopes was boosted dramatically; dozens of small to mid-size telescopes that were previously underused -- many because they were located in light-polluted urban environments -- were automated. Meanwhile hardware and information technology became cheaper and more reliable.

When Berkeley Lab researchers acquired a 30-inch telescope for supernova research, they designed and built the telescope control system and wrote observatory-scheduling, data-acquisition, image-analysis, and database-management software, making the telescope a robot. Currently located at the Leuschner Observatory in the hills east of Berkeley, the 30-inch telescope is used to take follow-up observations of nearby supernovae, images requested by precollege students, and real-time observations for the Tokyo Science Museum's "Live Universe" show.

The robot can start an evening's observing sequence, create the schedule of that night's targets based on requests from several sources, begin observing, and close down at the end of the night -- all without human intervention. The robot also knows when interactive sessions are scheduled and can start its list of observations afterwards.

But, like Berkeley Lab's 30-inch, virtually every telescope accessible this way has its own unique programs and observing language.

"What's needed is a high-level language that will allow observers to describe what they want to do in an uncomplicated way, and that can then talk to any telescope that can accomplish the task," said Carl Pennypacker of UC Berkeley's Space Sciences Laboratory and Lawrence Hall of Science, a longtime guest in the Lab's Physics Division, as he introduced a workshop last December for telescope manufacturers, observatory operators, astronomers, teachers, and Internet experts, convened under the auspices of the Hands-On Universe educational astronomy program.

The workshop's purpose was to lay the groundwork for a robotic telescope markup language, RTML, that will allow remote observers to use automated telescopes "in a manner that is independent of the underlying telescope system."

Programmer John Aymon described preliminary work on RTML. Using simple tags such as REQUEST and SCHEDULE and specifying TARGET, TIMERANGE, and other variables (CRPIX, for example, would remove glitches caused by cosmic rays hitting the CCD), a remote observer could use RTML to ask for a list of images from a specific telescope or from any suitable telescope.

Central servers will receive and distribute user requests and, after the observations have been made, return an RTML file of images to the user specifying where they were taken and explaining errors, if any. Over time a substantial database of observations will be collected, accessible by RTML. The user's interface will be a simple web page, containing useful information such as sky conditions at various telescope sites.

One result of the December meeting was general agreement among the participating manufacturers and observatory operators that their telescopes could readily be made responsive to RTML commands. Many details need to be worked out -- for example, building security systems to prevent unauthorized hackers from seizing control of telescopes -- but there was enough enthusiasm to set a target date of March 2000 for the first demonstration of an RTML request to be sent over the net and processed by remote automated telescopes. The hope is to present results later this year to meetings of the American Astronomical Society and the International Astronomical Union.

Educating tomorrow's scientists

While robotic telescopes are increasingly important in research, for example in the search for nearby supernovae and for the optical counterparts of gamma-ray bursters, perhaps the most eager potential users are astronomy teachers and their students.

"Kids want images -- with their names on them!" says Hands-On Universe teacher Vivian Hoette of Chicago. "Personal connection is the key to learning, but to make it work there has to be quick turnaround between asking the telescope to make an observation and getting the result."

Remote microscopy is playing a similar role. Among the earliest users of the DeepView "microscopy channel" are teachers of electron microscopy itself. Judy Murphy of the Microscopy Technology Center at San Joaquin Delta College in Stockton, California, uses DeepView to teach her students using the resources of NCEM.

"Students must learn real science, using up-to-date technology and methods for obtaining knowledge from the data," said Joe Stewart of the National Science Foundation, one of the attendees at the RTML workshop, who emphasized the need for resources to improve on typical textbooks. "These kids are going to be inundated with information. This is the time to teach them to master tomorrow's tools."

Additional Information: