Features

Japanese supercomputer beats top U.S. machine

By Matthew Fordahl The Associated Press
Tuesday July 23, 2002

LIVERMORE — U.S. supercomputers have been the world’s most powerful since the first high-performance machines analyzed virtual nuclear blasts, climate change and the makeup of the universe. 

Now, one built in Japan with an “old” design runs five times faster than the previous record holder, a machine that simulates nuclear tests at Lawrence Livermore National Laboratory. 

Japan’s Earth Simulator supercomputer hasn’t quite rattled the United States like the Soviet Union’s launch of Sputnik in 1957. But it does highlight some drawbacks of recent U.S. machines — and it has made more than a few scientists envious. 

“This machine is powerful enough that a researcher who uses it can do in one day what it takes a researcher in the U.S. to do in one month,” said Jack Dongarra, a University of Tennessee professor who tracks the world’s 500 speediest computers. 

More than national pride is at issue. Certain research is better suited to machines like the Earth Simulator, whose design was abandoned by most U.S. manufacturers in favor of one that melds better with the rest of their computing businesses. 

Supercomputers are built with thousands of processors that work in tandem to analyze the most complex issues — including nuclear test simulations, aircraft designs, drug creation and others — for governments, research centers and corporations. 

Livermore’s ASCI White supercomputer, like most recent U.S.-built machines, used off-the-shelf processors rather than custom parts specifically geared for high-performance scientific jobs. 

The trend took off in the 1990s, as such processors grew increasingly powerful, making it difficult to justify the cost of developing chips just for the small scientific supercomputer market. 

Now, supercomputers like ASCI White use the same chips — but thousands more of them — as do servers sold to businesses. 

Such machines now make up nearly 92 percent of the top 500 supercomputers worldwide. In 1993, they made up only about 27 percent. 

But some say the move away from custom processors places business concerns ahead of scientific needs. 

“The arguments are all based on strange economic theories — none of them are based on technical grounds,” said Burton Smith, chief scientist at Cray Inc., which still sells custom supercomputers. 

NEC-built Earth Simulator, which will be used in climate and earthquake studies, is faster than all 15 of the biggest supercomputers in the United States combined, Dongarra said. 

It performs 35.9 trillion calculations a second with 5,104 processors. ASCI White, by contrast, performs 7.2 trillion calculations a second with its 8,192 microprocessors. 

“U.S. scientists want to use it,” said Kiyoshi Otsuka, leader of the Earth Simulator’s research exchange group in Yokohama, Japan. 

The high-performance title isn’t expected to stay in Japan forever. IBM, which built ASCI White, Hewlett-Packard Co. and other U.S. supercomputer makers say they are working on even more powerful machines. 

“We could do that in a heartbeat and we could do that for a lot less money,” said Peter Ungaro, vice president of high performance computing at IBM, which sells supercomputers to foreign nuclear governments, research centers and corporations. 

Officials at the Livermore Lab say their machine and others like it offer better price performance than those designed like the Japanese supercomputer. Earth Simulator cost about $350 million, compared with ASCI White’s $110 million. 

“For global climate, (Earth Simulator) is a great advance,” said David Nowak, the ASCI program leader at Livermore. “It’s just a very expensive machine.” 

But supercomputers built with off-the-shelf parts can be more expensive in the long run, said Guy Robinson, research liaison at the Arctic Region Supercomputing Center, where scientists study everything from climate change to how galaxies form. 

Price isn’t the only factor, however. Scientists say certain research problems work better or are easier to program on one supercomputer type over the other. 

Custom supercomputers, for instance, have bigger data pipes, known as memory bandwidth, which is critical in climate modeling and some nuclear research. 

That can be a drawback for off-the-shelf processors. 

“In one sense, we have a tiny straw for data,” Dongarra said. “And we have the processor, which would like a fire hose of data to come at it.” 

But off-the-shelf machines work best on data analysis such as in genetic research. 

It’s a case of finding the best fit for the research — and budget, said Fran Berman, director of the San Diego Supercomputer Center. 

“From a political and financial perspective, we’re not in an environment where we could build all these different kinds of tools with equal emphasis,” she said. 

J. David Neelin, atmospheric scientist at the University of California, Los Angeles, said computing is ultimately just another tool, like a laptop is for a student assigned to write a paper. 

A faster machine does not guarantee quality. 

“Are they going to get that essay out any faster or is it going to be any better? Well, their spell checker is going to run a little faster,” he said. “In the end, it’s the thought that goes into it that really makes the essay.”