171 ТЕКСТЫ ДЛЯ РЕФЕРИРОВАНИЯ ПО ВЫЧИСЛИТЕЛЬНОЙ ТЕХНИКЕ - Страница 6

Text 9. SMALLER SLOWER, SUPERCOMPUTERS SOMEDAY MAY WIN THE RACE

LOS ALAMOS, NM, May 29, 2002.

The supercomputers of the future will never crash and will cost far less to run than today's machines. At least that's the vision of a scientist at the

- 20 -

National Nuclear Security Administration's Los Alamos National Laboratory.

"Everyone's fixed on the mantra of performance at all costs," said Wu Feng of Los Alamos' Advanced Computing Laboratory. "What we've done is redefine the price-to-performance ratio to look at efficiency, reliability and availability, in other words, total cost of ownership."

Feng and colleagues Michael Warren and Eric Weigle developed the first of this new breed of high-performing, low-cost computers, which they named "Green Destiny." The machine has been operating with unprecedented stability and performance efficiency for more than eight months in a dusty warehouse where temperatures routinely reach 85-degrees Fahrenheit.

Feng, Warren and Weigle argue that the costs of computing should include electrical power, infrastructure, air conditioning, floor space, time lost to system failures and salaries for the people needed to keep finicky machines operating. Supercomputers of the future may very well be similar to Green Destiny, they say: small, extremely stable and miserly in their power use.

Green Destiny represents a new "flavor" of supercomputer, Feng said. The machine packs 240 Transmeta processors that operate at 667 MegaHertz, mounted onto a half-inch-slim compact motherboard, or blade. A total of 24 blades then mount into a RLX Technologies System 324 chassis, and then ten chassis, with network switches, are mounted in a standard computer rack.

Currently computing at a peak rate of 160-billion operations per second, Green Destiny uses less than ten percent of the electricity and twenty-five percent of the space to give performance comparable to the previous generation of so-called cluster computers. More important is reliability, Feng said.

"As the push for performance goes up, so does the power consumption. And system failure is directly proportional to power consumption," he pointed out. "If your machine isn't available all the time, then you can't do any computing.

In fact, unpublished empirical data from computer vendors indicate that as processor temperatures increase by 10-degrees Celsius, failure rates double. Typical computing-intensive businesses depend on hundreds or even thousands of identical servers to handle multiple requests for information simultaneously. When the servers go down, hourly losses can range up to $6.5 million for a large brokerage firm.

 

- 21 -

Green Destiny, whose processors operate roughly one-tenth as hot as market-leading chips, has been running continuously since September without air filtration or special cooling. In fact, it kept humming even with the fans removed. "It's absolutely rock solid," Feng said. "It's so reliable we only keep one spare blade around, and we have never needed it."

In a recent paper, available at http://public.lanl.gov/feng/Bladed-Beowulf.pdf online, Feng predicts, based on Moore's law, that the drive for increased performance will result in "the microprocessor of 2010 having over one billion transistors and dissipating over one kilowatt of thermal energy; this is considerably more energy per square centimeter than even a nuclear reactor."

Beowulf clusters, developed at NASA in the early 1990s, group commodity processors with commercial switches and have attracted much attention because they're able to handle many computations simultaneously. A larger version of the Green Destiny Bladed Beowulf cluster would require far less space than a traditional Beowulf cluster. Putting 2,000 of the bladed machines together could yield an enormous savings in space, and in costs, with floor space in Silicon Valley renting for more than $150 a square foot.

Feng argued that with all these factors taken into account, the true price-to-performance rating for Green Destiny would be at least twice as good as other supercomputers.

Internet pioneer Gordon Bell, software guru Linus Torvalds and other guests visited Los Alamos Laboratory recently to learn more about Green Destiny and the Supercomputing in Small Spaces project, whose web site is at http://sss.lanl.gov online.

Stephen Lee, acting deputy leader of Los Alamos' Computer and Computational Sciences Division, said Green Destiny represents a promising research advance, but emphasized the national need for large platforms that are uniquely able to move huge amounts of data in and out of memory rapidly, such as Los Alamos' Q machine, developed for NNSA's Advanced Simulation and Computing program, or ASCI.

"This could be the next important step in scalable supercomputing, but the challenge of maintaining the nation's nuclear stockpile in the face of aging weapons, eroding expertise and nearly a decade without nuclear testing demand three-dimensional, full physics computing on tera-scale computers today, while designers and engineers with weapon test experience are still available to validate the ASCI simulations." Lee said.

The best use for machines like Green Destiny might be in the inexpensive development of scientific codes for a wide range of

- 22 -

applications, Feng and Lee said. Once the code has been developed and stabilized, it could move to an ASCI-style supercomputer.

Feng's team at Los Alamos originally bought the machine from RLX Technologies to host large volumes of data. After several delays in compiling the data, they decided to make a cluster instead and tested it with some high-performance applications, such as Warren's simulation of the beginnings of the universe and his three-dimensional model of supernovae. Among planned future jobs for Green Destiny are global climate modeling, large-scale molecular dynamics, computational fluid dynamics and bioinformatics.

"At first, we did not think that there was anything particularly novel about this," Feng said. "We showed it to fellow researchers at a supercomputing conference last November, and we saw more than 7,000 hits on our web site the following week. This project has taken on a life of its own."

The Transmeta Crusoe processor provides about 75 percent of the performance of similarly clocked chips from a major manufacturer used in other Beowulf clusters. So Green Destiny might be compared to the tortoise, eventual winner of the fabled race with the speedy hare.

Text 10. IBM'S TEST-TUBE QUANTUM COMPUTER MAKES HISTORY; FIRST DEMONSTRATION OF SHOR'S HISTORIC FACTORING ALGORITHM

SAN JOSE, California

Scientists at IBM's Almaden Research Center have performed the world's most complicated quantum-computer calculation to date. They caused a billion billion custom-designed molecules in a test tube to become a seven-qubit quantum computer that solved a simple version of the mathematical problem at the heart of many of today's data-security cryptographic systems.

"This result reinforces the growing realization that quantum computers may someday be able to solve problems that are so complex that even the most powerful supercomputers working for millions of years can't calculate the answers," said Nabil Amer, manager and strategist of IBM Research's physics of information group.

In today's issue of the scientific journal Nature, a team of IBM scientists and Stanford University graduate students report the first demonstration of "Shor's Algorithm" -- a method developed in 1994 by AT&T scientist Peter Shor for using the futuristic quantum computer to find a number's factors -- numbers that are multiplied together to give the original number. Today,

- 23 -

factoring a large number is so difficult for conventional computers -- yet so simple to verify -- that it is used by many cryptographic methods to protect data.

A quantum computer gets its power by taking advantage of certain quantum properties of atoms or nuclei that allow them to work together as quantum bits, or "qubits," which serve simultaneously as the computer's processor and memory . By directing the interactions between qubits while keeping them isolated from the external environment, scientists enable a quantum computer to perform certain calculations, such as factoring, exponentially faster than conventional computers. When factoring large numbers using a conventional computer, each added digit roughly doubles the time to find the factors. In contrast, the quantum factoring time increases by only a constant increment with each additional digit.

The simplest meaningful instance of Shor's Algorithm is finding the factors of the number 15, which requires a seven-qubit quantum computer. IBM chemists designed and made a new molecule that has seven nuclear spins -- the nuclei of five fluorine and two carbon atoms -- which can interact with each other as qubits, be programmed by radio frequency pulses and be detected by nuclear magnetic resonance (NMR) instruments similar to those commonly used in hospitals and chemistry labs.

The IBM scientists controlled a vial of a billion billion (10**18) of these molecules so they executed Shor's algorithm and correctly identified 3 and 5 as the factors of 15. "Although the answer may appear to be trivial, the unprecedented control required over the seven spins during the calculation made this the most complex quantum computation performed to date," Amer said.

"Now we have the challenge of turning quantum computation into an engineering reality," said Isaac Chuang, leader of the research team and now an associate professor at MIT. "If we could perform this calculation at much larger scales -- say the thousands of qubits required to factor very large numbers -- fundamental changes would be needed in cryptography implementations."

While the potential for quantum computing is huge and recent progress is encouraging, commercial quantum computers are still many years away. NMR-based quantum computers are laboratory experiments. The first quantum computing applications would likely to be co-processors for specific functions, such as solving difficult mathematical problems, modeling quantum systems and performing unstructured searches. Word processing or simple problem-solving tasks are more easily handled by today's computers.

- 24 -

IBM's demonstration of Shor's algorithm also shows the value of quantum computing experiments using NMR, an approach pioneered independently in the mid-1990s by Chuang and Neil Gershenfeld of MIT and by David Cory and colleagues, also at MIT. "Our NMR experiments stimulated us to develop fundamental tools that can be used in many future types of quantum computers," said Chuang. "Most important of these was a way to simulate and predict the signal degradation caused by 'decoherence' -- unintended quantum fluctuations. This tool enabled us to minimize decoherence errors in our 7-qubit experiment."

While NMR will continue to provide a testbed for developing quantum computing tools and techniques, it will be very difficult to develop and synthesize molecules with many more than seven qubits. As a result, new experiments at IBM and elsewhere are aimed at developing new quantum computing systems that can more easily "scale" to the large numbers of qubits needed for practical applications. Strong candidates today include electron spins confined in semiconductor nanostructures (often called quantum dots), nuclear spins associated with single-atom impurities in a semiconductor, and electronic or magnetic flux through supercoductors. Atomic and optical implementations continue to be evaluated.

When quantum computers were first proposed in the 1970s and 1980s (by theorists such as the late Richard Feynmann of California Institute of Technology, Pasadena, Calif.; Paul Benioff of Argonne National Laboratory in Illinois; David Deutsch of Oxford U. in England., and Charles Bennett of IBM's T.J. Watson Research Center, Yorktown Heights, N.Y.), many scientists doubted that they could ever be made practical. But in 1994, Peter Shor of AT&T Research described a specific quantum algorithm for factoring large numbers exponentially faster than conventional computers fast enough to defeat the security of many public-key cryptosystems. The potential of Shor's algorithm stimulated many scientists to work toward realizing the quantum computers' potential. Significant progress has been made in recent years by numerous research groups around the world.