Tianhe 2
A look down one of the corridors in the giant room housing the Tianhe 2, which has a processing power of 33.86 petaflops (double that of its nearest competitor, the Oak Ridge Lab's Titan). Argonne National Labs
SHARE
IBM Blue Gene
IBM Blue Gene Argonne National Labs

High-performance supercomputers can solve complex problems in climate science, aerospace design, biomedicine, and particle physics. But they are also used to develop new kinds of stealth technology, run complex ballistics models, and simulate nuclear weapon detonations.

Last month, China announced new restrictions on the export of both supercomputing technology and high-performance drones. The announcement came just a week after the Obama administration unveiled a new national initiative promising to deliver an exascale supercomputer 30 times more powerful than China’s Tianhe-2—currently the world’s fastest supercomputer—by 2025. And it comes just months after the U.S. Department of Commerce blocked the shipment of tens of thousands of American-made Intel chips bound for Chinese supercomputers, including Tianhe-2, citing security concerns.

In December 2010, the President’s Council of Advisors on Science and Technology (PCAST) delivered a 100-plus page series of recommendations to Congress and the executive branch. Titled “Designing a Digital Future,” the report outlines how the U.S. government could best leverage its resources to maintain its edge in IT infrastructure and computing. But amid the recommendations and research and development goals, the report offered a word of caution: Avoid an unproductive and potentially detrimental supercomputing “arms race” with China.

The most recent escalation in the U.S-China race for supercomputing supremacy is tied to an emerging set of 21st-century threats like cyberwarfare, global terrorism, and state-sponsored hacking. And, as PCAST warned five years ago, that could prove counterproductive for everyone.

***

The competition for supercomputing supremacy between the United States and China reaches back at least a decade, but that race has largely been a contest for national prestige. For China, an obsession with owning the world’s fastest supercomputer is driven by its desire to show the world its technological prowess, says James Andrew Lewis, a senior fellow and director of the Strategic Technologies Program at the Center for Strategic and International Studies. The underlying national security implications of supercomputing have remained largely in the background as the two countries have jockeyed for the top spot on the TOP500 list of the world’s fastest supercomputers.

By refusing to allow China to take delivery of American-made Intel processors, the U.S. has pushed those national security issues back to the fore. On paper, the Commerce Department denied Intel’s application for export on the grounds that Tianhe-2 and other Chinese supercomputers had been used for “nuclear explosive activities” that are “contrary to the national security or foreign policy of the United States.”

Particularly since the passing of the Nuclear Test Ban Treaty in 1996 prohibited live testing of nuclear weapons, such modeling has grown increasingly important. The U.S. Department of Energy owns four of the top 10 fastest supercomputers in the world, using them at least partially for this very kind of nuclear weapons modeling and research. But while concerns over China’s weapons modeling capability may be perfectly valid, previous boosts to China’s supercomputing power haven’t triggered a U.S. response like the one seen earlier this year.

So what’s changed?

“The Chinese have gotten better at computing for military purposes—military intelligence purposes—in the last year or two, and that’s probably of some concern,” Lewis says. “The U.S. after the Snowden revelations needs to rethink the way it does signals intelligence and that usually means bigger computers. So I think it’s those external events—in particular better Chinese performance—that’s driving some of this.”

Cyber-espionage and cyberwarfare have emerged as driving forces in security policy.

Just as nuclear weapons drove the development of supercomputing technology during the Cold War, threats like cyber-espionage and cyberwarfare have emerged as driving forces in security policy over the past several years. Militaries and intelligence agencies now have access to more data than ever before, more data to protect than ever before, and more potential adversaries trying to breach or attack their data networks than ever before. It’s a massive big data problem—one specially suited to bigger and faster supercomputing platforms.

“All of the signals intelligence agencies—the NSA, the GCHQ in the UK—are very, very interested in big data, because if they can crunch this data in a quick enough fashion, that enables them to begin to make connections between nodes in these networks,” says Tim Stevens, a teaching fellow in the war studies department at King’s College London. “And that of course is the dream of signals intelligence.”

There are two principal realms in which superior supercomputing could make a huge difference on the national security front, Stevens says. The first is counterrorism, or using big data analytics to sift through mountains of data and find signals in the noise, identifying patterns of behavior or connections between individuals and/or events that are relevant to national security. The other more important realm is cybersecurity, a realm in which many analysts believe the U.S. has already fallen behind.

“Being able to process network data in real near time to see where threats are coming from, to see what kinds of connections are being made by malicious nodes on the network, to see the spread of software or malware on those networks, and being able to model and interdict and track the dynamics on the network regarding things that national security agencies are interested in,” Stevens says, “those are the realms in which supercomputing has a real future.”

Tianhe 2

Tianhe 2

A look down one of the corridors in the giant room housing the Tianhe 2, which has a processing power of 33.86 petaflops (double that of its nearest competitor, the Oak Ridge Lab’s Titan).

An exascale computer like the one envisioned by the Obama administration that’s capable of one quintillion, or 1,000,000,000,000,000,000, or one billion billion floating-point operations per second–one exaflop–could lead to significant leaps forward in our understanding of climate change, disease, or the origins of the universe. But, like the first high-performance machines developed in the middle of the last century, the first exascale supercomputer will likely go to work in national security.

A national security aspect to a scientific undertaking can help drive competition, and competition drives funding and the allocation of other resources. It can elevate a scientific pursuit to a national imperative. But it can also distort priorities, and that’s exactly what PCAST tried to warn against back in 2010.

“While it would be imprudent to allow ourselves to fall significantly behind our peers with respect to scientific performance benchmarks that have demonstrable practical significance, a single-minded focus on maintaining clear superiority in terms of FLOPs count is probably not in our national interest,” the authors wrote. Such an “arms race” would be costly, they said, and an obsession with FLOPs count could divert resources away from making fundamentally new boundary-pushing discoveries.

Moreover, the exaflop milestone is purely arbitrary, Stevens says. It’s the next numerical paradigm beyond the current roster of machines (Tianhe-2 is a 33.86 petaflop [1015 FLOP] machine at last count). “It has a symbolic importance,” he says “There’s clearly a very strong symbolism in terms of national prestige attached to these techno-science projects, exactly as there was during the Cold War.” But, he adds, “there’s no real conceivable reason for the U.S. to beat the Chinese. If they need more computing power they’ll just build it—who says you have to be the fastest?”

Still, the Obama administration has committed to a sprint toward exascale computing. China currently holds a commanding lead over the world’s second-fastest machine—at 17.59 petaflops, the U.S. Department of Energy’s Titan supercomputer is just half as fast as Tianhe-2. But if the U.S. maintains its commitment, Stevens thinks it can reach the exaflop first.

“The U.S. has this amazing track record of making things happen when it wants to,” he says. “And if it really perceives that exascale computing is something that needs to be done in the national interest, and it puts the money there, then I see absolutely no reason why the U.S. won’t come out ahead.”