The double-hinged door between astrophysics and the military

Excerpt: Accessory to War
book cover accessory to war
This is an excerpt from Accessory to War: The unspoken alliance between astrophysics and the military by Neil deGrasse Tyson and Avis Lang. Tyson and Lang

Share

tarantula nebula
Supernovae remnants are visible in the top left of this 1999 image of the Tarantula nebula. Whether bombs or supernovae, both astrophysicists and the military have a motivation to study explosions. NASA

One notable twentieth-century result of the countless alliances between astrophysics and the military is the thermonuclear fusion bomb, whose design principles arise in part from the astrophysicist’s investigations of the cosmic crucible that occupies the center of every star. A less explosive example, from our own century, is the ChemCam instrument (short for Chemistry and Camera) atop the Curiosity rover, which began trundling across Mars in August 2012. From its skybox position on the rover’s mast, ChemCam fires laser pulses at rocks and soil and then uses its spectrometer to analyze the chemical makeup of what got vaporized.

Who or what built ChemCam? The Los Alamos National Laboratory: birthplace of the atom bomb, originator of hundreds of spacecraft instruments designed for use by the military, and home to the Center for Earth and Space Science, a division of the National Security Education Center as well as a hub of support for astrophysics. Los Alamos Lab operates under the auspices of the National Nuclear Security Administration, whose mission is to maintain and protect America’s stockpile of nuclear weapons while simultaneously working to undercut the proliferation of such stockpiles elsewhere in the world. And the lab’s astrophysicists use the same supercomputer and similar software to calculate the yield from hydrogen fusion within the heart of a star that physicists use to calculate the yield of a hydrogen bomb. You’d have to look far and wide to find a clearer example of dual use.

Say you want to know what takes place during the explosion of a nuclear bomb. If you were to tabulate the many varieties of subatomic particles, and track the ways they interact and transmute into one another under controlled conditions of temperature and pressure— not to mention the particles that get created or destroyed in the process—you’d quickly realize you need more than pencil and paper. You need computers. Powerful computers.

A properly programmed computer can calculate crucial parameters for nuclear bomb design, ignition, and explosive yields, so it can predict what to expect from an experiment. Of course, “experiment” means the actual detonation of a nuclear bomb, either in a test or in warfare. During the Manhattan Project, in the 1940s, Los Alamos used mechanical calculators and early IBM punch-card tabulators to calculate atomic bomb yields. Decade by decade, as computing power increased exponentially, so too did the power to calculate and understand in detail the nuclear happenings in a nuclear explosion. And the needs of Los Alamos fostered the sustained quest to build the fastest computer in the world.

Second-generation computers of the 1960s, furnished with transistors that greatly accelerated their performance, in part made the 1963 Nuclear Test Ban Treaty possible. While later generations of computers didn’t stop the arms race, they did offer a viable way to test weapon systems without actually detonating anything. By 1998, the Los Alamos supercomputer Blue Mountain could run 1.6 trillion calculations per second. By 2009, the lab’s Roadrunner had increased that speed more than six hundredfold, to the milestone of one quadrillion calculations per second. And by late 2017, its Trinity supercomputer had racked up another factor of fourteen in computing power.

We know that stars generate energy in exactly the same way that hydrogen bombs do. The difference is that the controlled nuclear fusion that happens in the star’s core is contained by the weight of the star itself, whereas in warfare the nuclear fusion is positively uncontrolled— the precise objective of a bomb. And that is why astrophysicists have long been associated with Los Alamos National Lab and its supercomputers. Picture scientists working away on opposite sides of a classified wall. On one side, you have researchers engaged in secret projects that are “responsible for enhancing national security through the military application of nuclear science.” On the other side, you have researchers trying to figure out how stars in the universe live and die. Each side is accessory to the other’s needs, interests, and resources.

If you seek more evidence, search the SAO/NASA Astrophysics Data System for research published in 2017 whose co authors are affiliated with Los Alamos National Laboratory. You’ll recover 102 papers. On average, that’s an astrophysics paper published every 3.6 days. And that’s the unclassified research. Next, peruse the titles of Los Alamos–affiliated papers over the years. Supernovas turn out to be a perennial favorite. Published in the year 2013, for instance, there’s “The Los Alamos Supernova Light-curve Project: Computational Methods.” In 2013– 14 there’s a three-paper sequence: “Finding the First Cosmic Explosions. I. Pair-instability Supernovae,” “II. Core-collapse Supernovae,” and “III. Pulsational Pair-instability Supernovae.” For 2006 you’ll find “Modeling Supernova Shocks with Intense Lasers.” For earlier years, you’ll see titles such as “Testing Astrophysics in the Lab: Simulations with the FLASH Code” (2003) and “Gamma-Ray Bursts: The Most Powerful Cosmic Explosions” (2002).

Born in Cold War fear, the alliance between space and national security remains alive and well in the unstable geopolitical climes of the twenty-first century. And it swings on a double-hinged door.

Excerpted from Accessory to War by Neil deGrasse Tyson and Avis Lang. Copyright © 2018 by Neil deGrasse Tyson and Avis Lang. Used with permission of the publisher, W.W. Norton & Company, Inc. All rights reserved.