Last year, Nature and Science prepared to publish research describing how to mutate H5N1, a deadly bird flu, into more-contagious forms. The papers could help scientists create a treatment should a similar mutation occur in nature. But according to the U.S. government, the papers could also help terrorists create a weapon.
The federal National Science Advisory Board for Biosecurity is a group of scientists that helps the government identify such "dual use" research that could endanger the public. The journals submitted the work to the NSABB, and the board asked them to either not publish it or to strike details. The request not to publish was unusual. The request to redact was unprecedented.
The government wants to keep risky information secret so that terrorists can't use it to make weapons. Scientists seek to share such data publicly so they can quickly collaborate to counteract any risk. No one has yet found any middle ground. (The NSABB certainly hasn't. After failing to negotiate a "need to know" system, it reversed its position on one paper and recommended publishing other pieces of the second.)
The conflict between national security and open science is intensifying. Scientists who accept public funds (which is most of them) must increasingly keep their findings secret. It used to be that research only became classified if it was funded by a national-security-related agency like the Department of Defense, which spends millions on infectious-disease research each year. But now, under some of the 50 semiclassified designations, any federal agency can gag researchers. This approach doesn't just limit science. It makes us less safe. Viruses spread by nature are a greater threat than viruses spread by terrorists, and shared data is essential to fight both.
The use of shared data is already transforming virology. The field is moving from hypothesis-based research, which tests a single proposition, to systems-based research, which looks for patterns in all available data. More quickly spotting infectious diseases that might cross over from animals to humans (like HIV, SARS and H5N1 did) will require pooling databases and international samples.
Charles Chiu, head of the University of California at San Francisco's Viral Diagnostics and Discovery Center, says that a potentially disastrous virus like H5N1 occurs naturally every 10 years: "This is a time bomb." He thinks that with international cooperation and $100 million, systems-based research could identify every dangerous reservoir of animal-to-human infection on Earth. And for $100 billion it could characterize the features of 90 percent of those infections. "With the right amount of researchers and money, you'd get ahead of any dangers," he says—natural or man-made.
I read with great interest Jocob Wards article, in the june issue of Popsci., Communicable Science. Creating a deadly virus is a very bad idea. Why on earth would anyone want to create a virus that has the potential to kill millions of people? In this day of rampant terrorism it would be a matter of when, not if, it fell into the wrong hands. Yes, nature does this on it's own, but I for one would rather take my chances with nature than to be killed at the hands of some mad scientist. Charles Chiu speaks of money as if it grew on trees. $100 million here, $100 billion there. With money like this flying around it makes one wonder if Mr. Chiu and his colleagues have an ulterior motive for creating such a deadly disease.