Computers already do so much of our work that it seems natural to let them take care of our sabotage, too. This might have been the line of thinking that led to Stuxnet, the first known malware worm designed to disrupt industrial processes.
Stuxnet began spreading in early 2009. It’s unclear who wrote the worm, or why, but a consensus is gathering that a government (probably Israel’s) created the worm to sabotage the Iranian nuclear program. Stuxnet seeks out and silently hijacks factory control software written by Siemens. Once it infects the computers running that software, it can command uranium-enriching centrifuges to spin out of control, thereby destroying them.
Computers such as these are kept disconnected from the Internet, so getting Stuxnet to its target probably required someone to load the worm onto a USB stick used by a plant employee or contractor. Once that person plugged his USB stick in at work, Stuxnet probably began crawling through the plant’s local-area network, searching for the right computer to hijack. Exactly how much damage it did, we don’t know, but the Iranian nuclear program did appear to slow, and last November Iranian president Mahmoud Ahmadinejad confirmed that a malicious computer program caused “problems” with centrifuges.
But somehow Stuxnet also made it to the Internet, and by last fall, the worm had infected hundreds of thousands of computers in at least 155 countries. Siemens says that to date, only 15 industrial facilities worldwide have been infected. So far, nothing has happened that we know of, but the potential for mischief is still high.
The operating system that Stuxnet is designed to hijack controls pipelines, conveyor belts, boilers, alarm systems and access controls, among other things, and experts warn that it and similar worms could, in theory, blow up factory boilers, destroy gas pipelines, sabotage power plants, and disrupt power grids. Now that Stuxnet has given hackers a sophisticated blueprint for creating such worms, many cybersecurity wonks are thoroughly freaked out.
How We Can Do Better
Cybersecurity consultant Stephen Spoonamore says that we should immediately redesign all critical infrastructure to “fail open,” meaning that in the event of a cyber attack, the system will default to a basic operating mode and keep running.