When the FBI asked Apple to help crack the San Bernardino shooter’s iPhone, the agency was wading into dangerous waters. That’s according to an essay by Susan Landau, Professor of Cybersecurity Policy at Worcester Polytechnic Institute, published today in the journal Science . Landau, who worked as a privacy analyst for Google, writes that the FBI’s approach was shortsighted and risked undermining smartphone protections for users across the board while inviting exploitation of the resulting security weaknesses by bad guys.
The real problem, Landau says, was the way the FBI wanted Apple to crack the phone. The agency had ordered Apple to create a special update that would disable a security feature normally causing the phone to shut down after ten failed password attempts. They would only use the special update once, the agency claimed, deleting it after use. Apple refused. Chief executive Tim Cook said that the resulting software would amount to a “master key, capable of opening hundreds of millions of locks,” leaving all their customers vulnerable. (The FBI ultimately paid a third-party contractor $1.3 million who was able to hack the phone.)
Using software updates to hack smartphones is dumb, Landau writes, and she doesn’t buy the FBI’s claim that the update wouldn’t be reused. In fact, as Apple lawyers revealed in February, the FBI had ordered Apple to provide access to eleven other iPhones since September 2015. Landau suggests that the FBI would be tempted to reuse the San Bernardino update, and perhaps even share the tool with other law enforcement agencies, such as the Manhattan DA, which holds over 200 stubbornly locked iPhones. If the use of updates to hack smartphones were to become routine, it could lead to misuse by bad actors, she argues.
“Some neglect in the process or the collaboration of a rogue employee would make it easy for false requests to be slipped into the update queue.” And consumers might wonder if new updates might actually be surveillance tools. If distrust leads users to avoid updates it “would have devastating security effects,” she writes.
But by far the greatest risk, Landau argues, would be the undermining of secure authentication. The two-step authentication system (password plus a temporary ID number) works great with smartphones. With the protection features weakened, attackers could more easily impersonate the phone owner, which would be especially problematic if the target held sensitive information or had special access.
At the heart of the issue is the so-called “going dark” problem: The FBI has legal authority to access certain information (such as data on San Bernardino shooter’s phone), but lacks the technological knowhow or capability to get it. As FBI director James Comey put it in a 2014 speech: “Armed with lawful authority, we increasingly find ourselves simply unable to do that which the courts have authorized us to do, and that is to collect information being transmitted by terrorists, by criminals, by pedophiles, by bad people of all sorts—simply unable to do our jobs; unable to intercept, lawfully, data-in-motion.” He went on to compare the data held in a smartphone to the contents inside the trunk of a car. Naturally he’d like easy access to both.
While many frame the smartphone encryption issue as one of privacy, Landau argues that it’s really all about security. Rather than asking tech companies to weaken the security of their devices, she writes, the FBI should focus on strengthening its own tech skills. Her message to the FBI: DIY! She recommends a big boost in the staff and money the FBI dedicates to what she calls “lawful hacking.”
Landau concludes: