We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›
When Apple turned on certain forms of encryption as the default setting for the iPhone a year ago, and Google followed suit on Android, the companies were picking a public fight with the government.
Last fall, FBI director James Comey likened the encrypted phone to “a closet that can’t be opened,” full of terrorist chatter, child pornography, and conspiracies to distribute heroin. Comey raised the spectre of the FBI “going dark,” blocked from intercepting the communications of dangerous suspects, even with a warrant. This summer, Apple CEO Tim Cook, at the EPIC Champions of Freedom event, fired back at the cryptographic “backdoor” demanded by Comey – the digital equivalent of a skeleton key that would let the FBI listen in on mathematically garbled conversations. “If you put a key under the mat for the cops, a burglar can find it, too,” Cook said.
The closet and the key under the mat are different ways of describing the same thing, but with very different priorities. Are the encrypted phones an impregnable closet full of horrors? Or a solidly locked front door that forces the government to use one of its many other tools to find out what’s inside?
Apple and Google’s encryption-oriented policies and marketing strategy has pleased privacy advocates, since encryption should prevent bulk surveillance by the NSA.
“Encryption will seriously interfere with the ability to routinely intercept communication, as our national intelligence seems to be doing domestically,” says Ryan Calo, who studies law and emerging technology at the University of Washington. “It’s been designed, tested with usability in mind, by people with three letters after their name. It’s usable, and mainstream. People don’t feel weird about using it.”
Apple has in fact criticized Google and other companies that rely on mining the communications of customers for services that are otherwise free. Because Apple makes money primarily by selling high-margin hardware like the iPhone, creating privacy-friendly devices has fewer business trade-offs than for its competitors. Apple doesn’t need to analyze the content of your data to be profitable. But Google’s “free,” advertising-supported model does make the devices a lot cheaper.
“Apple does get credit for making privacy a mainstream topic,” says Oliver Day, who heads a security-oriented non-profit called Securing Change. ” But at the same time Apple is so expensive it’s impractical to advise activists to all purchase the latest iPhone to secure their communications with iMessage. Android seems behind in terms of security features, but I’m personally more interested in seeing advances in Android privacy technology, since it is more affordable.”
“Apple does get credit for making privacy a mainstream topic.”
Whatever the platform, it’s the ease of use and the power of encryption algorithms that concerns Comey — there are certain phones in the U.S. that the FBI can’t access, even when it has a warrant and a potentially dangerous suspect.
The FBI Is Not Without Options
Calo points out that the government does have a powerful tool for compelling a suspect to decrypt his phone. It’s called “jail.” Assuming the government obtains a search warrant, a suspect must decrypt a phone or be thrown in jail for contempt of court.
“You can force people to decrypt,” he says. “That’s not perceived to be a Fifth Amendment violation, usually. You’re not self-incriminating by decrypting your phone.”
(The exception would be if decrypting a device were tantamount to self-incrimination — for example, knowing the passcode to a tranche of child pornography).
Still, it’s not like Comey’s fears are baseless. Whether lone wolf terrorists like the Tsarnaev brothers, or the young couple from Mississippi who planned to honeymoon in Syria and join ISIS, there are dangerous suspects out there among the 300 million people in the United States. These are people the FBI has a legitimate need to place under surveillance.
“What I hear from people in law enforcement and the intelligence community is that they are facing obstacles right now in tracking some terrorist communications overseas,” says Shane Harris, author of @War: The Rise of the Military-Internet Complex. “That’s because ISIS, for instance, has been using simple apps that allow them to encrypt their messages or erase them soon after they’re sent.”
The government does have a powerful tool for compelling a suspect to decrypt his phone. It’s called jail.
Once easy-to-use default encryption on smartphones is pervasive, then placing the equivalent of a wiretap without the suspect’s knowledge becomes much more difficult. Even if there were a warrant, the government would need to use other techniques such as bugging the person’s domicile or sending in an informant in order to secretly gather intelligence.
“It does make it impossible to do a surreptitious warrant — eavesdropping,” says Calo. “That analogue doesn’t exist when the phone is encrypted and only you have the key to it.”
“I think that the U.S. government has a legitimate point here that encryption in the hands of terrorists is making the job of law enforcement and intelligence agencies harder,” says Harris. “But it’s a big leap to go from that operational, foreign context into a broad statement that the FBI could be ‘going dark’ in a domestic setting. We need to see more data on this.”
Backdoors Are Worse Than Useless
Comey’s backdoors would stop millions of innocent people from avoiding mass surveillance by the NSA. And it would also create a “key under the mat” for elite hackers like The People’s Liberation Army Unit 61398. Moreover, backdoors wouldn’t prevent criminals from using other forms of off-the-shelf encryption. Multiple phone tools are readily available, such as Wickr, RedPhone/Signal, or Silent Circle, as any semi-competent terrorist knows. The criminals who most benefit from Apple and Google’s default encryption are the low-level players too ignorant to install the programs themselves.
“[The government] is no longer getting a windfall of unsophisticated criminals saying things out in the open,” says Calo.
And if the government were to attempt to backdoor or criminalize these add-on tools as well? Encryption software created in other countries would easily migrate into the U.S. via the darkwebs. The cryptographic horse left the barn some time ago. That’s why L. Gordon Crovitz’s Wall Street Journal op-ed “Why Terrorists Love Silicon Valley” misses the mark.
Crovitz argues that encryption conceals ISIS sympathizers in the United States, and the geeks just must be smart enough to come up with an algorithm to break the code, if they really tried. In reality, sophisticated criminals will find ways to encrypt communication no matter what — and cryptographers insist there’s no easy technological fix to that problem. As security expert Bruce Schneier put it in an essay at Lawfare: “As long as there is something that the ISIL operative can move them to, some software that the American can download and install on their phone or computer, or hardware that they can buy from abroad, the FBI still won’t be able to eavesdrop. And by pushing these ISIL operatives to non-US platforms, they lose access to the metadata they otherwise have.”
In other words, the FBI already can access “metadata” about the time and location of two people communicating on encrypted tools — just not the content of the conversation. And by imposing a draconian, Chinese-style “great firewall” on any non-backdoored software, the FBI would push ISIS to underground tools, thereby losing the metadata as well.
So How Big Is The Problem Really?
When Comey demands backdoors from Apple and Google, what he’s really talking about is help for a very specific type of case that meets several conditions. This occurs when the FBI has no other way to put together its case, can’t get anyone else to flip on the person, and can’t compel the suspect to decrypt through the threat of arrest.
“Plus, it must be someone who, without Apple making it easy, wouldn’t have encrypted,” says Calo.
This past summer, the FBI apparently had such a case, in which it showed a search warrant to Apple demanding that the company turn over specific messages from iMessage in a case “involving guns and drugs.” Apple responded that it did not have the ability to decrypt the messages, though it did ultimately turn over data stored on the user’s iCloud account, which is not encrypted. The question is how many of these types of cases actually crop up.
“The universe of this use case, with that threat model, may not be that big. I don’t know for a fact,” says Calo.
“Encryption does make targeted surveillance much harder. So do cash, bearer bonds, fake mustaches, hats, hair dye, blankets, horses, boats, and forests.”
The problem is that the FBI doesn’t really know either. In July, Comey told the Senate Intelligence Committee that he didn’t know how many times the FBI had been unable to access an encrypted device even with a warrant. He added that the situation was occurring more frequently.
“They might start tracking that,” says Shane Harris. “The FBI has got to do a better job persuading Americans that it doesn’t have any sinister motives for wanting some new policies around encryption. The bureau is going to have to do a better job making its case in this debate.”
And the government would do well to remember why Apple and Google started turning on encryption by default in the first place. A majority of citizens are at least somewhat concernedabout the NSA’s dragnet surveillance practices. American tech companies face a backlash from international customers who see our gadgets and software as extensions of the U.S. intelligence apparatus.
“I totally agree that [encryption] technologies do make targeted surveillance much harder,” says Day. “So does cash, bearer bonds, fake mustaches, hats, hair dye, blankets — see Snowden’s keyboard shroud — cars, horses, boats, and forests. I won’t agree that it makes targeted surveillance impossible or even untenable.”