Facebook wants your messages to be 100-percent private. The government has other ideas.

In a new letter, officials from WhatsApp and Messenger explain why they won’t give law enforcement a way to see encrypted messages.

Send someone a note on WhatsApp or iMessage, and the words you write—whether they’re silly or serious—are encrypted from end-to-end. Services like WhatsApp use powerful cryptographic tools to lock down your messages while they’re still on your phone, which means that anyone outside of the conversation trying to snoop simply can’t do it. If a Facebook employee tried to see the message’s contents as it zipped through their servers, they’d be blocked. If the FBI tried to break into the tech company’s records because they thought you were terrorist, they’d be, too.

Of course, the person on the subway looking over your shoulder as you type on your phone (each person’s device is the “end” in end-to-end encryption) could still see what you’re saying, so the system is not perfect—but it’s still an incredibly strong tool that privacy experts say is a fundamental right. You don’t want a tech company or government body listening to what you say in your living room, and you have the same expectation of privacy when you message someone on a device.

But even as it’s the gold standard, encryption isn’t completely widespread. Your basic SMS message isn’t encrypted, and neither is Facebook Messenger. But Messenger could change. In March, Facebook CEO Mark Zuckerberg said he wanted to bring encryption to all of the Facebook-owned messaging apps, including Messenger and Instagram. (WhatsApp has had it since 2016.) Facebook doesn’t have the best reputation for many reasons, but the move was generally seen as a good one for privacy.

There was a twist, though, earlier this year. On October 3, the Department of Justice released a letter co-signed by Secretary of Homeland Security, the U.K. Home Secretary, and the Australian Minister of Home Affairs asking Zuckerberg to cancel this security upgrade. “We are writing to request that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services,” the document, first leaked to and posted by BuzzFeed, states.

But on December 9, two Facebook officials responded, saying that they will not provide law enforcement with a “backdoor” and that they will proceed with their plan to expand the use of end-to-end encryption. “The ‘backdoor’ access you are demanding for law enforcement would be a gift to criminals, hackers and repressive regimes, creating a way for them to enter our systems and leaving every person on our platforms more vulnerable to real-life harm,” wrote the heads of WhatsApp and Messenger. (Read the full letter here.) “That is not something we are prepared to do.”

Law enforcement’s argument

Put simply, government leaders see encryption as a roadblock to law enforcement. “Use of end-to-end encryption … leaves service providers unable to produce readable content in response to wiretap orders and search warrants,” the October letter explains. “This barrier allows criminals to avoid apprehension by law enforcement by limiting access to crucial evidence in the form of encrypted digital communications.” The main example it cites is Child Sexual Abuse Material, or CSAM, more commonly referred to as child pornography. In a deeply reported, often horrifying story, The New York Times delved into the flourishing and rapidly expanding world of CSAM online. The number of online photos and videos reported by tech companies racks up to a staggering 45 million in 2018, according to the Times, which also notes that far more reports come from Messenger (which is not encrypted) than WhatsApp. Officials worry that WhatsApp’s encrypted exchanges could be shielding more instances of CSAM.

Catching the criminals who produce or distribute CSAM online, or otherwise abuse children, is a necessary (and underfunded) endeavor, but privacy experts argue that it’s not worth endangering the civil liberties of billions of others.

Privacy experts’ argument

The first thing to keep in mind is that end-to-end encryption protects the common user, whether the medium is a confidential message between a journalist and a source, or just a routine financial transaction. “It is basic infrastructure to keep people safe and to give them control over their communications,” says Gennie Gebhart, an associate director of research at the Electronic Frontier Foundation. “Proposing to break that is quite alarming.”

The concern among privacy experts is that preventing encryption to stop CSAM, terrorism, or other criminal behavior gives governments the opportunity to expand their monitoring efforts. The “insidious” element, Gebhart says, “is that with these capabilities to scan, or intercept, or access encrypted communications, there’s no reason to believe that the uses would stop at finding, or pursuing, these kind of criminals who abuse children.”

Others see room for compromise. “I am fully supportive of delaying [further encryption efforts] until we have a more serious conversation about the consequences,” says Hany Farid, a professor who focuses on digital forensics at the University of California. In the 2000s, he and others created a tool for Microsoft called PhotoDNA, which compares the digital signatures of photos that are emailed or on servers with known images of CSAM. “I’m not going to say there is no room for end-to-end encryption,” he adds. “I’m not going to say that we should throw away privacy for the purpose of security. But we have got to have a more informed and honest conversation about what we are trading off.”

Zuckerberg has personally waded into the debate himself; on October 3, during a company-wide Q&A session that was live streamed to his Facebook page (scan to 37 minutes in), he pointed out that the company sent 12 million reports to the National Center for Missing and Exploited Children last year. He also noted that it will increase investment in techniques that spot “patterns of activity” related to CSAM without looking at the actual content, before rolling out end-to-end encryption on Messenger.

“With all that said, I still think that the equities are generally in favor of moving towards end-to-end encryption,” Zuckerberg said during the Q&A. “The top messaging app in the United States is iMessage—it’s end-to-end encrypted—people want that.”

All of which shows there are ways for tech companies to shield the public from criminals and wanton surveillance from government bodies and others. The debate around encryption is frequently constructed as a “battle between privacy and safety [of children],” Gebhart says. “I think that’s just wrong. It’s safety versus safety.”

Stan Horaczek contributed to this article.

This story has been updated. It was originally published on October 4.