This week, Apple announced a new “extreme, optional” security feature called Lockdown Mode that is aimed at a very small minority of users who are at risk of being deliberately targeted by cyberattacks “from private companies developing state-sponsored mercenary spyware.” It will launch with iOS 16, iPadOS 16, and macOS Ventura this fall.
Lockdown Mode is designed to block a category of hyper-targeted hacks that are generally used by governments (or private companies with support from governments) against activists, dissidents, journalists, and high-level business people. Although there are presumably other, similar exploits that have not been exposed, the most infamous of these is the spyware called Pegasus.
Pegasus, developed by the (now sanctioned) Israeli technology firm NSO Group, turns iPhones and Android smartphones against their users. It’s basically the stereotypical Hollywood hack: The attackers have access to pretty much everything on the device, can intercept calls and messages, and even use the microphone, camera, and GPS to record and track people. Crucially, Pegasus can be a “zero-click” exploit, meaning that it can be installed without the user doing anything; at one point, phones could even be infected through a missed WhatsApp voice call.
Cybersecurity typically involves a tradeoff between convenience and security. If you want your computer to be very difficult to hack, don’t connect it to the internet—lock it in a secure room in your house. No viruses! But also no email, Amazon, or Minecraft. Modern iPhones, iPads, and Macs come with loads of features that make them fast, convenient, and easy to use, but these same features also give hackers large “attack surfaces” to work with. Lockdown Mode turns off a lot of these features, or at least makes them disabled by default, at the expense of a great user experience.
Some of the features that get disabled by Lockdown Mode, for example, are the speed and efficiency technologies in a bit of software called WebKit (which powers Safari). Web pages that haven’t been flagged as “trusted” will take longer to load and may be jankier to use, but those web pages won’t be able to exploit any potential JavaScript bugs. Similarly, in Messages, most attachments other than certain trusted image types are disabled, as are link previews. If the device is locked, wired connections to computers or accessories are blocked.
There are also features designed to limit who can contact you in an unsolicited manner, which should make zero click exploits harder to pull off. FaceTime calls, for example, are blocked unless you previously called the person in the last 30 days.
Another key feature is that once a device is in Lockdown Mode, it can’t be registered (or unregistered) in an enterprise mobile device management (MDM) program, which is what large companies use to control the devices used by their employees. Nor can configuration profiles be installed, which are used by college and enterprise networks to handle the devices that connect to them. These are two features that have allowed hackers access to devices in the past, and presumably are still possible to abuse.
And these are just some of the features at launch. Apple plans to continue to develop Lockdown Mode based on feedback from security researchers and other affected groups.
All in all, an iPhone in Lockdown Mode will be worse to use than an iPhone without—but it will also be much more secure. This is why, as scary as attacks like Pegasus are, Apple is stressing that this is not a feature for most users. In the press release, Ivan Krstić, Apple’s head of security engineering and architecture, says, “Lockdown Mode is a groundbreaking capability that reflects our unwavering commitment to protecting users from even the rarest, most sophisticated attacks. While the vast majority of users will never be the victims of highly targeted cyberattacks, we will work tirelessly to protect the small number of users who are.”
As well as announcing Lockdown Mode, Apple also announced that its “Bug Bounty” program rewards would be doubled—up to a maximum of $2 million—for any vulnerability that researchers find that could bypass its security features.
It also announced a $10 million grant (as well as any proceeds from its lawsuit against NSO Group) to “support organizations that investigate, expose, and prevent highly targeted cyberattacks.” Ron Deibert, director of the Citizen Lab, a research group at the University of Toronto that has uncovered a lot of information about Pegasus, said in a statement that accompanied Apple’s press release, “There is now undeniable evidence from the research of the Citizen Lab and other organizations that the mercenary surveillance industry is facilitating the spread of authoritarian practices and massive human rights abuses worldwide. I applaud Apple for establishing this important grant, which will send a strong message and help nurture independent researchers and advocacy organizations holding mercenary spyware vendors accountable for the harms they are inflicting on innocent people.”