Apple’s new Face ID system uses a sensing strategy that dates back decades

'Structured light' and artificial intelligence help power the iPhone X's flagship biometric skill.

Share

On Tuesday, in addition to three shiny new iPhone models, Apple announced Face ID, a slick new way for people to biometrically unlock their phones by showing it their, well, face. The system relies not only on neural networks—a form of machine learning—but also on a slew of sensors that occupy the real estate near the selfie camera on the front of the handset.

The kind of facial recognition that Apple is doing is different from what, say, Facebook does when it identifies a photo of you and suggests a tag—that’s taking place in the two-dimensional landscape of a photograph, while the latest iPhone is considering the three dimensions of someone’s face and using it as a biometric indicator to unlock (or not) their phone.

Alas, you’ll need to pony up the $999 for an iPhone X, as this feature only works on the company’s new flagship smartphone. Among the sensors that comprise what the company calls the TrueDepth camera system that enable Face ID are an infrared camera and a dot projector. The latter of those projects a pattern of more than 30,000 infrared dots on the user’s face when they want to unlock their phone, according to Phil Schiller, a senior vice president at Apple who described the technology yesterday.

One step in the facial-identification process is that the TrueDepth camera system takes an infrared image; another piece of hardware projects those thousands of infrared dots on the face, Schiller explained. “We use the IR image and the dot pattern, and we push them through neural networks to create a mathematical model of your face,” he said. “And then we check that mathematical model against the one that we’ve stored that you set up earlier to see if it’s a match and unlock your phone.”

Structured light

The technique of projecting something onto a three-dimensional object to help computer vision systems detect depth dates back decades, says Anil Jain, a professor of computer science and engineering at Michigan State University and an expert on biometrics. It’s called the structured light method.

Generally, Jain says, computer vision systems can estimate depth using two separate cameras to get a stereoscopic view. But the structured light technique substitutes one of those two cameras for a projector that shines light onto the object; Apple is using a dot pattern, but Jain says that other configurations of light, like stripes or a checkerboard pattern, have also been used.

“By doing a proper calibration between the camera and the projector, we can estimate the depth” of the curved object the system is seeing, Jain says. Dots projected onto a flat surface would look different to the system than dots projected onto a curved one, and faces, of course, are full of curves.

During the keynote, Schiller also explained that they’d taken steps to ensure the system couldn’t be tricked by ruses like a photograph or a Mission Impossible-type mask, and had even “worked with professional mask makers and makeup artists in Hollywood.” Jain speculates that what makes this possible is the fact that the system makes use of infrared light, which he says can be used to tell the difference between materials like skin or a synthetic mask.

Finally, the system taps into the power of neural networks to crunch the data it gathers during the face identification process. A neural network is a common tool in artificial intelligence; in broad strokes, it’s a program that computer scientists teach by feeding it data. For example, a researcher could train a neural network to recognize an animal like a cat by showing it lots of labeled cat pictures—then later, the system should be able to look at new photos and estimate whether those images have cats or not in them. But neural networks are not just constrained to images—Facebook, for example, uses multiple types of neural networks to translate text from one language to another.

Other phones on the market already have a face-identification system, notably Samsung’s S8 phones and their new Note8 device; that uses the handset’s front-facing camera, but the company cautions that the face ID feature is not as secure as using the fingerprint reader, for example. You can’t use it for Samsung pay, for instance, but Apple says that their FaceID system can indeed verify Apple Pay transactions.

Apple’s biometric Face ID system “pushes the tech a notch higher, because not everybody can make a biometric neural engine,” says Jain, or train a face-recognition system using, as Apple said, using more than one billion images. “So I think this will be a difficult act to follow by other vendors.”