How the latest iOS update made the iPhone 11’s camera even better

iOS 13.2 brings the Deep Fusion photography tech to the latest iPhone.

Share

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

iPhone 11 Pro Max
The iPhone 11 Pro Max camera got an upgrade with the latest software update. Stan Horaczek

It’s hard to tell what’s going on inside smartphone cameras. Every time you push the button to take a picture, a whole computerized Rube Goldberg-style chain of events kicks off with the goal of capturing as much image information as humanly possible. That way, powerful processors can cram it all together into a single cherished memory or hilarious snap of a turtle biting your buddy’s finger.

This week, Apple introduced its new Deep Fusion camera tech for the iPhone 11 as part of the iOS 13.2 software update. For the average user, photos taken under certain conditions will now look more detailed and slightly less noisy. There’s no indicator in the app like there is with Camera mode, so you won’t even know it’s working. But, those somewhat granular improvements took some serious engineering—and a whole lot of processing power—to achieve.

What is Deep Fusion?

Deep Fusion Leaf Comparison
The top image is from the iPhone XS max (no Deep Fusion), while the bottom comes from the iPhone 11 Pro Max. The color is better in the bottom image and there’s more pronounced detail, as well. Stan Horaczek

When you have the iPhone’s camera app open, it’s taking photos before you push the shutter. It keeps a rolling buffer of images, and once you press that button or tap the screen, it captures a total of nine photos. Eight of them happened before you pushed the button. Four of the frames are short to make them as sharp as possible by fending off camera shake or motion blur from objects in the photo. The other four from the buffer are standard exposures to capture color and details. The ninth frame is a long exposure to bring in more light and provide a lighter image for the processor to pull from.

Deep Fusion and the Smart HDR technologies work basically the same at this point. From here, however, Deep Fusion takes the sharpest of the four short exposures and basically ditches the rest. It then fuses those four standard exposures with the long exposure to get the color and highlights that belong in the finished photo.

This entire process happens seemingly instantaneously and exists to give the A13 bionic chip all the raw material it needs to smash them into something that looks like real life. That reliance on Apple’s latest processor is the reason why only iPhone 11 users get the Deep Fusion mode, at least for now.

Deep Fusion uses its neural engine to analyze the individual pixels in a scene, as well as their relationship to each other in the frame. It can pick and choose where to try and preserve highlights and where to add contrast to make details look more pronounced.

To my eye, the sharpening effect is by far the most noticeable difference between Deep Fusion and the Smart HDR photos. When high-end retouchers work to smooth out skin in a photo, they use a technique called frequency separation, which allows them to separate the fine details and edges of the image from the colors and tones and manipulate them independently. So, if you wanted to take out blemishes or trouble areas in a picture of a face, you could do it without losing the skin’s natural texture or color.

Deep Fusion person test
The left image comes from the iPhone XS Max, while the right image is the iPhone 11 Pro Max using Deep Fusion. The sharpening is most evident in my beard hair and in the face lines that indicate seasonal allergies and not nearly enough sleep. Stan Horaczek

Deep Fusion does something similar. It’s pulling the details and sharp edges from one of the short images, while it pulls color and light detail from a mashup of the standard and long exposures. This allows the neural engine to tweak things individually. So, if you take a picture of a person with a beard standing in front of a blue sky, it can pull extra-sharp details and darker sky tones from the short image, and the brightness and color for the face itself from the longer exposures that brighten things up.

What to look for?

You may have heard people online refer to Deep Fusion as “sweater mode.” That nickname stemmed from Apple’s original Deep Fusion sample photos, which depicted people wearing sweaters. Those woolen garments actually make for fairly perfect Deep Fusion demonstrations because they have lots of fine texture and patterns crammed into small spaces. Deep Fusion excels at drawing out those little details

Consider the picture of the underside of a house plant’s leaf earlier in the article. The intricate structures pop much more obviously with the iPhone 11 Pro camera than they do with the iPhone XS Max’s.

In terms of color and contrast, the Deep Fusion images look less flat than the XS Max’s photos. I’ve always found the Apple Smart HDR a little overzealous when it comes to flattening shadows and highlights, so Deep Fusion is an improvement there.

The sharpening performance, however, is curious. It makes total sense for complicated nature scenes. The extra details do nice work in landscapes or other complex scenes. It sometimes looks somewhat oversharpened to my eye, but I also formed my tastes on 35mm film, which is a totally different—and not so crispy-looking—beast.

iPhone 11 Deep Fusion leaves sample
The top shot (iPhone XS Max without Deep Fusion) has plenty of sharpness and detail, but the bottom image (shot with an iPhone 11 Pro Max with Deep Fusion), cranks it up another notch. Stan Horaczek

One you take a picture of a person, however, Deep Fusion’s effect seems more out of place. Most of the common face filter apps typically apply skin smoothing techniques—sometimes to a hilarious extent, which makes people look like sexy globs of Play-Doh. Deep Fusion goes the other way, however, and actually seems to accentuate texture on faces. Clearly, Apple is taking faces into consideration as part of its AI processing, but any perceived skin imperfections only get a little more imperfect with the added sharpening.

The effect also really emphasizes texture in hair. Sometimes that works to the subject’s advantage, but in my case, for instance, the added emphasis makes my beard look even more like something you’d find on a grizzled old wizard. It’s similarly unflattering to the hair on top of my head in a way that makes me feel just great about posting an up-close selfie in this very public article.

Of course, Apple is using AI to manage most of this process and future software updates could tweak its performance to drastically change the overall look.

Unlike Night Mode, there’s no indicator in the app to show you when you’re using Deep Fusion. It wouldn’t really matter anyway, since it’s the default camera mode in most lighting situations. It doesn’t work with the ultra-wide camera and, when you’re in a scene with too much light, it will kick back over to Smart HDR mode. When light gets dim enough, then Night Mode takes over.

Generally, Deep Fusion is an improvement when it comes to the camera’s overall performance. However, if this is how people are going to be taking pictures going forward, I should probably start moisturizing and conditioning my beard.

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.