SHARE

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Photographers who start on smartphones and move to dedicated cameras typically find some surprises during the transition. Without built-in HDR, blue skies get blown out. Without some kind of Night Mode, shots in the dark are actually, well, dark. And perhaps the most surprising change: You get a surprising number of shots that are noticeably out of focus.

A smartphone’s ability to keep pretty much everything in focus all the time is a byproduct of its primary hardware limitation. The image sensor—the little slab of pixels that actually captures light when you take a photo—has to be tiny in order to fit inside the device itself. Practically speaking, that’s a key reason why smartphone images have so much depth of field. It’s also why missing focus isn’t nearly as apparent on a smartphone as it is with a larger-sensor camera. Now, Apple has increased the size of the sensor inside the iPhone 12 Pro Max by 50 percent, which will have some effect on how much of your shot is blurry and how much is sharp.

That 50 percent difference is almost exactly the same as the rift between smaller APS-C sensors often found in consumer and enthusiast dedicated cameras, and the full-frame chips inside more professional models. (Full-frame refers to 36mm x 24mm sensors the size of one image shot on 35mm film, while APS-C matches a roughly 25mm x 17mm size of an ill-fated film format.)

Calculating depth of field is simple math, but it gets muddied in the real world by different types of gear and use-cases. You can read a solid explainer on the practical differences sensor size makes here. It essentially boils down to your aperture, the focal length of your lens, and your distance from the subject. Changing the sensor size affects all of those factors while you’re out shooting.

Also consider that the 12 Pro Max now has an even wider fixed aperture at f/1.6, which will naturally reduce depth of field even more. It’s not a huge jump from f/1.7 down to f/1.6, but at numbers this low, even small jumps have a big impact on the overall image appearance.

iPhone 11 focus
In this image—shot with an iPhone 11 Pro Max—the focus is on the dead flower in the bottom right of the image. Stan Horaczek
iphone 11 pro max focus
In this image, the focus is on the leaves in the center middle—it missed the flower. You might not notice the blurry flower on a small device, but the effect will be more pronounced on the 12 Pro Max thanks to the bigger sensor and faster lens. Stan Horaczek

All of that translates into the possibility that you’re actually going to be able to take out-of-focus photos with the iPhone 12 Pro Max unless you’re paying attention to what you’re shooting. Faster apertures and bigger sensors are simply less forgiving from a hardware standpoint, especially considering the fact that you can’t close the aperture at all to get yourself more DoF. It’s not an earth-shattering difference from a hardware perspective, but it goes firmly against Apple’s overall mission to try to prevent people from taking technically “bad” photos by any means necessary.

Apple has also added a LiDAR sensor to the 12 Pro Max camera array. And while its primary function is to help with AR and VR applications, the company also says that it will help with the camera’s autofocusing. That extra step beyond the typical autofocus pixels built into the chip could help shooters in a situation where the depth of field isn’t so forgiving.

Beyond the focusing feature, Apple is also giving the 12 Pro Max access to its new ProRAW file format, which saves all of the image data as well as the processing metadata that comes through the image signal processor. Pro photographers typically shoot in raw because it maintains image data that’s outside the scope of what a lossy format like JPEG can handle. As a result, shooters can intentionally overexpose or underexpose an image knowing that they can pull back details in editing that would otherwise have been lost with a simple JPEG.

ProRaw will also presumably help with situations such as when shooters try to take pictures of the red air stemming from the U.S. wildfires. Cities saw the skies turn red, but the iPhone camera’s computational photography mode tried to correct the tint and made the scenes look underwhelming. That’s because the camera typically looks for a neutral grey tone to prevent the color of light in a scene from getting too far from the norm. True raw capture allows for photographers to pick the neutral color point during processing without having to wrestle with the auto-white balance during shooting.

If you never plan to open your photos in any kind of advanced editor, like Adobe Lightroom, then shooting raw is largely a waste anyway. The files are typically considerably larger than JPEGs or HEICs because they’re holding onto more information. For people just looking to snap photos that look like typical iPhone photos, the raw data isn’t going to do them much good.

Apple has already faced some criticism for creating a hardware rift between the iPhone 12 camera and the Pro Max camera. In some ways, however, the move makes sense. The main iPhone camera will always be about trying to give people the easiest path to a pretty good picture. The Pro Max may be objectively “better” when it comes to maintaining detail and combatting noise in low-light, but that could come at the cost of some number of missed shots.

Apple doesn’t expect the average person to edit their photos in an app that goes much beyond auto-adjust or adding a filter. And why would the company want shooters to have to start worrying about focusing after all this time with ample depth of field? After all, portrait mode can always fake all the background blur that you want.