Roughly three years ago, when Apple announced the iPhone 8 Plus, it was a big milestone for smartphone cameras. Apple turned on its HDR feature—which merges several photos into a single final image—by default. Since then, Apple and pretty much every other smartphone camera manufacturer has been refining their software and algorithmic photography techniques to squeeze every last bit out of hardware that—in the grand scheme of things—hasn’t changed all that much.
When Apple introduced the iPhone 12 Pro Max, however, camera nerds like myself took notice. The company promised to grow the main imaging sensor by 47 percent while keeping the same resolution. That means bigger pixels, which typically translates into less ugly digital noise in low-light situations and, generally, less reliance on features like Night Mode. The Pro Max also promises a new lens with a wider f/1.6 aperture to let in more light and an image stabilization system that moves the sensor instead of glass lens elements to combat blurry photos stemming from shaky hands. The telephoto lens now reaches even longer to give a true 2.5x optical zoom. These some considerable hardware shifts from the previous generation. But what do they mean for real-world situations?
In short: The iPhone Pro Max is best overall smartphone camera I’ve used. But, before you make the jump and expect great images and video, there are a few important things to consider.
The wide-angle camera
At the heart of the iPhone 12 Pro Max’s camera module, the main wide-angle camera handles the bulk of the video and photography duties. This is where you get the larger sensor. If you’re expecting it to look profoundly different than your current iPhone 11 or an iPhone 12, you’ll be disappointed, but there are some key differences, some of which you’ll likely notice in day-to-day use.
First, the larger sensor and wider f/1.6 aperture (smaller f-numbers indicate bigger physical lens openings for letting in light) do have an effect on the depth of field you can expect to see in your photos. You’re not going to suddenly have DSLR-grade bokeh (the background blur that’s typically associated with fast aperture lenses), but you will probably notice more blur in the background of your photos. It’s especially pronounced when your subject is very close to the camera.
Blurry backgrounds are appealing because they allow you to set your subject apart from the background, but very small sensors (like those found in smartphones) have a hard time creating this effect. Compare the bokeh in these two shots above, one from the 12 Pro Max and one from the 12 Pro. Clearly, the Pro Max has more blur, which is what you’d expect. For an up-close shot like this, it’s appealing.
In other circumstances, it gets a little more complicated. Just because a camera creates more bokeh, doesn’t mean it’s beautiful blur. In these shots below, you’ll notice that the trees in the background are, in fact, blurrier with the 12 Pro Max, but the leaf pattern looks a bit harsh and crunchy. I suspect this is happening for one of a few reasons. For one, Apple developed a new seven-element lens to help maintain edge-to-edge image sharpness on the larger sensor, so it could be a product of the lens design itself. Apple is also applying its Deep Fusion technology to each image, which emphasizes contrast in areas of fine detail. That’s great for things like sweaters or other situations where you’d want to pull out lots of details, but it can look harsh in some situations.
I suspect Deep Fusion is also responsible for what I’d consider a bit of over sharpening on the hair in this photo. Despite how it sounds, “over sharpening” isn’t necessarily bad. It can translate into an image looking “crisp” when viewed small. But, zoom in and you’ll start noticing artifacts that aren’t very appealing.
All of that said, I think the slightly shallower depth of field is a win for the new larger sensor and I do think there’s a noticeable upgrade from the 12 Pro to the 12 Pro Max, let alone over the previous generations.
In typical shooting conditions, I noticed a difference in overall image quality between shooting in JPEG and HEIC formats. I preferred the look of the HEICs because they had more contrast and the shadows were slightly darker.
When shooting outdoors—or even indoors with a view of the outdoors—the iPhone still goes to really great lengths to prevent skies from blowing out and getting too bright. In this shot of a tree, you can see that the bright blue seems to have the luminance dropped and the saturation cranked. It’s a bit much for me, but then again, I’ve never minded a slightly blown sky. I know I’m in the minority there.
When it comes to low-light photography, the comparisons are similarly complicated. Apple’s Night Mode tech showed up in the iPhone 11; it allows the camera to take much longer exposures and crunch them together with shorter captures to make a final image that’s properly exposed but not blurry from a shaking camera. It works for making dramatic pictures, but I also think the effect is too much in some circumstances. The colors get unrealistic and you lose a true sense of what a dark scene feels like.
Because the 12 Pro Max captures more light and handles it more effectively, it’s not as quick to jump to night mode and instead more often gives you a more traditional picture. I prefer this because a single exposure is more conducive to capturing moving subjects (even if you get a little natural motion blur) and you don’t have to worry about holding the camera as still during the capture process. I also think the non-Night-Mode shots just look more natural in many cases. I will give Apple credit for making the night mode look—with saturated colors and amplified illumination—less pronounced.
Ultimately, Apple promised that the 12 Pro Max is better in low-light and it delivered on the promise. If you’re the type to edit your photos before posting or sharing them, this will be a bigger deal since I found the Night Mode shots particularly difficult to edit to my taste.
Like the iPad Pro, Apple equipped the 12 Pro Max with a Lidar sensor that helps the camera focus in low-light situations. It’s hard to tell when it actually kicks in because the camera uses scene recognition tech to switch between contrast AF, phase detection AF, and Lidar depending on what makes the most sense for the shooting situation.
In short: it will still hunt a bit in low-light situations, but it will totally fail to focus less than a phone without it.
The portrait camera
Apple’s built-in telephoto camera didn’t get a bigger sensor like the wide-angle, but it did get a new lens. It offers a field of view that’s roughly equivalent to a 65mm lens on a full-frame sensor like you’d find in many pro-grade cameras. That’s up from the 56mm equivalent found in the 12 Pro and previous iPhones. You may read other reviews in which authors are keen to remind you that true “portrait lenses” tend to start around 85mm in focal length, but I think 65mm makes more sense in this context. It’s only 5mm away from 70mm, which is the longest setting on the zoom lens used by many professional photographers (including myself) to shoot portraits all the time.
The extra zoom is nice, but it also complicates things when it comes to image quality. Whether you know it or not, clicking the 2.5x or 2x “zoom” button in your camera app doesn’t always switch over to the telephoto lens. Sometimes, it simply uses “digital zoom” to crop into the image and enhance it via software to make up for the lost resolution. So, in this shot of some sculptures at night, for instance, I clicked 2.5x zoom to get closer to the faceless heads. It was too dark for the f/2.2 telephoto lens and its smaller sensor to make a good-looking image, so it defaulted back to a cropped-in image from the wide-angle camera. Because it zooms to 2.5x instead of just 2x like the Pro, the image quality can actually look slightly muddier with the Pro Max.
This low-light picture of a shopping cart is another good example of an image that was shot at 2.5x, but came from the main camera’s sensor. The fine details of the cart itself look pretty muddy due to the enlargement.
Ultimately, however, these are nitpicks. In normal circumstances, the 2.5x telephoto is great and using it in medium to bright light turns out great results. Just be aware of what camera you’re actually using.
Ultra-wide and selfie cameras
Personally, I don’t get a ton of use out of either of these cameras and they also have changed drastically from before. The selfie camera now has Night Mode and Deep Fusion, which makes selfies look more detailed than before, but it’s not a world of difference.
The super-wide camera now has night mode, which does make a big difference when compared to the iPhone 11. You can still expect a lot of distortion around the edges of your photos (very typical of lenses this wide) and skewed perspectives as an expected cost of cramming more information into the frame.
In addition to the new sensor, Apple also mounted the chip in a moving mount that makes up to 5,000 micro-adjustments per second in order to counteract your shaking hands. From a photography perspective, I didn’t notice a ton of difference with the new system. I was already pretty impressed with how the IS worked in the iPhone, so you can expect that it will still be good.
When it comes to video, the stabilization stands out. It reminds me a lot of the GoPro Hypersmooth tech, which makes even vigorous motion seem tame. Walking around, shake is kept to a minimum, but some objects in the scene seem to slightly wobble a bit as you move.
This is the first iPhone generation that shoots natively in HDR, which is a big step in terms of image quality, but also creates some compatibility quirks that will work themselves out over the next couple months. I’ve been shooting at maximum quality during my testing and I’ve generally been impressed with the overall look of the HDR footage. I have noticed some slightly odd flickering in some of the videos, however. I noticed it primarily in this indoor video posted below, but some other reviewers have seen it outside in nice light as well. Here are some samples to check out. I’ll be shooting more with it in the coming weeks, but I’ll also be following the dedicated cinema bloggers who have a better handle on the motion aspects of the camera than I do.
A phone’s size typically doesn’t play too much into a camera-oriented review, but the 12 Pro Max really is massive. Holding it in one hand isn’t the simplest task, though, the squared sides are much easier to grip than the rounded edges on the 11 Pro Max. Honestly, I’m at the point where I’d like a case with an honest-to-goodness grip on it for holding the phone steady and correctly aligned.
Who should buy it?
If you want the best smartphone camera available at the moment, I think this is it. The bigger sensor makes a difference—albeit a subtle one—and the overall image quality difference will be noticeable if you’re coming from an older generation.
Personally, I’m looking forward to Apple’s upcoming Pro Raw image format, which promises more control over some of the automated imaging systems it applies to photos. I’d love to see what bokeh looks like with Deep Fusion toned down or turned off. I’d love to be able to tweak the extent to which the blue skies are burned-in (a darkroom photography term that basically means “darkened”) when shooting outdoors.
The 65mm telephoto lens is a genuine upgrade for shooting portraits and zooming in general, even if it can’t quite match the optical zoom skills of the Samsung Galaxy Note 20. I’ve still been impressed with what Google has done in its Pixel phones with almost no change in hardware, but Apple’s jump to a bigger sensor is a differentiator.
Whether it’s worth spending the extra money for you specifically will have to be a personal decision. If you notice subtle-but-real improvements in image quality, then it will be worth it. Once the new raw image format enters iOS, I expect that the 12 Pro Max will be even more flexible and appealing to people who like to edit and finish their images before sharing.