We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›
At its recent product announcement event in New York City, Google showed off a handful of new gear. But, the company dedicated considerable time—and presumably money spent to hire iconic portrait photographer Annie Leibovitz—showing off the Pixel 4 smartphone’s new camera. That emphasis makes sense. Consumers still indicate that photo quality is one of the most-important factors they use to pick a new device. And Google is coming off of a very strong showing in the Pixel 3, which was (at least as far as I was concerned at the time) the absolute best smartphone camera.
The Pixel 4 adds some more AI-powered smarts, relying increasingly on its software to determine the overall look of the final image. And while the camera has some moments where it’s truly excellent, I ran into a few growing pains as Google tries to calculate its way to perfect photos.
What’s new?
On paper, the Pixel 4’s camera doesn’t seem all that different from the hardware that came before it. The primary exception is the notable addition of a telephoto lens, which Google intends to improve performance specifically when it comes to zooming and portrait mode. The shooter’s competition, however, is stiffer this year: Apple seems to have corrected some the over-zealous HDR tech that made iPhone XS images look unrealistic and unnatural at times, and the Cupertino company promises to further improve the iPhone 11 Pro’s already-very-good camera when its detail-enhancing Deep Fusion tech arrives in the next iOS update.
Image quality
Google doesn’t pull punches when it comes to computational photography, which relies more on processing power and algorithms than pure hardware performance. The company makes it abundantly clear that the software magic that happens during and after you press the shutter has become extremely important in determining the look of the final image.
Like almost every smartphone camera at this point, pressing the shutter doesn’t simply take one photo. Instead, it captures a burst and combines information from those images into one finished file. This “smart HDR” tech does a lot of good: It can prevent highlights from getting blown out, or flatten out a super-contrasty scene that could lose crucial details. But, as with iPhone 11 Pro, it can be unpredictable.
In good conditions shooting with the main wide-angle camera, I prefer the images that come out of the Pixel 4 to those from the iPhone 11 Pro. It’s close, but the Pixel’s camera still feels more neutral and natural to me. I don’t notice the HDR effect that can make subjects look unrealistic—and sometimes even cartoonish—as much as I do with the iPhone. This is especially useful for users who edit their photos after taking them (something very few typical users do).
Google made a few welcome improvements to its overall HDR experience as well. When you tap the screen to focus on an object in the image, two sliders now pop up for adjusting the brightness of the scene. One slider affects the overall exposure (how bright or dark everything looks) in the scene, while the other simply affects the shadows. That second slider is extremely useful. It allows you to do things like taking silhouette photos in which the subject is virtually blacked out while the background (usually the bright sky) stays properly exposed.
You can also achieve the opposite effect in which you can brighten up a dark foreground subject without blowing out a bright sky in the background. In a situation like the one pictured below, you’d typically lose some of those nice yellow leaf details to shadow unless you brightened the whole image and blew out the sky. Adjusting the shadow slider allows you to bring up the exposure on the leaves while leaving the sky alone.
That slider is one of my favorite additions to the Pixel 4 camera, and it’s a trend I’d love to see continue as we go forward into the future of HDR all the time on everything.
When the shooting conditions get tricky, however, the Pixel 4 has some real quirks.
The flickr effect
Most artificial lighting flickers, but your brain does a good job of making the glow seem continuous. The pulsing effect, however, is more difficult for digital cameras to negate, and the Pixel 4 seems to have more trouble in this arena than its competition.
In the video above, you’ll notice some dark bands going across the image. This kind of thing isn’t out-of-the-ordinary with artificial light sources, which have a generally imperceptible flicker to coincide with the 60 Hz electrical current flowing through them. Dedicated digital cameras, however, typically have “flicker detection” to help combat it, and even the iPhone 11 Pro does a better job of mitigating the effect.
With the Pixel 4, I noticed it in a variety of locations and artificial light sources. It’s subtle, most of the time, but if you have a bright light source in the frame of the picture or video, it can push the shutter speed faster than 1/60th of a second, which is when the bands start to creep in.
When I switched to a manual camera mode in the Lightroom app and used a slower shutter speed, it disappeared. In scenes like this, the iPhone seems to use its HDR+ tech to keep at least one frame in the mix with a shutter speed slow enough to stop this from happening. Once I figured out the circumstances that brought it on, I shot the example below, which shows it very clearly.
The flaw isn’t a deal breaker since it only appears in specific circumstances, but it’s very annoying when it does.
White balancing act
Another area where our brains and eyes routinely outperform cameras: color balance. If you’re in a room with both artificial light and a window, the illumination may look fairly consistent to your eye, but render as orange and blue, respectively, to a camera.
Smartphones often try to split the difference when it comes to white balance unless you mess with it on your own. The Pixel 4, however, analyzes the scene in front of it and uses AI to try and recognize important objects in the frame. So, if it notices a face, it will try and get the white balance right on the person. That’s a good tactic.
Generally, I think the Pixel 4 does an excellent job when it comes to white balance, except when it gets it very wrong. Move around the iPhone 11 Pro camera, and the scene’s overall color cast tends to stay mostly consistent. Do the same with the Pixel 4, and its overall white balance can shift drastically, even when you only slightly move the camera. Above, the grid-style screenshot show a series of photos I took in succession under unchanging conditions. I moved the phone subtly as I shot, and you can see the really profound color shift. Again, this primarily happens when shooting under artificial light.
As long as you pay attention and notice the change before snapping the shot, it’s totally fine and the Pixel does a great job. It’s also easy to correct later on if you’re willing to open an editing app. But, on a few occasions, I ended up with a weirdly yellow photo I didn’t expect.
Telephoto lens
The new telephoto lens is roughly twice the focal length of the Pixel’s standard camera, which effectively gives you a 2x optical zoom. It has an f/2.4 aperture, compared to the improved f/2.0 (lower numbers let in more light) portrait lens on the iPhone 11 Plus. It’s only a fraction of a stop, however, so it’s unlikely to make a huge impact, but it’s a reminder that Apple has been doing telephoto lenses for some time now and is already refining while Google is just getting started.
Like we said earlier, the telephoto lens counts zooming as one of its primary functions. The phone gives you the option to zoom up to 8x by combining digital and optical technology. Google claims pinching to get closer now actually gives you better image quality than simply taking a wider photo and cropping in, which has historically provided better results. I found this statement accurate. “Zooming” has come a long way on smartphone cameras, but you shouldn’t expect magic. You’ll still end up with ugly choppy “artifacts” in the images that look like you’ve saved and re-saved the photo too many times as a JPEG.
When you peep at the images on a smaller screen, like Instagram, however, they look impressive, and that’s ultimately probably the most important display condition for a smartphone camera in 2019.
If you zoom a lot, the Pixel beats the iPhone on the regular. It’s even slightly easier to hold steady due to the improved images stabilization system when you’re zoomed all the way to 8x.
Portrait mode
The other big draw of the telephoto lens comes in the form of improved portrait mode. Even with the single lens on the Pixel 3, Google already did a very impressive job faking the background blur that comes from shallow depth of field photography. Predictably, adding a second lens to let it better calculate depth in a scene improves its performance.
If you really want to notice the jump, try shooting a larger object or a person from farther back than the simple head-and-torso shot for which portrait mode was originally developed. Using portrait mode for larger objects is a new skill for the Pixel 4 and it does a good job of mitigating the inherent limitations of the tech. Any weirdness or artifacts like oddly-sharp areas or rogue blobs of blur typically tend to show up around the edges of objects or in fine details like hair or fur. The closer you get to your subject, the harder you’re making the camera work and the more likely you are to notice something weird or out of place.
Overall, the Pixel 4’s portrait mode looks more natural than the iPhone 11 Pro, but it struggles more with edges and stray hairs. In headshots, the areas around the hair typically give away the Pixel 4’s tricks right away. (The iPhone 11 Pro gets around those edge issues by adding a “dreamy” blur across most of the image.) The Pixel’s overall colors and contrast are generally better because they don’t try to emulate different types of lighting like the iPhone does. But, when you get a truly ugly edge around a subject’s face or hair with the Pixel 4, it can quickly ruin the effect.
If you’re only posting your portrait mode shot on Instagram, those rough edges may not really play for your followers. Viewing them on a laptop screen or larger, however, makes them obvious.
The Pixel 4 does give you almost immediate access to both the fake blur images and the regular photo in your library. Portrait mode takes a few seconds to process, so you can’t see those immediately. Considering the amount of processing it’s doing, that’s understandable—and also the case with the iPhone—but if you’re trying to nail exactly the right expression, you can’t really check your results in real time.
Night Sight
When Google debuted its impressive low-light shooting mode, Night Sight, in the Pixel 3, it was incredibly impressive. Google has clearly continued to refine its performance and, even with the iPhone 11 Pro adding its own version of the tech, the Pixel 4 still maintains a considerable advantage.
You’ll still have to swipe over to the Night Sight mode in order to enable it, as opposed to the iPhone, which springs it on you automatically when it thinks the conditions are right. I like having more control over what I’m doing, so I prefer the Pixel’s approach, especially since these night modes require long exposures that can result in blurry photos if you—or the objects in the scene—can’t hold still.
Compared to the iPhone’s Night Mode, the Night Sight’s colors are more accurate and the scenes just look more natural. Ultimately, this one will come down to personal preference, but I prefer the Pixel 4’s results over the iPhone 11 Pro’s.
During the camera presentation, Google flat-out said that it hopes you’ll only use the camera “flash” as a flashlight. I abided by this rule. The flash is not good, just like every smartphone camera flash photo that came before it. It’s useful if you really need it—especially if you don’t mind converting images to black and white after the fact—but you can ultimately just leave it turned off forever.
As an addition to Night Sight, Google also added functionality that makes it easier to shoot night sky photos that show off stars and the Milky Way—if you know what you’re doing. I didn’t test this feature because I didn’t have access to truly dark sky, and the weather hasn’t really cooperated. If you’re planning to use this feature, you should plan to use a tripod—or at least balance the phone on a stable object—since it still requires long exposures. Ultimately, I love that the company added this feature and I look forward to seeing what people create with it, but it’s a specialized thing that I imagine most users won’t try more than a few times.
The case of the missing super-wide-angle lens
When the Pixel 3 shipped without a telephoto lens, I didn’t really miss it. I do, however, have to wonder why Google would ship the Pixel 4 without the super-wide lens found on the iPhone 11 Pro and other high-end smartphones.
The super-wide is easy to abuse if the unique perspective blinds you to the inherent distortion and overall just kinda wacky perspective it offers. But, there are times when it comes in really handy. If you’re trying to shoot a massive landscape without creating a panorama, or you’re just taking a photo in really tight quarters, the extra width makes a tangible difference.
Ultimately, I advocate that people do the vast majority of their shooting with the standard wide-angle camera no matter which phone they choose, because the overall performance and image quality are typically far better than the other lenses. But, I like options, and a super-wide lens lets you achieve a perspective you physically can’t get by simply backing up.
So, what’s the best smartphone camera?
The Pixel 4 has left us in a tough situation. The image quality, color reproduction, and detail are really excellent—most of the time. The quirks that pop up, however, really do have a tangible effect on the overall usability of the camera as a whole. If you’re the type of shooter who is careful to pay attention to your scene and edits your photos after you shoot, then the Pixel is the best option for the most part. The more neutral colors and contrast take edits better than those on iPhone files, which come straight out of the camera looking more processed.
Ultimately, though, we’re in a time when smartphone camera quality has largely leveled off. I haven’t mentioned the Samsung cameras in this review for the most part, because I find their files overly processed with too much sharpening and aggressive contrast and saturation levels. But, a large contingent of people like that. At this point, there isn’t enough difference between overall performance and image quality on the Pixel 4 to jump ship from your preferred platform—only to eek out a slight edge on images that come straight out of the camera.