You can also achieve the opposite effect in which you can brighten up a dark foreground subject without blowing out a bright sky in the background.
In the third shot, I used the shadows slider to raise the shadow levels to bring up the leaves a bit while the sky remained mostly unchanged. In the second shot, I raised the overall brightness, which drew out detail from the leaves, but blew out the sky.
The first shot in this series was the default with no adjustments. It allows you to do things like taking silhouette photos in which the subject is virtually blacked out while the background (usually the bright sky) stays properly exposed. One slider affects the overall exposure (how bright or dark everything looks) in the scene, while the other simply affects the shadows. When you tap the screen to focus on an object in the image, two sliders now pop up for adjusting the brightness of the scene. Google made a few welcome improvements to its overall HDR experience as well. It grabbed onto the pizza cutter through this window reflection. The focusing on the Pixel 4 is impressive. If you weren’t comparing them side-by-side, however, they’re both totally acceptable. The Pixel’s image doesn’t crank the contrast as much and the tones look smoother overall.
These Bananas at Whole Foods illustrate the difference between the Pixel 4 (left) and the iPhone 11 Pro (right). But, as with iPhone 11 Pro, it can be unpredictable.
This “smart HDR” tech does a lot of good: It can prevent highlights from getting blown out, or flatten out a super-contrasty scene that could lose crucial details. Instead, it captures a burst and combines information from those images into one finished file. Like almost every smartphone camera at this point, pressing the shutter doesn’t simply take one photo.
The company makes it abundantly clear that the software magic that happens during and after you press the shutter has become extremely important in determining the look of the final image. Google doesn’t pull punches when it comes to computational photography, which relies more on processing power and algorithms than pure hardware performance. If you wanted to edit the photo, it’s a great, neutral starting point. The Pixel 4 does a really excellent job of bringing up the shadows near the door while keeping the colors accurate to the scene as it looked in real life. There’s natural light coming in through the archway and artificial light overhead. Image quality This scene is a great space to test HDR. It took me roughly five tries before I got the timing right and caught the person up front with a punch extended. What’s new? The Pixel is about average when it comes to quickness in capturing photos. And while the camera has some moments where it’s truly excellent, I ran into a few growing pains as Google tries to calculate its way to perfect photos. The Pixel 4 adds some more AI-powered smarts, relying increasingly on its software to determine the overall look of the final image. And Google is coming off of a very strong showing in the Pixel 3, which was (at least as far as I was concerned at the time) the absolute best smartphone camera. Consumers still indicate that photo quality is one of the most-important factors they use to pick a new device. But, the company dedicated considerable time-and presumably money spent to hire iconic portrait photographer Annie Leibovitz-showing off the Pixel 4 smartphone’s new camera. Stan HoraczekĪt its recent product announcement event in New York City, Google showed off a handful of new gear. Shot through a shop window, the Pixel 4 handles the shiny, golden surfaces and contrasty pools of light very nicely.