Nine things you should know about the Google Pixel 2 With all the hype surrounding the release of the Google Pixel 2 and Pixel 2 XL and their "world's highest rated smartphone camera," it's easy to lose the forest for the trees.
What's important about this new phone? Where did Google leave us wanting more? How is this phone's camera better than its predecessor? And why should photographers care about the technology baked into Google's new flagship?
After covering the launch in detail and spending some time with the Pixel 2 in San Francisco, we're setting out to answer those questions (and a few others) for you.
Dual Pixel AF
The new Pixel phones sport a very clever feature found on higher-end Canon cameras: split left- and right-looking pixels behind each microlens on the camera sensor. This allows the camera to sample left and right perspectives behind the lens, which can then be used to focus the camera faster on the subject (it's essentially a form of phase-detect AF).
It's officially called dual pixel autofocus, and it has the potential to offer a number of advantages over the 'focus pixels' Apple phones use: every pixel can be dedicated to focus without any impact to image quality (see this illustration). We've been impressed with its implementation on the Samsung Galaxy S7 and on Canon cameras. So we're expecting fast autofocus for stills, even in low light, as well as very smooth autofocus in video with little to no hunting. Given how good the Pixel 2's stabilized 4K video is, you might even make some professional-looking clips from these new phones.
Dual pixel + machine learning driven portraits
The split pixels have another function: the left-looking and right-looking pixels underneath each microlens essentially sample two different perspectives that are slightly shifted from one another. Google then builds a rudimentary depth map using this set of separated images and some help from its machine learning algorithms.
Clever. However, the stereo disparity between the two images are likely to be very small compared to a dual camera setup, which is likely to make it difficult for the Pixel 2 cameras to distinguish background from subject for more distant subjects. This might explain the poor results in DXO's comparison, but better results in the image above where Allison is much closer to the camera.
On the plus side, Portrait mode now renders full resolution 12MP files (you only got 5MP files on the original Pixels), and the 'lens blur' Google uses is generally more pleasing than Apple's more Gaussian blur. Out of focus highlights are rendered as more defined circles compared to Apple's results. This comes at a cost though: the blurring algorithm is computationally intensive so you'll generally wait a few seconds before seeing the result (and you can't see it in real time as you can with Apple).
Hardier hardware
Unsurprisingly if you've been following the rumor mill, the hardware specs on the new Pixel 2 phones didn't particularly impress any more than what we've seen from other phones. They're nice devices, and both are far more durable with IP67 ratings (a huge step up from the poor IP53 ratings of the previous Pixel phones, which were prone to quick wear and tear), but hardware-wise there's not too much to be excited about.
We've lost the headphone jack but gained stereo speakers in the front. The XL has less of a bezel, but it's still not as bezel-less as Samsung phones. No dual-cameras. RAM and processor are what you get in other Android phones. You can invoke the Assistant with a squeeze, but. . . well. . .
Nothing really stands out. But wait, there's more to the story.
AI First
If there's one point Google CEO Sundar Pichai continuously makes in his presentations, it's that we're moving from a 'Mobile First' to an 'AI First' world. He's referring to the move away from thinking of mobile devices simply as pocketable computation devices but, instead, intelligent devices that can adapt to our needs and make our lives easier. And Google is a leader here, thanks to the intelligence it acquires from its search services and apps like Maps and Photos.
AI is increasingly being used in many services to make them better, but often transparently. CEO Pichai recently cited an example of the Fitness app: every time he opens it he navigates to a different page. But rather than have the app team change the default page, or add an option to, he figures AI should just learn your preference transparently.
What's that mean for photography and videography? We're purely speculating here, but, imagine a camera learning your taste in photography by the way you edit photos. Or the photos you take. Or the filters you apply. Or the photos you 'like'. How about learning your taste in music so when Google Assistant auto-builds videos from your library of photos and videos, they're cut to music you like?
The possibilities are endless, and we're likely to see lots of cool things make their way into the new Pixel phones, like. . .
Google Lens
Sundar Pichai first talked about Google Lens at the I/O Developer Conference earlier this year. It marries machine vision and AI, and is now available for the first time in the Photos app and within Google Assistant on the new Pixel phones. Google's machine vision algorithms can analyze what the camera sees, and use AI to do cool things like identify what type of flower you're pointing your camera at.
This sort of intelligence is applicable to photography as well: Pichai talked about how AutoML has improved Google's ability to automatically identify objects in a scene. Anything from a fence to a motorbike to types of food to your face: Google is getting increasingly better at identifying these objects and understanding what they are – automatically using reinforcement learning.
And once you understand what an object is, you can do all sorts of cool things. Remove it. Re-light it. Identify it so you can easily search for it without ever keywording your photos. The Photos app can already pull up pictures of planes, birthdays, food, wine, you name it. We look forward to seeing how the inclusion of Google Lens in the new phones makes Photos and Assistant better.
Maybe intelligent object recognition could even fix flare issues by understanding what flare is. . . though this may not be necessary for the new phone. . .
Goodbye ugly lens flare
Thankfully, the nasty flare issues that plagued the first-gen Pixel phones appear to be remedied by lifting the camera module above the glass backing, which has also been reduced and streamlined to fit flush with the rest of the phone.
The camera unit is raised from the back ever-so-slightly though, but that's a compromise we're willing to accept if it means the camera isn't behind a piece of uncoated glass – a recipe for flare disaster. The only flare we've seen so far with our limited hands-on time is what DXO witnessed in their report: the lens element reflections in corners you sometimes see even in professional lenses. That's something we'll gladly put up with (and that some of us even like).
If flare bugged you on the previous Pixel phones (it certainly bugged me), consider it a non-issue on the new phones.
Incredibly smooth video
When the original Pixel launched, Google claimed its camera beat other cameras with optical image stabilization (OIS) despite lacking OIS. It claimed its software-based stabilization approach allowed it to get better with time as algorithms got better. Omitting OIS was also crucial to keeping the camera small such that it fit within the slim body.
Google is singing a different tune this year, including both OIS and electronic image stabilization (EIS) in its larger camera unit that extends ever-so-slightly above the back glass. And the results appear to be quite impressive. The original Pixels already had very good stabilization in video (even 4K), but combining OIS + EIS appears to have made the video results even smoother. Check out the video from Google above.
For low light photography, OIS should help steady the camera for longer shutter speeds. You should also get better macro results and better document scanning. Hey, that's worth something.
Equally as important though as what the new phones offer is what they don't offer. . .
Color management? HEIF?
Notably absent was any talk about proper color management on the new phones. The previous Pixels had beautiful OLED displays, but colors were wildly inaccurate and often too saturated due to lack of any color management or proper calibrated display modes.
iPhones have some of the most color accurate screens out there. Their wide gamut displays now cover most of DCI-P3 but, more importantly, iOS can automatically switch the screen's gamut between properly calibrated DCI-P3 and standard gamut (sRGB) modes on-the-fly based on content.
This means you view photos and movies as they were intended. It also means when you send an image from your iPhone to be printed (using a service that at least understands color management, like Apple's print services), the print comes back looking similar, though perhaps a bit dimmer. *
The Samsung Galaxy S8 also has calibrated DCI-P3 and sRGB modes, though you have to manually switch between them. The new Pixel phones made no mention of calibrated display modes or proper color management, though Android Oreo does at least support color management (though, like Windows, leaves it up to apps). But without a proper display profile, we're not sure how one will get accurate colors on the Pixel 2 phones.
*That's only because prints aren't generally illuminated as much as bright backlit LCDs that these days reach anywhere from 6 to 10 times the brightness prints are generally viewed at.
HDR display?
Sadly there was also no mention of 10-bit images or HDR display of photos or videos (using the HDR10 or Dolby Vision standards) at Google's press event. This leaves much to be desired.
The iPhone X will play back HDR video content using multiple streaming services, but more importantly for photographers it will - for the first time ever in any device - display photos in HDR mode as well. Apple is pushing the industry forward by bringing the conversation of HDR display for the first time to photos, not just video. Remember, this has little to do with HDR capture but, instead, the proper display of photos on displays—like OLED—that can reproduce a wider range of tones.
To put it bluntly: photos taken on an iPhone X and viewed on an iPhone X will look more brilliant and have more pop than anything else you're likely to have seen before thanks to the support for HDR display and accurate color. It's a big deal, and Google seems to have missed the boat entirely here.
HDR displays require less of the tonemapping traditional HDR capture algorithms employ (though HDR capture is still usually beneficial, since it preserves highlights and decreases noise in shadows). Instead of brightening shadows and darkening bright skies after capture, as HDR algorithms like the Pixel 2's are known to do post-capture (above, left), leaving many of these tones alone is the way to go with high dynamic range displays like OLED.
In other words, the image above and to the right, with its brighter highlights and darker shadows, may in fact be better suited for HDR displays like that of the Pixel 2, as long as there's still color information present in the shadows and highlights of the (ideally 10-bit) image. Unfortunately, Google made no mention of a proper camera-to-display workflow for HDR capture and display.
That's what we can tell you based off of what we know so far, but let us know in the comments how you feel about these devices. We've left out talk about smart home integration, but if you want to know more about that, let us know in the comments and we'll try to answer your questions or write a separate article on the topic.
. dpreview.com2017-10-7 20:17