6 things we want to see in the Google Pixel 2 It was true a year ago and it's still true now: the Google Pixel and Pixel XL offer one of the best smartphone cameras on the market. But the competition hasn't been standing still for the last year – Apple has gained ground with its dual focal length dual-camera devices, and the 8/8 Plus have overtaken the Pixel in DxoMark's mobile rankings.
With the announcement of the Pixel 2 imminent, here's what we think Google needs to add to keep its flagship phone competitive – with special attention to camera specs, of course.
All signs are pointing to no on this one, but we're stubborn so we'll ask for it anyway: Google, please put a dual camera on the Pixel.
The first generation offered just one rear-facing imaging module and if the rumors are true, so will the Pixel 2. And let's reiterate it: the Pixel may have only one main camera, but it's a really, really good one. However, it'll be difficult for Google to overcome the two major advantages that Apple's dual cam offers: optical zoom and a superior shallow depth-of-field simulation mode.
Maybe they've found software solutions to mitigate these issues in the Pixel 2. Rumors are pointing to a mode more like Apple and Samsung's offerings, with a sharp subject and blurred background, all rendered live rather than post-processed. But given what Google has already done with one camera and sophisticated software, just imagine what it could do with two!
Our plea for dual cameras is probably in vain, but we feel better about this wish being fulfilled. The iPhone X offers a rating of IP67 rating, meaning it's dust-proof and water-resistant up to 30 minutes up to a depth of 1m. Samsung's Note 8 is an IP68 – equally dust-resistant and can swim in up to 1. 5m for up to 30 minutes. The Pixel is a weaker IP53 – not quite dust-proof, and splash-resistant only. Upgraded durability would keep the Pixel competitive with current flagships and is a win all around.
A fix for lens flare
It wasn't long after the Pixel made its way into users' hands that some of them reported some drastic lens flare creeping into their photos, and it wasn't the good kind – see our example above. Google implemented a software 'fix' in HDR+ mode, but frankly it barely helped. When you've got uncoated glass sitting far in front of your main lens, there's not much you can do in software. All rumors point to a lens that protrudes from the body above the back glass - much like most other phones. We've got our fingers crossed that this fixes the original phones' issue.
Proper color management and HDR
Android Oreo (finally) supports color management, but like Windows, the OS leaves it up to apps to do the work. iOS color manages everything – down to the app icons. When it comes to wide gamut displays like OLED, it becomes increasingly important to properly color manage everything, else you risk over-saturated, wrong colors (read 'saturation accuracy' here to understand why). Just check out app icons on a Pixel or Samsung phone.
With Android Oreo, it's basically up to device manufacturers to provide proper display profiles, and app developers to take advantage of them. We're hoping that Google takes an extra step and provides a layer on top of its OS to color manage everything – much like Microsoft's Surface Studio and its DCI-P3 and sRGB modes.
Sound like too much to ask for?
Apple profiles all its device displays, offering properly calibrated wide gamut P3 and standard gamut sRGB modes, switching as necessary based on content. The iPhone X is likely to be the world's first, out-of-the-box, properly color-managed DCI-P3 OLED device. That means the potential for very saturated colors if the photographer or videographer intended so, but not at the cost of inaccurate ones. We'd like to see the Pixel 2 follow suit.
Bonus points for proper HDR display support like the iPhone X's HDR10 and Dolby Vision modes, as well as HDR display of photos using the new High Efficiency Image Format (HEIF).
Optical image stabilization
If Google's launch event teaser is any indication, it looks like we'll get this one. The original Pixel offered some impressive digital video stabilization, and adding optical stabilization into the mix for stills would keep the Pixel on par with the competition. And if you're going to offer just one camera, you might as well put a stabilized lens in front of it.
At the Google I/O developer conference in 2017, CEO Sundar Pichai introduced a new technology that marries machine vision and AI: Google Lens. While not strictly photography related (as far as we know), it is very much camera related. Google's machine vision algorithms can analyze what the camera sees, and use AI to help you take action. Pichai demoed a number of cool features: point your phone at a flower and the Google Assistant will automatically analyze it and tell you which flower it is. Point it at restaurant down the street and Assistant will automatically pick up the restaurant's name, ratings and reviews.
This sort of intelligence is applicable to photography as well though: Google demoed the automatic removal of a fence in a photograph taken of a child playing baseball through a fence. The camera can do this by understanding what a fence is. Object recognition also drives automatic tagging and searching of images: the Photos app can already pull up pictures of planes, birthdays, food and most importantly: beer. Just by searching for these terms.
We look forward to the official inclusion of Google Lens and the integration of evolved machine learning in the camera and Photos apps on the new Pixel devices.
Do you own a Pixel or Pixel XL? What do you want in the next generation? Let us know in the comments, and tune in here for live updates from Google's launch event tomorrow.. dpreview.com