Google’s Pixel 2 and Pixel 2 XL might have had teething issues when it comes to their displays, however one area in which they didn’t fail to impress were their cameras. Following in the footsteps of their original Pixel duo that launched a year before, the newest 12.2-megapixel sensors in Google’s latest smartphones are a treat to work with already, but their full potential isn’t quite exploited yet as there are lots of promised features yet to arrive, waiting to be enabled via future software updates.
Gadgets 360 needed a Hangout session with Brian Rakaowski, VP of Product Management at Google and Timothy Knight, who directs camera development for Pixel two, to talk specifically about the camera and also making it tick. We’re all aware of some of the more publicised issues with all the new Pixels like the audio issues when recording video and over Bluetooth, odd screen flashes, but we have had some issues with the camera too, which we expected to get some clarity on in the Google duo, no puns intended.
The Pixel two does a great job stabilising video however in low light, especially at 4K, the footage tends to get quite noisy. This is mainly since the Pixel 2 tries to brighten up the spectacle as much as possible by boosting the ISO, which gives you a brighter scene for sure, but at the cost of noise. This is done intentionally, Knight explains.
“That is really a tradeoff we think a lot about. We tried to strike a balance of both,” he says. “If you compared the Pixel 2 camera to other mobile cameras, then you’ll see that we are brighter. It’s simple to make the noise go away if you just create the image dark. We decided that we rather allow the user see the spectacle more clearly, by making it brighter, even if this means there is some more noise.” Knight additionally states that 1080p video ought to be a bit less noisy compared to 4K, since there’s more headroom to do heavy weight processing, compared to 4K.
Another feature that’s missing in the Pixel two is 60fps support at 4K, something which the iPhone 8 Plusand iPhone X boast off. “4K in 60[fps], unfortunately, is not something we are going to bring to Pixel 2,” says Knight. “As an example products, we will consider it certainly. However, for Pixel two, 4K 30 and 1080 60 is the video we plan to support.”
If you’ve looked in the settings of the Pixel 2’s camera app, you will notice that enabling manual control for HDR+ gives you a second option in the viewfinder, known as HDR+ improved. When we analyzed the Pixel 2 along with the Pixel two XL, we didn’t actually notice any quality difference between the two modes, other than the fact that it requires a longer time to process the HDR+ enhanced photo. Turns out, we’re right.
“In the huge majority of instances, there is no difference. From an individual perspective, HDR+ and HDR+ improved will take the exact same photograph,’ explains Knight. ‘In a few conditions, HDR+ improved can take a photograph which has a little more dynamic selection.” The reason the improved mode takes longer to process is since in standard HDR+ manner, Zero Shutter Lag (ZSL) is on whereas in the elongated mode, it’s away. Shutter lag is typically the time taken by the moment you press the shutter button, to if the picture is actually recorded and saved. Zero Shutter Lag (ZSL) typically gives you near-instantaneous shots, with virtually zero delay.
We initially assumed that the Pixel 2’s Visual Core imaging chip would help speed this process up, after it’s active in the Android 8.1 update, but it does not seem to be the case. The Visual Core SoC’s primary purpose will be to empower third-party camera apps to utilize the HDR+ feature.
Finally, the lack of manual controls and RAW file supports is another bummer in new camera program. This is an area that other Android manufacturers like Samsung and HTC have really mastered through the years. Not everyone needs manual controls but it’s nice to have the option, especially when you want to take some artistic shots, and it’s very helpful in low light. Having this feature would also help control the vulnerability in video, for those who prefer to catch the scene for what it is instead of brightening things up. But, Knight isn’t convinced that simply putting sliders for ISO, aperture, and so on is the best interface to get a phone. He further states that in doing this, users will not be able to benefit from HDR+, so image quality will endure.
Google might add some amount of manual control in the long run, “but at the present time, don’t expect to realize a manual slider anytime soon,” says Knight. It appears that Google is relying heavily on its machine learning to improve photos and make them look so great as they do, which might explain why they are not willing to relinquish control over to the consumer. This applies to RAW file service too.
We do not have any updates today but we’re looking into it,” says Knight.