Today at its Pixel Fall Launch event, after many pre-announcements and leaks, Google finally announced all the details of the Pixel 6 and Pixel 6 Pro. Some of the biggest changes to these new flagship phones from Google are updated camera modules. Pixel phones were long the champions of smartphone photography, but Google has been resting on its laurels for a while, using the same 12.2-megapixel Sony IMX363 sensor from Pixel 3 through Pixels 5 and 5A.

Now, the main cameras of the Pixel 6 and 6 Pro house a 50-megapixel sensor that is larger than its predecessors and bins the images down to a 12.5-megapixel output. Google claims it can capture 150 percent more light than the Pixel 5. Following the iPhone playbook, both Pixels feature ultrawide lenses with 12-megapixel sensors while the 6 Pro adds a third, telephoto lens with 4x optical zoom coupled to 48 megapixels of resolution. On the front, the Pixel 6’s selfie cam is 8 megapixels with an 84-degree field of view, while the 6 Pro is 11.1 megapixels and 94 degrees for easier group selfies.

The rear camera setup is quite the departure from prior Pixels, where the commodity hardware of Pixel cameras allowed Google to focus squarely on software optimization. Some Pixel iterations added or removed an extra lens or two, but each relied mostly on computational features.

Those software developments have brought new features to the Pixel 6 and 6 Pro as well. Google claims its reengineered Portrait Mode can better render different skin tones of wider, more diverse groups of people, naming the feature Real Tone. Google coordinated with photographers and cinematographers of color to build more diverse portraits into the image dataset of its camera models. Google says it went to the lengths of even correcting some aberrations that affect the image rendering of those with darker skin more harshly, such as implementing a new algorithm to decrease stray light, which washes out dark skin. Google says it is dedicated to building a more equitable experience across its camera and imaging products, which will hopefully be a welcome departure from past omissions in its algorithms.

What Google calls Face Unblur promises to use the multiple images recorded with each shutter press to keep faces sharp, even when photographing movement. Motion Mode is designed for capturing action shots and long exposure, for aesthetics like streaking car lights across a nightscape or panning with a moving subject to blur the background and give a sense of speed. Lastly, Magic Eraser will attempt to remove distracting objects and photo bombers in Google Photos.


Magic Eraser can automatically remove unwanted subjects from backgrounds.
GIF: Google

Magic Eraser sounds a bit like the long-abandoned chain-link fence concept, now resurrected in a more achievable form. Google showed off automatic object removal at Google I/O 2017, six months before the Pixel 2 came out, but it never made it to a device. It was seemingly forgotten and quietly left behind by Google. Though it has since been a case study for Google’s software-first mentality, it looks like a version of it is alive now in the Pixel 6 and Google Photos.

Developing… we’re adding more to this post, but you can follow along with our Pixel 6 event live blog to get the news even faster.


Related:

Balancing love hormones. Ties in to our market analysis work. Advantages of local domestic helper.