One of the most impressive demos at Google I/O started with a photo of a woman in front of a waterfall. A presenter onstage tapped on the woman, picked her up, and moved her to the other side of the image, with the app automatically filling in the space where she once stood. They then tapped on the overcast sky, and it instantly bloomed into a brighter cloudless blue. In just a matter of seconds, the image had been transformed.
The AI-powered tool, dubbed the Magic Editor, certainly lived up to its name during the demo. It’s the kind of tool that Google has been building toward for years. It already has a couple of AI-powered image editing features in its arsenal, including the Magic Eraser, which lets you quickly remove people or objects from the background of an image. But this type of tool takes things up a notch by letting you alter the contents — and potentially, the meaning — of a photo in much more significant ways.
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>GIF: Google
While it’s clear that this tool isn’t flawless — and there remains no firm release date for it — Google’s end goal is clear: to make perfecting photos as easy as just tapping or dragging something on your screen. The company markets the tool as a way to “make complex edits without pro-level editing tools,” allowing you to leverage the power of AI to single out and transform a portion of your photo. That includes the ability to enhance the sky, move and scale subjects, as well as remove parts of an image with just a few taps.
Google’s Magic Editor attempts to package all the steps that it would take to make similar edits in a program like Photoshop into a single tap — or, at least, that’s what it looks like from the demo. In Photoshop, for example, you’re stuck using the Content-Aware Move tool (or any of the other methods of your choice) to pick up and move a subject inside of an image. Even then, the photo still might not look quite right, which means you’ll have to pick up other tools, like the Clone Stamp tool or maybe even the Spot Healing Brush, to fix any leftover artifacts or a mismatched background. It’s not the most complicated process ever, but as with most professional creative tools, there’s a definite learning curve for people who are new to the program.
I’m all for Google making photo editing tools free and more accessible, given that Photoshop and some of the other image editing apps out there are expensive and pretty unintuitive. But putting powerful and incredibly easy-to-use image editing tools into the hands of, well, just about everyone who downloads Google Photos could transform the way we edit — and look at — photos. There have long been discussions about how far a photo can be edited before it’s no longer a photo, and Google’s tools push us closer to a world where we tap on every image to perfect it, reality or not.
Samsung recently brought attention to the power of AI-“enhanced” photos with “Space Zoom,” a feature that’s supposed to let you capture incredible pictures of the Moon on newer Galaxy devices. In March, a Reddit user tried using Space Zoom on an almost unsalvageable image of the Moon and found that Samsung appeared to add craters and other patches that weren’t actually there. Not only does this run the risk of creating a “fake” image of the Moon, but it also leaves actual space photographers in a strange place, as they spend years mastering the art of capturing the night sky, only for the public to often be presented with fakes.
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Image: Google
To be fair, there are a ton of similar photography-enhancing features that are built in to smartphone cameras. As my colleague Allison Johnson points out, mobile photography already fakes a lot of things, whether it’s by applying filters or unblurring a photo, and doctored images are nothing new. But Google’s Magic Editor could make a more substantial form of fakery easier and more attractive. In its blog post explaining the tool, Google makes it seem like we’re all in search of perfection, noting that the Magic Editor will provide “more control over the final look and feel of your photo” while getting the chance to fix a missed opportunity that would make a photo look its best.
Call me some type of weird photo purist, but I’m not a fan of editing a photo in a way that would alter my memory of an event. If I was taking a picture of a wedding and the sky was cloudy, I wouldn’t think about swapping it for something better. Maybe — just maybe — I might consider moving things around or amping up the sky on a picture I’m posting to social media, but even that seems a little disingenuous. But, again, that’s just me. I could still see plenty of people using the Magic Editor to perfect their photos for social media, which adds to the larger conversation of what exactly we should consider a photo and whether or not that’s something people should be obligated to disclose.
Google calls its Magic Editor “experimental technology” that will become available to “select” Pixel phones later this year before rolling out to everyone else. If Google is already adding AI-powered image editing tools to Photos, it seems like it’s only a matter of time before smartphone makers integrate these one-tap tools, like sky replacement or the ability to move a subject, directly into a phone’s camera software. Sometimes, the beauty of a photo is its imperfection. It just seems like smartphone makers are trying to push us farther and farther away from that idea.