After a week mired in old controversy rearing its head once again, Samsung has officially detailed how its smartphones take such high fidelity photos of the moon. While most of what’s in the post isn’t anything we haven’t heard before, Samsung’s response to this week’s confusion over its moon-shooting capabilities is a hearty reminder that much of the magic of smartphone photography is due to software enhancements on the backend.
It all started about a week ago on Reddit, where most good controversy gets stoked. A user called ibreakphotos went viral with their post about how Samsung’s space zoom moon shots are faked, providing proof with screenshots. They concluded as such:
The moon pictures from Samsung are fake. Samsung’s marketing is deceptive. It is adding detail where there is none (in this experiment, it was intentionally removed). In this article, they mention multi-frames, multi-exposures, but the reality is, it’s AI doing most of the work, not the optics, the optics aren’t capable of resolving the detail that you see. Since the moon is tidally locked to the Earth, it’s very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected.
Advertisement
To anyone who knows how smartphone photos work, hearing that much of the heavy lifting is done by AI shouldn’t be too surprising. Indeed, Samsung’s response to the findings doubles down on the fact that AI is working behind the scenes to improve the quality of certain shots. The company explains that its Scene Optimizer feature has supported moon photography since the Galaxy S21 series—before it was confusingly marketed as “space zoom”—and that the company has since improved the algorithms associated with this kind of shot so that the feature knows there a moon in the frame that needs so-called optimization.
Samsung writes:
The engine for recognizing the moon was built based on a variety of moon shapes and details, from full through to crescent moons, and is based on images taken from our view from the Earth.
It uses an AI deep learning model to detect the presence of the moon and identify the area it occupies – as denoted by the square box – in the relevant image. Once the AI model has completed its learning, it can detect the area occupied by the moon even in images that were not used in training.
G/O Media may get a commission
Advertisement
I have tested Samsung’s “space zoom” mode on two Samsung smartphones. You can see a photo of the moon I took with the Galaxy S22 Ultra in our original review for that phone. I remember remarking on the coolness factor of being able to produce such an image and share it on social media.
Advertisement
I did not get the same results when I took pictures of a full moon with this year’s Galaxy S23 Ultra. Instead, I got an overexposed beam of light shining from up top rather than the detailed craters exhibited in that S22 Ultra photo. Following the logic of Samsung’s explanation, this is precisely how the algorithm works. The Galaxy S23 Ultra didn’t produce a moon shot like its predecessor because it couldn’t immediately identify it as such, and that’s not how the company trained its algorithms to work.
It’s worth reading through Samsung’s entire post, especially if you are curious about how smartphones manage to produce the images they do. However, it doesn’t absolve Samsung of the fact that it pushed marketing to make it sound like its cameras were zooming 100x natively, like a DSLR or mirrorless camera with a capable telescopic lens. A smartphone and its glass are no match for a full-blown camera setup. The sensor size and rear-facing glass need to be bigger to capture the detail you need from the moon.
This is also not your typical postprocessing, and instead is more akin to a live photoshop. Samsung’s moon photos, as discovered by the original Reddit post, are not always improving what you’ve already shot but are instead using your photo alongside multiple others (taken by your camera at the same time without your noticing) and AI deep learning of existing moon details to partially synthesize what the phone thinks the moon’s features should look like in your shot. This means things like color will be preserved (which wouldn’t be the case if the phone simply copy and pasted other moon photos over yours), but also that your phone will still be able to take advantage of the moon’s tidally locked position to know what it should look like at any given time and adjust your photo to match.
Advertisement
At the same time, if it quacks like a moon photo, it might just be a moon photo. I’m reticent to take the side of a giant conglomerate, but I also don’t blame Samsung for pushing the narrative on its camera capabilities. All most of us want our smartphones to do is capture the world so we can look back upon our gigabytes of information with fondness rather than wonder why a photo appears the way it does.
Apple and Google employ similar marketing strategies with their respective smartphone cameras, with Google leaning into the idea that its machine learning makes its photos the best out of the competition. If you want documented pictures of the moon phases as they’re actually happening outside your window, consider ditching the $1,200 starting price of the Samsung Galaxy S23 Ultra and applying some of that cash to a telescopic camera setup instead.
Services Marketplace – Listings, Bookings & Reviews