Samsung’s fake moon photos aren’t a giant leap for mobile photography

It’s a debate as old as photography itself. On Friday, Reddit user u/ibreakphotos posted some photos of the Moon that had the Internet grappling with a familiar question: what is the “truth” in photography?

The images in question show a blurry Moon alongside a much sharper and cleaner version. The latter is a better picture, but there is a major problem with it. It’s not real — at least in the sense that most of us think a photo is real. Instead, it’s an image created by a Samsung phone based on a crappy photo of the Moon, which was then put through some sophisticated processing to remove some of the details. It might seem like a stretch to call it photography, but considering everything smartphone cameras already do, it’s not really the giant leap it seems to be – more like a small step.

Samsung is no stranger to machine learning – it’s spent the last several years playing around with AI-powered high zoom through the aptly named Space Zoom. In most cases, Space Zoom combines data from an optical telephoto lens with multiple frames captured in rapid succession, relying on machine learning to create a much sharper image of distant subjects than you could normally get with a smartphone camera. It is very good.

That’s not exactly what Samsung seems to be doing here. Other than photographing the moon, Samsung’s processing pipeline only works with the data in front of it. It will sharpen the edges of a building photographed from several blocks away with an unsteady hand, but it won’t add windows to the side of the building that weren’t there to begin with.

The Moon seems to be a different case, and ibreakphotos’ clever test reveals the ways in which Samsung puts in a little extra processing. They put an intentionally blurry image of the Moon in front of the camera, displayed it on a screen, and photographed it. The resulting image shows details it couldn’t have gleaned from the original photo because it was blurry — rather, Samsung’s editing does a little more embellishment: adding lines and, in a tracking test, texturing the Moon. areas cut in white in the original image. Is not wholesale copy and pastebut it doesn’t just amplify what it sees.

But… is it that bad? The thing is, smartphone cameras already use a lot of behind-the-scenes techniques in an effort to produce photos you like. Even if you turn off every beauty mode and scene optimization, your images are still processed to brighten faces and make fine details appear in all the right places. Get Face Unblur on recent Google Pixel phones: if your subject’s face is slightly motion-blurred, it’ll use machine learning to combine an image from the ultra-wide camera with an image from the main camera to give you a sharp final image.

Have you tried taking a photo of two toddlers looking at the camera at the same time? It is arguably more difficult than taking a picture of the Moon. Face Unblur makes it much easier. And it’s not a feature you turn on in settings or a feature you select in the camera app. It bakes right in and just runs in the background.

To be clear, this isn’t the same thing Samsung is doing with the Moon — it’s combining data from photos you’ve actually taken — but the rationale is the same: to give you the image you actually wanted to take. Samsung just goes one step further than Face Unblur or any sunset photo you’ve ever taken with a smartphone.

Every photo taken with a digital camera relies on a small computer that makes some guesses

The thing is, every photo taken with a digital camera relies on a little computer that makes some guesses. This is true down to the individual pixels on the sensor. Each has a green, red or blue color filter. A green-filtered pixel can only tell you how green something is, so an algorithm uses neighboring pixel data to make a good guess at how red and blue it is, too. Once you have all that color data sorted out, then you have to make a lot more judgments about how to process the photo.

Year after year, smartphone cameras go a step further, trying to make smarter guesses about the scene you’re shooting and how you want it to look. Every iPhone from the past half decade will recognize faces in a photo and brighten them for a more flattering look. If Apple suddenly stopped doing this, people would be up in arms.

It’s not just Apple — every modern smartphone camera does this. Is this a picture of your best friend? Brighten it up and smooth out the shadows under their eyes! Is it a plate of food? Boost that color saturation so it doesn’t look like a Fancy Feast plate! All this happens in the background, and generally, we like them.

Would it be surprising if, instead of just upping the saturation to make your dinner look appealing, Samsung added a few sprigs of parsley to the picture? Absolutely. But I don’t think it’s a fair comparison to Moon-gate.

Samsung doesn’t put the Eiffel Tower or the little green men in the picture

The Moon is not kind of photography as “food” is. It is a specific subject, isolated in a dark sky, that every person on Earth is looking at. Samsung doesn’t put the Eiffel Tower or the little green men in the picture – it makes an educated guess about what should be there to begin with. Smartphone photos of the moon also look downright rubbish, and even Samsung’s enhanced versions still look pretty terrible. There is no danger that someone with an S23 Ultra will win Astrophotographer of the Year.

Samsung takes an extra step forward with its moon photo editing, but I don’t think it’s the big departure from the basic “truth” of modern smartphone photography that it seems to be. And I don’t think it means we’re headed for a future where our cameras are just Midjourney prompt machines. It’s another step in the journey that smartphone cameras have been on for many years, and if we’re taking the company to court for image-editing crimes, then I have a few more complaints about the judge.

Leave a Comment