Are Galaxy S23 Ultra Moon Photos Fake?

Smartphone cameras are getting better every year, but compact sensors still have a lot of limitations, especially when it comes to taking pictures in the dark and zooming. Samsung was one of the first to add a periscope lens to its smartphones for better zoom. With this lens, users can even take pictures of the moon with their phone. But this has become a controversy as some users have questioned whether these photos are indeed real. And I say it is.

Modern smartphone cameras and AI

It is now impossible to talk about smartphone cameras without mentioning improvements made by artificial intelligence. Almost every phone maker from Apple to Google to Samsung uses artificial intelligence to improve the photos users take. Such technology can mitigate the lack of a large camera sensor in many cases.

For example, Apple introduced Deep Fusion and Night Mode with the iPhone 11. Both technologies combine the best parts of multiple images with artificial intelligence to result in a better photo when there is little or no light. The iPhone and other smartphones also use artificial intelligence to make the sky bluer, the grass greener, and the food more appealing.

Some phones do it more aggressively, some less so. But the fact is, almost all of your smartphone photos in recent years have had some sort of AI modification. You’re not actually capturing what you see with your eyes, but what your smartphone thinks will look best in a digital photo.

Apple Deep Fusion - Are Galaxy S23 Ultra Moon Photos Fake?  Not really

Photos of the moon taken with the Galaxy S23 Ultra

Taking photos of the moon is quite difficult as it is an extremely distant bright subject. Focusing on it is not easy and you need to set the right exposure to get all the details. It’s even more difficult for a smartphone camera.

You can take photos of the moon with your iPhone if you use a camera app with manual controls, but they still won’t look good due to the distance and lack of wider optical zoom.

Are Galaxy S23 Ultra Moon Photos Fake?  Not really

One of the key features of the Samsung Galaxy S Ultra phones is the periscope camera, which allows up to 10x optical zoom. Combined with software tricks, users can take photos with up to 100x zoom using these phones. The iPhone is rumored to get a periscope lens in the next generation, but for now, it only offers a telephoto lens with 3x optical zoom (if you have a Pro model).

With such powerful zoom, Samsung is promoting the camera of the Galaxy S23 Ultra, its latest flagship device, as capable of taking photos of the moon. And indeed it is.

Last week, a user on Reddit conducted an experiment to see if the photos of the moon taken with the S23 Ultra are real or not. The user basically downloaded an image of the moon from the Internet, reduced its resolution to remove detail, and pointed the phone at the screen showing the blurry image of the moon. Surprisingly, the phone took a high-quality image of the moon.

The internet was immediately flooded with people and news sites claiming that Samsung phones were taking fake photos of the moon. Has Samsung been lying all this time? Not really.

See how Samsung phones take pictures of the moon

Contrary to popular belief, Samsung is not replacing user-taken photos with random, high-quality moon shots. Huawei phones, for example, actually do this. Instead of using AI, the system uses pre-existing images of the moon to make the final photo. Samsung, on the other hand, uses a lot of artificial intelligence to deliver good moon shots.

Samsung has an article on its Korean website detailing how the camera algorithms on its smartphones work. According to the company, all phones since the Galaxy S10 use AI to improve photos.

For phones with a periscope lens, Samsung uses a “Super Resolution” feature that composites details not captured by the sensor. In this case, when Samsung phones detect the moon, they immediately use AI to increase the contrast and sharpness of the photo, as well as artificially increase the resolution of details in it.

This feature works in much the same way as Pixelmator’s ML Super Resolution and Halide’s Neural Telephoto. You don’t exactly replace your photo with another one, but use a bunch of algorithms to reconstruct it in better quality.

I did the test myself with a Galaxy S22 Ultra, which also has a periscope lens. I used a third-party camera app to take a RAW photo of the moon at 10x zoom. I then took a photo in the same position, but using Samsung’s camera app. If I overlay both images, I can clearly see that they are the same. But the edited version has much more details added by AI to make the photo more beautiful.

Are Galaxy S23 Ultra Moon Photos Fake?  Not really

What is real and fake photo?

I wrote this article after watching MKBHD’s latest video in which he asks “what is a photo?” As mentioned earlier, all smartphones today – even the iPhone – use algorithms to enhance images. Sometimes this works very well, sometimes not.

I recently wrote about how Smart HDR makes my photos look too sharp and over-colored, which I don’t like. Since the iPhone XS, many users have also complained about how Apple softens skins in photos. Are the photos taken with my iPhone fake? I do not think. But they certainly don’t look 100% natural either.

Samsung has done something really impressive by combining its software with the periscope lens. And if the company doesn’t replace user images with alternatives, I can’t see why that’s a bad thing. Most users just want to take good photos, no matter what.

I’m sure if Apple eventually comes out with a feature that uses artificial intelligence to take better shots of the moon, a lot of people will love it no doubt. And as MKBHD said, if we start questioning the artificial intelligence used to enhance a photo of the moon, then we should question any photo enhancement done by AI.

Read also:

FTC: We use affiliate links that automatically earn you income. More.

Leave a Comment