Concerning the titular question, the answer is: One. Samsung already explained how its moon shots captured by the Galaxy S23 Ultra's telephoto camera work, but one engineer took it upon himself to explain things in even greater detail. The short story is that the company doesn't just overlay existing moon photos on top of yours or generate data out of thin air. It's much more complex and clever than that, and if you're still wondering how the company's camera works, keep on reading and check out the full video below.
Without getting too technical, the simplest explanation (courtesy of Techisode TV) of how Samsung's moon photos work is that Super Resolution technology synthesizes more than ten images of the moon you capture and combines image data from all those photos to create the highest-possible version, reducing noise and improving sharpness and detail. These combined results are then enhanced even further as they pass through an AI that Samsung trained to recognize the moon in every phase.
But this explanation doesn't seem to account for the now-famous (or infamous) blurred photo of the moon someone on Reddit used to “prove” that the Galaxy S23 Ultra moon photos are fake. Or does it?
What about those moon photos blurred and tampered with deliberately?
In the video we recommend watching below, Techisode TV explains how, because the person on Reddit blurred the moon using gaussian blur, it allowed Samsung's AI to run the numbers backward and get a much clearer image with seemingly no image data. In essence, Samsung's convolution neural network improves an image's sharpness and detail by doing the exact opposite math gaussian blur does.
As for other moon photos that some users modified on purpose to include foreign objects and throw off Samsung's Moon Photography, these results also have an explanation, and again, it all has to do with the way Samsung applies its convolution neural network and approximates numbers. Check out the video below for more details.
Lastly, the best proof that Samsung isn't just faking moon photos is that the same technology the Galaxy S23 Ultra uses to enhance moon shots is also applied to enhance every other photo captured at a high-enough zoom level — moon shot or not. Therefore, this is much more than just an AI trained to enhance moon photos using existing textures and data from memory. It's more like complex math that tries to “guess” reality from the little information you give it.
Rest assured. Samsung's camera AI doesn't plaster pre-made images onto your telephoto pictures to make them more realistic. Instead, it uses AI-driven complex math to calculate the best it can how reality should look like given the information it receives through the camera sensor and lenses. It does this with every photo captured at high levels of zoom — not just the moon — and does it extremely well.