One of the most amazing features of the latest Galaxy S23 is its powerful zoom. On the Ultra model, it is possible to go up to 100x magnification using both the integrated lenses and a sensor cutout. And one of the ways that Samsung has shown the possibilities of this level of increase is with a photo of the moon. Although it is a mobile photo, if you have a good hand, it is possible to capture an image of the star with an S23 where many of the details of the surface can be appreciated.
It is something that many users have also verified in the first weeks in which the phone has been available, and even in the S22 Ultra model launched on the market last year, which, although it has a lower resolution sensor, also has a very powerful digital zoom. .
But one Reddit user is convinced these moon photos aren’t real, and his experiment seems to indicate that Samsung phones do indeed add details that aren’t present when taking the photo. It is an old controversy that began to circulate on the networks in 2021 and has now been resurrected with new evidence.
To demonstrate this, user “ibreakphotos” started with a high-resolution photo of the moon downloaded from the web. Reduced it to 170 x 170 pixels in size and applied a blur filter. He then showed it on his monitor and took a picture of the monitor with the room lights off from the phone, simulating capturing the moon outside. The resulting photo had much more detail than the original 170 x 170 pixel image.
Although this is not a texture applied directly to the image (something that other manufacturers have tried in the past), the experiment indicates that the phone interprets the point of light as the moon and adds the details that should be present, despite that the sensor does not really capture them.
Samsung has recognized that when the scene recognition mode comes into action, it is able to detect that it is a photo of the moon thanks to the fact that it has been trained with hundreds of photos of the star. Upon detecting it, it applies a series of filters and optimizations to achieve the final result. If scene detection mode is turned off, the moon appears as a white blur, which is what you would expect from a mobile camera.
But according to Samsung, at no time is a texture or information added that has not been captured by the sensor. The Reddit user’s experiment seems to indicate otherwise.
Regardless of whether or not there is more information, the technique expands the debate that has existed for years about what has come to be known as “computational photography.”
No mobile camera shows exactly what the sensor captures, unless a RAW mode capture is used. Each photo has corrections made automatically to improve contrast, brightness or texture and skin tones.
In recent years, cameras have also begun to perform more direct manipulations. In portrait mode, for example, they blur the background and in other scenes they can combine several exposures to achieve a better result or expand the dynamic range.
These are changes that need to be made through software given the size limitations of smartphones. No matter how large the sensors and optics are, they will always be more limited than those of an SLR or, in this case, a telescope.
According to the criteria of The Trust Project