We’ve recently reviewed the Samsung Galaxy S23 Ultra 5G, and we found it to be quite the flagship phone that offers plenty to users, especially with its robust camera system. This time around, both hardware and AI fronts are being pushed here and the results you can grasp using it is pretty impressive.
Today, we’re taking a deep look into the camera systems, from A to Z, as detailed as we can.
Here’s a quick Spec reminder so that you’ll know what you’re dealing with.
|Processor||Qualcomm Snapdragon 8 Gen 2|
|RAM & Storage||12GB+256GB – RM 5699|
12GB+512GB – RM 6199
12GB+1TB – RM 7199
|Display||6.8’’ 1440p+ Dynamic AMOLED 2X, 120Hz|
|Cameras||200 MP, f/1.7, 23mm (wide)|
10 MP, f/4.9, 230mm (periscope telephoto), 10x optical zoom
10 MP, f/2.4, 70mm (telephoto), 3x optical zoom
12 MP, f/2.2, 13mm, 120˚ (ultrawide),
12 MP, f/2.2, (selfie)
|Battery||5000 mAh |
45W Fast Charging
|Price||12GB+256GB – RM 5699|
12GB+512GB – RM 6199
12GB+1TB – RM 7199
|Colors||Phantom Black, Cream, Green, Lavender|
Rear Camera Systems
The Galaxy S32 Ultra 5G deploys quad-camera setup – a 200MP primary camera, a 10MP telephoto, 10MP periscope telephoto, and a 12MP ultrawide sensor.
Standard Wide Angle (200MP / 12MP)
The 200MP main sensor is Samsung’s own HP2 sensor, which outputs to 12MP images. How the high pixel count is achived is through the new Tetra2pixel RGB Bayer Pattern color filter. It effectively groups 16 pixels into 1 large one. There’s also a new autofocus system, dubbed the ‘Super QPD’. How it helps you is that it’s able to detect changes in phase both vertically and horizontally. What it’s really trying to do is to be fast enough to catch focus of your subject (assuming its moving) as quickly as possible with minimal blur that’s caused by movement.
I gave equal attention to both Auto and Pro mode, as this time around the Galaxy S23 Ultra highlights new hardware that emphasizes on improved AI-assisted photo-taking.
How the Snapdragon 8 Gen 2 for Galaxy stands out is the debut of Snapdragon’s Cognitive ISP. This new Image Signal Processor, which enables Semantic Segmentation. It’s a neural networked based filter that’s trained to recognize details from faces, accessories and various types of landscapes. To make it easier, it’s simply an AI-heavy photo-editing technique that segmentates every recognizable object in your image.
The camera UI is very minimal in terms of on-screen options, which is good since you’ll treating it more as a viewfinder with a shutter button. Essentially you’ll be tapping to focus on any area, while pressing and holding your desired spot locks that focus. Scene Optimizer will take care of the rest. It’s a setting you can toggle under “intelligent features”, where it simply adds a bit more exposure and saturation to your shots, especially food when the camera recognizes it. A food blogger might appreciate no longer needing to add it themselves, but as a photographer that wants to take more flat, natural shots I just turn it off.
If you really wanna nail those shots that are well focused and exposed, ALWAYS go PRO mode. Not only does it provide that full control, but it also captures photos a lot faster since you can select the ideal shutter speed and also not allow the AI to work, thus reducing time taken when shooting since there’s nothing to optimize.
This is where the fun begins. You’ll begin to see all the additional dials and buttons that you can find on an actual camera and for beginners that can be a little overwhelming at first. Don’t worry, it’s easy to understand and as long as you can get familiar with it at your first few tries you have nothing to worry about!
On Pro Mode you will have full control of your shutter speeds, exposure and focus points. Choosing manual focus and rolling the dials lets you choose what you want to focus on in your subject, and if you just want to focus on everything in front of you to be sharp, you can simply just leave focus on either multi or centered. The main sensor has a very large f/1.7 aperture, so it can let more light in especially in darker situations such as a night street photography or dark portraits
Out of all the sensors, we expect that this one would be used the most by the common user. Practice different perspectives, explore all possible exposure levels and focus well. Here’s a couple more shots when it comes to lending perspectives, as we too encourage people to practice in skill to complement the hardware.
Ultrawide Angle (12MP)
Samsung’s pretty confident with their ultrawides, boasting accurate distortion correction and image quality. Like the S22 Ultra, it still covers 120 degrees in wideness, and this time around it’s the ISP on the processor that’ll bring about more detailed photos thanks to better AI and focus. It outputs photos in 12MP, standard affair if you’ve been on Samsung long enough.
With autofocus on board, you simply just have to tap ‘n lock onto where you’re focusing, just as you would on the main sensor. Lighting is equally important especially for ultrawide, so right after you tap to focus, remember to dial your exposures either up or down to get the best result. Composition is important as always and it can make or break your shot no matter how scenic things are here.
I would encourage you to go beyond the standard horizontal layout and try to lock onto the edges instead. This is great for taking photos of buildings, towers and exhibitions as that angling is more unique and ultrawide simply gives it a gentle stretch that isn’t too distracting.
Zooms can be tricky on any smartphone, since it’s either hardware or software assisted. The best would be hardware, since it would be a dedicated lens for the job. Now that we’re on a customized Snapdragon 8 Gen 2 that’s “Made for Galaxy”, the AI and postprocessing takes a huge leap forward, improving everything from quality to faster capture times.
So what’s the difference between the S23 Ultra’s Telephoto and Periscope Telephoto?
The difference is the movement and non-movement. Telephoto cameras are able to zoom optically with/without moving any parts, whereas the periscope simply acts as an extension to the existing telephoto lens. The advantage is having a natural zoom experience with little to no parts moving when taking the shot. The reason why shots can be so sharp and detailed even when zooming in is because it’s the hardware that’s zooming and it’s definitely not digital. This results in shots that have less noise and grain compared to digital zoom.
On the S23 Ultra, it’s not pure hardware zoom mind you. It’s more of a hybrid, so when you go really far, like lets say 30x and beyond, you’ll be stretching the limits of the periscope and the rest of the phone has to zoom digitally to reach your desired zoom level, like 50x or 100x. With this combination, shots might more more noisy, and the device has to compensate in post, so you might notice a watercolor effect taking place, and it’s up to you whether you’d find the image usable or not.
The application here is not always about taking sick, far out shots, but more towards just being able to see a great distance to find out what’s out there. This makes it more of a tool, and you’ll need to understand at this point that the final image might not look as pretty as we think as it’s more of a combination between a digital crop and magnifying, and of course a good helping of AI help.
Just How Far Is Far?
Start From Any Distance, Watch For Lighting
Compared to the S22 Ultra which packs the same zoom hardware, the S23 Ultra delivers better resolved detail and a slight bump in sharpness, but oddly enough with a small blanket of grain. It may be an overall improvement but because it’s more of a software side thing, S22 Ultra owners should just stay put while the rest can pretty much jump on board to enjoy some zooming fun. I would say that 30x is still the best of the best zoom to capture max detail, color and sharpness so do keep that in mind when you start to go into a zooming spree. Also, watch for your lighting, zoom lenses are notoriously known to not capture a lot of light as you can see from the transitioning photo of the TRX in the gallery, the one that’s darker than the rest.
The Moon – Ignore The Controversy
Of course, moon shots have now been synonymous to Samsung since its debut of Space Zoom on the S20 series. 3 generations later, it only makes sense that the tech should advance and make things better than before. Again, we’re playing with 30x and 100x, no tripod, but absolute, trained stillness. Here’s how it went.
We know it’s a trained algorithm that’s instructed to enhance moon shots when we zoom all the way to it and the internet’s divisive when it came to whether they were for it or otherwise. As for me, it really doesn’t matter. To get a good moon shot from an actual camera will cost you more than what this phone is worth in body and lenses, tripods and possibly even training for timing as you’ll need to abide by the rules of time and gravity for the moon to gravitate in such a way that you can take the shot.
AI optimization is fine, it’s a cool gimmick that people can just do whenever they’re out and at night and the moon is full for the taking.
No Pro mode is needed to get these. Just be on default mode and start pinchin’ to zoom. Obviously we’ll see the gradual drop in detail when we reach 100x zoom, but for now, no other phone can come close to this level of detail and distance, no matter how hard we’ve tried. Even though it’s clear that we should recommend only 30x for the moon and probably ask you to just crop, sometimes you’ll never know how lucky you are to land a perfectly sharp shot of the night sun. We didn’t edit the image in any way.