Smartphone photography has improved considerably over the past decade. Even the most basic of smartphones can spit out decent images now because all of the heavy lifting is done by the camera software now. That's also the reason why smartphone photos look ultra-processed now. The results are a step beyond realistic and in the apparent drive to capture great detail in the images, there's a noticeable degradation in contrast and images mostly appear flat over all.
The consequence of this extreme processing is that photos, particularly those taken with high-end devices, seem unnatural, overly sharp with outrageous highlights. This happens because modern smartphone photography is just software magic. When you take a picture, the photography engine captures multiple frames and based on how it's been set up, adjusts the exposure on objects and faces, sharpens the image significantly, reduces the noise as much as possible before presenting you with a very obviously highly processed image that just seems tailor-made for Instagram.
This often requires brightening up the dark areas of an image considerably while reducing brightness in other areas to showcase as much of that detail as possible. Displeasure with these over-processed photos has been building and it's not uncommon to see those who take their photography seriously, including a certain group with its proclivity for flannels and beanies, out and about with digital cameras so that their photos can reflect the essence they want to capture, instead of looking borderline AI-generated.
Apple has led the way with exaggerated over-sharpening of images and last year's iPhone 15 lineup got flayed considerably for that. Things aren't much better with the new iPhone 16 series, it's gotten worse actually, but Apple has come up with some new settings that return a lot of control back over to the user. It begs the question, though, is this a tacit admission of failure?
Perhaps Apple feels that it's not possible or simply not worth it to retrain its camera software to not be so overzealous and spit out images that seem much more natural. Instead, it's offering a new set of features and letting users figure that out for themselves. So disregard the failure that the camera software has ventured into unrealistic territory but appreciate that Photographic Styles, the feature in question, provides an element of manual control, even though most average users may not be comfortable enough to tinker with them in the first place.
The Photographic Styles feature isn't new, it's actually been around for several generations of the iPhone, but Apple has now upgraded it in a major way. Users can play around with various settings to make their images look cooler or warmer, dial in skin tones more accurately, and adjust colors. Presents and manual customization are both offered and all of these changes happen within the camera processing pipeline, so it's not like these are simply filters like the ones you'd apply in apps like Instagram.
The iPhone's camera needs to be all things for all people and this is Apple's way of making that happen. It has to cater to the average user who doesn't understand these nuances or even knows that Photographic Styles exist as much as it has to obey the whims of a serious mobile photographer. So for the former, there's the familiar exaggerated processing which spits out images that they like simply because their benchmark for judging image quality is vastly different from the latter who understands contrast, sharpness, noise reduction, skin undertones etc, all of which they now have more control over.
Apple has taken the easy way out here. Instead of improving its camera software to address what even the most loyal of iPhone fans now consider a failing, it's shifting the burden on users through new features that aren't very intuitive to begin with and will likely cause much aggravation to those who aren't used to or simply don't want to do all of this manually. Add to that the fact that Photographic Styles can only be used when shooting in Apple's preferred HEIF image file format, which isn't supported widely and no option is provided to automatically convert them into JPEG, the gold standard for image files.
To be fair, Samsung's camera software has similarly aggressive tendencies, as we've pointed out in many of our high-end Galaxy device reviews. From the Galaxy Z Fold 6's camera going “a little overboard sometimes,” to the Galaxy S24 Ultra's “struggles with accurate color production” even though Samsung toned down the exaggerated sharpening on it, the company's phones are in the same boat as the iPhone.
Samsung does provide several tools for manual control, such as the Expert Raw app for pro photographers and the Camera Assistant app that lets users decide whether the camera should prioritize focus speed, capture speed, or picture quality, but nothing quite as like what Apple is now offering with Photographic Styles. How long before it does, though?
I'd argue that it shouldn't. Instead of following Apple into an admission of failure, Samsung needs to do the opposite and work diligently on addressing these shortcomings. Truly intelligent camera processing shouldn't have a one-track application of over-processing, rather, it should be able to better navigate the nuances that people actually want to see in the images they capture, more so than what users can achieve manually with features like Photographic Styles.
Apple has provided Samsung a great opportunity with Photographic Styles to do what it hasn't been able to achieve for its camera processing software. It's been a long-running joke that the image quality on Android phones is far inferior to iPhones, the easiest retort for non-technical fans, or the misguided belief of Apple fans that there's no other phone that can provide compatible, if not better, image quality than the iPhone.
If Samsung is able to get this done, and not choose a cop-out like Apple, it would forever put to rest this pointless debate and cement its status as the company that fixed everything that's wrong with photography on smartphones today.