Thealmost right around the corner and the rumor mill was churning furiously as we headed towards the start. I have talked too much but I’d especially like to see this camera take some real steps forward.
The cameras on Apple’s phones have always been greatit’s capable of capturing the shots you’d expect from professional cameras, and even the cheapest iPhone SE can capture beautiful shots on your summer vacation. However and pack amazing camera systems that mean Apple hasn’t led the way for a while.
So I dreamed up how to redesign Apple’s camera system for the iPhone 14 to secure its position as the best photography phone. Apple, take note.
A larger image sensor on the iPhone 14
The image sensors inside the phones are tiny compared to those in professional cameras like the Canon EOS R5. The smaller the image sensor, the less light can hit it and is light everything in photography. More captured light means better-looking pictures, especially at night, and that’s why pro cameras have sensors many times larger than phones.
Why are phone cameras lacking in this regard? Because image sensors have to fit into pocket-sized phone bodies where the floor is high. But there is definitely room to play. phones likeand even 2015’s Panasonic CM1 packs 1-inch camera sensors that can offer vastly improved dynamic range and low-light agility, so it’s not too wild to hope for a larger image sensor in the iPhone 14.
Of course, Apple does amazing things with computational photography to squeeze every ounce of quality out of its tiny sensors, but if it combined those same software capabilities with a large image sensor, the difference could be huge. A 1-inch image sensor was certainly out of the question, but I’d love to see Apple push things further with an APS-C-sized sensor like those found in many mirrorless cameras.
Well, all three cameras can’t get massive sensors, otherwise they just won’t fit in the phone, but maybe only the main camera can increase its size. Either that, or just have one large image sensor and put the lens on a rotating dial on the back to allow you to physically change the angle of view depending on your scene. To be honest, it doesn’t seem like much of an Apple thing to do.
A zoom that will finally rival Samsung
While I generally found images taken with the iPhone 13 Pro’s primary camera to look better than those taken with the Galaxy S22 Ultra, there is one area where Samsung wins; telephoto zoom. The iPhone’s optical zoom is up to 3.5x, but the S22 Ultra offers up to 10x optical zoom.
And the difference in shots you can get is staggering. I love zoom lenses because they allow you to find all kinds of hidden compositions in the scene instead of using a wide lens and shooting everything in front of you. I think they allow for more artistic, more considered images, and while the iPhone’s zoom helps you get those compositions, it’s no competition for the S22 Ultra.
So what the phone needs is a proper zoom lens based on good optics and not just digital cropping and sharpening, which always results in pretty muddy looking shots. There should be at least two optical zoom levels; 5x for portraits and 10x for more detailed landscapes. Or better yet, it will allow you to continuously zoom between these levels to find the perfect composition instead of choosing between two fixed zoom options.
Personally, I think 10x is the maximum Apple should go for. Sure, Samsung boasts that its phone can zoom up to 100x, but the reality is that those shots rely heavily on digital cropping, and the results are terrible. 10x is huge and the equivalent of carrying a 24-240mm lens for your DSLR — wide enough for wide landscapes, with enough zoom for wildlife photography. Ideal.
Pro video controls built into the default camera app
With the introduction of ProRes video on the iPhone 13 Pro, Apple has sent a strong signal that it sees its phones as truly useful video tools for professional creatives. ProRes is a video codec that captures large amounts of data, allowing for greater editing control in postproduction software such as Adobe Premiere Pro.
But the camera app itself is still pretty basic, with video settings mostly limited to turning ProRes on or off, changing zoom lenses, and changing resolution. The point is; make it as simple as possible to shoot and capture beautiful shots without fuss. But professionals looking to use ProRes will want more manual control over things like white balance, focus, and shutter speed.
Yes, that’s why there are programs like Filmic Pro that give you incredibly fine control over all these parameters to get exactly the look you want. But it would be nice to see Apple find a way to make these settings more accessible in the default camera app. So you can fire up the camera from the lock screen, tweak just a few settings, and be in action right away, making sure you’re getting exactly what you want from your video.
Focus stacking in the camera on the iPhone
Imagine you found a beautiful mountain wildflower with a towering alpine peak behind it. You move closer to the flower and tap it to focus, and it snaps into sharp focus. But now the mountain is out of focus and it shows that the flower is now blurred. This is a common problem when trying to focus on two elements in a scene that are far apart, and experienced landscape and macro photographers will solve this problem by using a technique called focus stacking.
Focus stacking means taking a series of pictures with the camera held still while focusing on different elements within a scene. These images are then combined — usually in a desktop program such as Adobe Photoshop or a special focus program such as Helicon Focus — to create an image with extreme foreground and background focus. This is the opposite of the camera’s Portrait Mode, which aims to purposefully blur the background around the subject for shallow depth of field, or bokeh.
This may be a niche wish, but I’d love to see this focus stacking capability on the iPhone, and it wouldn’t be too hard to do. After all, the phone already uses image blending technology to combine different exposures into a single HDR image — it would do the same with focus points only, rather than exposure.
Better long exposure photography
Apple has had the ability to take long exposure photos on the iPhone for years now. You will have seen those shots; images of waterfalls or rivers, where the water is expertly blurred, but the rocks and landscape around the water remain sharp. It’s a great technique to really emphasize the action in the scene, and it’s something I like to do on my proper camera and iPhone.
And while it’s easy to do on an iPhone, the results are only good. The problem is that the iPhone uses a moving image — Live Photo — to detect motion in the scene and then digitally blur it, which usually means that in any motion blurs, even bits that shouldn’t be there. The result is shots that look pretty neat, even when you put the phone on a mobile tripod for stability. They’re great to send to your family or maybe post on Instagram, but they wouldn’t look good printed and framed on your wall, and I think that’s a shame.
I’d like to see Apple make better use of optical image stabilization to allow for really sharp long-exposure shots of night scenes, perhaps car headlights fading down the street, not just water. This would be another great way to get creative with your phone photography and take advantage of the great quality of these cameras.