October 10, 2017

iPhone 8, 8+ and X cameras: Not a big leap, just a big difference in your photos.

Nope. Still 12 megapixels in the new iPhones. There are noticible improvements to the sensors, lenses and flash hardware which have some saying this is the best phone cam ever made. But the real difference is the Image Signal Processor powered by the new A11 Bionic chip.

The big improvement: ISP.

The ISP is the unsung hero of phone photography: taking raw sensor data and making the decisions on how to process that into a pleasing image. While the Sony-made camera modules in the new iPhones send more accurate image sensor data to the A11, what happens after that is where the magic begins.

There’s somethin’ happening here.

I really don’t like to delve into speculation, but I think I’ll have to in order to guess at some of the ISP magic. While the sensors on the back cameras of the iPhone are about 48 times smaller than one in a DSLR, the iPhone has many more sensors available than a DSLR does. The Plus and X models even more so, since they have two rear lenses that can supply separate focus distance data based of the subject and background to create a depth map and collect three-dimensional data. So your iPhone can compare all this data to better understand what you’re shooting and even know how you want the final image to look.

More sensors=better photos.

For instance, data from the image sensors, accelerometer and clock could tell your iPhone that you’re shooting a subject in front of the low sun at dusk and likely want a silhouette of the darkened subject against a colorful sunset. It’s also like that the enhanced optical image stabilization is aided in low light by the accelerometer and image sensor data on sharpness to work with the continuous Image buffer that’s capturing images at 12 frames per second. If you shook the camera and blurred the image as you pressed the shutter or the subject blinked, the iPhone could decide on a sharper or eyes-open frame in the sequence just before or after you pressed the shutter would be a better choice. While there’s some speculation to the magic, results of these scenarios are already visible in new iPhone images.

And no speculation on this: the depth map created from data of the dual cameras which spawned the Portrait mode (sharp subject against a blurred background) is now joined by Portrait Lighting filters. These don’t just monkey with overall colors or contrast like Instagram filters, they can lighten, darken and color individual pixels based on professional lighting algorithms to completely redefine subject and scene lighting.

Okay, one big leap: HEIF.

All this data works better with a image format that goes beyond the confines of the 25-year-old JPG. Apple added the option to shoot in HEIF (an ISO standard established in 2015) which appears to be the future of photography. It currently takes images that are a little nicer than JPG, but half the size. HEIF is essentially a package that can store different types of data (like image formats, Live Photo sequences, other shooting data) and will support new image formats as they emerge. If called for, your iPhone will automatically convert these HEIF photos to JPGs on export, so there’s no reason not to use this new format and save the space.

So the spec sheets of the new iPhones may not look that impressive at first, but the photos from the iPhone 8, 8+ and X certainly will.

Get the definitive book for mastering your iPhone camera: The Crap-Free Guide to iPhone Photography


Previous post
The almost painfully-long glossary of iPhone photo terms. This glossary originally appeared in my book The Crap-Free Guide to iPhone Photography. Though you “only” have an iPhone, there’s no reason why
Next post
Resyncing audio with iMovie in iOS. There’s a horrible bug in the iOS that causes audio from Lightning or USB-connected microphones, such as the Blue Raspberry, Apogee MiC, Line 6