Alongside the iPhone 12 mini and Pro Max orders opening this week and the devices arriving to the first customers on November 13, Apple’s VP of camera software engineering Jon McCormack and product line manager Francesca Sweet have shared a behind the scenes look at the company’s philosophy for iPhone camera design, it’s goals for everyday users to pros, and more.

McCormack and Sweet talked with PetaPixel for a new interview diving into Apple’s thinking behind designing its iPhone cameras. Not surprisingly, they revealed in the big picture it’s a holistic approach between both software and hardware:

As for the main goal, Apple wants to make capturing shots with an iPhone camera so seamless that users aren’t distracted from what’s happening in any moment.

both made clear that the company thinks of camera development holistically: it’s not just the sensor and lenses, but also everything from Apple’s A14 Bionic chip, to the image signal processing, to the software behind its computational photography.

McCormack also highlights that even applies to “serious photographers:”

“As photographers, we tend to have to think a lot about things like ISO, subject motion, et cetera,” McCormack said “And Apple wants to take that away to allow people to stay in the moment, take a great photo, and get back to what they’re doing.”

Machine learning is used by Apple to individually process different aspects of a photo.

He explained that while more serious photographers want to take a photo and then go through a process in editing to make it their own, Apple is doing what it can to compress that process down into the single action of capturing a frame, all with the goal of removing the distractions that could possibly take a person out of the moment.

“We replicate as much as we can to what the photographer will do in post,” McCormack continued. “There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there.”

Francesca Sweet commented on the improvement that the iPhone 12 lineup cameras bring to Night mode.

“The background, foreground, eyes, lips, hair, skin, clothing, skies. We process all these independently like you would in Lightroom with a bunch of local adjustments,” he explained. “We adjust everything from exposure, contrast, and saturation, and combine them all together.”

And when it comes to the new ProRAW option, McCormack shared that the idea came from Apple asking what if it could offer the benefits of shooting in RAW while keeping the advantages of computational photography.

“The new wide camera, improved image fusion algorithms, make for lower noise and better detail,” she said. “With the Pro Max we can extend that even further because the bigger sensor allows us to capture more light in less time, which makes for better motion freezing at night.”

The full interview is an interesting read, check it out at PetaPixel.

Looking to trade in your iPhone/upgrade to iPhone 13?

  • Check latest iPhone trade-in values HERE
  • Check latest iPad trade-in values
  • How to Trade in Your iPhone — The Ultimate Guide
  • How to check your iPhone trade-in value
  • How much is your iPhone 11/Pro worth right now?
  • How much is your iPhone 12/Pro worth right now?