Camera Tricks: How the Dazzling iPhone XS and XR Photos Features Work

With today unveiling of the iPhones XS, XS Max, and XR, Apple took aim at having the best camera on a smartphone. And all this new technology is exemplary of a greater trend: better photos through software.

image
Apple

With today unveiling of the iPhones XS, XS Max, and iPhone XR, Apple took aim at having the best camera on a smartphone. And all this new technology is exemplary of a greater trend: better photos through software.

Depth Control

image
Apple

Yesterday, Apple’s Phil Schiller showed off two new key photo features. One is called Depth Control, which can blur or sharpen the background on an existing photo that was taken in Portrait Mode. An audience member let out an audible “No!” during the demo, as in, No way, is that what I think it is? It was, and the demo rightfully earned applause. Never before had iPhones been able to change a photo’s depth of field after the fact.

Now, Schiller wasn’t quite right when he said that this is the first time something like this has been possible. A company named Lytro, which went out of business last March, made headlines earlier this decade with the design of the light-field camera. Lytro’s breakthrough feature was the ability to read and understand a deluge of data, including the direction light rays are traveling. With that information, the camera could capture enough data so that the user could change the focal distance after the photo was taken, blurring or un-blurring the background.The iPhone achieves the same end in a more user-friendly way. Because none of these ideas matter if they’re too difficult for people to figure out. Apple gets that.

Smart HDR

image
Smart HDR on the iPhone Xs captures frames at many exposures, then merges the best elements of each into one photo.
Apple

The second new function is Smart HDR.

iPhone users will have noticed a tag on photos that says "HDR" —it means High Dynamic Range, a process where, when you press the shutter button, the phone captures several frames simultaneously, then merges elements from each one into a single image (a good shadow from one, skin texture from another). The new Smart HDR captures four frames, plus four more in between those frames at a different exposure. The iPhone will also take a separate long-exposure to grab bright color elements.

The hard work to merge all these frames into a single image happens on the A12 Bionic chip that runs the new iPhones. Schiller said the chip’s “Neural Engine” will handle machine learning functions like recognizing faces. It all adds up, he said, to a phone that can perform up to 1 trillion operations on a single photo.

Simply the Best?

image
The iPhone Xs has two 12MP cameras. The top is wide-angle, and the bottom is telephoto.
Apple

So far, we haven’t mentioned the hardware. The back of the new iPhone XSes will have one 12 megapixel wide-angle camera (f/1.8), and a 12 megapixel telephoto camera (f/2.4) that work together. Like on the iPhone X, it uses the same hardware (a 7 megapixel camera with f/2.2 aperture, an infrared camera, a dot projector) and software that runs Face ID to tune the camera for flattering selfies.

The real question is: If you want a new phone, and an awesome camera is a priority, what do you buy?

That’s hard to answer right now, most importantly because we don’t have details about Google’s Pixel 3 phone, which will certainly have a camera and software as good if not better than that on the iPhone XS. Before today, the title of best camera smartphone was generally considered to be a contest between the Pixel 2 or iPhone X.

For its part, Google beat Apple to two big innovations in smartphone photography. The Google Photos app made computer vision useful—searching “beach” or “dog” actually found photos of those things, and it let you remove already backed-up photos from the phone, to free up storage. It was also free, as long as you didn’t mind the compression. Second, Google’s Pixel phones showed us the potential of software in phone photography. By analyzing what’s being photographed and applying subtle post-processing. It could also identify a human form, and artificially blur the background like on a big SLR camera, something Apple did later with Portrait Mode. It’s how the Pixel 2 (arguably) took better photos with one camera than the same-era iPhone could with two.

However, Apple’s cameras have been, for the last few phones, capable beyond the needs of most people. Modern iPhones shoot incredible photographs. More specifically, I don’t know anyone who spent four figures on an iPhone X and is less than thrilled with it, and I expect the same from the new Xses. And as someone who believes iOS beats Android in just about every important way, and because Google Photos works really well on iOS, I still recommend iPhones. Even a certified Apple refurbished iPhone 7 Plus takes great shots.

Lastly, know this: While they have gotten closer, phones haven’t yet caught up to dedicated cameras in image quality. The compactness of a phone means you will never be able to have a big sensor.

Last year, I asked Pulitzer Prize-winning photographer and prolific iPhone user Lynsey Addario about this. She said this, which says it all:

“If you take a great photographer and give them an iPhone, they’re going to take great pictures. But if you give them a professional camera, the pictures are going to have great resolution, great color contrast….The sharpness, the color saturation, the color tone, depth of field. You can do that on the iPhone, but it can’t compare to a camera with a beautiful lens.”
Advertisement - Continue Reading Below
More From Best New Gadgets