Though it is a budget device with a single-lens camera, the iPhone SE features support for Portrait Mode, enabled through the powerful A13 chip in the smartphone.
It is the first of Apple’s smartphones to offer Portrait Mode photos created entirely with software techniques rather than hardware, which prompted the developers behind popular iOS camera app Halide to take a deep dive into how it works.
The iPhone SE is equipped the same camera sensor as the iPhone 8, based on a recent teardown done by iFixit, but its camera can do more because it’s using “Single Image Monocular Depth Estimation,” aka generating Portrait Mode effects using a 2D image.
As Halide developer Ben Sandofsky points out, the iPhone XR is also a single-lens camera with Portrait Mode support, but the iPhone XR gets depth information through hardware. That’s not possible on the iPhone SE because the older camera sensor doesn’t support the feature.
Halide has discovered that unlike other iPhones, the iPhone SE can take a picture of another picture to attempt to develop a depth map. The app was even able to take a photo of an old slide film, adding depth effects to a 50 year old photo.
The iPhone SE’s Portrait Mode is somewhat limited because it only works with people, which is due to the neural network that powers the feature. When a Portrait Mode image without a person is captured, it fails in various ways because it can’t create an accurate estimated depth map.
The iPhone XR also limited Portrait Mode to people alone, and using Portrait Mode with other objects requires upgrading to one of Apple’s more expensive phones.
According to Halide, depth maps on the iPhone SE (or any phone with Portrait Mode) can be viewed by using the Halide app and then shooting in Depth mode. Halide’s full breakdown of the iPhone SE’s Portrait Mode can be read over on the Halide website.