by marklit on 6/4/25, 5:57 PM with 95 comments
by Uncorrelated on 6/4/25, 8:13 PM
1. The original method uses two cameras on the back, taking a picture from both simultaneously and using parallax to construct a depth map, similar to human vision. This was introduced on the iPhone 7 Plus, the first iPhone with two rear cameras (a 1x main camera and 2x telephoto camera.) Since the depth map depends on comparing the two images, it will naturally be limited to the field of view of the narrower lens.
2. A second method was later used on iPhone XR, which has only a single rear camera, using focus pixels on the sensor to roughly gauge depth. The raw result is low-res and imprecise, so it's refined using machine learning. See: https://www.lux.camera/iphone-xr-a-deep-dive-into-depth/
3. An extension of this method was used on an iPhone SE that didn't even have focus pixels, producing depth maps purely based on machine learning. As you would expect, such depth maps have the least correlation to reality, and the system could be fooled by taking a picture of a picture. See: https://www.lux.camera/iphone-se-the-one-eyed-king/
4. The fourth method is used for selfies on iPhones with FaceID; it uses the TrueDepth camera's 3D scanning to produce a depth map. You can see this with the selfie in the article; it has a noticeably fuzzier and low-res look.
You can also see some other auxiliary images in the article, which use white to indicate the human subject, glasses, hair, and skin. Apple calls these portrait effects mattes and they are produced using machine learning.
I made an app that used the depth maps and portrait effects mattes from Portraits for some creative filters. It was pretty fun, but it's no longer available. There are a lot of novel artistic possibilities for depth maps.
by caseyohara on 6/4/25, 6:28 PM
I think there might be a few typos of the file format?
- 14 instances of "HEIC"
- 3 instances of "HIEC"
by andrewmcwatters on 6/4/25, 6:32 PM
Just in case you were doing 3d modeling work or photogrammetry and wanted to know, like I was.
by heliographe on 6/4/25, 10:24 PM
As another commenter pointed out, they used to be captured only in Portrait mode, but on recent iPhones they get captured automatically pretty much whenever a subject (human or pet) is detected in the scene.
I make photography apps & tools (https://heliographe.net), and one of the tools I built, Matte Viewer, is specifically for viewing & exporting them: https://apps.apple.com/us/app/matte-viewer/id6476831058
by onlygoose on 6/4/25, 7:10 PM
by kccqzy on 6/4/25, 9:40 PM
FWIW I personally hate the display of HDR on iPhones (they make the screen brightness higher than the maximum user-specified brightness) and in my own pictures I try to strip HDR gain maps. I still remember the time when HDR meant taking three photos and then stitching them together while removing all underexposed and overexposed parts; the resulting image doesn't carry any information about its HDR-ness.
by arialdomartini on 6/4/25, 8:16 PM
by kawsper on 6/4/25, 10:51 PM
by praveen9920 on 6/5/25, 6:08 PM
by itsgrimetime on 6/4/25, 6:34 PM
Anyways, never heard of oiiotool before! Super cool
by yieldcrv on 6/4/25, 9:44 PM
Chimera
The old gpu is an aberration and odd place to skimp. If he upgraded to a newer nvidia gpu it would have linux driver support and he could ditch windows entirely
And if he wasn’t married to arcgis he could just get a mac studio
by ziofill on 6/5/25, 4:01 AM
by layer8 on 6/4/25, 7:05 PM
by cloud_herder on 6/5/25, 1:38 AM
by just-working on 6/4/25, 7:47 PM
by pzo on 6/5/25, 5:05 AM
Lidar is a let down. First I would expect that Lidar would trickle down to non-pro devices. Come on apple FaceID got introduced in iphone X and next year it was in all iphone models. Lidar was introduced in iphone 12 pro and still only pro devices have it. As 3rd party dev it makes me reluctant to make any app using it if it limits my user base by 50%.
I'm also disappointed they didn't improve FaceID or Lidar in the last ~5 years (Truedepth still only 30fps, no camera format to mix 30fps depth + 120fps rgb, still big latency, Lidar still low resolution, no improvement to field of view)
by wahnfrieden on 6/4/25, 7:13 PM
by 1oooqooq on 6/4/25, 7:37 PM
> I'm running Ubuntu 24 LTS via Microsoft's Ubuntu for Windows on Windows 11 Pro
this is like hearing someone buying yet another automatic super car.