The iPhone X face mapping: Privacy risks (none) and how it works (very cool)

Lots of interesting tidbits in the linked TechCrunch article. Here are a few highlights:

As we’ve covered off in detail before Apple does not have access to the depth-mapped facial blueprints that users enroll when they register for Face ID. A mathematical model of the iPhone X user’s face is encrypted and stored locally on the device in a Secure Enclave.

Face ID also learns over time and some additional mathematical representations of the user’s face may also be created and stored in the Secure Enclave during day to day use

And:

The key point here is that Face ID data never leaves the user’s phone (or indeed the Secure Enclave). And any iOS app developers wanting to incorporate Face ID authentication into their apps do not gain access to it either. Rather authentication happens via a dedicated authentication API that only returns a positive or negative response after comparing the input signal with the Face ID data stored in the Secure Enclave.

Some people have pointed to the detailed face mapping accessed via ARKit and expressed concerns about privacy with 3rd party developers access to that data:

With the iPhone X developers can access ARKit for face-tracking to power their own face-augmenting experiences — such as the already showcased face-masks in the Snap app.

“This new ability enables robust face detection and positional tracking in six degrees of freedom. Facial expressions are also tracked in real-time, and your apps provided with a fitted triangle mesh and weighted parameters representing over 50 specific muscle movements of the detected face,” writes Apple.

And:

Now it’s worth emphasizing that developers using this API are not getting access to every datapoint the TrueDepth camera system can capture. This is also not literally recreating the Face ID model that’s locked up in the Secure Enclave — and which Apple touts as being accurate enough to have a failure rate as small as one in one million times.

The data being shared via ARKit is a small sample of what’s used for Face ID, and it is missing key details, like attention detection data.

The linked article goes into a lot more detail and is an interesting read.

As to how Face ID works, take a look at the video embedded in this tweet from The Verge’s Nilay Patel. Before you click, note that there are flashing lights that might trigger a reaction in some people. The embedded video really gives a sense of the dot projector at work. Fascinating stuff.