Tracking is not as easy as it sounds. As you move around, you may wander meters away from the object and it probably won’t be in your camera view. But if you decide to turn around and come back, you’d expect it to stay where you last saw it.

These are computationally expensive calculations so some shortcuts are needed to find a balance between real-time frame rates and accuracy. Because of this, the position of the object may shift over time, and you may find the anchors at a slightly different place than you last saw it at. This difference is called the drift.
The face tracking system is defined by a bunch of numbers that keep track of movements of the eye, eyeballs, eyebrows, jaws, etc. These numbers are called coefficients. These coefficients define the face topology. The topology is then used to create a geometry called the ARFaceGeometry object.

The face mesh geometry is an important piece for the face tracking system. You can place a face mask at the anchor location. The kind of mask that actors use in theater for acting training. This mask is readily available as a 3D Mesh in Reality Composer. ARKit will keep track of the mask by keeping track of the underlying anchor as the user moves their face around.
Also, The developer documentation is a great resource to find coding examples that you can try out on your own.
AR Face Blend
Published:

AR Face Blend

Published: