Unity has been at the forefront of support for all things Augmented reality and that’s no different when it comes to the ARKit platform built by Apple. ARKit support launched the day it was announced and face tracking launched shortly after with the release of the iPhone X.
The iPhone X’s front facing camera supports a variety of features. All of which are clearly laid out with documentation and sample scenes in the ARKit plugin . This is crucial to getting a head start and experimenting with face tracking Apps.
For this write up I’ll walk through the steps I used to create two small applications. One that overlays 3D models on a face and the other that activates and streams particles out of a users mouth if it is open.
For each of these apps, I started by duplicating an example scene then building it to my device to see what it did.
Overlaying 3D models on a face
Starting scene: FaceAnchorScene
This scene is fairly simple but does have a few complexities and requires a bit of trial and error to get the 3D elements lined up well. The basic idea is childing 3D elements to the AxesPrefab in the scene.
The AxesPrefab is assigned to the face anchor from the ARKit SDK through UnityARFaceAnchorManager.cs which is a component of the ARFaceAnchorManager object in the scene.
Here is a screenshot that shows off the scale and position I found to works well with my face when adding 3D geometry. I added and scaled down a hat and mustache model, and then made them children of the AxesPrefab object.
You can then delete or turn off the axes colored rectangles once you’ve positioned your geometry.
If you build out the scene right now you would get results similar to this. As you can see all 3D geometry is rendered on top of the 2D camera feed.
In order to fix this we will use another feature of the ARKit SDK as well as a special shader already included in the ARKit plugin.
The FaceMeshScene has an object named ARFaceMeshManager that creates a mesh of the users face at runtime. You will need to bring it into this scene or recreate it as shown in the screenshot. Note the material has been changed from the default unity shader to the occlusionPlaneMaterial material.
The object should look like this.
Here is an image of the facial geometry that is created at runtime with the default unity material applied. By adding the occlusion material all objects behind the face are occluded.
You may have noticed the eyeballs do not have any mesh covering them. Depending on what objects you add to your anchor this will produce some odd results. To fix this, I added two spheres at the approximate position of the eyes with the occlusionPlaneMaterial.
The final result looks like this.
Activating particles out of your mouth
Starting scene: FaceBlendshapeScene
The main concept for this scene is to set up a gameobject’s active state and have it be driven by the value of a blend shape from the ARKit SDK.
To do this I wrote a small script that hooks into ARKit callbacks for the face mesh then it looks for a specific blend shape by name. The blend shape gives me a float value, which I check against a threshold to toggle the active state of an object. For my demo I am using jawOpen coefficient.
The full script can be found here.
For the object setup I have entered the name of the blend shape that the script should observe. Then I assigned a reference to the object, which I will activate and deactivate (in this case it’s a particle effect).
The final step is to properly line up your toggle object in the scene. For this example I want the particles coming out of the users mouth so I positioned the object (RainBow) as a child of the ARFaceMeshManager as shown in the image below. This will allow the particle system to track to the users position and rotation.
There are a couple additional steps you can take in order to enhance the effect and scene. The first is to add the occlusion mask that I referenced in the 3D Models example, and this depends on the type of effect you are creating. In this case I am emitting particles from my mouth, so the mask helps hide particles inside my mouth and the ones around my lips. Another improvement for this specific example is to modify the object active state to call Play() and Stop() on the particle system, so that there’s a better transition between active states.
The final results look like this.
Fore more updates on AR and VR development with Unity, follow me on twitter @DanMillerDev