AR Face Tracking Tutorial for iOS: Getting Started

In this tutorial, you’ll learn how to use AR Face Tracking to track your face using a TrueDepth camera, overlay emoji on your tracked face, and manipulate the emoji based on facial expressions you make. By Yono Mittlefehldt.

Leave a rating/review
Download materials
Save for later
Share
You are currently viewing page 2 of 4 of this article. Click here to view the first page.

Adding a Mesh Mask

On the surface, it looks like you’ve only turned on the front-facing camera. However, what you don’t see is that your iPhone is already tracking your face. Creepy, little iPhone.

Wouldn’t it be nice to see what the iPhone is tracking? What a coincidence, because that’s exactly what you’re going to do next!

Add the following code after the closing brace for the EmojiBlingViewController class definition:

// 1
extension EmojiBlingViewController: ARSCNViewDelegate {
  // 2
  func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    
    // 3
    guard let device = sceneView.device else {
      return nil
    }
    
    // 4
    let faceGeometry = ARSCNFaceGeometry(device: device)
    
    // 5
    let node = SCNNode(geometry: faceGeometry)
    
    // 6
    node.geometry?.firstMaterial?.fillMode = .lines
    
    // 7
    return node
  }
}

In this code you:

  1. Declare that EmojiBlingViewController implements the ARSCNViewDelegate protocol.
  2. Define the renderer(_:nodeFor:) method from the protocol.
  3. Ensure the Metal device used for rendering is not nil.
  4. Create a face geometry to be rendered by the Metal device.
  5. Create a SceneKit node based on the face geometry.
  6. Set the fill mode for the node’s material to be just lines.
  7. Return the node.
Note: ARSCNFaceGeometry is only available in SceneKit views rendered using Metal, which is why you needed to pass in the Metal device during its initialization. Also, this code will only compile if you’re targetting real hardware; it will not compile if you target a simulator.

Before you can run this, you need to set this class to be the ARSCNView‘s delegate.

At the end of the viewDidLoad() function, add:

sceneView.delegate = self

OK, time for everyone’s favorite step. Build and run that app!

Mesh Mask

Updating the Mesh Mask

Did you notice how the mesh mask is a bit… static? Sure, when you move your head around, it tracks your facial position and moves along with it, but what happens when you blink or open your mouth? Nothing.

How disappointing.

Luckily, this is easy to fix. You just need to add another ARSCNViewDelegate method!

At the end of your ARSCNViewDelegate extension, add the following method:

// 1
func renderer(
  _ renderer: SCNSceneRenderer, 
  didUpdate node: SCNNode, 
  for anchor: ARAnchor) {
   
  // 2
  guard let faceAnchor = anchor as? ARFaceAnchor,
    let faceGeometry = node.geometry as? ARSCNFaceGeometry else {
      return
  }
    
  // 3
  faceGeometry.update(from: faceAnchor.geometry)
}

Here, you:

  1. Define the didUpdate version of the renderer(_:didUpdate:for:) protocol method.
  2. Ensure the anchor being updated is an ARFaceAnchor and that the node’s geometry is an ARSCNFaceGeometry.
  3. Update the ARSCNFaceGeometry using the ARFaceAnchor’s ARFaceGeometry

Now, when you build and run, you should see the mesh mask form and change to match your facial expressions.

Updating Mesh Mask

Emoji Bling

If you haven’t already done so, go ahead and download the material for this tutorial via the button at the top or bottom of the tutorial.

Inside, you’ll find a folder called SuperUsefulCode with some Swift files. Drag them to your project just below EmojiBlingViewController.swift. Select Copy items if needed, Create groups, and make sure that the Emoji Bling target is selected

StringExtension.swift includes an extension to String that can convert a String to a UIImage.

EmojiNode.swift contains a subclass of SCNNode called EmojiNode, which can render a String. It takes an array of Strings and can cycle through them as desired.

Feel free to explore the two files, but a deep dive into how this code works is beyond the scope of this tutorial.

With that out of the way, it’s time to augment your nose. Not that there’s anything wrong with it. You’re already such a beautiful person. :]

At the top of your EmojiBlingViewController class, define the following constants:

let noseOptions = ["👃", "🐽", "💧", " "]

The blank space at the end of the array is so that you have the option to clear out the nose job. Feel free to choose other nose options, if you want.

Next, add the following helper function to your EmojiBlingViewController class:

func updateFeatures(for node: SCNNode, using anchor: ARFaceAnchor) {
  // 1
  let child = node.childNode(withName: "nose", recursively: false) as? EmojiNode

  // 2
  let vertices = [anchor.geometry.vertices[9]]
  
  // 3
  child?.updatePosition(for: vertices)
}

Here, you:

  1. Search node for a child whose name is “nose” and is of type EmojiNode
  2. Get the vertex at index 9 from the ARFaceGeometry property of the ARFaceAnchor and put it into an array.
  3. Use a member method of EmojiNode to update it’s position based on the vertex. This updatePosition(for:) method takes an array of vertices and sets the node’s position to their center.
Note: So where did index 9 come from? It’s a magic number. The ARFaceGeometry has 1220 vertices in it and index 9 is on the nose. This works, for now, but you’ll briefly read later the dangers of using these index constants and what you can do about it.

It might seem silly to have a helper function to update a single node, but you will beef up this function later and rely heavily on it.

Now you just need to add an EmojiNode to your face node. Add the following code just before the return statement in your renderer(_:nodeFor:) method:

// 1
node.geometry?.firstMaterial?.transparency = 0.0

// 2
let noseNode = EmojiNode(with: noseOptions)

// 3
noseNode.name = "nose"

// 4
node.addChildNode(noseNode)

// 5
updateFeatures(for: node, using: faceAnchor)

In this code, you:

  1. Hide the mesh mask by making it transparent.
  2. Create an EmojiNode using your defined nose options.
  3. Name the nose node, so it can be found later.
  4. Add the nose node to the face node.
  5. Call your helper function that repositions facial features.

You’ll notice a compiler error because faceAnchor is not defined. To fix this, change the guard statement at the top of the same method to the following:

guard let faceAnchor = anchor as? ARFaceAnchor,
  let device = sceneView.device else {
  return nil
}

There is one more thing you should do before running your app. In renderer(_:didUpdate:for:), add a call to updateFeatures(for:using:) just before the closing brace:

updateFeatures(for: node, using: faceAnchor)        

This will ensure that, when you scrunch your face up or wiggle your nose, the emoji’s position will update along with your motions.

Now it’s time to build and run!

Nose job