AVFoundation Tutorial: Adding Overlays and Animations to Videos

In this AVFoundation tutorial, you’ll learn how to add overlays and animations to videos, by using the AVVideoComposition CoreAnimationTool, which allows you to combine CALayers with videos to add backgrounds and overlays. By Marin Bencevic.

4.9 (27) · 1 Review

Download materials
Save for later
Share
You are currently viewing page 2 of 4 of this article. Click here to view the first page.

Core Animation – The Star of the Show

Layer cake

Your backgrounds and overlays will all be CALayers. CALayer is the primary class of a framework called Core Animation.

Core Animation is behind every view in your app, responsible for drawing and animating its contents. Since a CALayer backs every view, everything that’s drawn on your phone’s screen is a layer.

As their name suggests, you can draw layers below or on top of other layers, making them perfect for adding backgrounds or overlays to videos. You leverage this in videos with AVFoundation’s handy class called AVVideoCompositionCoreAnimationTool. This is a bridge between a composition and Core Animation, letting you create a new video that applies CALayers to a video composition.

To get started, you’ll need three different layers. One layer is the background, drawn behind the video. The second layer draws the frames of the video. The third layer is an overlay layer drawn on top of the video layer.

AVVideoCompositionCoreAnimationTool background, video and overlay CALayer

Layering the Cake

Create the three layers by adding the following code to the end of makeBirthdayCard(fromVideoAt:forName:onComplete:):

let backgroundLayer = CALayer()
backgroundLayer.frame = CGRect(origin: .zero, size: videoSize)
let videoLayer = CALayer()
videoLayer.frame = CGRect(origin: .zero, size: videoSize)
let overlayLayer = CALayer()
overlayLayer.frame = CGRect(origin: .zero, size: videoSize)

Each layer will have the same frame, spanning the whole video.

Next, assemble all these layers into a single parent layer by adding this to the method:

let outputLayer = CALayer()
outputLayer.frame = CGRect(origin: .zero, size: videoSize)
outputLayer.addSublayer(backgroundLayer)
outputLayer.addSublayer(videoLayer)
outputLayer.addSublayer(overlayLayer)

Here, you create a new layer that will be the layer of your final composition. You first add the background layer, then the video layer and finally, the overlay layer. The order of adding these layers matters. Here, you arrange the video behind the overlay and the background behind the video.

By now, you’re probably getting anxious to build the project and see the results. Unfortunately, you don’t have a video to show yet!

Grumpy face

You have your layers set up, but you still have to use AVFoundation to export a video from these layers.

Exporting the Video

Now that you have all the layers, it’s time to use AVVideoCompositionCoreAnimationTool to combine them into a delicious video composition cake!

Creating a Video Composition

Add the following to the method:

let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = videoSize
videoComposition.frameDuration = CMTime(value: 1, timescale: 30)
videoComposition.animationTool = AVVideoCompositionCoreAnimationTool(
  postProcessingAsVideoLayer: videoLayer, 
  in: outputLayer)

First, you create a new AVMutableVideoComposition. The composition you created earlier was an AVComposition, which can hold video, audio and other types of tracks. On the other hand, you only use an AVVideoComposition to compose video tracks. In this case, you only need a single video track.

Next, you set the rendered video’s size to equal the original video’s size. frameDuration determines how long a frame lasts. By passing a CMTime with a value of 1 and a timescale of 30 you set the frame duration to 1 / 30 seconds, resulting in a video with a rate of 30 frames per second.

The final touch for the video composition is adding an animation tool. This tool takes your assembled output layer and the video layer and renders the video track into the video layer.

Next, add the following code to add a video track to the video composition:

let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(
  start: .zero, 
  duration: composition.duration)
videoComposition.instructions = [instruction]
let layerInstruction = compositionLayerInstruction(
  for: compositionTrack, 
  assetTrack: assetTrack)
instruction.layerInstructions = [layerInstruction]

Video compositions use a collection of instructions to determine what to show on the video at any time. In this case, you only need one instruction to show the assembled video over the whole duration of the composition.

Each instruction can have its own layer instructions that determine how to layer different video tracks. The starter project includes a helpful method called compositionLayerInstruction that returns the correct instructions for your video. These instructions tell the video to scale and rotate itself to match the original video’s size and orientation. Without this instruction, portrait videos would show up as landscape and result in a weird, stretched-out video.

You now have all the required pieces: a new composition containing the original video and audio and a video composition that renders the video inside your cake of layers. You’ll combine these pieces in an export session.

Using an Export Session

Create an export session to render the video to a file:

guard let export = AVAssetExportSession(
  asset: composition, 
  presetName: AVAssetExportPresetHighestQuality) 
  else {
    print("Cannot create export session.")
    onComplete(nil)
    return
}

You’ll create a new AVAssetExportSession by passing it your composition and using a preset for the highest quality video. Highest quality birthday cards for the highest quality friends. :]

Now, create a file path and set up the export session:

let videoName = UUID().uuidString
let exportURL = URL(fileURLWithPath: NSTemporaryDirectory())
  .appendingPathComponent(videoName)
  .appendingPathExtension("mov")

export.videoComposition = videoComposition
export.outputFileType = .mov
export.outputURL = exportURL

Here, you use UUID to get a unique random string that you’ll use as the file name. You’ll save the file into a temporary directory for now; the user can decide to store it in the photo library later.

Then, you make sure the export session knows the file URL and extension and give it the video composition you created earlier to render the video.

Still in makeBirthdayCard(fromVideoAt:forName:onComplete:), initiate the export by adding the following:

export.exportAsynchronously {
  DispatchQueue.main.async {
    switch export.status {
    case .completed:
      onComplete(exportURL)
    default:
      print("Something went wrong during export.")
      print(export.error ?? "unknown error")
      onComplete(nil)
      break
    }
  }
}

When you begin the export, it will eventually complete by calling the completion handler on a background thread. Make sure to dispatch to the main thread and, then, check the result of the export. If it completed successfully, call the completion handler with the exported file’s URL. If something goes wrong, you know the drill: Print out an error and complete with nil.

Phew, that was an intense coding session! Now, it’s finally time to build and run the project. Choose a video and enter a name and after your device does some churning… You’ll see the same result as you did before.

Exporting a video using AVExportSession

Yes, you did all this work for the exact same result. Except, it’s not the same. The background and overlays are rendering, it’s just that they’re currently completely transparent since you haven’t yet added anything to them. So start adding things!