AVFoundation Tutorial: Adding Overlays and Animations to Videos

In this AVFoundation tutorial, you’ll learn how to add overlays and animations to videos, by using the AVVideoComposition CoreAnimationTool, which allows you to combine CALayers with videos to add backgrounds and overlays. By Marin Bencevic.

4.9 (27) · 1 Review

Download materials
Save for later
Share
Update note: Marin Bencevic updated this tutorial for Xcode 11, Swift 5 and iOS 13. Abdul Azeem Khan wrote the original.

If you’re making a camera app, you’ll benefit from adding overlays and animations to your videos. Whether you add the current date, location name, weather or just fun GIFs, your users will appreciate being able to customize their videos.

You can do all that with AVFoundation, Apple’s framework for manipulating audio and video content. You can think of AVFoundation as a programmatic video and audio editor, which lets you compose video and audio tracks then add cool overlays to them.

In this AVFoundation tutorial, you’ll learn how to:

  • Add a custom border to your videos.
  • Add text and images to your videos.
  • Animate your video overlays.
  • Export your overlaid video into a file.

To get the most out of this tutorial, you’ll need to be familiar with iOS development. It would also be useful to have some familiarity with Core Animation. Don’t worry though, this tutorial will explain everything as you go along.

Ready? Lights, camera… action! :]

Getting Started

Start by downloading the starter project by clicking the Download Materials button at the top or bottom of the tutorial.

The project’s name is Cubica, which stands for Custom Birthday Cards. You’ll make an app that will let you record a video and add overlays and borders to turn it into a customized birthday card for your friends.

Using CAEmitterLayer to add confetti to a video

Open the begin project in Xcode. You can use either the simulator or a device, but keep in mind that this tutorial requires a video. Make sure to drag and drop a video file from your Mac to your simulator.

The begin project already has a screen where you can enter your friend’s name and pick a video to add overlays to. This all happens in PickerViewController.swift.

Custom birthday cards video picker screen

Once the user picks a video, the app sends it to VideoEditor.swift. Currently, this file only has a couple of helper methods and a method called makeBirthdayCard(fromVideoAt:forName:onComplete:). You’ll edit this method to add overlays to the video.

Once the app adds overlays to the video, the method calls the completion handler and sends the video URL to PlayerViewController.swift. This view controller plays the video file and lets you export the file to your photo library.

Note: To learn how to pick and play video files, check out How to Play, Record, and Merge Videos in iOS and Swift.

I bet you can’t wait to send this to your friends! :]

Composing a Video

Before you can add any overlays to the video, you need to set some things up. What you’ll do is create a new video file from an existing video, with added backgrounds and overlays.

First, you’ll create a new AVFoundation composition. You can think of a composition as a programmatic video editor. The composition holds different types of tracks, like audio and video tracks, and manages when they start or end in the time line of the video.

Once you create an empty composition, you’ll add two tracks to the composition, one for the video and one for the audio. For the audio track, you’ll simply copy the existing video’s audio. To create the video, you’ll use AVVideoCompositionCoreAnimationTool, a class that lets you combine an existing video with Core Animation layers.

Once you have both the combined video and the audio inside the composition, you’ll export the composition into a video file using AVAssetExportSession.

Don’t worry, it’s not as daunting as it sounds! Your first step is to create the composition.

Creating a Composition

Open VideoEditor.swift. The meat of the sample project is inside makeBirthdayCard(fromVideoAt:forName:onComplete:). Currently, this method simply calls the completion handler with the existing video. Replace the line onComplete(videoURL) with the following:

let asset = AVURLAsset(url: videoURL)
let composition = AVMutableComposition()

Create an AVAsset, which holds all the required information and data about the provided video. Also, create an empty composition. You’ll fill this composition with an overlaid video.

Next, add a track to the composition and grab the video track from the asset by adding the following code to the method:

guard
  let compositionTrack = composition.addMutableTrack(
    withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid),
  let assetTrack = asset.tracks(withMediaType: .video).first
  else {
    print("Something is wrong with the asset.")
    onComplete(nil)
    return
}

You add a new video track by calling addMutableTrack with a .video media type. You pass the invalid ID constant for the track ID if you won’t be using the ID. You also grab the video from the asset by grabbing the first, and only, video track in the asset. If you can’t perform either of these two actions, print out an error and call the completion handler with nil.

Now, add the following code at the end of makeBirthdayCard(fromVideoAt:forName:onComplete:) to insert the video track from the asset inside the composition’s video track:

do {
  // 1
  let timeRange = CMTimeRange(start: .zero, duration: asset.duration)
  // 2
  try compositionTrack.insertTimeRange(timeRange, of: assetTrack, at: .zero)
  
  // 3
  if let audioAssetTrack = asset.tracks(withMediaType: .audio).first,
    let compositionAudioTrack = composition.addMutableTrack(
      withMediaType: .audio, 
      preferredTrackID: kCMPersistentTrackID_Invalid) {
    try compositionAudioTrack.insertTimeRange(
      timeRange, 
      of: audioAssetTrack, 
      at: .zero)
  }
} catch {
  // 4
  print(error)
  onComplete(nil)
  return
}

Here’s what’s going on in the code above:

  1. CMTimeRange specifies time ranges inside videos. In this case, you want to add the video from the beginning to the end, so you make a range from zero to the duration of the video.
  2. Once you have the time range, you insert the whole video from the asset into your composition’s video track.
  3. If the asset also contains an audio track, do the same thing you just did for the audio track. First, add a new audio track to your composition and then insert the asset’s audio into the track.
  4. If you get an error, print it and call the completion handler with nil.

Setting Up the Composition

Next, take care of the sizing and orientation of your composition by adding this code at the end of the method:

compositionTrack.preferredTransform = assetTrack.preferredTransform
let videoInfo = orientation(from: assetTrack.preferredTransform)

let videoSize: CGSize
if videoInfo.isPortrait {
  videoSize = CGSize(
    width: assetTrack.naturalSize.height,
    height: assetTrack.naturalSize.width)
} else {
  videoSize = assetTrack.naturalSize
}

You first make sure the composition’s and the asset’s preferred transforms are the same. The starter project includes orientation(from:), which returns the orientation – portrait or landscape – of the video. If the orientation is portrait, you need to reverse width and height when checking the video’s size. Otherwise, you can use the original size.

Note: To learn more about composing videos with AVFoundation, check out How to Play, Record, and Merge Videos in iOS and Swift.

Now, you’ve created a new composition that includes the video and audio from the original file. Next, you’ll set up layers to make sure you can add backgrounds and overlays to the video in the composition.