iOS & Swift Tutorials

Learn iOS development in Swift. Over 2,000 high quality tutorials!

AVFoundation Tutorial: Adding Overlays and Animations to Videos

In this AVFoundation tutorial, you’ll learn how to add overlays and animations to videos, by using the AVVideoComposition CoreAnimationTool, which allows you to combine CALayers with videos to add backgrounds and overlays.

5/5 5 Ratings

Version

  • Swift 5, iOS 13, Xcode 11
Update note: Marin Bencevic updated this tutorial for Xcode 11, Swift 5 and iOS 13. Abdul Azeem Khan wrote the original.

If you’re making a camera app, you’ll benefit from adding overlays and animations to your videos. Whether you add the current date, location name, weather or just fun GIFs, your users will appreciate being able to customize their videos.

You can do all that with AVFoundation, Apple’s framework for manipulating audio and video content. You can think of AVFoundation as a programmatic video and audio editor, which lets you compose video and audio tracks then add cool overlays to them.

In this AVFoundation tutorial, you’ll learn how to:

  • Add a custom border to your videos.
  • Add text and images to your videos.
  • Animate your video overlays.
  • Export your overlaid video into a file.

To get the most out of this tutorial, you’ll need to be familiar with iOS development. It would also be useful to have some familiarity with Core Animation. Don’t worry though, this tutorial will explain everything as you go along.

Ready? Lights, camera… action! :]

Getting Started

Start by downloading the starter project by clicking the Download Materials button at the top or bottom of the tutorial.

The project’s name is Cubica, which stands for Custom Birthday Cards. You’ll make an app that will let you record a video and add overlays and borders to turn it into a customized birthday card for your friends.

Using CAEmitterLayer to add confetti to a video

Open the begin project in Xcode. You can use either the simulator or a device, but keep in mind that this tutorial requires a video. Make sure to drag and drop a video file from your Mac to your simulator.

The begin project already has a screen where you can enter your friend’s name and pick a video to add overlays to. This all happens in PickerViewController.swift.

Custom birthday cards video picker screen

Once the user picks a video, the app sends it to VideoEditor.swift. Currently, this file only has a couple of helper methods and a method called makeBirthdayCard(fromVideoAt:forName:onComplete:). You’ll edit this method to add overlays to the video.

Once the app adds overlays to the video, the method calls the completion handler and sends the video URL to PlayerViewController.swift. This view controller plays the video file and lets you export the file to your photo library.

Note: To learn how to pick and play video files, check out How to Play, Record, and Merge Videos in iOS and Swift.

I bet you can’t wait to send this to your friends! :]

Composing a Video

Before you can add any overlays to the video, you need to set some things up. What you’ll do is create a new video file from an existing video, with added backgrounds and overlays.

First, you’ll create a new AVFoundation composition. You can think of a composition as a programmatic video editor. The composition holds different types of tracks, like audio and video tracks, and manages when they start or end in the time line of the video.

Once you create an empty composition, you’ll add two tracks to the composition, one for the video and one for the audio. For the audio track, you’ll simply copy the existing video’s audio. To create the video, you’ll use AVVideoCompositionCoreAnimationTool, a class that lets you combine an existing video with Core Animation layers.

Once you have both the combined video and the audio inside the composition, you’ll export the composition into a video file using AVAssetExportSession.

Don’t worry, it’s not as daunting as it sounds! Your first step is to create the composition.

Creating a Composition

Open VideoEditor.swift. The meat of the sample project is inside makeBirthdayCard(fromVideoAt:forName:onComplete:). Currently, this method simply calls the completion handler with the existing video. Replace the line onComplete(videoURL) with the following:

let asset = AVURLAsset(url: videoURL)
let composition = AVMutableComposition()

Create an AVAsset, which holds all the required information and data about the provided video. Also, create an empty composition. You’ll fill this composition with an overlaid video.

Next, add a track to the composition and grab the video track from the asset by adding the following code to the method:

guard
  let compositionTrack = composition.addMutableTrack(
    withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid),
  let assetTrack = asset.tracks(withMediaType: .video).first
  else {
    print("Something is wrong with the asset.")
    onComplete(nil)
    return
}

You add a new video track by calling addMutableTrack with a .video media type. You pass the invalid ID constant for the track ID if you won’t be using the ID. You also grab the video from the asset by grabbing the first, and only, video track in the asset. If you can’t perform either of these two actions, print out an error and call the completion handler with nil.

Now, add the following code at the end of makeBirthdayCard(fromVideoAt:forName:onComplete:) to insert the video track from the asset inside the composition’s video track:

do {
  // 1
  let timeRange = CMTimeRange(start: .zero, duration: asset.duration)
  // 2
  try compositionTrack.insertTimeRange(timeRange, of: assetTrack, at: .zero)
  
  // 3
  if let audioAssetTrack = asset.tracks(withMediaType: .audio).first,
    let compositionAudioTrack = composition.addMutableTrack(
      withMediaType: .audio, 
      preferredTrackID: kCMPersistentTrackID_Invalid) {
    try compositionAudioTrack.insertTimeRange(
      timeRange, 
      of: audioAssetTrack, 
      at: .zero)
  }
} catch {
  // 4
  print(error)
  onComplete(nil)
  return
}

Here’s what’s going on in the code above:

  1. CMTimeRange specifies time ranges inside videos. In this case, you want to add the video from the beginning to the end, so you make a range from zero to the duration of the video.
  2. Once you have the time range, you insert the whole video from the asset into your composition’s video track.
  3. If the asset also contains an audio track, do the same thing you just did for the audio track. First, add a new audio track to your composition and then insert the asset’s audio into the track.
  4. If you get an error, print it and call the completion handler with nil.

Setting Up the Composition

Next, take care of the sizing and orientation of your composition by adding this code at the end of the method:

compositionTrack.preferredTransform = assetTrack.preferredTransform
let videoInfo = orientation(from: assetTrack.preferredTransform)

let videoSize: CGSize
if videoInfo.isPortrait {
  videoSize = CGSize(
    width: assetTrack.naturalSize.height,
    height: assetTrack.naturalSize.width)
} else {
  videoSize = assetTrack.naturalSize
}

You first make sure the composition’s and the asset’s preferred transforms are the same. The starter project includes orientation(from:), which returns the orientation – portrait or landscape – of the video. If the orientation is portrait, you need to reverse width and height when checking the video’s size. Otherwise, you can use the original size.

Note: To learn more about composing videos with AVFoundation, check out How to Play, Record, and Merge Videos in iOS and Swift.

Now, you’ve created a new composition that includes the video and audio from the original file. Next, you’ll set up layers to make sure you can add backgrounds and overlays to the video in the composition.

Core Animation – The Star of the Show

Layer cake

Your backgrounds and overlays will all be CALayers. CALayer is the primary class of a framework called Core Animation.

Core Animation is behind every view in your app, responsible for drawing and animating its contents. Since a CALayer backs every view, everything that’s drawn on your phone’s screen is a layer.

As their name suggests, you can draw layers below or on top of other layers, making them perfect for adding backgrounds or overlays to videos. You leverage this in videos with AVFoundation’s handy class called AVVideoCompositionCoreAnimationTool. This is a bridge between a composition and Core Animation, letting you create a new video that applies CALayers to a video composition.

To get started, you’ll need three different layers. One layer is the background, drawn behind the video. The second layer draws the frames of the video. The third layer is an overlay layer drawn on top of the video layer.

AVVideoCompositionCoreAnimationTool background, video and overlay CALayer

Layering the Cake

Create the three layers by adding the following code to the end of makeBirthdayCard(fromVideoAt:forName:onComplete:):

let backgroundLayer = CALayer()
backgroundLayer.frame = CGRect(origin: .zero, size: videoSize)
let videoLayer = CALayer()
videoLayer.frame = CGRect(origin: .zero, size: videoSize)
let overlayLayer = CALayer()
overlayLayer.frame = CGRect(origin: .zero, size: videoSize)

Each layer will have the same frame, spanning the whole video.

Next, assemble all these layers into a single parent layer by adding this to the method:

let outputLayer = CALayer()
outputLayer.frame = CGRect(origin: .zero, size: videoSize)
outputLayer.addSublayer(backgroundLayer)
outputLayer.addSublayer(videoLayer)
outputLayer.addSublayer(overlayLayer)

Here, you create a new layer that will be the layer of your final composition. You first add the background layer, then the video layer and finally, the overlay layer. The order of adding these layers matters. Here, you arrange the video behind the overlay and the background behind the video.

By now, you’re probably getting anxious to build the project and see the results. Unfortunately, you don’t have a video to show yet!

Grumpy face

You have your layers set up, but you still have to use AVFoundation to export a video from these layers.

Exporting the Video

Now that you have all the layers, it’s time to use AVVideoCompositionCoreAnimationTool to combine them into a delicious video composition cake!

Creating a Video Composition

Add the following to the method:

let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = videoSize
videoComposition.frameDuration = CMTime(value: 1, timescale: 30)
videoComposition.animationTool = AVVideoCompositionCoreAnimationTool(
  postProcessingAsVideoLayer: videoLayer, 
  in: outputLayer)

First, you create a new AVMutableVideoComposition. The composition you created earlier was an AVComposition, which can hold video, audio and other types of tracks. On the other hand, you only use an AVVideoComposition to compose video tracks. In this case, you only need a single video track.

Next, you set the rendered video’s size to equal the original video’s size. frameDuration determines how long a frame lasts. By passing a CMTime with a value of 1 and a timescale of 30 you set the frame duration to 1 / 30 seconds, resulting in a video with a rate of 30 frames per second.

The final touch for the video composition is adding an animation tool. This tool takes your assembled output layer and the video layer and renders the video track into the video layer.

Next, add the following code to add a video track to the video composition:

let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRange(
  start: .zero, 
  duration: composition.duration)
videoComposition.instructions = [instruction]
let layerInstruction = compositionLayerInstruction(
  for: compositionTrack, 
  assetTrack: assetTrack)
instruction.layerInstructions = [layerInstruction]

Video compositions use a collection of instructions to determine what to show on the video at any time. In this case, you only need one instruction to show the assembled video over the whole duration of the composition.

Each instruction can have its own layer instructions that determine how to layer different video tracks. The starter project includes a helpful method called compositionLayerInstruction that returns the correct instructions for your video. These instructions tell the video to scale and rotate itself to match the original video’s size and orientation. Without this instruction, portrait videos would show up as landscape and result in a weird, stretched-out video.

You now have all the required pieces: a new composition containing the original video and audio and a video composition that renders the video inside your cake of layers. You’ll combine these pieces in an export session.

Using an Export Session

Create an export session to render the video to a file:

guard let export = AVAssetExportSession(
  asset: composition, 
  presetName: AVAssetExportPresetHighestQuality) 
  else {
    print("Cannot create export session.")
    onComplete(nil)
    return
}

You’ll create a new AVAssetExportSession by passing it your composition and using a preset for the highest quality video. Highest quality birthday cards for the highest quality friends. :]

Now, create a file path and set up the export session:

let videoName = UUID().uuidString
let exportURL = URL(fileURLWithPath: NSTemporaryDirectory())
  .appendingPathComponent(videoName)
  .appendingPathExtension("mov")

export.videoComposition = videoComposition
export.outputFileType = .mov
export.outputURL = exportURL

Here, you use UUID to get a unique random string that you’ll use as the file name. You’ll save the file into a temporary directory for now; the user can decide to store it in the photo library later.

Then, you make sure the export session knows the file URL and extension and give it the video composition you created earlier to render the video.

Still in makeBirthdayCard(fromVideoAt:forName:onComplete:), initiate the export by adding the following:

export.exportAsynchronously {
  DispatchQueue.main.async {
    switch export.status {
    case .completed:
      onComplete(exportURL)
    default:
      print("Something went wrong during export.")
      print(export.error ?? "unknown error")
      onComplete(nil)
      break
    }
  }
}

When you begin the export, it will eventually complete by calling the completion handler on a background thread. Make sure to dispatch to the main thread and, then, check the result of the export. If it completed successfully, call the completion handler with the exported file’s URL. If something goes wrong, you know the drill: Print out an error and complete with nil.

Phew, that was an intense coding session! Now, it’s finally time to build and run the project. Choose a video and enter a name and after your device does some churning… You’ll see the same result as you did before.

Exporting a video using AVExportSession

Yes, you did all this work for the exact same result. Except, it’s not the same. The background and overlays are rendering, it’s just that they’re currently completely transparent since you haven’t yet added anything to them. So start adding things!

Adding a Background

Now that you have the app set up to compose a video, you can start working on adding flourishes to the video to make it look like a birthday card.

Draw the background layer underneath the video. Currently, the video is as large as the background, so you first need to crop the video to let the background show from underneath. The background layer will act as a border for the video. You’ll then add an image to the background layer so that the border fills up with confetti.

In makeBirthdayCard(fromVideoAt:forName:onComplete:), add the following code underneath where you create the three layers, right under the line where you set overlayLayer‘s frame:

backgroundLayer.backgroundColor = UIColor(named: "rw-green")?.cgColor
videoLayer.frame = CGRect(
  x: 20,
  y: 20, 
  width: videoSize.width - 40, 
  height: videoSize.height - 40)

Here, you add a background color to the layer so that you can clearly see what’s going on when you run the app. Next, you set the layer’s frame to be 20 points smaller on each edge, giving the video a 20-point-wide green border.

Build and run the project, choose a video, and after it’s exported you’ll see a border around your video.

Cropping a video with AVFoundation

Now that you can see the background around the video, add an image to the background layer by adding the following code underneath what you just wrote:

backgroundLayer.contents = UIImage(named: "background")?.cgImage
backgroundLayer.contentsGravity = .resizeAspectFill

To show an image with Core Animation, set a CALayer‘s contents to the image. Also set the contentsGravity to .resizeAspectFill to make sure the image always fills the size of the layer while maintaining its aspect ratio.

Build and run once again, and you’ll see your border, now full of cute little confetti!

Adding a background image to a video with AVFoundation

Now that your background is set and done, you can work on what’s on top of the video.

Note: There’s a bug in the Simulator in iOS version up to and including 13.2.2 where the background and the image you are about to add will show as black. If you encounter this bug, you’ll need to run on a device to see your handiwork.

Adding Images

You’ll begin your video overlay journey by adding an image to the video. The starter project already includes an image of Swift, Android and a dinosaur (for some reason) all having a great time at a birthday party.

Image you'll add to your video

Add a new method in VideoEditor.swift that will add an image to a layer:

private func addImage(to layer: CALayer, videoSize: CGSize) {
  let image = UIImage(named: "overlay")!
  let imageLayer = CALayer()
}

Just like you did for the background, use a plain CALayer to hold the image. Next, add the following to the method to set the frame of the image so that it’s centered in the bottom of the video:

let aspect: CGFloat = image.size.width / image.size.height
let width = videoSize.width
let height = width / aspect
imageLayer.frame = CGRect(
  x: 0, 
  y: -height * 0.15, 
  width: width, 
  height: height)

Here, you first calculate the image’s aspect ratio. The image will be as wide as the video, so you’ll have to use the aspect ratio to determine the image’s height. This lets you fit the aspect ratio, no matter the size of the image.

You don’t want the bottom of the image to show, so you set the y-coordinate of the image to a negative value, which will make sure the image starts below the video.

Next, add the image to the image layer and the image layer to the layer passed inside the method.

imageLayer.contents = image.cgImage
layer.addSublayer(imageLayer)

Back in makeBirthdayCard(fromVideoAt:forName:onComplete:), call your method after the background code, right after you set contentsGravity:

addImage(to: overlayLayer, videoSize: videoSize)

You’re no longer working on the background; instead, you want to show images on top of the video. That’s why you pass overlayLayer to the method.

Build and run the project, select a video and you should see the image show up on the video.

Adding an overlay image to a video with AVFoundation

Now, you know how adding overlays to videos works. It’s starting to look like a real party! In the next section, you’ll improve the birthday card by adding some text.

Adding Text

This wouldn’t be much of a birthday card if it didn’t say “Happy Birthday,” so you’ll need to add text to the video. In VideoEditor.swift, add a new method:

private func add(text: String, to layer: CALayer, videoSize: CGSize) {
}

To display the text, you’ll use a CALayer subclass called CATextLayer. Since the subclass has a bit of a cumbersome API, you’ll use NSAttributedString to create and customize the text. Add the following code to the method you just created:

let attributedText = NSAttributedString(
  string: text,
  attributes: [
    .font: UIFont(name: "ArialRoundedMTBold", size: 60) as Any,
    .foregroundColor: UIColor(named: "rw-green")!,
    .strokeColor: UIColor.white,
    .strokeWidth: -3])

With this code, you create the attributed string and make sure it uses a big, round font. You make the text green, so it’s easy to see and add a white stroke so that the text works on all backgrounds. If you want your text to have both a stroke and a fill, you need to make the stroke width negative; otherwise, only the stroke will show. Don’t wonder why it’s like that, it just is. :]

Next, create a text layer with the attributed string:

let textLayer = CATextLayer()
textLayer.string = attributedText
textLayer.shouldRasterize = true
textLayer.rasterizationScale = UIScreen.main.scale
textLayer.backgroundColor = UIColor.clear.cgColor
textLayer.alignmentMode = .center

This creates a text layer and sets the text. You make sure the text is rasterized with a scale that matches the current screen’s scale so that it doesn’t look blurry. Finally, you give the layer a transparent background and center the text.

Now, set the text’s frame and add it to the layer:

textLayer.frame = CGRect(
  x: 0, 
  y: videoSize.height * 0.66, 
  width: videoSize.width, 
  height: 150)
textLayer.displayIfNeeded()
layer.addSublayer(textLayer)

Before adding it to the layer, call displayIfNeeded() on the text layer. CALayers, just like UIViews, update asynchronously and sometimes Core Animation doesn’t show the text for the first second or two of the video. By calling this method, you ensure that Core Animation renders the text as soon as possible.

Finally, back in makeBirthdayCard(fromVideoAt:forName:onComplete:), call your new method right under the call to addImage(to:videoSize:):

add(
  text: "Happy Birthday,\n\(name)", 
  to: overlayLayer, 
  videoSize: videoSize)

Just like you did with the image, you add the text to the overlay layer.

Build and run the project, select a video, and enter your friend’s name. This tutorial will address the card to Ray. Once it exports, you’ll see a line of text wishing your friend a happy birthday!

Adding a text overlay to a video with AVFoundation

Whether or not today is actually your friend’s birthday shouldn’t stop you from celebrating, because your birthday card is looking pretty good now! But one thing that’s still missing is movement. Keep reading to find out how to add animations to your overlays.

Adding Animations

One good thing about CALayers is, as the name Core Animation implies, they’re animatable! To make your birthday card more dynamic, you’ll add a scaling animation to your text to make it scale up and down.

In add(text:to:videoSize:), add the following line right before the last line, where you add the text layer as a sublayer of layer:

let scaleAnimation = CABasicAnimation(keyPath: "transform.scale")

CABasicAnimation lets you add simple animations to CALayers between two specific values. CABasicAnimation uses key-value observing to set and read the animated values. In this case, you’ll be setting the transform’s scale.

Next, set up the animation by adding the following code right under the line you just added:

scaleAnimation.fromValue = 0.8
scaleAnimation.toValue = 1.2
scaleAnimation.duration = 0.5
scaleAnimation.repeatCount = .greatestFiniteMagnitude
scaleAnimation.autoreverses = true
scaleAnimation.timingFunction = CAMediaTimingFunction(name: .easeInEaseOut)

This code scales the text from a value of 0.8 to 1.2 over 0.5 seconds. You want this animation to repeat indefinitely, so you set the repeat count to .greatestFiniteMagnitude. You also set autoreverses to true so that the scale bounces back and forth between 0.8 and 1.2.

Finally, set a few more settings and add the animation to the layer:

scaleAnimation.beginTime = AVCoreAnimationBeginTimeAtZero
scaleAnimation.isRemovedOnCompletion = false
textLayer.add(scaleAnimation, forKey: "scale")

When adding animations to videos, it’s important that you set the beginTime to AVCoreAnimationBeginTimeAtZero; otherwise, the animation will never start. You also need to make sure the animation is not removed on completion.

Build and run the project and select a video.

Adding animated text to a video with AVFoundation

Away with the boring static birthday cards, now your text scales up and down! You made the card more dynamic, but why stop there? In the next section, you’ll add confetti to your birthday card, too!

Adding the Final Touches

There are lots of other powerful CALayer subclasses besides CATextLayer. For instance, CAGradientLayer shows color gradients. CAReplicatorLayer lets you create patterns by repeating a layer according to a set of rules. With CAShapeLayer, you can draw circles, ellipses, arcs, polygons and even arbitrary shapes. By combining it with CAAnimation, you can dynamically change and animate the shape, leading to some cool effects.

No birthday is complete without confetti, which is why, as a final touch, you’ll add a confetti effect. The confetti will fall from the top of the video. Thankfully, this is digital confetti, so you won’t need to clean up the mess afterward. :]

To create the confetti, you’ll use another CALayer subclass called CAEmitterLayer, which lets you create particle effects. You determine the layer’s shape and set your desired particle birth rate, velocity and other settings. The layer will then emit particles from its shape.

The starter project already includes a method called addConfetti(to:) that creates 16 different particles by combining a random confetti image with a random color. It then sets up a CAEmitterLayer as a line just above the video layer with those 16 particles. The result is confetti falling from the top of the video!

To add the confetti, scroll to makeBirthdayCard(fromVideoAt:forName:onComplete:) for the final time and call the method right above where you call addImage(to:videoSize:):

addConfetti(to: overlayLayer)

Build and run the project and bask in the glory of CALayers!

Using CAEmitterLayer to add confetti to a video

By adding confetti, you’ve made the birthday card at least ten times as triumphant. Trust me, I calculated that myself. :]

Where to Go From Here?

You can download the completed project files by clicking the Download Materials button at the top or bottom of the tutorial.

By now, you deserve a card to congratulate you for everything you’ve learned in this AVFoundation tutorial! You learned how to use AVVideoCompositionCoreAnimationTool to combine CALayers with video to add backgrounds and overlays, how to make a video composition and how to export a video to a file.

If you want to go deeper into everything you can do with AVFoundation, check out our video course Beginning Video with AVFoundation.

To learn more about the different things you can do with Core Animation, check out the iOS Views and Animations video course, as well as the CALayer Tutorial.

Make sure to remember this tutorial for your friends’ upcoming birthdays! If you have any comments or questions, reach out in the comment section below.

Average Rating

5/5

Add a rating for this content

5 ratings

Contributors

Comments