Home iOS & Swift Tutorials

Core Image Tutorial: Getting Started

Learn the basics of cool image filtering effects with Core Image and Swift.

Version

  • Swift 5.5, iOS 15, Xcode 13
Update note: Ron Kliffer updated this tutorial for Xcode 13, Swift 5.5 and iOS 15. Jake Gunderson wrote the original.

Core Image is a robust framework that lets you apply filters to images. It provides all kinds of effects, such as modifying the vibrancy, hue or exposure. It can use either the CPU or GPU to process the image data quickly – fast enough to do real-time processing of video frames!

You can chain Core Image filters together to apply many effects to an image or video frame at once. The many filters combine into a single filter and apply to the image. This makes it quite efficient compared to processing the image through each filter, one at a time.

In this tutorial, you’ll get hands-on experience playing around with Core Image. You’ll apply a few different filters and see how easy it is to apply cool effects to images in real-time.

Getting Started

Before you begin, look at some of the most important classes in the Core Image framework:

  • CIContext. CIContext does all the processing of a core image. This is somewhat like a Core Graphics or OpenGL context.
  • CIImage. This class holds the image data. A UIImage, an image file or pixel data can create it.
  • CIFilter. The CIFilter class has a dictionary. This defines the attributes of the particular filter it represents. Examples of filters include vibrancy, color inversion, cropping and many more.

You’ll use each of these classes in this project.

CoreImageFun

Click the Download Materials button at the top or bottom of this tutorial to download the starter project. Open CoreImageFun.xcodeproj and run it. This is a simple app, a single screen with an image and a slider. The slider doesn’t do anything yet, but we’ll use it to show CIFilter‘s powers. You’ll also notice a camera button at the top right of the screen. You’ll use this later in the tutorial to bring up the image picker.

Starter app state

Image-Filtering Basics

You’re going to start by running your image through a CIFilter and displaying it on the screen. Every time you want to apply a CIFilter to an image, you need to do four things:

  1. Create a CIImage object. A CIImage has several initialization methods. In this tutorial, you’ll use CIImage(image:) to create a CIImage from a UIImage. Explore the documentation to learn more ways you can create a CIImage.
  2. Create a CIContext. A CIContext can be CPU- or GPU-based. A CIContext is expensive to initialize, so you reuse it rather than create it over and over. You’ll always need one when outputting the CIImage object.
  3. Create a CIFilter. When you create the filter, you configure some properties on it that depend on the filter you’re using.
  4. Get the filter output. The filter gives you an output image as a CIImage. You can convert this to a UIImage using the CIContext, as you’ll see below.

Applying Filter

After the theoretical information, it’s time to see how this works. Add the following code to ViewController.swift:

func applySepiaFilter(intensity: Float) {
  // 1
  guard let uiImage = UIImage(named: "image") else { return }
  let ciImage = CIImage(image: uiImage)

  // 2
  guard let filter = CIFilter(name: "CISepiaTone") else { return }

  // 3
  filter.setValue(ciImage, forKey: kCIInputImageKey)
  filter.setValue(intensity, forKey: kCIInputIntensityKey)

  // 4
  guard let outputImage = filter.outputImage else { return }

  // 5
  let newImage = UIImage(ciImage: outputImage)
  imageView.image = newImage
}

Here’s what the code does. It:

  1. Creates a UIImage and uses it to create a CIImage.
  2. Creates a CIFilter of type CISepiaTone. It’s the type of sepia-tone.
  3. The CISepiaTone filter takes two values. First, an input image: kCIInputImageKey, which is a CIImage instance. Second, an intensity: kCIInputIntensityKey, a float value between 0 and 1. Most of the filters use their default values if there isn’t any value. One exception is the CIImage. This must provide a value because there’s no default.
  4. Gets a CIImage back out of the filter, using the outputImage property.
  5. Turns the CIImage back to a UIImage and displays it in the image view.

Next, call the added new method by adding the following to viewDidLoad():

applySepiaFilter(intensity: 0.5)

This triggers the image filtering with an intensity value of 0.5. Later in the tutorial, you’ll use a slider to try various intensity values.

Build and run the project. You’ll see your image filtered by the sepia tone filter:

Image with sepia filter

Congratulations, you have used CIImage and CIFilters well! :]

Putting it Into Context

Before you proceed, there’s an optimization you should know about.

As noted above, you need a CIContext to apply a CIFilter. But there’s no mention of this object in the example above. It turns out UIImage(CIImage:) does all the work for you. It creates a CIContext and uses it to filter the image. This makes the Core Image API quite easy to use.

There’s one major drawback: It creates a new CIContext every time it’s used. CIContext instances should be reusable to increase performance. If you want to use a slider to update the filter value, you must create a new CIContext every time you change the filter. This method would be quite slow.

First, add the following property to ViewController:

let context = CIContext(options: nil)

CIContext accepts an options dictionary. It specifies options such as the color format or whether the context should run on the CPU or GPU. For this app, the default values are fine, so you pass in nil for that argument.

Next, delete Step 5 from applySepiaFilter(intensity:) and replace it with the following:

guard let cgImage = context.createCGImage(outputImage, from: outputImage.extent) else { return }
imageView.image = UIImage(cgImage: cgImage)

Here, you use CIContext to draw a CGImage and use that to create a UIImage to display in the image view.

Build and run. Make sure it works as before.

Image with sepia filter

In this example, handling the CIContext creation yourself doesn’t make much difference. You’ll see why doing so is important for performance as you put in place the ability to change the filter in the next section.

Changing Filter Values

This is great, but it’s just the beginning of what you can do with Core Image filters. It’s time to use that nice slider below the image to alter the filter effect.

You already added a property for the CIContext instance. Now, you’ll add a property to hold the filter.

There’s already an IBAction connected to the slider’s Value Changed action. It’s called sliderValueChanged(_:). In this method, you’ll redo the image filter whenever the slider value changes. But you don’t want to redo the whole process. That would be quite inefficient and would take too long. You’ll need to change a few things in your class, so you hold onto some of the objects you create in applySepiaFilter(intensity:).

Add the following properties right below the context declaration:

let filter = CIFilter(name: "CISepiaTone")!

Next, add the following to viewDidLoad() before calling applySepiaFilter(intensity:):

guard let uiImage = UIImage(named: "image") else { return }
let ciImage = CIImage(image: uiImage)
filter.setValue(ciImage, forKey: kCIInputImageKey)

Here, you set the image to filter. You did this before in applySepiaFilter(intensity:). But it’s better to move it to viewDidLoad() to prevent calls on every slider value change.

You moved some code to viewDidLoad(), so replace applySepiaFilter(intensity:) with the following:

func applySepiaFilter(intensity: Float) {
  filter.setValue(intensity, forKey: kCIInputIntensityKey)

  guard let outputImage = filter.outputImage else { return }

  guard let cgImage = context.createCGImage(outputImage, from: outputImage.extent) else { return }
  imageView.image = UIImage(cgImage: cgImage)
}

Finally, add the following to sliderValueChanged(_:):

applySepiaFilter(intensity: slider.value)

When the slider value changes, applySepiaFilter(intensity:) will run with new intensity value.

Your slider is set to the default values: min 0, max 1, default 0.5. How convenient! These happen to be the right values for this CIFilter.

Build and run. You should have a functioning live slider that will alter the sepia value for your image in real-time.

Intensity slider in action

Getting Photos From the Photo Album

Now that you can change the values of the filter right away, things are getting interesting! But what if you don’t care for this image of flowers? Next, you’ll set up a UIImagePickerController to get pictures out of the photo album and into your app so you can play with them.

There’s already an IBAction connected to the camera button’s Touch Up Inside action. It’s called loadPhoto(). Add the following code to the loadPhoto():

let picker = UIImagePickerController()
picker.delegate = self
present(picker, animated: true)

The first line of code instantiates a new UIImagePickerController. Set the delegate of the image picker to self (the ViewController) and then present the picker.

There’s a compiler error here. You need to declare that ViewController conforms to the UIImagePickerControllerDelegate and UINavigationControllerDelegate protocols.

Add the following extension to the bottom of ViewController.swift:

extension ViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate {
  func imagePickerController(
    _ picker: UIImagePickerController,
    didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey: Any]) {
  }
}

This delegate method returns the selected image along with some related information in the info dictionary. For more details on the various data in this dictionary, check out the documentation.

Build and run the app, and tap the button, which brings the image picker with the photos in your photo album.

Showing an image picker

Selecting an image does nothing. You’re about to change that. Add the following code to imagePickerController(_:didFinishPickingMediaWithInfo:):

//1
guard let selectedImage = info[.originalImage] as? UIImage else { return }

//2
let ciImage = CIImage(image: selectedImage)
filter.setValue(ciImage, forKey: kCIInputImageKey)

//3
applySepiaFilter(intensity: slider.value)

//4
dismiss(animated: true)

Here’s a code breakdown. It:

  1. Retrieves the select image using the originalImage UIImagePickerController.InfoKey key.
  2. Applies the selected image to the filter.
  3. Calls applySepiaFilter(intensity:) using the current slider value for the intensity. This will update the image view.
  4. Dismisses the image picker once you’re done filtering the selected image.

Build and run. Now, you’ll be able to update any image from your photo album.

Selecting an image

What About Image Metadata?

Of course, it’s time to talk about image metadata for a moment. Image files taken on mobile phones have various data associated with them, such as GPS coordinates, image format and orientation.

Orientation in particular is something you’ll need to preserve. Loading a UIImage into a CIImage, rendering to a CGImage and converting back to a UIImage strips the metadata from the image. To preserve orientation, you’ll need to record it and then pass it back into the UIImage.

Start by adding a new property to ViewController.swift:

var orientation = UIImage.Orientation.up

Next, add the following line to imagePickerController(_:didFinishPickingMediaWithInfo:) before calling applySepiaFilter(intensity:):

orientation = selectedImage.imageOrientation

This saves the selected image orientation to the property.

Finally, alter the line in applySepiaFilter(intensity:) set in the imageView object:

imageView.image = UIImage(cgImage: cgImage, scale: 1, orientation: orientation)

Now, if you take a picture taken in something other than the default orientation, the app preserves it.

What Other Filters Are Available?

One of CIFilter biggest strengths is the ability to chain filters. To do this, you’ll create a dedicated method to process the CIImage and filter it to look like an old photo.

The CIFilter API has more than 160 filters on macOS, with most of them available on iOS as well. It’s also now possible to create custom filters as well.

To see any available filters or attributes, check out the documentation.

Creating Old Photo Filter

Add the following method to ViewController:

func applyOldPhotoFilter(intensity: Float) {
  // 1
  filter.setValue(intensity, forKey: kCIInputIntensityKey)

  // 2
  let random = CIFilter(name: "CIRandomGenerator")

  // 3
  let lighten = CIFilter(name: "CIColorControls")
  lighten?.setValue(random?.outputImage, forKey: kCIInputImageKey)
  lighten?.setValue(1 - intensity, forKey: kCIInputBrightnessKey)
  lighten?.setValue(0, forKey: kCIInputSaturationKey)

  // 4
  guard let ciImage = filter.value(forKey: kCIInputImageKey) as? CIImage else { return }
  let croppedImage = lighten?.outputImage?.cropped(to: ciImage.extent)

  // 5
  let composite = CIFilter(name: "CIHardLightBlendMode")
  composite?.setValue(filter.outputImage, forKey: kCIInputImageKey)
  composite?.setValue(croppedImage, forKey: kCIInputBackgroundImageKey)

  // 6
  let vignette = CIFilter(name: "CIVignette")
  vignette?.setValue(composite?.outputImage, forKey: kCIInputImageKey)
  vignette?.setValue(intensity * 2, forKey: kCIInputIntensityKey)
  vignette?.setValue(intensity * 30, forKey: kCIInputRadiusKey)

  // 7
  guard let outputImage = vignette?.outputImage else { return }
  guard let cgImage = context.createCGImage(outputImage, from: outputImage.extent) else { return }
  imageView.image = UIImage(cgImage: cgImage, scale: 1, orientation: orientation)
}

Here’s what’s going on, section by section. The code:

  1. Sets the intensity in the sepia-tone filter you used earlier.
  2. Establishes a filter that creates a random noise pattern looking like this:A random noise pattern
    It doesn’t take any parameters. You’ll use this noise pattern to add texture to your final “old photo” look.
  3. Alters the output of the random noise generator. You want to change it to grayscale and brighten it a bit, so the effect is less dramatic. The input image key is set to the outputImage property of the random filter. This is a convenient way to pass the output of one filter as the input of the next.
  4. Has cropped(to:) take an output CIImage and crops it to the provided rect. In this case, you need to crop the output of the CIRandomGenerator filter because it goes on forever. If you don’t crop it at some point, you get an error saying the filters have “an infinite extent”. CIImages don’t actually contain image data; they describe a “recipe” for creating it. It’s not until you call a method on the CIContext that the data is processed.
  5. Combines the output of the sepia and the CIRandomGenerator filters. The latter performs the same operation as the “Hard Light” setting does in an Adobe Photoshop layer. Most, if not all, of the filter options in Photoshop are achievable using Core Image.
  6. Runs a vignette filter on this composited output that darkens the photo’s edges. You use the intensity value to set the radius and intensity of this effect.
  7. Gets the output image and sets it to the image view.

Applying Old Photo Filter

That’s all for this filter chain. You now have an idea of how complex these filter chains may become. You can combine Core Image filters into these kinds of chains, so you can achieve an endless variety of effects.

If you want to view all this in action, then replace all the calls to applySepiaFilter(intensity:) with applyOldPhotoFilter(intensity:).

Build and run. You should get a more refined old-photo effect, complete with sepia, a little noise and some vignetting.

Old photo filter

This noise could be more subtle, but refining that is up to you, dear reader. Now, you have the full power of Core Image at your disposal. Go crazy!

Where to Go From Here?

Use the Download Materials button at the top or bottom of this tutorial to download the final project for this tutorial.

That about covers the basics of using Core Image filters. It’s a pretty handy technique, and you should be able to use it to apply some neat filters to images quite fast.

Moreover, there are many more Core Image tutorials on this site, including:

You can check out the Core Image filter reference documentation for notes on all the filters.

Do you have any questions or comments? If so, please join the forum discussion below.

Reviews

More like this

Contributors

Comments