Core Image Tutorial: Getting Started

Learn the basics of cool image filtering effects with Core Image and Swift. By Nick Lockwood.

Leave a rating/review
Save for later
Share

CoreImage_non_feast

Update note: This tutorial was updated for iOS 8 and Swift by Nick Lockwood, and checked against Xcode 6 beta 7! Original post by Tutorial Team member Jake Gunderson.

Core Image is a powerful framework that lets you easily apply filters to images. You can get all kinds of effects, such as modifying the vibrance, hue, or exposure. It can use either the CPU or GPU to process the image data and is very fast — fast enough to do real-time processing of video frames!

Core Image filters can also be chained together to apply multiple effects to an image or video frame at once. The multiple filters are combined into a single filter that is applied to the image. This makes it very efficient compared to processing the image through each filter, one at a time.

In this tutorial, you will get hands-on experience playing around with Core Image. You’ll apply a few different filters, and you’ll see how easy it is to apply cool effects to images in real time!

Note: At the time of writing this tutorial, our understanding is we cannot post screenshots of Xcode 6 and iOS 8 since they are still in beta. Therefore, we are suppressing screenshots in this Swift tutorial until we are sure it is OK.

Getting Started

Before you get started, let’s discuss some of the most important classes in the Core Image framework:

  • CIContext. All of the processing of a core image is done in a CIContext. This is somewhat similar to a Core Graphics or OpenGL context.
  • CIImage. This class hold the image data. It can be created from a UIImage, from an image file, or from pixel data.
  • CIFilter. The CIFilter class has a dictionary that defines the attributes of the particular filter that it represents. Examples of filters are vibrance, color inversion, cropping, and many more.

You’ll be using each of these classes in this project.

CoreImageFun

Open up Xcode and create a new project with the iOS \ Application \ Single View Application template. Enter CoreImageFun for the Product Name, select iPhone for the Devices option, and make sure that Language is set to Swift.

Download the resources for this tutorial, and add the included image.png to your project.

Next, open Main.storyboard, and drag an image view in as a subview of the existing view. In the Attributes Inspector, set the image view’s content mode to Aspect Fit, so it won’t distort images.

Next, ensure the Document Outline (the hierarchy left of the canvas in Interface Builder) is visible – you can enable it from the menu at Editor \ Show Document Outline.

Control-drag from the image view to its superview three times to add three constraints:

  1. Add a constraint Top Space to Layout Guide, using the Size Inspector to set the constraint’s constant to zero if necessary.
  2. Add a constraint to Center Horizontally in Container (also setting its constant to zero).
  3. Add an Equal Width constraint.

Finally, to constrain the image view’s height, control drag from the image view to itself, and add an Aspect Ratio constraint, using the Size Inspector to set its multiplier 8:5 for the ratio of width to height and a constant factor of zero. Finally, navigate to Editor \ Resolve Auto Layout Issues \ All Views in View Controller \ Update Frames, so Interface Builder updates the layout based on these new constraints.

Next, open the Assistant Editor and make sure it’s displaying ViewController.swift. Control-drag from the UIImageView to just after the opening brace of the ViewController class. Name the outlet imageView, and click Connect.

Build and run the project just to make sure everything is good so far – you should just see an empty screen. The initial setup is complete – now onto Core Image!

Basic Image Filtering

You’re going to get started by simply running your image through a CIFilter and displaying it on the screen. Every time you want to apply a CIFilter to an image you need to do four things:

  1. Create a CIImage object. CIImage has several initialization methods, including: CIImage(contentsOfURL:), CIImage(data:), CIImage(CGImage:), CIImage(bitmapData:bytesPerRow:size:format:colorSpace:), and several others. You’ll most likely be working with CIImage(contentsOfURL:) most of the time.
  2. Create a CIContext. A CIContext can be CPU or GPU based. A CIContext is relatively expensive to initialize so you reuse it rather than create it over and over. You will always need one when outputting the CIImage object.
  3. Create a CIFilter. When you create the filter, you configure a number of properties on it that depend on the filter you’re using.
  4. Get the filter output. The filter gives you an output image as a CIImage – you can convert this to a UIImage using the CIContext, as you’ll see below.

Let’s see how this works. Add the following code to ViewController.swift inside viewDidLoad():

// 1
let fileURL = NSBundle.mainBundle().URLForResource("image", withExtension: "png")

// 2
let beginImage = CIImage(contentsOfURL: fileURL)

// 3
let filter = CIFilter(name: "CISepiaTone")
filter.setValue(beginImage, forKey: kCIInputImageKey)
filter.setValue(0.5, forKey: kCIInputIntensityKey)

// 4
let newImage = UIImage(CIImage: filter.outputImage)
self.imageView.image = newImage

Let’s go over this section by section:

  1. This line creates an NSURL object that holds the path to your image file.
  2. Next you create your CIImage with the CIImage(contentsOfURL:) constructor.
  3. Next you’ll create your CIFilter object. The CIFilter constructor takes the name of the filter, and a dictionary that specifies the keys and values for that filter. Each filter will have its own unique keys and set of valid values. The CISepiaTone filter takes only two values, the KCIInputImageKey (a CIImage) and the kCIInputIntensityKey, a float value between 0 and 1. Here you give that value 0.5. Most of the filters have default values that will be used if no values are supplied. One exception is the CIImage, this must be provided as there is no default.
  4. Getting a CIImage back out of a filter is as easy as using the outputImage property. Once you have an output CIImage, you will need to convert it into a UIImage. The UIImage(CIImage:) constructor creates a UIImage from a CIImage. Once you’ve converted it to a UIImage, you just display it in the image view you added earlier.

Build and run the project, and you’ll see your image filtered by the sepia tone filter.

CI-Sepia-Crop

Congratulations, you have successfully used CIImage and CIFilters!

Nick Lockwood

Contributors

Nick Lockwood

Author

Over 300 content creators. Join our team.