UIGestureRecognizer Tutorial in iOS 5: Pinches, Pans, and More!

Update 7/17/14: We now have a new version of this tutorial fully updated to Swift – check it out! If you need to detect gestures in your app, such as taps, pinches, pans, or rotations, it’s extremely easy with the built-in UIGestureRecognizer classes. In this tutorial, we’ll show you how you can easily add gesture […] By Ray Wenderlich.

Leave a rating/review
Save for later
Share
You are currently viewing page 2 of 3 of this article. Click here to view the first page.

Gratuitous Deceleration

In a lot of Apple apps and controls, when you stop moving something there’s a bit of deceleration as it finishes moving. Think about scrolling a web view, for example. It’s common to want to have this type of behavior in your apps.

There are many ways of doing this, but we’re going to do one very simple implementation for a rough but nice effect. The idea is we need to detect when the gesture ends, figure out how fast the touch was moving, and animate the object moving to a final destination based on the touch speed.

  • To detect when the gesture ends: The callback we pass to the gesture recognizer is called potentially multiple times – when the gesture recognizer changes its state to begin, changed, or ended for example. We can find out what state the gesture recognizer is in simply by looking at its state property.
  • To detect the touch velocity: Some gesture recognizers return additional information – you can look at the API guide to see what you can get. There’s a handy method called velocityInView that we can use in the UIPanGestureRecognizer!

So add the following to the bottom of the handlePan method:

if (recognizer.state == UIGestureRecognizerStateEnded) {
    
    CGPoint velocity = [recognizer velocityInView:self.view];
    CGFloat magnitude = sqrtf((velocity.x * velocity.x) + (velocity.y * velocity.y));
    CGFloat slideMult = magnitude / 200;
    NSLog(@"magnitude: %f, slideMult: %f", magnitude, slideMult);
    
    float slideFactor = 0.1 * slideMult; // Increase for more of a slide
    CGPoint finalPoint = CGPointMake(recognizer.view.center.x + (velocity.x * slideFactor), 
                                     recognizer.view.center.y + (velocity.y * slideFactor));
    finalPoint.x = MIN(MAX(finalPoint.x, 0), self.view.bounds.size.width);
    finalPoint.y = MIN(MAX(finalPoint.y, 0), self.view.bounds.size.height);
    
    [UIView animateWithDuration:slideFactor*2 delay:0 options:UIViewAnimationOptionCurveEaseOut animations:^{
        recognizer.view.center = finalPoint;
    } completion:nil];

}

This is just a very simple method I wrote up for this tutorial to simulate deceleration. It takes the following strategy:

  • Figure out the length of the velocity vector (i.e. the magnitude)
  • If the length is < 200, then decrease the base speed, otherwise increase it.
  • Calculate a final point based on the velocity and the slideFactor.
  • Make sure the final point is within the view’s bounds
  • Animate the view to the final resting place.
  • It uses the “ease out” animation option to slow down the movement over time.

Compile and run to try it out, you should now have some basic but nice deceleration! Feel free to play around with it and improve it – if you come up with a better implementation, please share in the forum discussion at the end of this article.

UIPinchGestureRecognizer and UIRotationGestureRecognizer

Our app is coming along great so far, but it would be even cooler if you could scale and rotate the image views by using pinch and rotation gestures as well!

Let’s add the code for the callbacks first. Add the following declarations to ViewController.h:

- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer;
- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer;

And add the following implementations in ViewController.m:

- (IBAction)handlePinch:(UIPinchGestureRecognizer *)recognizer {    
    recognizer.view.transform = CGAffineTransformScale(recognizer.view.transform, recognizer.scale, recognizer.scale);
    recognizer.scale = 1;    
}

- (IBAction)handleRotate:(UIRotationGestureRecognizer *)recognizer {    
    recognizer.view.transform = CGAffineTransformRotate(recognizer.view.transform, recognizer.rotation);
    recognizer.rotation = 0;    
}

Just like we could get the translation from the pan gesture recotnizer, we can get the scale and rotation from the UIPinchGestureRecognizer and UIRotationGestureRecognizer.

Every view has a transform that is applied to it, which you can think of as information on the rotation, scale, and translation that should be applied to the view. Apple has a lot of built in methods to make working with a transform easy such as CGAffineTransformScale (to scale a given transform) and CGAffineTransformRotate (to rotate a given transform). Here we just use these to update the view’s transform based on the gesture.

Again, since we’re updating the view each time the gesture updates, it’s very important to reset the scale and rotation back to the default state so we don’t have craziness going on.

Now let’s hook these up in the Storyboard editor. Open up MainStoryboard.storyboard and perform the following steps:

  1. Drag a Pinch Gesture Recognizer and a Rotation Gesture Recognizer on top of the monkey. Then repeat this for the banana.
  2. Connect the selector for the Pinch Geture Recognizers to the View Controller’s handlePinch method. Connect the selector for the Rotation gesture recognizers to the View Controller’s handleRotate method.

Compile and run (I recommend running on a device if possible because pinches and rotations are kinda hard to do on the simulator), and now you should be able to scale and rotate the monkey and banana!

Scaling and rotating the monkey and banana

Simultaneous Gesture Recognizers

You may notice that if you put one finger on the monkey, and one on the banana, you can drag them around at the same time. Kinda cool, eh?

However, you’ll notice that if you try to drag the monkey around, and in the middle of dragging bring down a second finger to attempt to pinch to zoom, it doesn’t work. By default, once one gesture recognizer on a view “claims” the gesture, no others can recognize a gesture from that point on.

However, you can change this by overriding a method in the UIGestureRecognizer delegate. Let’s see how it works!

Open up ViewController.h and mark the class as implementing UIGestureRecognizerDelegate as shown below:

@interface ViewController : UIViewController <UIGestureRecognizerDelegate>

Then switch to ViewController.m and implement one of the optional methods you can override:

- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {        
    return YES;
}

This method tells the gesture recognizer whether it is OK to recognize a gesture if another (given) recognizer has already detected a gesture. The default implementation always returns NO – we switch it to always return YES here.

Next, open MainStoryboard.storyboard, and for each gesture recognizer connect its delegate outlet to the view controller.

Compile and run the app again, and now you should be able to drag the monkey, pinch to scale it, and continue dragging afterwards! You can even scale and rotate at the same time in a natural way. This makes for a much nicer experience for the user.

Programmatic UIGestureRecognizers

So far we’ve created gesture recognizers with the Storyboard editor, but what if you wanted to do things programatically?

It’s just as easy, so let’s try it out by adding a tap gesture recognizer to play a sound effect when either of these image views are tapped.

Since we’re going to play a sound effect, we need to add the AVFoundation.framework to our project. To do this, select your project in the Project navigator, select the MonkeyPinch target, select the Build Phases tab, expand the Link Binary with Libraries section, click the Plus button, and select AVFoundation.framework. At this point your list of frameworks should look like this:

Adding the AVFoundation framework into the project

Open up ViewController.h and make the following changes:

// Add to top of file
#import <AVFoundation/AVFoundation.h>

// Add after @interface
@property (strong) AVAudioPlayer * chompPlayer;
- (void)handleTap:(UITapGestureRecognizer *)recognizer;

And then make the following changes to ViewController.m:

// After @implementation
@synthesize chompPlayer;

// Before viewDidLoad
- (AVAudioPlayer *)loadWav:(NSString *)filename {
    NSURL * url = [[NSBundle mainBundle] URLForResource:filename withExtension:@"wav"];
    NSError * error;
    AVAudioPlayer * player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
    if (!player) {
        NSLog(@"Error loading %@: %@", url, error.localizedDescription);
    } else {
        [player prepareToPlay];
    }
    return player;
}

// Replace viewDidLoad with the following
- (void)viewDidLoad
{
    [super viewDidLoad];
    for (UIView * view in self.view.subviews) {
        
        UITapGestureRecognizer * recognizer = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(handleTap:)];
        recognizer.delegate = self;
        [view addGestureRecognizer:recognizer];
        
        // TODO: Add a custom gesture recognizer too
        
    }

    self.chompPlayer = [self loadWav:@"chomp"];
}

// Add to bottom of file
- (void)handleTap:(UITapGestureRecognizer *)recognizer {    
    [self.chompPlayer play];
}

The audio playing code is outside of the scope of this tutorial so we won’t discuss it (although it is incredibly simple).

The imiportant part is in viewDidLoad. We cycle through all of the subviews (just the monkey and banana image views) and create a UITapGestureRecognizer for each, specifying the callback. We set the delegate of the recognizer programatically, and add the recognizer to the view.

That’s it! Compile and run, and now you should be able to tap the image views for a sound effect!

Contributors

Over 300 content creators. Join our team.