Easily Overlooked New Features in iOS 7

Check out this list of easily overlooked features in iOS 7 – there will probably be a few you’ve never heard of! By .

Leave a rating/review
Save for later
Share
You are currently viewing page 4 of 4 of this article. Click here to view the first page.

16. Check screenshots with UIApplicationUserDidTakeScreenshotNotification

Prior to iOS 7, apps like Snapshot or Facebook Poke used some pretty creative methods to detect when a user took a screenshot. However, iOS 7 provides a brand-new notification for this event: UIApplicationUserDidTakeScreenshotNotification. Just subscribe to it as usual to know when a screenshot was taken.

Note: UIApplicationUserDidTakeScreenshotNotification is posted after the screenshot is taken. Currently there is no way to be notified before a screenshot is taken, which could be useful for hiding an embarrassing photo. Hopefully Apple adds an UIApplicationUserWillTakeScreenshotNotification in iOS 8! :]

17. Implement multi-language speech synthesis

Wouldn’t be nice if you could make your app speak? iOS 7 introduces two new classes: AVSpeechSynthesizer and AVSpeechUtterance. Together, they can give your app a voice. The really interesting news? There is a huge selection of available languages are available — even ones that Siri doesn’t speak, like Brazilian Portuguese!

Using these two classes to provide speech synthesis in your apps is very easy. AVSpeechUtterance represents what and how you want to say something. Then, AVSpeechSynthesizer is used to say it, as shown in the code snippet below:

AVSpeechSynthesizer *synthesizer = [[AVSpeechSynthesizer alloc] init];
AVSpeechUtterance *utterance = 
  [AVSpeechUtterance speechUtteranceWithString:@"Wow, I have such a nice voice!"];
utterance.rate = AVSpeechUtteranceMaximumSpeechRate / 4.0f;
utterance.voice = [AVSpeechSynthesisVoice voiceWithLanguage:@"en-US"]; // defaults to your system language
[synthesizer speakUtterance:utterance];

That’s impressive — it only takes five lines of code to add speech to your app!

18. Use the new UIScreenEdgePanGestureRecognizer

UIScreenEdgePanGestureRecognizer inherits from UIPanGestureRecognizer and lets you detect gestures starting near the edge of the screen.

Using this new gesture recognizer is quite simple, as shown below:

UIScreenEdgePanGestureRecognizer *recognizer = [[UIScreenEdgePanGestureRecognizer alloc] initWithTarget:self action:@selector(handleScreenEdgeRecognizer:)];
recognizer.edges = UIRectEdgeLeft; // accept gestures that start from the left; we're probably building another hamburger menu!
[self.view addGestureRecognizer:recognizer];

19. Realize Message.app behavior with UIScrollViewKeyboardDismissMode

Dismissing the keyboard while you scroll is such a nice experience in Messages.app. However, building this behavior into your own apps can be tough. Luckily, Apple added the handy property keyboardDismissMode on UIScrollView to make your life a little easier.

Now your app can behave like Messages.app just by changing a single property on your Storyboard, or alternatively by adding one line of code!

This property uses the new UIScrollViewKeyboardDismissMode enum. The possible values of this enum are as follows:

UIScrollViewKeyboardDismissModeNone        // the keyboard is not dismissed automatically when scrolling
UIScrollViewKeyboardDismissModeOnDrag      // dismisses the keyboard when a drag begins
UIScrollViewKeyboardDismissModeInteractive // the keyboard follows the dragging touch off screen, and may be pulled upward again to cancel the dismiss

Here’s the Storyboard property to change to dismiss the keyboard on scroll:

You don’t even have to code to use UIScrollViewKeyboardDismissMode!

You don't even have to code to use UIScrollViewKeyboardDismissMode!

20. Detect blinks and smiles with CoreImage

iOS 7 adds two new face detection features to Core Image: CIDetectorEyeBlink and CIDetectorSmile. In plain English, that means you can now detect smiles and blinks in a photo! Unfortunately, that means that now iOS 7 can now get its feelings hurt.

Here’s an example of how you could use it in your app:

UIImage *image = [UIImage imageNamed:@"myImage"];
CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace
                                          context:nil
                                          options:@{CIDetectorAccuracy: CIDetectorAccuracyHigh}];

NSDictionary *options = @{ CIDetectorSmile: @YES, CIDetectorEyeBlink: @YES };

NSArray *features = [detector featuresInImage:image.CIImage options:options];

for (CIFaceFeature *feature in features) {
    NSLog(@"Bounds: %@", NSStringFromCGRect(feature.bounds));
	
    if (feature.hasSmile) {
	NSLog(@"Nice smile!");
    } else {
	NSLog(@"Why so serious?");
    }
    if (feature.leftEyeClosed || feature.rightEyeClosed) {
	NSLog(@"Open your eyes!");
    }
}

21. Add links to UITextViews

Creating your own Twitter client just got easier on iOS 7 — now you’re able to add a link to an NSAttributedString and invoke a custom action when it’s tapped.

First, create an NSAttributedString and add an NSLinkAttributeName attribute to it, as shown below:

NSMutableAttributedString *attributedString = [[NSMutableAttributedString alloc] initWithString:@"This is an example by @marcelofabri_"];
[attributedString addAttribute:NSLinkAttributeName
                         value:@"username://marcelofabri_"
                         range:[[attributedString string] rangeOfString:@"@marcelofabri_"]];


NSDictionary *linkAttributes = @{NSForegroundColorAttributeName: [UIColor greenColor],
                                 NSUnderlineColorAttributeName: [UIColor lightGrayColor],
                                 NSUnderlineStyleAttributeName: @(NSUnderlinePatternSolid)};

// assume that textView is a UITextView previously created (either by code or Interface Builder)
textView.linkTextAttributes = linkAttributes; // customizes the appearance of links
textView.attributedText = attributedString;
textView.delegate = self;

That makes a link appear in the body of your text. However, you can also control what happens when the link is tapped by implementing the new shouldInteractWithURL: method of the UITextViewDelegate protocol, like so:

- (BOOL)textView:(UITextView *)textView shouldInteractWithURL:(NSURL *)URL inRange:(NSRange)characterRange {
    if ([[URL scheme] isEqualToString:@"username"]) {
        NSString *username = [URL host]; 
        // do something with this username
        // ...
        return NO;
    }
    return YES; // let the system open this URL
}

Where to Go From Here?

Wow! That’s a ton of new features; some you may already be familiar with, but some of them are probably news to you, as they were to me.

If you want to learn even more about the changes under the hood in iOS 7, I recommend taking a look at the following resources:

Have you found any other hidden gems in iOS 7? If so, come join the forum discussion and share your discoveries with everyone!