Top 10 WWDC 2017 Videos

Wondering which WWDC 2017 videos are the best for developers to watch? Check out our recommended Top 10! By Tim Mitra.

Leave a rating/review
Save for later
Share
You are currently viewing page 2 of 4 of this article. Click here to view the first page.

3) What’s New in Cocoa Touch – Session 201

[Video Link]

Presented by Eliza Block and Josh Shaffer, What’s New in Cocoa Touch is a rapid-fire overview of new productivity, UI refinements and new APIs. Like the Platforms State of the Union, this session leads the way into other, in-depth sessions. Eliza gives a brief overview of adding drag functionality, moving items around the screen and finally dropping, where your app receives the dropped data. Drag and Drop is relatively easy to deploy, as many existing frameworks have the hooks in place already to handle this.

“What’s new in Cocoa Touch is a great overview of a lot of the changes on iOS” – Ellen Shapiro

Document management is also further refined by file management across Cocoa Touch. Based on UIDocumentBrowserViewController, files can now be accessed independently and stored in folders. In fact, files from one app may be even accessed from other apps. There is an understated push to make the iPad and larger iPhones more flexible though these and other refinements.

Josh Shaffer covers the new dynamic large titles as part of the UI refinements. The large prominent title, reminiscent of the News app, lives inside a larger header bar. As you scroll down the page, the header shrinks to the familiar style and size. The Safe Area creates a buffer space around the edge of devices. This clears the edges for gestures, creates a cleaner look and most importantly aids with overscan buffer, which is important for tvOS devices. And even better, UIScrollView no longer fights with your contentInsets! The Design Shorts 2 session and Updating Your App For iOS 11 have more info on these.

Eliza returns to cover a few new things in Swift 4, such as the new KeyPath type with its new literal “\” that eases and provides clarity in the new block-based KVO. She also covers the Codable protocol that enables objects to be archived and unarchived, and which enables native JSON encoding in Cocoa Touch apps.

Working with Dynamic Type, which is essential for accessibility, is now easier with the new UIFontMetric objects. Auto Layout works with Dynamic Type to aid the system sizing your fonts. Password autofill is also covered in brief.

This session gives you enough information to speak clearly about new features in Asset Catalogs, PDF-backed images, ProMotion’s support of higher refresh rates on the latest devices.

4) Core ML In Depth – Session 710

[Video Link]

Machine Learning is clearly a hot topic these days and Apple has made it easy to add this technology to your apps.

With Core ML, you can consider machine learning as simply calling a library from code. You only need to drop a Core ML library into your project and let Xcode sort everything else out. In this session, Krishna Sridhar and Zach Nation give overviews of the types of use cases for machine learning in your apps.

“It was good, lots of attention of augmented reality and machine learning. Especially when Apple made machine learning plug and play. All you need is to find a model you can use or worry about training your own. Everything else just works! “ – Vincent Ngo

You can put Core ML to work with handwriting recognition, credit card analysis, sentiment analysis with text input and gesture recognizes. Krishna demonstrates how Core ML has a natural way of dealing with numerics and categorical input. You can also use Core ML with the Natural Language Processing (NLP) to determine the mood of the user by processing text.

The speakers cover the hardware optimization of Core ML along with Core ML Tools that let you convert and work with popular machine learning formats. You also don’t need to sort out whether your project will use the CPU or GPU. Core ML takes care of that for you.

Zach Nation demonstrates how to use Apple’s open-sourced Core ML Tools, which are a set of Python scripts to import common machine learning formats and convert them to Core ML library format.

“CoreML and related are fantastic – great technology and easy to use.” – Mark Rubin

I’m also awarding a “honorable mention” to Introducing Core ML [https://developer.apple.com/videos/play/wwdc2017/703/] which also ranked well. It’s further proof that Core ML seems to be the runaway topic of WWDC 2017!

Note: For even more on Core ML, check out the brand new tutorial, Core ML and Vision: Machine Learning in iOS 11 by Audrey Tam.

5) Advanced Animations with UIKit – Session 230

[Video Link]

Joe Cerra takes you through some basics on UIAnimations, with the aim to help you make your animations interactive and interruptible. In 2016, Apple introduced UIVIewPropertyAnimator that enables you to do just that. With this framework, you can give your animation customized timing as well as update them on the fly. Joe walks though how to adjust timings to create more interesting effects.

Joe demonstrates several enhancements to a simple demo animation with a pan gesture recognizer, including .PauseAnimation to pause and .ContinueAnimation to continue moving an object in the view. Midway through the talk, he demonstrates how to combine a number of tap and pan gestures along with animation properties to create an interactive and interruptible experience. Building on the effect, he adds in new behaviors, linear and nonlinear scrubs, pausing, and uses springs to add realism and damping.

Using the UIVisualEffectView, Joe combines blur and zoom to create compelling effects that he terms “view morphing”. The final reveal involves new properties for corner radii and masked corners. There are plenty of great tips and tricks covered in the session – way more than can I can fit here in a few paragraphs.

6) What’s New in CareKit and ResearchKit – Session 232

[Video Link]

Samantha Mravca refreshes viewers on ResearchKit and CareKit, as well as how they combine to sit on top of Apple’s HealthKit framework. ResearchKit allows institutions to build tools for gathering medial information and share that data with other HealthKit apps. CareKit, introduced in 2016, enables users to play an active role in their health.

The session covers some new features and the CareKit prototyping tool. There are some really interesting widgets and controls in CareKit to display progress, collect stats and capture optional and read-only data. I wonder if these widgets could find a place in other types of apps.

The speakers covered some existing data collection, or “active task” examples such as hearing tests, Stroop focus tests and cognitive tests such as trail making tests for visual attention. New modules include range of motion tests that make use of the accelerometer and gyro to test the motion of shoulder and knees.

CareKit now combines the user’s health data and symptoms into Care Contents. CareKit also includes some ready-to-use glyphs for iOS and watchOS. New this year are threshold measurements including numeric and adherence thresholds.

The session also covers the CareKit prototyping tool, which is targeted at non-technical builders who want to leverage the prototyping tool. Ultimately, these tools are designed for health professions and involve a minimal amount of coding in some cases none. Health care is a fascinating subject that we all have a vested interest in.

Note: For more on CareKit take a look at Jeff Rames’ two part CareKit Tutorial for iOS tutorial on our site.