The third session of Google I/O ’19, What’s New in Android, was really exciting! Google presented a long list of new Android features. There were also many exciting reveals and plans for the Android framework.
With Android Q in beta 3, Chet Haase, Dan Sandler, and Romain Guy, had plenty to show the crowd. While there was no big announcement comparable to the Kotlin announcement two years ago, there was still plenty of excitement.
It’s clear Google is listening to end users and developers when making decisions for the future. It’s looking at patterns developers are using and improving them for everyone.
In their own words, they are:
- Noticing a UI paradigm in the wild.
- Making it safer and more reusable.
- Adding it to the framework.
That’s the core of what the Android team has been doing for a while now. The team has made some cool changes in the process. Things are about to become much easier for us all because of the attention it’s paid to developers.
Let’s dig into some of the things the speakers discussed.
Android Q is around the corner and you’ve got to see the improvements that are coming with it.
One improvement makes notifications better for end users. You can make notifications either gentle or interruptive by using swipe gestures. This will make multi-tasking easier and show the right notifications at the right time.
Gentle notifications appear at the bottom of the screen and don’t pop up to get your attention. Interruptive notifications display on top of other notifications to grab your attention. This is a great way to distinguish your most important alerts from those that are less important.
Priority and Gentle Notifications
To promote digital wellbeing, Google is also introducing the idea of gentle versus priority notifications. After all, if everything is a priority, nothing is.
Priority notifications cause a pop-up and are for your most important notifications. Good examples of priority notifications include communications from a person, events or alarms.
You can request priority notifications for your app. However, end users ultimately choose how they want to categorize your notification.
Users can get things done more efficiently by using notification actions. Knowing this, Google is introducing automatically generated actions for notifications. This includes replying using text.
Google has automatically enabled this for
MessagingStyle notifications. You also have the ability to opt-in for other types of notifications.
Google saw people building Bubbles and decided to create a safer API for everyone. Bubbles tie into the notification system and float on top of other app content for maximum multitasking. They’re an alternative option to using
SYSTEM_ALERT_WINDOW, which will eventually be fully deprecated. You can start using them in Android Q!
There are two new options for viewing and interacting with your devices. These are both optional to the user.
Dark Theming is available for end users on Android Q. It’s no longer time based, so the user can turn it on at will.
Google recommends you prepare your app for dark UI by using themes, but they provide other ways as well. You don’t want to be the only app without a dark theme!
Gesture Navigation lets you replace the three navigation buttons at the bottom of your screen with gestures. Fewer buttons mean more screen space for your content. Start paying attention to the gesture areas at the sides and bottom of the screen when considering draggable content, as Google advises.
Interacting With the Framework
The APIs you work with every day got some changes to make things easier and safer for everyone.
There’s a new and improved share sheet and sharing API! The share sheet has a content preview that supports images and text. The copy to clipboard option is now at the top so you don’t have to look for it.
Also, another great addition is that the new sharing API doesn’t need to start your app.
Audio Playback Capture
Audio Playback Capture lets you capture audio from other apps by using
AudioPlaybackCaptureConfiguration. Don’t worry about other apps capturing your audio: you can opt in or out per audio stream or application. Check it out here.
If you use external storage in your app, you’ll need to be aware of the new access restrictions. Starting with target SDK Q, you’ll be sandboxed by default.
There are also changes in how you access media files and photo metadata. You can make your content readable by other apps, to stay compatible with apps that do so already, like the file manager applications which are currently out there.
Users feel safer if they know what data an app is collecting and when – especially location data. Now, the user can permit or revoke permission to access location when the app is in the foreground or always. You’ll need to add a new permission to get the location while the app is in the background. It’s all a part of the new security changes for the location API.
Background Activity Starts
Speaking of restrictions, there are new restrictions on when you can start an Activity from the background. In short, you need to involve the user.
There are many ways you can do this such as pending intents or broadcasts. Google recommends a few approaches, one being using a notification when it makes sense.
You can use the Settings Panel within your app to let users change settings important to the app itself, depending on the context of what the user is currently involved with. This includes settings for the Internet, NFC, volume and WiFi. They’re as simple to use as launching an Intent!
Here’s one last restriction for you: Apps can no longer enable or disable Wifi. Fortunately, you have Settings Panels so the user can take care of this for you within your apps! It’s a good tradeoff for more security and privacy.
In addition to all that, Q is coming with some more improvements for the platform.
WebView, Accessibility and Text
WebView got some improvements as well. One improvement is a new listener to know when the
WebView is unresponsive so your app can act on that.
Accessibility got a one-liner for accessibility actions and an option to vary duration for transient UI.
Text also got a few improvements including an easier API for finding system fonts. Hyphenation is now turned off by default for better performance. Additionally, you can now customize the text magnifier!
Private API Restrictions
To keep our users safer and free from unpredictable crashes, Google is adding more restrictions on calling private APIs. These can change and cause problems for users.
In most cases, a public solution already exists, such as changing something currently private to being public or adding a new API. However, Google is adding private methods to a grey-list for future restriction.
Google has added several solutions. If you don’t see what you need, contact them to figure out a solution.
Google added more improvements to the Android Runtime (ART)! Along with startup improvements, there’s a generational garbage collector which collects new objects first. This makes using temporary objects cheaper and faster since most of the data used in Android is short-term.
PowerManager, can notify you when a device goes under thermal throttling. If the device is getting hot you can adapt and do less work. This creates a better user experience.
Neural Network API
There are additional improvements to the Neural Network API with 60 new ops. Also, there’s latency reduction with up to nine times reduction for face detection!
Users are more secure with TSL 1.3, now enabled by default, making connections much faster. The biometrics dialog now allows you to request implicit confirmation.
With this dialog, you can also fall back to a passcode if there’s an issue with the biometric recognition. All this is in addition to a new Jetpack Security library, one of many new changes to the Jetpack libraries in general.
Keep in mind going that forward
android.preferences is now deprecated. You should use
androidx.preferences instead. So be sure to migrate!
Developing for Android is increasingly Kotlin first. This is another example of the team listening to the community. For example, new APIs in Q all have nullability annotations. Nullability is enforced as errors instead of warnings with Android Q.
On top of this, the team added support for coroutines in Jetpack libraries and Kotlin first libraries. You can now use WorkManager, LiveData, Room and other parts of the Jetpack libraries with support for coroutines. This brings easier cancellation and threading to those libraries.
As mentioned, Google added some upgrades and new additions to Jetpack, the collection of libraries meant to replace the support libraries previously favored.
There are new versions of both WorkManager and Navigation. You’ll also see improvements to support coroutines to Lifecycles, LiveData and Room.
There’s a new easier-to-use API for the camera called CameraX. It’s more concise for common features and has more consistent behavior across Android devices. It’s also backwards-compatible back to Android L.
Additionally, Google is working with manufacturers to include device specific API extensions.
This is one of the shiny, new things you get this year! Jetpack Compose is a library which simplifies UI development. It’s a reactive, Kotlin tool used for creating and managing the UI. Like Flutter, with Jetpack Compose you can build the UI fully declaratively, without XML!
The much needed ViewPager2, now in alpha stages, is like
ViewPager, but better. It’s based on
RecyclerView and has an easy migration from
ViewPager. It includes features such as right-to-left, or RTL, support, vertical paging and improved dataset notifications. You can check it out already!
ViewBindings will help you manage your view references. These bindings generated from XML layout files have null-safe and type-safe fields without the need for an annotation processor. They also support a Kotlin synthetic-like syntax so they’re very easy to use.
Graphics and Media
Because of limited time, the presenters sped through this part of the presentation. However, you have the opportunity to spend some time here! There are plenty of tools to make your graphics more efficient.
The new android.graphics.BlendMode API replaces the confusing
PorterDuff still works, but you might want to look into BlendMode. New available modes include
SOFT_LIGHT and more.
The Android’s UI toolkit team uses RenderNode internally in their views for hardware acceleration. Now, Google’s exposing it for your use, as well.
It provides efficient rendering which allows you to set position, alpha, offset and more. You can even use it to create shadows without using views!
The UI toolkit team also uses HardwareRenderer internally. It renders a scene of
RenderNodes to a Surface and lets you control the light source for Material shadows. You can check it out now!
In another cool update, Bitmap can now wrap a
HardwareBuffer. This means you can have frequent bitmap updates without texture uploads. You can use a Surface as a Bitmap in your UI.
Vulkan and ANGLE
All new 64 bit devices now require Vulkan 1.1. It comes with the experimental ANGLE, which is OpenGL ES running on top of Vulkan. This will allow them to more easily deliver updates to the OpenGL ES driver using an APK. If you’re wondering how it works, head over to the official docs!
Wide Color Gamut
Finally, there are changes to the Wide Color Gamut introduced in Android O. To improve performance and battery life, Google changed it from sixteen-bit color depth to eight-bit color depth. The company also expanded the related APIs.
Where to Go From Here?
We’re so excited about all the features that are coming to both Android and Google products! Keep an eye out for tutorials and articles on these topics. While you wait, you can watch the full Google I/O ’19 What’s New in Android recording.
What are you excited about from Google I/O ’19? Join us in the forum with any comments or questions.