Google I/O Reactions: What is New with the Google Assistant

I was very excited and honored this year to be chosen to attend the 2019 Google I/O conference at the Shoreline Amphitheater in Mountain View, California. One of the technologies I was most excited to hear about was the Google Assistant. The Google Assistant is a virtual assistant created by Google that is has grown […] By Jenn Bailey.

Leave a rating/review
Save for later
Share
You are currently viewing page 2 of 2 of this article. Click here to view the first page.

Utilizing the Assistant in Your Android App

App actions for an Android app include Slices and Conversational Actions. Utilizing app actions in your app can really help expand the reach of your app. The most important reason to use app actions is to increase user re-engagement. Oftentimes, apps get buried in a long list of user-installed apps. Having an Action that can deep link into your app to a specific feature by the Assistant increases the chance of the user discovering that feature. It makes it more easy and convenient for the user to engage with your app.

Adding Actions to an App

It doesn’t take a lot of developer effort to add actions into an app. All a developer has to do is add an actions.xml file to the res/xml directory of the app. This file contains action blocks that represent the app actions. Within the action block there can be one or more fulfillment mechanisms that map the actions to the fulfillment intents. The fulfillment can contain parameters that can be extracted. Google uses its own Natural Language Processing to match the requests to the appropriate actions, so the developer does not have to worry about that and the intents can be built using the Dialogflow console.

Slicing the slices

Slices are the successor to app widgets in Android. By simply making a small change in the actions.xml file to designate the fulfillment be carried out via slices, the fulfillment will be shown directly in the Assistant. Slices are essentially the visual representation and enhancement of the app feature. They display rich, dynamic and interactive content

Finding ‘toothbrush’ moments

There were many talks about conversational design for good Actions. One of the terms I heard often was ‘toothbrush’ moments. It is not the point of utilizing actions in an app to create voice actions for every feature of the app. Rather, it is best to find those features which the user will be able to use when near a Google Assistant enabled device. ‘Start a run’ is a good example of a toothbrush moment, as is ‘how many calories have I burned?’ This was demonstrated during the talk with the Nike Run Club app.

Conversing through Actions

One problem with App Actions and Slices is that they are only available on devices where the app is installed. Because there are many devices that have the Assistant, some that are not even Android based, there are Conversational Actions that are universal across all devices. Many talks were focused on building conversational actions including one called Designing Quality Actions for the Google Assistant.

Now that this talk explained what was available for the Android developer, I was curious about building games with Interactive Canvas.

Building Interactive Experiences with Interactive Canvas

Interactive Canvas is a way to use HTML, CSS and JavaScript to build rich, interactive experiences for the Google Assistant. It can be used to build full-screen visuals and custom animations. There was a talk specific to creating games with Interactive Canvas. The Home Hub is a great target for these interactive Actions. Rich responses are used in conjunction with the URL of a web application to allow the developer to create an immersive experience. The developer used Dialogflow to create the custom conversations the user will have with the Action. The developer has complete control over the conversational flow. Then, using the Interactive Canvas API, the developer has pixel-level control over the display for games where any HTML, any CSS, and any JavaScript can be run.

Providing an SDK for the smart home

One of the biggest announcements at the Google I/O keynote was the Nest Hub Max. However, one of the lesser known announcements that could have a huge impact on developers is the developer kit known as the Local Home SDK. This was demonstrated in the sandbox demos with a visual representation of toy trains. Basically, it allows Home Devices like smart speakers and smart displays to send requests to third-party gadgets such as lights, thermostats and cameras on the local network instead of via the cloud. This would be great for those days when the internet is acting up! Google has also made it easier to set up equipment such as GE smart lights using the Google Home app. This will streamline the process of setting up devices for the consumer. They are releasing 16 new device types and three new device traits for developers of smart home Actions. It was announced that there would be more details about the Google Assistant Connect platform later this year. This is the program that allows smart home appliance developers to easily add the Assistant to their devices at a low cost. Google says it has been working to develop products through this program with Anker, Leviton and Tile.

Where to go from here?

This was an exciting Google I/O for those interested in the Assistant. There were many talks for every one concerned with the Assistant including web developers, Android developers, hardware developers and the most important, us, the consuming public. For the developer there were many talks on how to develop great Google Actions in a variety of contexts. If you are interested in checking out any of these talks or seeing a tour of the sandbox demos, please checkout the links below!

Assistant Related Talks from Google I/O 2019 can be found here.
A Tour of the Google Assistant Sandbox Demo Tent can be found here.

Contributors

Vijay Sharma

Final Pass Editor

Over 300 content creators. Join our team.