Hey, people!
A few weeks ago, I've created an Alexa Skill that provides information about an VR MMORPG called Zenith.
Markkop / essence-helper-alexa-skill
An Alexa Skill that provides information about ZenithVR MMORPG
The problem I was trying to solve with this skill is finding out more about the game without the need to remove the headset while playing.
People on Reddit liked it and suggested creating the same skill but for Google Assistant.
I had no idea how to make one, so I've taken this opportunity to challenge myself and also share my learnings in a blog post.
So shall we?
Setting Up
This Quick Start guide is pretty much what you need to get everything ready for development.
However, here are some of the troubles I had and the solutions I've found:
- If you try to install the
gactions
CLI using the zipped file, you might get the "This app is blocked" screen. Instead, you can install it using thenpm
package, which is even easier: ```
npm i -g @assistant/gactions
- You'll need to enable both Cloud Functions and Cloud Build API. It took me a while to notice that the error returned from the `gactions deploy view` was different after enabling Cloud Function API.
```json
[ERROR] Server did not return HTTP 200.
{
"error": {
"code": 403,
"message": "Asset 'webhooks/ActionsOnGoogleFulfillment' cannot be deployed. [Cloud Functions API has not been used in project 402567933069 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/cloudfunctions.googleapis.com/overview?project=402567933069 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.]"
}
[ERROR] Server did not return HTTP 200.
{
"error": {
"code": 403,
"message": "Asset 'webhooks/ActionsOnGoogleFulfillment' cannot be deployed. [Build failed: Cloud Build API has not been used in project 402567933069 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/cloudbuild.googleapis.com/overview?project=402567933069 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.]"
}
}
- Also, make sure to enable Cloud APIs in the correct project. A new cloud project with the same
projectId
you've set in settings.yaml should be created in the Cloud Dashboard.
Using a local editor
If you followed the quick start guide mentioned above, you now have an Interactive canvas sample project in your computer.
actions-on-google / actions-builder-canvas-nodejs
Interactive Canvas sample (using Actions Builder) in Node.js
Whenever you run gactions push
inside the sdk
folder, the project in the Actions Console will be updated as well.
You can also pull changes made in the console to the project by running gactions pull
.
To read more about gactions
CLI, take a look here.
Remember that you have to gactions deploy preview
to be able to test your changes in the testing page.
To develop your action, you'll probably be going back and forth between the Web Action Build and the code locally itself. You sure can stick with only one, but sometimes you'll want to visualize and edit the app's flow in the actions console, and other times you'll rather just code a lot using your favorite editor.
Creating the action
The way Google Actions work is having scenes that trigger intents which call a webhook handler.
The documentation can give you the details, but the most important point here is that you'll need to add some app.handlers
into the webhook code and set them on your scenes.
I won't go further in the explanation on how to use @assistant/conversation or how to connect scenes, intents and slots. But I'm sure you can figure them out by trial-and-error and taking a look on samples.
Testing with preview deployment
At this point, you surely are familiar with the testing console. However, you might want to add automated tests in your project, so you don't need to do the manual work every time.
The process is pretty simple, just follow this project's readme instructions and check its sample code.
actions-on-google / actions-builder-conversation-components-nodejs
Conversation Components sample (using Actions Builder) in Node.js
Actions on Google: Conversation Components Sample
Prerequisites
- Node.js and NPM
- We recommend installing using nvm for Linux/Mac and nvm-windows for Windows
- Install the Firebase CLI
- We recommend using MAJOR version
8
,npm install -g firebase-tools@^8.0.0
- Run
firebase login
with your Google account
- We recommend using MAJOR version
Setup
Actions Console
- From the Actions on Google Console, New project > Create project > under What kind of Action do you want to build? > Custom > Blank project
Actions CLI
- Install the Actions CLI
- Navigate to
sdk/settings/settings.yaml
, and replace<PROJECT_ID>
with your project ID - Run
gactions login
to login to your account. - Run
gactions push
to push your project. - Run
gactions deploy preview
to deploy your project.
Running this Sample
- You can test your Action on any Google Assistant-enabled device on which the Assistant…
The only problem I had was some weird error with draft deployment. I can't tell for sure, but after deploying a new preview version, the automated tests were not updating with the new code. I believe it was some kind of Google Cloud Function reference error.
The solution was going to the Web Testing page and triggering and new function deployment.
Having to do this every time is bothersome, but luckily the next step solved this problem as well.
Testing locally
In order to be able to write some code and test it without the need to deploy it again, I've followed this guide and adapted it to Google Conversation.
The only thing is that we now have to use an HTTPS Endpoint for fulfillment method instead of the Inline Cloud Functions.
The endpoint we're going to use is actually the same Firebase Function we've been using. You can grab it from Firebase Console, it's the one with ActionsOnGoogleFulfillment in its URL.
However, we're only using this endpoint when our skill is ready for production. While testing and developing, we will set its endpoint to the Ngrok Public URL as described in the guide linked above.
In fact, the tips and tricks on how I've set this up is content for another post. For now you can check how I'm using it in my github repository:
Markkop / essence-helper-google-action
A Google Action that provides information about the Zenith VR MMORPG
Releasing
There is no mystery here, just go to the "Deploy" tab, submit your information and create a new release.
After Google's verification, you might have the action officially live!
Conclusion
I've spent a good amount of time figuring out how to develop a Google Action, specially about some of the resources I've got used to while using Alexa Skills Kit, such as testing with dialog replay files and local development.
Certainly, there are pros and cons for each home assistant service, but my next goal is using a framework to develop both at the same time.
Stay tuned!
By the way, if you want to check out my skill live, just say to your Google Home Assistant: "Ok Google, Talk to Essence Helper" 😉
Top comments (0)