Ionic Speech Recognition [v3]


A few days back I found the Ionic Native Speech Recognition and thought to give it a try, as previously we had to use some third party services to recognize speech.

With this new Cordova plugin we can directly make use of the iOS and Android internal features for speech recognition, so let
s implement the feature in a little app!

Setting up our App

First of all we start with a blank new app. If you are adding it only to your app, maker sure to install the Cordova plugin (already the new syntax for the new CLI 3!) and also the NPM package for speech recognition like this:

Make sure to add it to your module by adding the installed package to src/app/app.module.ts:

Now we are ready to use the speech recognition feature, although we have to make sure to run this on a real device now to test it!

The Cordova plugin won’t work inside the browser, so deploy it to your device afterwards.

Implementing Speech Recognition

In general we just want to be understood by our device and print that out as text. The plugin works internally a bit different for iOS and Android so we have to make a special switch:

On iOS the speech recognition doesn’t stop by itself, therefore we have to call a function to stop listening. Otherwise we need in general a button to get the permissions from our device to access the microphone and a button to start recording.

Once the speech recognition is done we will get a list of possible matches what we might have said, and we just display that list inside our view below the buttons inside a card.

Go ahead and changer your src/pages/home/home.html to:

As you might see we are also conditionally changing the color of our status bar depending on the variable isRecording which we will always set and unset when we start or stop. You could create a little “on air” modus by doing this for your app!

Inside our class we have a few functions, most not super interesting. We need to check the platform for our 3. button to stop recording, and we also add a little hack using ChangeDetectorRef because for me the view was not really updated when the results from the plugin came in.

I’m not sure what exactly was the reason, but manually triggering the Angular change detector fixed the problem for now (kinda ugly, but this tutorial should not be about change detection).

To get permissions and to stop listening we use the according functions of the Ionic Native plugin. The startListening is not checking for permission itself so we have to take care about that part and make sure at the time we want to start we already got our permissions (or call the check then).

Finally, when we start to listen to the user we can pass an object to the function which takes for example the language it will listen for, or also more specific Android options which you can find on the Cordova plugin Github repository.

When we now get matches back inside the subscribe block we simply set them to our local array and call the change detection. Go ahead and change your src/pages/home/home.ts to:

That’s it! If you now run your app (on a device) you should be able to transform your own commands into text. You could now improve the on air modus, perhaps only being active when you hold down a button, or you could trigger different functions depending on your own voice commands!

If you have built something with the Speech Recognition, let us know below inside the comments and show what you developed.

You can find a video version of this quick win below!