Adding advanced multi-turn Voice capabilities into your app
By now you must have configured and published your Assistant via the Slang Console and also maybe customized it as required. Congratulations! :) If you have not already done that, you can do so by following the instructions here.
Let's start coding!
For testing, we recommend using a physical Android device instead of an emulator because most emulators don't work well with microphones.
1. Configure the build system
The first step is to update the app's build system to include Slang's Retail Assistant SDK.
Add the Slang dependency to your Gradle files
Add the path to the Slang maven repository to your top-level Gradle file
# Add this to your top level Gradle file
allprojects {
repositories {
…
maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }
}
}
Add the Slang Retails Assistant dependency to your app's Gradle file
# Add this to your app's Gradle file
dependencies {
…
implementation 'in.slanglabs.assistants:slang-retail-assistant:5.0.5'
}
Install the Slang Retail Assistant package
The next step is to install the required packages inside your code repo
yarn setup
If you use yarn for install packages, run the below command
Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps
react-native link @slanglabs/slang-conva-react-native-retail-assistant
Finally, add the path to the Slang maven repository (to download native library dependencies) to your top-level gradle file
# Add this to your top level gradle file
allprojects {
repositories {
…
maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }
}
}
Install the Slang Retail Assistant package
Run the below command to install the required packages inside your code repo.
Once done, run the command 'dart pub get' and ensure Slang assistant is added to the dependencies.
Next is to import Slang Retail Assistant in your dart code.
Using Slang Retail Assistant Package
You can add our NPM package to your project using any of your preferred package manager -
yarn
npm
If you are build a simple HTML project, you can still use our SDK using the script below -
2. Code integration
2.1 Initialization
The next step is to initialize the SDK with the keys you obtained after creating the Assistant in the Slang console.
The recommendation is to perform the initialization in the onCreate method of the Application class. If the app does not use an Application class, the next best place would be the onCreate method of the primary Activity class.
This should ideally be done in the componentDidMount of your main app component
This should ideally be done inside the main method.
If you're using async/await, then -
If you use Promise callback, then -
If you are using a framework like React, we recommend putting this code inside a useEffect block.
If you are building a simple HTML project, then put this code inside an document.addEventListener('DOMContentLoaded', () => {}) event listener.
2.2 Show the Assistant Trigger (microphone icon)
Once the Assistant is initialized, the next step is to show the Assistant Trigger (ie the microphone button) that the app's users can click on to invoke the Assistant and speak to it.
Add the below line to the onResume method of the Activities where you want the Assistant to be enabled.
One can call "show" and "hide" methods as required to control the visibility of the Assistant
Use "showTrigger" and "hideTrigger" APIs to control the visibility of the Assistant as shown below.
One can call "show" and "hide" methods as required to control the visibility of the Assistant
By default, the trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call:
Note that there are two types of Assistant Icons. Global and Inline. Refer to this page for more details and how to make use of it in your app. The default is the "Global" Assistant Icon
2.3 Implement Actions
Refresher: A UserJourney represents a path that a user may take to reach their goal when using a web or mobile app. See Voice Assistant Concepts for details.
Refresher: The Actions for the various User Journeys can also be specified directly in the console. Refer to the "Define Actions for various User Journeys" section for details
Now if the actions (basically the visual change that the app should do) corresponding to the various User Journeys have not been already defined in the console, the app needs to do that via code and implement the Actions associated with the various User Journeys supported by the Assistant. This can be done as shown below
The following user journeys are currently supported by the Slang Retail Assistant:
Voice Search
Voice Order Management
Voice Offers
Voice Checkout
Voice Navigation
Backward compatibility note: Earlier offers and checkout were targets inside Navigation Journey. Now they have become their own full-blown user journeys
The Action Handler interface has an explicit callback for each of the supported user journeys. Whenever the Assistant detects the user's journey (based on what they spoke), it invokes the callback associated with that user journey.
When these callbacks are invoked, the Assistant also passes the parametric data corresponding to the user journey that the Assistant was able to gather. The app is then expected to:
Consume the parametric data as needed
Optionally launch appropriate UI actions
Set appropriate conditions in the Assistant based on the app's internal state
Return the AppState that the app transitioned to
2.4 Return the AppState and Condition
Refresher: An AppState typically corresponds to a screen that the app transitioned to based on user input. See Voice Assistant Concepts for details.
An AppState indicates which state the app transitioned to, based on the user-journey and parametric data that was passed to the app. The list ofAppStates that are supported depends on the user journey.
Conditions represent more detailed states of the app within a particular app state. For example, the search might have failed when performing the search or the items might be out of stock. The app can use Conditions to indicate the correct condition to the Assistant. The condition controls the message that the Assistant speaks up after the call-back returns.
2.4.1 Assistant Prompts
Based on the AppState returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.
The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.
That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant into your app.
Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to the more advanced needs of the apps. Please refer to the Advanced Concepts section for more details.
// Your application class
protected void onCreate(Bundle savedInstance) {
...
AssistantConfiguration configuration = new AssistantConfiguration.Builder()
.setAPIKey(<API Key>)
.setAssistantId(<AssistantId>)
.setEnvironment(STAGING) // Change this to PRODUCTION once you've published the Assistant to production environment
.build();
SlangRetailAssistant.initialize(this, configuration);
}
import SlangRetailAssistant from '@slanglabs/slang-conva-react-native-retail-assistant';
SlangRetailAssistant.initialize({
requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
assistantId: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
})
import 'package:slang_retail_assistant/slang_retail_assistant.dart';
var assistantConfig = new AssistantConfiguration()
..assistantId = "<AssistantId>"
..apiKey = "<APIKey>"
..requestedLocales = ['en-IN', 'hi-IN'];
SlangRetailAssistant.initialize(assistantConfig);
import SlangRetailAssistant from '@slanglabs/slang-retail-assistant';
await SlangRetailAssistant.init({
assistantID: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
})
import SlangRetailAssistant from '@slanglabs/slang-retail-assistant';
SlangRetailAssistant.init({
assistantID: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
}).then(() => {
// other integration code...
})
protected void onResume(Bundle savedInstance) {
...
SlangRetailAssistant.getUI().showTrigger(this); // There is a corresponding hideTrigger too if needed
}
SlangRetailAssistant.ui.showTrigger(); // There is a corresponding hideTrigger too if needed