LogoLogo
  • Overview
  • Voice Assistants as a Service
  • Voice Assistant Concepts
  • Voice Assistant Components
  • Voice Assistant Types
  • Platform & Languages Supported
  • Getting Started
    • Integrating Slang CONVA into Retail e-Commerce Apps
      • Setting up Slang CONVA
      • Customizing the Assistant
      • Code Integration
        • CONVA Search
          • Advanced Topics
        • CONVA Plus
          • Supported User Journeys
            • Search
            • Order Management
            • Checkout
            • Offer Management
            • Navigation
    • Integrating Slang CONVA into Travel e-Commerce Apps
      • Setting up Slang CONVA
      • Code Integration - Basic Steps
      • Supported User Journeys
        • Search
        • Navigation
  • Advanced Topics
    • Asynchronous Action Handling
    • Changing the Assistant behavior at runtime
      • Launching the Assistant Programmatically
      • Changing the language of the Assistant
      • Accessing and Setting User Journey Context
    • Advanced Assistant Customizations
      • Customizing the Visual nature of the Assistant
      • Customizing the conversational aspects of the Assistant
      • Training the Assistant to recognize additional data
        • Customizing Retail Subdomain Data
  • Sample Apps
    • Slang Playground App
    • Slang Retail e-Commerce App
Powered by GitBook
On this page
  • 1. Configure the build system
  • 2. Code integration
  • 2.1 Initialization
  • 2.2 Show the Trigger (microphone icon)
  • 2.3 Implement Actions
  • 2.4 Return the AppState and Condition

Was this helpful?

  1. Getting Started
  2. Integrating Slang CONVA into Travel e-Commerce Apps

Code Integration - Basic Steps

Integrating the Slang Voice Assistant with your app

PreviousSetting up Slang CONVANextSupported User Journeys

Last updated 3 years ago

Was this helpful?

By now you must have configured and published your Assistant via the Slang Console. Congratulations! :) If you have not already done that, you can do so by following the instructions .

While the overall idea is similar across platforms, the specific steps involved vary slightly based on the platform on which your app is built. Supported platforms are:

  • Android Native

  • React Native for Android

  • Web (JS)

Let's start coding!

For testing, we recommend using a physical Android device instead of an emulator because most emulators don't work well with microphones.

1. Configure the build system

The first step is to update the app's build system to include Slang's Travel Assistant SDK.

Add the Slang dependency to your gradle files

Add the path to the Slang maven repository to your top-level gradle file

# Add this to your top level gradle file

allprojects {  
    repositories {    
        …    
        maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }  
    }
}

Add the Slang Travel Assistant dependency to your app's gradle file

# Add this to your app's gradle file

dependencies {  
    …   
    implementation 'in.slanglabs.assistants:slang-travel-assistant:4.0.27'
}

N/A

Install the Slang Travel Assistant package

The next step is to install the latest version of the required packages inside your code repo

yarn setup

If you use yarn for install packages, run the below command

$ yarn add @slanglabs/slang-travel-assistant

npm setup

If you use npm for managing your packages, run the below command

$ npm install @slanglabs/slang-travel-assistant --save

2. Code integration

2.1 Initialization

The recommendation is to perform the initialization in the onCreate method of the Application class. If the app does not use an Application class, the next best place would be the onCreate method of the primary Activity class.

// Your application class 

protected void onCreate(Bundle savedInstance) {  
    ...
    AssistantConfiguration configuration = new AssistantConfiguration.Builder()    
        .setAPIKey(<API Key>)    
        .setAssistantId(<AssistantId>)
        .setEnvironment(STAGING)  // Change this to PRODUCTION once you've published the Assistant to production environment    
        .build();  
    SlangTravelAssistant.initialize(this, configuration);
}

This should ideally be done in the componentDidMount of your main app component

import SlangRetailAssistant from '@slanglabs/react-native-slang-retail-assistant';

SlangRetailAssistant.initialize({
    requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
    assistantId: '<assistant id>',        // The Assistant ID from the console
    apiKey: '<API Key>',                  // The API key from the console
})

import SlangTravelAssistant from '@slanglabs/slang-travel-assistant';

SlangTravelAssistant.init({
    requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
    assistantID: '<assistant id>',        // The Assistant ID from the console
    apiKey: '<API Key>',                  // The API key from the console
})

2.2 Show the Trigger (microphone icon)

Once the Assistant is initialized, the next step is to show the microphone UI element (what we call the Trigger) that the app's users can click on to invoke the Assistant and speak to it.

Add the below line to the onResume method of the Activities where you want the Assistant to be enabled.

protected void onResume(Bundle savedInstance) {  
    ... 
    SlangTravelAssistant.getUI().showTrigger(this); // There is a corresponding hideTrigger too if needed
}

One can call "show" and "hide" methods as required to control the visibility of the Assistant

SlangRetailAssistant.ui.showTrigger(); // There is a corresponding hideTrigger too if needed

One can call "show" and "hide" methods as required to control the visibility of the Assistant

SlangTravelAssistant.ui.show(); // There is a corresponding hide too if needed

The trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call: SlangTravelAssistant.getUI().hideTrigger(this)

2.3 Implement Actions

Last but not the least, the app needs to implement the Actions associated with the various User Journeys supported by the Assistant. This can be done as shown below

SlangRetailAssistant.setAction(new SlangRetailAssistant.Action() {
    @Override
    public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney  searchJourney) {
        // Handle search requests
        // …
        searchJourney.setSuccess(); //Setting app state condition
        return SearchUserJourney.AppState.SEARCH_RESULTS; //Returning app state.
    }

    @Override
    public NavigationUserJourney.AppState onNavigation(
           NavigationInfo navigationInfo,
           NavigationUserJourney navigationUserJourney
    ) {
       	// Handle navigation requests
        // …
        navigationUserJourney.setSuccess();
        return NavigationUserJourney.AppState.NAVIGATION;
    }
   
    @Override
    public void onAssistantError(final AssistantError error) {
        // Handle errors that might have occurred during the Assistant lifecycle
        
        // Error codes available 
        // FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS, 
    }
}
SlangTravelAssistant.setAction({
      onSearch: (searchInfo, searchUserJourney) => {
      
        // use searchInfo for performing the search in app
        
        searchUserJourney.setSuccess();
        return searchUserJourney.AppStates.SEARCH_RESULTS;
      },
      onNavigation: (navigationInfo, navigationUserJourney) => {
      
        // use navigationInfo to know the navigation target
        
        navigationUserJourney.setNavigationSuccess();
        return navigationUserJourney.AppState.NAVIGATION;
      },
    });

The following user journeys are currently supported by the Slang Travel Assistant:

  • Voice Search

  • Voice Navigation

The Action Handler interface has an explicit callback for each of the supported user journeys. Whenever the Assistant detects the user journey the user is interested in (based on what they spoke), it invokes the callback associated with that user journey.

When these callbacks are invoked, the Assistant also passes the parametric data corresponding to the user journey that the Assistant was able to gather. The app is then expected to:

  1. Consume the parametric data as needed

  2. Optionally launch appropriate UI actions

  3. Set appropriate conditions in the Assistant based on the app's internal state

  4. Return the AppState that the app transitioned to

2.4 Return the AppState and Condition

An AppState indicates which state the app transitioned to, based on the user-journey and parametric data that was passed to the app. The list ofAppStatesthat are supported depends on the user journey.

Conditions represent more detailed states of the app within a particular app state. For example, when performing a search, the search might have failed or the items might be out of stock. The app can useAppStateconditions to indicate to the Assistant the correct condition. The condition controls the message that the Assistant speaks up after the call-back returns.

public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {        
   // Handle the Search requests
   // ...
   searchUserJourney.setSuccess();
   return SearchUserJourney.AppState.SEARCH_RESULTS;
}
onSearch: async (searchInfo, searchUserJourney) => {
    // Handle the search request    
    // ...
    return SearchUserJourney.AppState.SEARCH_RESULTS;
}
onSearch: async (searchInfo, searchUserJourney) => {
    // Handle the search request using searchInfo  
    // ...
    return searchUserJourney.AppStates.SEARCH_RESULTS;
}

2.4.2 Assistant Prompts

Based on the AppState returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.

The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.

That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant into your app.

Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to the more advanced needs of apps. Please refer to the Advanced Concepts section for more details.

The next step is to initialize the SDK with the you obtained after creating the Assistant in the Slang console.

Refresher: A UserJourney represents a path that a user may take to reach their goal when using a web or mobile app. See for details.

Refresher: An AppState typically corresponds to a screen that the app transitioned to based on user input. See for details.

here
keys
Voice Assistant Concepts
Voice Assistant Concepts