CONVA Omni
  • Introduction to CONVA Omni Copilot
  • Creating and Configuring the Omni Copilot
  • Integrating the Omni Copilot with your app
  • CONVA Search Documentation
Powered by GitBook
On this page
  • 1. Configure the build system
  • 2. Code Integration
  • a. Initialization
  • b. Start Interaction with the Omni Copilot
  • c. Displaying the Omni Copilot trigger
  • d. Customize Surface UI
  • e. Implementing actions
  • f. Notifying the current context to Omni Copilot
  • g. Notifying the Omni Copilot
  • h. Copilot Lifecycle Events

Integrating the Omni Copilot with your app

PreviousCreating and Configuring the Omni Copilot

Last updated 9 months ago

The Omni Copilot SDK will be available for the following platforms:

  • Android Native

  • React Native(Android/iOS)

  • Flutter (Android/iOS)

Note: These APIs are under active development and could change

1. Configure the build system

The first step is to update the app's build system to include Conva Omni Copilot SDK.

Add the path to the Slang maven repository to your top-level Gradle file

// Add this to your project build.gradle file

allprojects {  
    repositories {        
        maven { url "https://gitlab.com/api/v4/projects/25706948/packages/maven" }  
    }
}

Add the Omni Copilot dependency to your app's Gradle file

# Add this to your app's Gradle file

dependencies {  
    …   
    implementation 'in.slanglabs.conva:conva-omni-copilot:1.+'
}

Install the Conva Omni Copilot package

Run the below command to install the required packages inside your code repo.

 $ flutter pub add conva_omni_copilot

Once done, run the command 'dart pub get' and ensure Conva Omni Copilot is added to the dependencies.


dependencies:
  conva_omni_copilot: ^1.1.5

For iOS:-

Add the path to the Slang Cocoapod repository to your Podfile

# Add this to your podfile

source 'https://github.com/SlangLabs/cocoapod-specs'

Add support for granting microphone permission In iOS, the user must explicitly grant permission for an app to . An app with the ConvaOmniCopilot requires access to the User’s device microphone for voice interactions.

To comply with this requirement, you must add NSMicrophoneUsageDescription key to the Info.plist file of your app and provide a message about why your app requires access to the microphone. The message will be displayed only when the ConvaOmniCopilot needs to activate the microphone.

To add the key:

  1. In the Xcode project, go to the Info tab.

  2. In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.

  3. From the list, select Privacy - Microphone Usage Description.

  4. In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched. For example: "We require microphone permission to enable the voice assistant platform"

Next is to import Conva Omni Copilot in your dart code.

import 'package:conva_omni_copilot/conva_omni_copilot.dart';

Install the Conva Omni Copilot package

The next step is to install the required packages inside your code repo

yarn setup

If you use yarn for install packages, run the below command

yarn add @slanglabs/conva-omni-copilot

npm setup

If you use npm for managing your packages, run the below command

npm install @slanglabs/conva-omni-copilot --save

Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps

react-native link @slanglabs/slang-conva-react-native-retail-assistant

For iOS:-

Add the path to the Slang Cocoapod repository to your Podfile

# Add this to your podfile

source 'https://github.com/SlangLabs/cocoapod-specs'

To comply with this requirement, you must add NSMicrophoneUsageDescription key to the Info.plist file of your app and provide a message about why your app requires access to the microphone. The message will be displayed only when the ConvaOmniCopilot needs to activate the microphone.

To add the key:

  1. In the Xcode project, go to the Info tab.

  2. In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.

  3. From the list, select Privacy - Microphone Usage Description.

  4. In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched. For example: "We require microphone permission to enable the voice assistant platform"

2. Code Integration

a. Initialization

The next step is to initialize the SDK with the keys and set of information required.

Our recommendation is to perform this initialization in the onCreate method of the Application class or the main Activity of your application.

protected void onCreate(Bundle savedInstance) {
  // ...
  // This list, `waveGradientColor`, is designed to provide a 
  // collection of multi-wave gradient color combinations. 
  List<List<String>> waveGradientColor = Arrays.asList(
            Arrays.asList("#0197FF", "#FFFFFF"),
            Arrays.asList("#9701FF", "#FF0197"),
            Arrays.asList("#FF0197", "#FF9701")
  );
        
  OmniCopilotConfiguration configuration = new OmniCopilotConfiguration.Builder()
    .setAPIKey("your-api-key")
    .setCopilotID("your-copilot-id")
    .setApplication(getApplication())
    .setBrandColor("<Color Hex Code>")
    .enableApptrigger(false) // Set this to true if you want to use 
     //custom trigger instead of Global Trigger
    .setEnvironment(Environment.PRODUCTION)
    .setWaveGradientColor(waveGradientColor)
    .build();
  ConvaOmniCopilot.initialize(configuration);
}

This should ideally be done inside the main method.

import 'package:conva_omni_copilot/conva_omni_copilot.dart';


// This list, `waveGradientColor`, is designed to provide a 
// collection of multi-wave gradient color combinations. 
List<List<String>> waveGradientColor = [
      // Color pair 1
      ["#0197FF", "#FFFFFF"],
      // Color pair 2
      ["#9701FF", "#FF0197"],
      // Color pair 3
      ["#FF0197", "#FF9701"],
    ];
    
var assistantConfig = new OmniCopilotConfiguration()
 ..copilotId = "<CopilotId>"
 ..apiKey = "<APIKey>"
 ..environment = CopilotEnvironment.PRODUCTION
 ..uiMode = CopilotUIMode.LIGHT
 ..waveGradientColor = waveGradientColor
 ..enableCustomTrigger = false // Set this to true if you want to use 
 //custom trigger instead of Global Trigger
 ..fontPaths = ["<Array of required font paths>"] 
 // Example:- ["fonts/TerminalDosis-Regular.ttf", "fonts/TerminalDosis-SemiBold.ttf"]
 ..brandColor = "<Color Hex Code>";

ConvaOmniCopilot.initialize(assistantConfig);

This should ideally be done in the componentDidMount of your main app component.

import { ConvaOmniCopilot } from '@slanglabs/conva-omni-copilot';

const waveGradientColor: string[][] = [
    // Color pair 1
    ["#0197FF", "#FFFFFF"],
    // Color pair 2
    ["#9701FF", "#FF0197"],
    // Color pair 3
    ["#FF0197", "#FF9701"],
];

const config: OmniCopilotConfiguration = {
    copilotId: '<CopilotId>',
    apiKey: '<ApiKey>',
    environment: CopilotEnvironment.Production,
    uiMode: CopilotUIMode.Light,
    waveGradientColor: waveGradientColor,
    brandColor = "<Color Hex Code>",
    enableAppTrigger = false // Set this to true if you want to use 
     //custom trigger instead of Global Trigger
};

ConvaOmniCopilot.initialize(config)

b. Start Interaction with the Omni Copilot

Initiates a user interaction session in the Conva OmniCopilot platform.

When the custom trigger is tapped, it should call startUserInteraction() method to initiate the interaction.

ConvaOmniCopilot.startUserInteraction();

Parameters

  • text (optional): A string representing the text to be used for interaction.

ConvaOmniCopilot.startUserInteraction("Custom text to start the interaction");
ConvaOmniCopilot.startUserInteraction();

Parameters

  • text (optional): A string representing the text to be used for interaction.

ConvaOmniCopilot.startUserInteraction(text: "Custom text to start the interaction");
ConvaOmniCopilot.startUserInteraction();

Parameters

  • text (optional): A string representing the text to be used for interaction.

ConvaOmniCopilot.startUserInteraction("Custom text to start the interaction");

c. Displaying the Omni Copilot trigger

Trigger refers to the UI element that appears on the screen, which the user will click on to bring up the Copilot.

For Global trigger:-

protected void onResume(Bundle savedInstance) {
    // ...
    CopilotUI.setGlobalTrigger();
    CopilotUI.showTrigger(this);
}

For Global Trigger:-

ConvaOmniCopilot.getUI().setGlobalTrigger();
ConvaOmniCopilot.getUI().showTrigger();

For Inline Trigger:-

Incorporate the following component into your UI definition at the desired location to display the trigger (typically represented by a microphone icon), often positioned alongside the search bar.

Container(
       height: 60,
       width: 60,
       child: ConvaOmniTrigger(enableCircularBackground: true,)
)

For Global Trigger:-

ConvaOmniCopilot.getUI().showTrigger();

The trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call:

Note:- Follow this step only if you are using Global Trigger

protected void onResume(Bundle savedInstance) {
    // ...
    CopilotUI.hideTrigger(this);
}
ConvaOmniCopilot.getUI().hideTrigger();
ConvaOmniCopilot.getUI().hideTrigger();

d. Customize Surface UI

To pause the Copilot Surface

CopilotUI.pauseSurface();
ConvaOmniCopilot.getUI().pauseSurface();
// TBD

To resume the Copilot Surface

CopilotUI.resumeSurface();
ConvaOmniCopilot.getUI().resumeSurface();
// TBD

To set a bottom margin to the Copilot Surface

CopilotUI.setSurfaceBottomMargin(<bottomMargin>)
ConvaOmniCopilot.getUI().setSurfaceBottomMargin(<bottom_margin>);
// TBD

To reset the Surface Context:-

API to reset conversation history: SDK now by default retains conversation history on the UI. So re-invoking the surface would continue from the last session of the user. If the app wants to reset the conversation history at any point, the app can call the following API:

Param: (resetConversationHistory) => Boolean value

CopilotUI.resetSurfaceContext(<resetConversationHistory>)

After this API is called, the SDK will refresh the UI context (prompts, hints) for the next user interaction.

API to reset conversation history: SDK now by default retains conversation history on the UI. So re-invoking the surface would continue from the last session of the user. If the app wants to reset the conversation history at any point, the app can call the following API:

ConvaOmniCopilot.getUI().resetCopilotSurfaceContext();

After this API is called, the SDK will refresh the UI context (prompts, hints) for the next user interaction.

// TBD

To set custom UI hints

HashMap<String, String> params = new HashMap<>();
params.put("param1", "value1");
params.put("param2", "value2");

List<HintInfo> hints = new ArrayList<>();
hints.add(new HintInfo("Example 1", "https://example1.com/username-help", params));
hints.add(new HintInfo("Example 2", "https://example2.com/username-help", params));

CopilotUI.setUIHints(hintInfo)
Map<String, String> params = {
  "param1": "value1",
  "param2": "value2",
};

List<HintInfo> hints = [
   HintInfo(hint: 'Example 1', url: 'https://example1.com/username-help', params: params),
   HintInfo(hint: 'Example 2', url: 'https://example2.com/username-help', params: params),
 ];
 ConvaOmniCopilot.getUI().setUIHints(hints);
// TBD

HintInfo Properties:-

- `hint`: String
  - Description: The hint text.
- `url`: String (optional)
  - Description: The URL associated with the hint.
- `params`: `MutableMap<String, String>`
  - Description: Additional parameters associated with the hint.

To minimize the Copilot Surface

CopilotUI.setCopilotSurfaceMode(CopilotUI.CopilotSurfaceMode.MINIMIZE)
ConvaOmniCopilot.getUI().minimizeSurface(); 
// TBD

To maximize the Copilot Surface

CopilotUI.setCopilotSurfaceMode(CopilotUI.CopilotSurfaceMode.MAXIMIZE)
ConvaOmniCopilot.getUI().maximizeSurface(); 
// TBD

e. Implementing actions

The app will need to register actions that can be invoked by the Copilot whenever it receives the necessary information from the user. These actions will be invoked automatically by the Copilot whenever the user provides input either via text or voice and it gets processed by the backend, resulting in an app action.

ConvaOmniCopilot.setAction(new CopilotAction() {
        @Override
        public void onSearch(@NonNull SearchInfo searchInfo) {
            // Fire the search action
            // You can access searchInfo properties to perform the search
            // For example:
            //System.out.println("Search term: " + searchInfo.searchTerm);
        }
        @Override
        public void onNavigation(@NonNull NavigationInfo navigationInfo) {
            // Fire the navigation action
            // You can access navigationInfo properties to perform navigation
            // For example:
            // System.out.println("Navigation target: " + navigationInfo.target);
    }
        }
    });
//Set the action handler via the setAction method
class AppAction implements CopilotAction {

  @override
  void initState() {
    super.initState();
    ConvaOmniCopilot.setAction(this);
  }
  @override
  void onNavigation(NavigationInfo navigationInfo) {
    // Handle navigation requests
  }

  @override
  void onSearch(SearchInfo searchInfo) {
    // Handle search requests
  }
}

var action = new AppAction();
ConvaOmniCopilot.setAction(action);
ConvaOmniCopilot.setAction({
    onSearch: (searchInfo: any) => {
        // Fire the search action
        // You can access searchInfo properties to perform the search
        // For example:
        // var searchterm : string = searchInfo["searchTerm"]
    },
    onNavigation: (navigationInfo: any) => {
        // Fire the navigation action
        // You can access navigationInfo properties to perform the search
        // For example:
        // var navigationTarget : string = navigationInfo["target"]
    }
});

The parameterSearchInfo contains the breakdown of the original search request. Its structure is as described below:

// Type definitions
public class SearchInfo {
    public String searchTerm; // returns the full search term
    public @Nullable String category; // returns the category for the user query
    public @Nullable String app; // returns the app for the user query
    public @Nullable List<FilterInfo> filters; // returns the list of filters applied
    public @Nullable SortInfo sortInfo; // returns the sorting info applied
}

public class FilterInfo {
    public String key; // returns the filter key
    public Map<String, Object> params; // Additional parameters associated with the filter.
}
public class SortInfo {
    public String sortKey; // returns the sort key
}
class SearchInfo {
  String get searchTerm // (String): Returns the entire search term.
  String? get category // (String, Nullable): Returns the category for the user query, if available.
  String? get app // (String, Nullable): Returns the app for the user query, if available.
  List<FilterInfo>? get filters // (List of FilterInfo, Nullable): Returns the list of filters applied, if any.
  SortInfo? get sortInfo // (SortInfo, Nullable): Returns the sorting info applied, if available.
}

class FilterInfo {
  String get filterCategory // (String): Returns the filter category.
  String get filterLabel // (String): Returns the filter label.
  String? get filterId // (String, Nullable): Returns the filter ID, if available.
}

class SortInfo {
  String? get sortKey // (String, Nullable): Returns the sort key.
}
// When the user searches for something 
// This is how the SearchInfo parameter would be populated
// Example

{
	"filters": [],
	"app": "",
	"category": "",
	"sortInfo": {
		"sortKey": "popularity"
	},
	"searchTerm": "sunscreen"
}

The parameter NavigationInfo contains the breakdown of the original navigation request. Its structure is as described below:

public class NavigationInfo {
    public String target; // Represents the target of the navigation action.
    public @Nullable String url; // Represents the URL associated with the navigation action, if any.
    public @Nullable Map<String,String> parameters; // Represents additional parameters associated with the navigation action.
}
class NavigationInfo {
     String get target; // Represents the target of the navigation action.
     String? get url; // Represents the URL associated with the navigation action, if any.
     Map<String,String>? get parameters; // Represents additional parameters associated with the navigation action.
}
// When the user searches for something 
// This is how the NavigationInfo parameter would be populated
// Example

{
	"target": "home",
	"url": "<Url to your target>",
}

f. Notifying the current context to Omni Copilot

The following APIs can be used to provide context or filter content-based on the selected app.

For notifying the current app name

ConvaOmniCopilot.setCurrentApp(<"current_app_name">);

For notifying the current app category

ConvaOmniCopilot.setCurrentCategory(<"current_app_category">);

For notifying the current app name

ConvaOmniCopilot.setCurrentApp(<"current_app_name">);
// TBD

g. Notifying the Omni Copilot

The following APIs can be used to provide contextual information to the Omni Copilot

// Update the user profile
ConvaOmniCopilot.updateUserProfileInfo(
    "<user_id>", 
    new HashMap<String, String>() {
            {
                // Any additional user info
                put("user_info_1","info_1");
                put("user_info_2","info_2");
            }
);

// Notify text search
ConvaOmniCopilot.notifyTextSearch("<SearchString>");

// Notify CTR event
ConvaOmniCopilot.notifyCTREvent("eventInfo",
    new HashMap<String, String>() {
        {
            // The section that was clicked. Here are some
            // examples of events 
            // "NavigatedToProductPage"
            // "AddedToCart"
            // "AddedToWishlist"
            put("eventName","<event>"); 
            // The product item that was clicked
            // Eg: "Organic Onions"
            put("itemName","<item>");
        }
});
// Update the user profile
Map<String, String> userMetaData = {
      'user_info_1': '<info_1>',
      'user_info_2': '<info_2>',
};
ConvaOmniCopilot.setUserInfo("<user_id>", userMetaData);


// Notify text search
Map<String, String> searchMetaData = {
      'searchName1': '<searchData1>',
      'searchName2': '<searchData2>',
};
ConvaOmniCopilot.notifyTextSearch("<searchItem>", searchMetaData);


// Notify CTR event
Map<String, String> eventMetaData = {
      // The section that was clicked. Here are the
      // supported strings 
      // "NavigatedToProductPage"
      // "AddedToCart"
      // "AddedToWishlist"
      'eventName': '<event>',
      // The product item that was clicked
      // Eg: "Organic Onions"
      'itemName': '<item>',
};
ConvaOmniCopilot.notifyCTREvent("<eventInfo>", eventMetaData);
// TBD

h. Copilot Lifecycle Events

The Omni Copilot handles most of the heavy lifting of interacting with the end-users and notifies the app when there is some action to be performed by the app. But in some cases, apps may want to be notified of low-level events that the Copilot is aware of. For example, whether a user clicks on the trigger (the microphone button) or the copilot is initialized.

The Copilot's Lifecycle Events API provides access to low-level Assistant events.

Registering for events

The app can register with the Assistant to be notified of all interesting life-cycle events via the setLifeCycleObserver method.

ConvaOmniCopilot.setLifecycleObserver(new OmniLifecycleObserver() {
    @Override
    public void onCopilotInitSuccess() {
        // Called when the Conva OmniCopilot initialization is successful.
    }

    @Override
    public void onCopilotInitFailure(String description) {
        // Called when the Conva OmniCopilot initialization fails, providing a description of the failure.
    }

    @Override
    public void onCopilotInteractionBegin(boolean isVoice) {
        // Called when a user interaction with Conva OmniCopilot begins, indicating whether the interaction is voice-initiated (`true`) or not (`false`).
    }

    @Override
    public void onCopilotInteractionEnd(boolean isCanceled) {
        // Called when a user interaction with Conva OmniCopilot ends, indicating whether the interaction was canceled (`true`) or not (`false`).
    }

    @Override
    public void onCopilotSurfaceDismissed() {
        // Called when the Conva OmniCopilot surface (e.g., UI overlay) is dismissed.
    }
    
    @Override
    public void onCopilotError(CopilotError errorType) {
        // Callback method invoked when an error occurs during a Copilot interaction.
        //
        // [error] The error that occurred during the Copilot interaction.
        //         Possible values are defined in the [ErrorType] enum.
    }
    
    
});
class _MyAppState extends State<MyApp> implements
        CopilotLifeCycleObserver {
        
  @override
  void initState() {
    super.initState();
    ConvaOmniCopilot.setLifeCycleObserver(this);
  }
  
  @override
  void onCopilotInitFailure(String description) {
    // Called when the Conva OmniCopilot initialization fails, providing a description of the failure.
  }

  @override
  void onCopilotInitSuccess() {
    // Called when the Conva OmniCopilot initialization is successful.
  }

  @override
  void onCopilotInteractionBegin(bool isVoice) {
    // Called when a user interaction with Conva OmniCopilot begins, indicating whether the interaction is voice-initiated (`true`) or not (`false`).
  }
  
  @override
  void onCopilotInteractionEnd(bool isCanceled) {
    // Called when a user interaction with Conva OmniCopilot ends, indicating whether the interaction was canceled (`true`) or not (`false`).
  }

  @override
  void onCopilotSurfaceDismissed() {
    // Called when the Conva OmniCopilot surface (e.g., UI overlay) is dismissed.
  }
  
  @override
  void onCopilotError(CopilotError error) {
    // Callback method invoked when an error occurs during a Copilot interaction.
    //
    // [error] The error that occurred during the Copilot interaction.
    //         Possible values are defined in the [ErrorType] enum.
  }
}
ConvaOmniCopilot.setLifeCycleObserver({
    onCopilotInitFailure: (description: string) => {
        // Called when the Conva OmniCopilot initialization fails, 
        // providing a description of the failure.
    },
    onCopilotInitSuccess: () => {
        // Called when the Conva OmniCopilot initialization is successful.
    },
    onCopilotInteractionBegin: (isVoice: boolean) => {
        // Called when a user interaction with Conva OmniCopilot begins, 
        // indicating whether the interaction is voice-initiated (`true`) or not (`false`).
    },
    onCopilotInteractionEnd: (isCanceled: boolean) => {
        // Called when a user interaction with Conva OmniCopilot ends,
        // indicating whether the interaction was canceled (`true`) or not (`false`).
    },
    onCopilotSurfaceDismissed: () => {
        // Called when the Conva OmniCopilot surface (e.g., UI overlay) is dismissed.
    }
  
});

Try out the for developers to understand the assistant.

Add support for granting microphone permission In iOS, the user must explicitly grant permission for an app to . An app with the ConvaOmniCopilot requires access to the User’s device microphone for voice interactions.

access the user’s data and resources
playground app
access the user’s data and resources