The Omni Copilot SDK will be available for the following platforms:
Android Native
React Native(Android/iOS)
Flutter (Android/iOS)
Note: These APIs are under active development and could change
1. Configure the build system
The first step is to update the app's build system to include Conva Omni Copilot SDK.
Add the path to the Slang maven repository to your top-level Gradle file
// Add this to your project build.gradle file
allprojects {
repositories {
maven { url "https://gitlab.com/api/v4/projects/25706948/packages/maven" }
}
}
Add the Omni Copilot dependency to your app's Gradle file
# Add this to your app's Gradle file
dependencies {
…
implementation 'in.slanglabs.conva:conva-omni-copilot:1.+'
}
Install the Conva Omni Copilot package
Run the below command to install the required packages inside your code repo.
$ flutter pub add conva_omni_copilot
Once done, run the command 'dart pub get' and ensure Conva Omni Copilot is added to the dependencies.
dependencies:
conva_omni_copilot: ^1.1.5
For iOS:-
Add the path to the Slang Cocoapod repository to your Podfile
# Add this to your podfile
source 'https://github.com/SlangLabs/cocoapod-specs'
Add support for granting microphone permission
In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the ConvaOmniCopilot requires access to the User’s device microphone for voice interactions.
To comply with this requirement, you must add NSMicrophoneUsageDescription key to the Info.plist file of your app and provide a message about why your app requires access to the microphone. The message will be displayed only when the ConvaOmniCopilot needs to activate the microphone.
To add the key:
In the Xcode project, go to the Info tab.
In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.
From the list, select Privacy - Microphone Usage Description.
In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched.
For example: "We require microphone permission to enable the voice assistant platform"
Next is to import Conva Omni Copilot in your dart code.
Try out the playground app for developers to understand the assistant.
Install the Conva Omni Copilot package
The next step is to install the required packages inside your code repo
yarn setup
If you use yarn for install packages, run the below command
yarn add @slanglabs/conva-omni-copilot
npm setup
If you use npm for managing your packages, run the below command
npminstall@slanglabs/conva-omni-copilot--save
Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps
react-native link @slanglabs/slang-conva-react-native-retail-assistant
For iOS:-
Add the path to the Slang Cocoapod repository to your Podfile
# Add this to your podfile
source 'https://github.com/SlangLabs/cocoapod-specs'
Add support for granting microphone permission
In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the ConvaOmniCopilot requires access to the User’s device microphone for voice interactions.
To comply with this requirement, you must add NSMicrophoneUsageDescription key to the Info.plist file of your app and provide a message about why your app requires access to the microphone. The message will be displayed only when the ConvaOmniCopilot needs to activate the microphone.
To add the key:
In the Xcode project, go to the Info tab.
In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.
From the list, select Privacy - Microphone Usage Description.
In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched.
For example: "We require microphone permission to enable the voice assistant platform"
2. Code Integration
a. Initialization
The next step is to initialize the SDK with the keys and set of information required.
Our recommendation is to perform this initialization in the onCreate method of the Application class or the main Activity of your application.
protectedvoidonCreate(Bundle savedInstance) {// ...// This list, `waveGradientColor`, is designed to provide a // collection of multi-wave gradient color combinations. List<List<String>> waveGradientColor =Arrays.asList(Arrays.asList("#0197FF","#FFFFFF"),Arrays.asList("#9701FF","#FF0197"),Arrays.asList("#FF0197","#FF9701") );OmniCopilotConfiguration configuration =new OmniCopilotConfiguration.Builder().setAPIKey("your-api-key").setCopilotID("your-copilot-id").setApplication(getApplication()).setBrandColor("<Color Hex Code>").enableApptrigger(false) // Set this to true if you want to use //custom trigger instead of Global Trigger.setEnvironment(Environment.PRODUCTION).setWaveGradientColor(waveGradientColor).build();ConvaOmniCopilot.initialize(configuration);}
This should ideally be done inside the main method.
import'package:conva_omni_copilot/conva_omni_copilot.dart';// This list, `waveGradientColor`, is designed to provide a // collection of multi-wave gradient color combinations. List<List<String>> waveGradientColor = [// Color pair 1 ["#0197FF", "#FFFFFF"],// Color pair 2 ["#9701FF", "#FF0197"],// Color pair 3 ["#FF0197", "#FF9701"], ];var assistantConfig =newOmniCopilotConfiguration() ..copilotId ="<CopilotId>" ..apiKey ="<APIKey>" ..environment =CopilotEnvironment.PRODUCTION ..uiMode =CopilotUIMode.LIGHT ..waveGradientColor = waveGradientColor ..enableCustomTrigger =false// Set this to true if you want to use //custom trigger instead of Global Trigger ..fontPaths = ["<Array of required font paths>"] // Example:- ["fonts/TerminalDosis-Regular.ttf", "fonts/TerminalDosis-SemiBold.ttf"] ..brandColor ="<Color Hex Code>";ConvaOmniCopilot.initialize(assistantConfig);
This should ideally be done in the componentDidMount of your main app component.
import { ConvaOmniCopilot } from'@slanglabs/conva-omni-copilot';constwaveGradientColor:string[][] = [// Color pair 1 ["#0197FF","#FFFFFF"],// Color pair 2 ["#9701FF","#FF0197"],// Color pair 3 ["#FF0197","#FF9701"],];constconfig:OmniCopilotConfiguration= { copilotId:'<CopilotId>', apiKey:'<ApiKey>', environment:CopilotEnvironment.Production, uiMode:CopilotUIMode.Light, waveGradientColor: waveGradientColor, brandColor ="<Color Hex Code>", enableAppTrigger =false// Set this to true if you want to use //custom trigger instead of Global Trigger};ConvaOmniCopilot.initialize(config)
b. Start Interaction with the Omni Copilot
Initiates a user interaction session in the Conva OmniCopilot platform.
When the custom trigger is tapped, it should call startUserInteraction() method to initiate the interaction.
ConvaOmniCopilot.startUserInteraction();
Parameters
text (optional): A string representing the text to be used for interaction.
ConvaOmniCopilot.startUserInteraction("Custom text to start the interaction");
ConvaOmniCopilot.startUserInteraction();
Parameters
text (optional): A string representing the text to be used for interaction.
ConvaOmniCopilot.startUserInteraction(text:"Custom text to start the interaction");
ConvaOmniCopilot.startUserInteraction();
Parameters
text (optional): A string representing the text to be used for interaction.
ConvaOmniCopilot.startUserInteraction("Custom text to start the interaction");
c. Displaying the Omni Copilot trigger
Trigger refers to the UI element that appears on the screen, which the user will click on to bring up the Copilot.
Incorporate the following component into your UI definition at the desired location to display the trigger (typically represented by a microphone icon), often positioned alongside the search bar.
The trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call:
Note:- Follow this step only if you are using Global Trigger
API to reset conversation history: SDK now by default retains conversation history on the UI. So re-invoking the surface would continue from the last session of the user. If the app wants to reset the conversation history at any point, the app can call the following API:
Param: (resetConversationHistory) => Boolean value
After this API is called, the SDK will refresh the UI context (prompts, hints) for the next user interaction.
API to reset conversation history: SDK now by default retains conversation history on the UI. So re-invoking the surface would continue from the last session of the user. If the app wants to reset the conversation history at any point, the app can call the following API:
- `hint`: String
- Description: The hint text.
- `url`: String (optional)
- Description: The URL associated with the hint.
- `params`: `MutableMap<String, String>`
- Description: Additional parameters associated with the hint.
The app will need to register actions that can be invoked by the Copilot whenever it receives the necessary information from the user. These actions will be invoked automatically by the Copilot whenever the user provides input either via text or voice and it gets processed by the backend, resulting in an app action.
ConvaOmniCopilot.setAction(newCopilotAction() { @OverridepublicvoidonSearch(@NonNullSearchInfo searchInfo) {// Fire the search action// You can access searchInfo properties to perform the search// For example://System.out.println("Search term: " + searchInfo.searchTerm); } @OverridepublicvoidonNavigation(@NonNullNavigationInfo navigationInfo) {// Fire the navigation action// You can access navigationInfo properties to perform navigation// For example:// System.out.println("Navigation target: " + navigationInfo.target); } } });
//Set the action handler via the setAction method
class AppAction implements CopilotAction {
@override
void initState() {
super.initState();
ConvaOmniCopilot.setAction(this);
}
@override
void onNavigation(NavigationInfo navigationInfo) {
// Handle navigation requests
}
@override
void onSearch(SearchInfo searchInfo) {
// Handle search requests
}
}
var action = new AppAction();
ConvaOmniCopilot.setAction(action);
ConvaOmniCopilot.setAction({
onSearch: (searchInfo: any) => {
// Fire the search action
// You can access searchInfo properties to perform the search
// For example:
// var searchterm : string = searchInfo["searchTerm"]
},
onNavigation: (navigationInfo: any) => {
// Fire the navigation action
// You can access navigationInfo properties to perform the search
// For example:
// var navigationTarget : string = navigationInfo["target"]
}
});
The parameterSearchInfo contains the breakdown of the original search request. Its structure is as described below:
// Type definitions
public class SearchInfo {
public String searchTerm; // returns the full search term
public @Nullable String category; // returns the category for the user query
public @Nullable String app; // returns the app for the user query
public @Nullable List<FilterInfo> filters; // returns the list of filters applied
public @Nullable SortInfo sortInfo; // returns the sorting info applied
}
public class FilterInfo {
public String key; // returns the filter key
public Map<String, Object> params; // Additional parameters associated with the filter.
}
public class SortInfo {
public String sortKey; // returns the sort key
}
class SearchInfo {
String get searchTerm // (String): Returns the entire search term.
String? get category // (String, Nullable): Returns the category for the user query, if available.
String? get app // (String, Nullable): Returns the app for the user query, if available.
List<FilterInfo>? get filters // (List of FilterInfo, Nullable): Returns the list of filters applied, if any.
SortInfo? get sortInfo // (SortInfo, Nullable): Returns the sorting info applied, if available.
}
class FilterInfo {
String get filterCategory // (String): Returns the filter category.
String get filterLabel // (String): Returns the filter label.
String? get filterId // (String, Nullable): Returns the filter ID, if available.
}
class SortInfo {
String? get sortKey // (String, Nullable): Returns the sort key.
}
// When the user searches for something
// This is how the SearchInfo parameter would be populated
// Example
{
"filters": [],
"app": "",
"category": "",
"sortInfo": {
"sortKey": "popularity"
},
"searchTerm": "sunscreen"
}
The parameter NavigationInfo contains the breakdown of the original navigation request. Its structure is as described below:
public class NavigationInfo {
public String target; // Represents the target of the navigation action.
public @Nullable String url; // Represents the URL associated with the navigation action, if any.
public @Nullable Map<String,String> parameters; // Represents additional parameters associated with the navigation action.
}
class NavigationInfo {
String get target; // Represents the target of the navigation action.
String? get url; // Represents the URL associated with the navigation action, if any.
Map<String,String>? get parameters; // Represents additional parameters associated with the navigation action.
}
// When the user searches for something
// This is how the NavigationInfo parameter would be populated
// Example
{
"target": "home",
"url": "<Url to your target>",
}
f. Notifying the current context to Omni Copilot
The following APIs can be used to provide context or filter content-based on the selected app.
The following APIs can be used to provide contextual information to the Omni Copilot
// Update the user profile
ConvaOmniCopilot.updateUserProfileInfo(
"<user_id>",
new HashMap<String, String>() {
{
// Any additional user info
put("user_info_1","info_1");
put("user_info_2","info_2");
}
);
// Notify text search
ConvaOmniCopilot.notifyTextSearch("<SearchString>");
// Notify CTR event
ConvaOmniCopilot.notifyCTREvent("eventInfo",
new HashMap<String, String>() {
{
// The section that was clicked. Here are some
// examples of events
// "NavigatedToProductPage"
// "AddedToCart"
// "AddedToWishlist"
put("eventName","<event>");
// The product item that was clicked
// Eg: "Organic Onions"
put("itemName","<item>");
}
});
// Update the user profile
Map<String, String> userMetaData = {
'user_info_1': '<info_1>',
'user_info_2': '<info_2>',
};
ConvaOmniCopilot.setUserInfo("<user_id>", userMetaData);
// Notify text search
Map<String, String> searchMetaData = {
'searchName1': '<searchData1>',
'searchName2': '<searchData2>',
};
ConvaOmniCopilot.notifyTextSearch("<searchItem>", searchMetaData);
// Notify CTR event
Map<String, String> eventMetaData = {
// The section that was clicked. Here are the
// supported strings
// "NavigatedToProductPage"
// "AddedToCart"
// "AddedToWishlist"
'eventName': '<event>',
// The product item that was clicked
// Eg: "Organic Onions"
'itemName': '<item>',
};
ConvaOmniCopilot.notifyCTREvent("<eventInfo>", eventMetaData);
// TBD
h. Copilot Lifecycle Events
The Omni Copilot handles most of the heavy lifting of interacting with the end-users and notifies the app when there is some action to be performed by the app. But in some cases, apps may want to be notified of low-level events that the Copilot is aware of. For example, whether a user clicks on the trigger (the microphone button) or the copilot is initialized.
The Copilot's Lifecycle Events API provides access to low-level Assistant events.
Registering for events
The app can register with the Assistant to be notified of all interesting life-cycle events via the setLifeCycleObserver method.
ConvaOmniCopilot.setLifecycleObserver(new OmniLifecycleObserver() {
@Override
public void onCopilotInitSuccess() {
// Called when the Conva OmniCopilot initialization is successful.
}
@Override
public void onCopilotInitFailure(String description) {
// Called when the Conva OmniCopilot initialization fails, providing a description of the failure.
}
@Override
public void onCopilotInteractionBegin(boolean isVoice) {
// Called when a user interaction with Conva OmniCopilot begins, indicating whether the interaction is voice-initiated (`true`) or not (`false`).
}
@Override
public void onCopilotInteractionEnd(boolean isCanceled) {
// Called when a user interaction with Conva OmniCopilot ends, indicating whether the interaction was canceled (`true`) or not (`false`).
}
@Override
public void onCopilotSurfaceDismissed() {
// Called when the Conva OmniCopilot surface (e.g., UI overlay) is dismissed.
}
@Override
public void onCopilotError(CopilotError errorType) {
// Callback method invoked when an error occurs during a Copilot interaction.
//
// [error] The error that occurred during the Copilot interaction.
// Possible values are defined in the [ErrorType] enum.
}
});
class _MyAppState extends State<MyApp> implements
CopilotLifeCycleObserver {
@override
void initState() {
super.initState();
ConvaOmniCopilot.setLifeCycleObserver(this);
}
@override
void onCopilotInitFailure(String description) {
// Called when the Conva OmniCopilot initialization fails, providing a description of the failure.
}
@override
void onCopilotInitSuccess() {
// Called when the Conva OmniCopilot initialization is successful.
}
@override
void onCopilotInteractionBegin(bool isVoice) {
// Called when a user interaction with Conva OmniCopilot begins, indicating whether the interaction is voice-initiated (`true`) or not (`false`).
}
@override
void onCopilotInteractionEnd(bool isCanceled) {
// Called when a user interaction with Conva OmniCopilot ends, indicating whether the interaction was canceled (`true`) or not (`false`).
}
@override
void onCopilotSurfaceDismissed() {
// Called when the Conva OmniCopilot surface (e.g., UI overlay) is dismissed.
}
@override
void onCopilotError(CopilotError error) {
// Callback method invoked when an error occurs during a Copilot interaction.
//
// [error] The error that occurred during the Copilot interaction.
// Possible values are defined in the [ErrorType] enum.
}
}
ConvaOmniCopilot.setLifeCycleObserver({
onCopilotInitFailure: (description: string) => {
// Called when the Conva OmniCopilot initialization fails,
// providing a description of the failure.
},
onCopilotInitSuccess: () => {
// Called when the Conva OmniCopilot initialization is successful.
},
onCopilotInteractionBegin: (isVoice: boolean) => {
// Called when a user interaction with Conva OmniCopilot begins,
// indicating whether the interaction is voice-initiated (`true`) or not (`false`).
},
onCopilotInteractionEnd: (isCanceled: boolean) => {
// Called when a user interaction with Conva OmniCopilot ends,
// indicating whether the interaction was canceled (`true`) or not (`false`).
},
onCopilotSurfaceDismissed: () => {
// Called when the Conva OmniCopilot surface (e.g., UI overlay) is dismissed.
}
});