Integrating the Omni Copilot with your app
The Omni Copilot SDK will be available for the following platforms:
Android Native
React Native(Android/iOS)
Flutter (Android/iOS)
1. Configure the build system
The first step is to update the app's build system to include Conva Omni Copilot SDK.
Add the path to the Slang maven repository to your top-level Gradle file
// Add this to your project build.gradle file
allprojects {
repositories {
maven { url "https://gitlab.com/api/v4/projects/25706948/packages/maven" }
}
}Add the Omni Copilot dependency to your app's Gradle file
# Add this to your app's Gradle file
dependencies {
…
implementation 'in.slanglabs.conva:conva-omni-copilot:1.+'
}Install the Conva Omni Copilot package
Run the below command to install the required packages inside your code repo.
$ flutter pub add conva_omni_copilotOnce done, run the command 'dart pub get' and ensure Conva Omni Copilot is added to the dependencies.
dependencies:
conva_omni_copilot: ^1.1.5For iOS:-
Add the path to the Slang Cocoapod repository to your Podfile
# Add this to your podfile
source 'https://github.com/SlangLabs/cocoapod-specs'Add support for granting microphone permission In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the ConvaOmniCopilot requires access to the User’s device microphone for voice interactions.
To comply with this requirement, you must add NSMicrophoneUsageDescription key to the Info.plist file of your app and provide a message about why your app requires access to the microphone. The message will be displayed only when the ConvaOmniCopilot needs to activate the microphone.
To add the key:
In the Xcode project, go to the Info tab.
In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.
From the list, select Privacy - Microphone Usage Description.
In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched. For example: "We require microphone permission to enable the voice assistant platform"
Next is to import Conva Omni Copilot in your dart code.
Install the Conva Omni Copilot package
The next step is to install the required packages inside your code repo
yarn setup
If you use yarn for install packages, run the below command
npm setup
If you use npm for managing your packages, run the below command
Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps
For iOS:-
Add the path to the Slang Cocoapod repository to your Podfile
Add support for granting microphone permission In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the ConvaOmniCopilot requires access to the User’s device microphone for voice interactions.
To comply with this requirement, you must add NSMicrophoneUsageDescription key to the Info.plist file of your app and provide a message about why your app requires access to the microphone. The message will be displayed only when the ConvaOmniCopilot needs to activate the microphone.
To add the key:
In the Xcode project, go to the Info tab.
In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.
From the list, select Privacy - Microphone Usage Description.
In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched. For example: "We require microphone permission to enable the voice assistant platform"
2. Code Integration
a. Initialization
The next step is to initialize the SDK with the keys and set of information required.
Our recommendation is to perform this initialization in the onCreate method of the Application class or the main Activity of your application.
This should ideally be done inside the main method.
This should ideally be done in the componentDidMount of your main app component.
b. Start Interaction with the Omni Copilot
Initiates a user interaction session in the Conva OmniCopilot platform.
When the custom trigger is tapped, it should call startUserInteraction() method to initiate the interaction.
Parameters
text(optional): A string representing the text to be used for interaction.
Parameters
text(optional): A string representing the text to be used for interaction.
Parameters
text(optional): A string representing the text to be used for interaction.
c. Displaying the Omni Copilot trigger
Trigger refers to the UI element that appears on the screen, which the user will click on to bring up the Copilot.
For Global trigger:-
For Global Trigger:-
For Inline Trigger:-
Incorporate the following component into your UI definition at the desired location to display the trigger (typically represented by a microphone icon), often positioned alongside the search bar.
For Global Trigger:-
The trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call:
Note:- Follow this step only if you are using Global Trigger
d. Customize Surface UI
To pause the Copilot Surface
To resume the Copilot Surface
To set a bottom margin to the Copilot Surface
To reset the Surface Context:-
API to reset conversation history: SDK now by default retains conversation history on the UI. So re-invoking the surface would continue from the last session of the user. If the app wants to reset the conversation history at any point, the app can call the following API:
Param: (resetConversationHistory) => Boolean value
After this API is called, the SDK will refresh the UI context (prompts, hints) for the next user interaction.
API to reset conversation history: SDK now by default retains conversation history on the UI. So re-invoking the surface would continue from the last session of the user. If the app wants to reset the conversation history at any point, the app can call the following API:
After this API is called, the SDK will refresh the UI context (prompts, hints) for the next user interaction.
To set custom UI hints
HintInfo Properties:-
To minimize the Copilot Surface
To maximize the Copilot Surface
e. Implementing actions
The app will need to register actions that can be invoked by the Copilot whenever it receives the necessary information from the user. These actions will be invoked automatically by the Copilot whenever the user provides input either via text or voice and it gets processed by the backend, resulting in an app action.
The parameterSearchInfo contains the breakdown of the original search request. Its structure is as described below:
The parameter NavigationInfo contains the breakdown of the original navigation request. Its structure is as described below:
f. Notifying the current context to Omni Copilot
The following APIs can be used to provide context or filter content-based on the selected app.
For notifying the current app name
For notifying the current app category
For notifying the current app name
g. Notifying the Omni Copilot
The following APIs can be used to provide contextual information to the Omni Copilot
h. Copilot Lifecycle Events
The Omni Copilot handles most of the heavy lifting of interacting with the end-users and notifies the app when there is some action to be performed by the app. But in some cases, apps may want to be notified of low-level events that the Copilot is aware of. For example, whether a user clicks on the trigger (the microphone button) or the copilot is initialized.
The Copilot's Lifecycle Events API provides access to low-level Assistant events.
Registering for events
The app can register with the Assistant to be notified of all interesting life-cycle events via the setLifeCycleObserver method.
Last updated