Copilot Mode
This page describes the process of integrating the Conva.AI Assistant in copilot mode to an app.
Conva.AI Copilot is available for the following platforms:
Android
iOS - Coming soon
Flutter (Android only for now)
Add Dependencies
To use the Copilot, you need to first add the dependencies in your project.
First, you need to add the maven repository URL to your project-level build.gradle file. Open the build.gradle file and add the following code inside the allprojects > repositories block.
allprojects {
repositories {
maven {
url "https://pkgs.dev.azure.com/slanglabs-convaai/prod/_packaging/public_prod/maven/v1"
}
}
}Next, open your app-level build.gradle file and add theconva-ai-copilot dependency.
dependencies {
implementation 'in.slanglabs.conva:conva-ai-copilot:1.0.2-beta'
}$ flutter pub add conva_ai_copilotOnce done, run the command dart pub get and ensure Conva.AI Copilot is added to the dependencies.
dependencies:
conva_ai_copilot: ^0.1.3Setup the Copilot
To setup the Copilot, follow these steps:
Initialize and Setup:
Initialize the ConvaAI SDK.
Setup the Copilot using the
setup()API, passing in aConvaAIOptionsobject to configure the assistant.
Register Handlers for Agents:
Handle user interactions when they speak to the Copilot.
Register Handlers for Suggestions:
Assistant-Handled Suggestions: Suggestions that generate follow-on queries. Clicking these does not invoke the suggestion handler.
App-Handled Suggestions: Suggestions meant for app-specific actions, such as initiating a search within the app. Clicking these triggers the suggestion handler.
Attach the Copilot to UI
To prepare the Copilot UI, you need to call the attach API of the ConvaAICopilot. This step readies the Copilot to appear on top of the current screen but remains invisible until you call startConversation().
Start the Copilot
To bring up the Copilot experience, call the startConversation() method of the ConvaAICopilot. This displays the Copilot on top of the current screen, allowing the user to interact with it.
Handle Agent Responses
When the Copilot is visible and the user either types or speaks to it, the Copilot sends the request to the Conva.AI backend orchestration service. This service determines the appropriate Agent to invoke and returns the response to the SDK.
To handle the Agent, you need to register a handler, as shown in the setup step above.
ConvaAIResponse object
The ConvaAIResponse object contains three main fields:
message: Contains either the status message or the answer to the question. This string is displayed in the message area and spoken aloud (unless muted by the user).
related_queries: Controls the list of suggestions that appear after each user query.
params: A
Mapcontaining the custom parameters configured in the Agent blueprint.
ConvaAIInteraction object
The ConvaAIInteraction object contains data relevant to the current interaction. By default it contains an array of Suggestion objects which are populated with the related_queries available in the ConvaAIResponse object.
Inside the onCapability callback, the app can use the ConvaAIInteraction instance for more advanced functionality. This allows the app to customize the message displayed to the user, as well as set the suggestions that should be shown on the surface.
Last updated
Was this helpful?