Copilot Mode

This page describes the process of integrating the Conva.AI Assistant in copilot mode to an app.

Conva.AI Copilot is available for the following platforms:

  • Android

  • iOS - Coming soon

  • Flutter (Android only for now)

Add Dependencies

To use the Copilot, you need to first add the dependencies in your project.

First, you need to add the maven repository URL to your project-level build.gradle file. Open the build.gradle file and add the following code inside the allprojects > repositories block.

allprojects {
    repositories {
        maven {
            url "https://pkgs.dev.azure.com/slanglabs-convaai/prod/_packaging/public_prod/maven/v1"
        }
    }
}

Next, open your app-level build.gradle file and add theconva-ai-copilot dependency.

dependencies {
    implementation 'in.slanglabs.conva:conva-ai-copilot:1.0.2-beta'
}

Setup the Copilot

To setup the Copilot, follow these steps:

  1. Initialize and Setup:

    • Initialize the ConvaAI SDK.

    • Setup the Copilot using the setup() API, passing in a ConvaAIOptions object to configure the assistant.

  2. Register Handlers for Agents:

    • Handle user interactions when they speak to the Copilot.

  3. Register Handlers for Suggestions:

    • Assistant-Handled Suggestions: Suggestions that generate follow-on queries. Clicking these does not invoke the suggestion handler.

    • App-Handled Suggestions: Suggestions meant for app-specific actions, such as initiating a search within the app. Clicking these triggers the suggestion handler.

// Initialize the ConvaAI SDK
ConvaAI.init(
    id = "assistant_id",
    key = "api_key",
    version = "LATEST",
    application = applicationContext
);
val options = ConvaAIOptions.Builder()
    .setCapabilityHandler(object : ConvaAIHandler {
        override fun onCapability(
            response: ConvaAIResponse, 
            interactionData: ConvaAIInteraction, 
            isFinal: Boolean) {
            // Handle the response from the assistant
        }
    })
    .setSuggestionHandler(object : ConvaAISuggestionHandler {
        override fun onSuggestion(suggestion: ConvaAISuggestion) {
            // Handle the selected suggestion
        }
    })
    .build()
    
ConvaAICopilot.setup(options)

Attach the Copilot to UI

To prepare the Copilot UI, you need to call the attach API of the ConvaAICopilot. This step readies the Copilot to appear on top of the current screen but remains invisible until you call startConversation().

You only need to call attach() once for the initial setup. The Copilot UI will track screen transitions and continue to persist unless dismissed by the user or the app.

// Attach the Copilot to the current activity
ConvaAICopilot.attach(getActivity())

Start the Copilot

To bring up the Copilot experience, call the startConversation() method of the ConvaAICopilot. This displays the Copilot on top of the current screen, allowing the user to interact with it.

ConvaAICopilot.startConversation()

Handle Agent Responses

When the Copilot is visible and the user either types or speaks to it, the Copilot sends the request to the Conva.AI backend orchestration service. This service determines the appropriate Agent to invoke and returns the response to the SDK.

To handle the Agent, you need to register a handler, as shown in the setup step above.

val options = ConvaAIOptions.Builder()
    .setCapabilityHandler(object : ConvaAIHandler {
        override fun onCapability(
            response: ConvaAIResponse, 
            interactionData: ConvaAIInteraction, 
            isFinal: Boolean) {
            // Handle the response from the assistant
        }
    })
    .build()

When using the Copilot, the handler receives streaming responses, meaning it will be called multiple times until the isFinal parameter is set to true.

ConvaAIResponse object

The ConvaAIResponse object contains three main fields:

  • message: Contains either the status message or the answer to the question. This string is displayed in the message area and spoken aloud (unless muted by the user).

  • related_queries: Controls the list of suggestions that appear after each user query.

  • params: A Map containing the custom parameters configured in the Agent blueprint.

ConvaAIInteraction object

The ConvaAIInteraction object contains data relevant to the current interaction. By default it contains an array of Suggestion objects which are populated with the related_queries available in the ConvaAIResponse object.

Inside the onCapability callback, the app can use the ConvaAIInteraction instance for more advanced functionality. This allows the app to customize the message displayed to the user, as well as set the suggestions that should be shown on the surface.

val options = ConvaAIOptions.Builder()
    .setCapabilityHandler(object : ConvaAIHandler {
        override fun onCapability(response: ConvaAIResponse, interactionData: ConvaAIInteraction, isFinal: Boolean) {
            // Update the message that is to be displayed to the user
            interactionData.setMessage("Display this message to the user")
        
            // Update the list of suggestions that are shown to the user
            // NOTE: These suggestions would always be handled
            // by the app via the ConvaAISuggestionAction instance provided
            // to the copilot during setup()
            interactionData.setSuggestionStrings(mutableListOf("Potato", "Tomato", "Onion"))
        
            // Set the suggestion index that should be highlighted on the copilot surface UI, here we set it to 0
            interactionData.setSelectedSuggestionIndex(0)
        }
    })
    // other builder method calls
    .build()
);

Last updated