Headless Mode
This page is to detail the process of integrating the Conva.AI assistant in headless mode to an app.
Conva.AI supports SDKs for the following platforms
Android
Python
iOS
Flutter
Typescript
Rust - Coming soon
Let's walk through the integration process.
Setting up the build environment
First, you need to add the maven repository URL to your project-level build.gradle file. Open the build.gradle file and add the following code inside the allprojects > repositories block
allprojects {
repositories {
maven {
url "https://pkgs.dev.azure.com/slanglabs-convaai/prod/_packaging/public_prod/maven/v1"
}
}
}Next, open your app-level build.gradle file and add the conva-ai-core dependency inside the dependencies block
dependencies {
implementation 'in.slanglabs.conva:conva-ai-core:1.0.2-beta'
}$ pip install conva-ai$ flutter pub add conva_ai_coreOnce done, run the command 'dart pub get' and ensure Conva.AI is added to the dependencies.
dependencies:
conva_ai_core: ^0.1.3Add the Cocoapod repository path to your Podfile
# Add this to your podfile
source 'https://github.com/CocoaPods/Specs.git'Next, open your podspec file and add the ConvaAICore dependency.
s.dependency 'ConvaAICore', '1.0.3'Once done, run the command 'pod install'
Initializing the Assistant
First, initialize Conva.AI using the credentials of the Assistant created via Magic Studio.
Invoking a Agent
Next, pass unstructured text input to Conva.AI and let it invoke the appropriate Agent and return the response.
Note
Proper error handling should be implemented to manage any potential issues during the invocation.
Notes
Ensure that this method is called from an
asynccontext.Proper error handling should be implemented to manage any potential issues during the invocation.
Response object
The response object contains the following key fields
capability_name: The name of the Agent that was triggered.
message: The string containing either the answer to the original query or a status message. Agent creators can update this via Magic Studio.
related_queries: An array of strings with queries related to the current query and app context. Agent creators can update this field's description if needed.
params: This object contains custom parameters specific to the invoked Agent. For example, if a Agent is created with
task_name,date, andtime(as shown in the image below), these fields will be in the params object.history: The current conversation history. This can be used by the app to continue the current conversation further (more on this later).

Invoking a specific Agent
To invoke a specific Agent, you can refer to it by name and pass the input string. You can use this in streaming (async) or non-streaming (sync) mode.
Note
Proper error handling should be implemented to manage any potential issues during the invocation.
Notes
Ensure that this method is called from an
asynccontext.Proper error handling should be implemented to manage any potential issues during the invocation.
Handling Streaming Response
Conva.AI supports a streaming version of these APIs. The response is sent back incrementally, with the listener invoked multiple times. Use the is_final flag in the Response object to check if all responses are received.
Notes
Ensure that this method is called from an
asynccontext.Proper error handling should be implemented to manage any potential issues during the invocation.
Invoking a Agent Group
When creating Agents via Magic Studio, it places all Agents into the "default" group. But users can create their own groups and put specific Agents into it. If you want to limit the Agent invocation to specific groups, you can pass the group name to the APIs.
Note
Proper error handling should be implemented to manage any potential issues during the invocation.
Notes
Ensure that this method is called from an
asynccontext.Proper error handling should be implemented to manage any potential issues during the invocation.
Handling Conversational Context
Conva.AI supports maintaining context across different conversations, allowing the AI to build upon prior interactions for more relevant responses.
Parameters
history(Optional): Available after the first response. Pass this from the previous response if you want to maintain conversational context in subsequent calls.capabilityContext(Optional): Context related to a specific Agent, refining AI responses for that particular Agent.
Usage
On the first invocation, call the API without
history.On subsequent invocations, pass the
historyfrom the previous response if you want to maintain the conversation's context.You can also pass the
capabilityGrouporcapabilityNamealong with the context to refine the AI’s responses further.
Notes
Ensure that this method is called from an
asynccontext.Proper error handling should be implemented to manage any potential issues during the invocation.
Using the Copilot UI
The Copilot experience includes a well designed bottom sheet that appears on top of the app and provides the following elements:
Text box: For users to type their queries.
Mic icon: To trigger Conva.AI's highly accurate speech recognition feature
Message Area: Displays the message (the
messagefield in theResponseobject)Voice Feedback: Speak back the message, can be optionally muted by the user
Mute icon: To allow end-users to enable or disable the voice feedback
Feedback Icons: Automatically appear after each response.
Dynamic Suggestions: Buttons showing related queries (the
related_queriesfield in theResponseobject) or custom suggestions provided by the app via an API.
Here are the high level steps to use the Copilot:
Setup the Copilot.
Theme the Copilot
Register handlers for Agent handling and Suggestion handling
Agent Handling: Handle responses generated by the Copilot
Suggestion Handling: Handle user interactions with suggestion buttons directed to the app
Attach the Copilot to an Activity
Start the Copilot
Handle the Agents and Suggestion clicks
Last updated
Was this helpful?