Additional APIs and customisations to measure and improve the experience further
2.4 The onSearch callback
Whenever the Assistant detects that the user is searching for items in the app, it will try to break down the search request into its basic components and invoke the onSearch callback associated with the search user journey. The callback looks like this:
When this callback is invoked, the app is expected to:
Consume the details of the search request via the SearchInfo parameter.
Fire the app's search request.
publicvoidonSearch(SearchInfo searchInfo,SearchUserJourney searchJourney) {// The searchItem will have the relevant part of the end-users search request // and will automatically be in English, even if the user spoke in a // different language. String searchItem =searchInfo.getSearchString();// Launch SearchResultsActivity using "searchItem"// ...}
The following are some examples of commands that could trigger this journey
"Onions"
"Show me onions"
"3 kgs of organic onions"
"Looking for fresho organic onions"
"Searching for Maggi Instant noodles in grocery"
"2 rs head and shoulders shampoo"
SearchInfo Parameter
The parameterSearchInfo contains the breakdown of the original search request. Its structure is as described below:
Class SearchInfo {publicItemgetItem();publicStringgetSearchString();publicList<FilterInfo>getFilters(); publicSortingInfogetSorting();publicbooleanisAddToCart();}Class Item {publicStringgetCategory(); // The category that the user specified, publicStringgetBrand(); // The brand name identified by Slang from what thepublicString[] getProductNames(); // The product names (like "organic", "pulpy orange") if anypublicQuantitygetQuantity(); // The quantity if any is spoken by the userpublicSizegetSize(); // The size if any is spoken by the userpublicPricegetPrice(); // The price value if any spoken by the userpublicStringgetDescription(); // The fully constructed search string without size and quantitypublicStringgetCompleteDescription(); // The fully constructed search string with size and quantity}
// When the user searches for something like // 4 fresho organic onions 3kg// This is how the SearchInfo parameter would be populated{"item": {"brand":"Fresho Organic","description":"Fresho Organic","completeDescription":"Fresho Organic 3 kg onion""quantity": {"amount":4,"unit":"UNKNOWN" },"size": {"amount":3,"unit":"KILOGRAM" },"productNames": ["onions" ] },"isAddToCart": true}
classSearchInfo {Item item;bool isAddToCart;SortingInfo sortingInfo;List<FilterInfo> filterInfoList;}classItem {String category; // The category that the user specified, String brand; // The brand name identified by Slang from what theList<String> productNames; // The product names (like "organic", "pulpy orange") if anyQuantity quantity; // The quantity if any is spoken by the userSize size; // The size if any is spoken by the userPrice price; // The price value if any spoken by the userString description; // The fully constructed search string without size and quantityString completeDescription; // The fully constructed search string with size and quantity}
As part of the Analytics offering from CONVA, it offers a few metrics that can be used to track the effectiveness of adding the Voice Search experience.
VTS - Voice-to-text - tracks the affinity of users to do voice or text searches for any search session
CTR - Clickthrough Rate - allows the app to compare the click-throughs that happened on the search results page when the user landed on it via text or voice search
In order to compute these metrics, the app is expected to share a couple of events at the appropriate time. Let's look into each of those required events
2.5.1 Text Search Events
CONVA provides an API with which the app can specify if a search happened via UI/Text based interactions. This event would be used to determine the adoption and engagement of the voice compared to the traditional UI-based interactions.
Use the below API to inform CONVA when a user clicked on the search results. Note that the user could have clicked on different sections of the search result - wishlist, product, add-to-cart. Providing the exact section that was clicked will allow CONVA's Analytics to provide more fine-grained comparisons between voice and text searches.
SlangRetailAssistant.notifyCTREvent(newHashMap<String,String>(){ {// The section that was clicked. Here are the// supported strings // "NavigatedToProductPage"// "AddedToCart"// "AddedToWishlist"// "ShadeSelected"put("eventName","<SectionName>"); // The product item that was clicked// Eg: "Organic Onions"put("itemName","<ItemName"); }});
SlangRetailAssistant.notifyCTREvent( eventMedata: ["eventName":"<SectionName">, "itemName":"<ItemName>"])//SectionName include// "NavigatedToProductPage"// "AddedToCart"// "AddedToWishlist"// "ShadeSelected"//ItemName specifies the item that was clicked// Eg: "Organic Onion"
var eventInfo = { eventName:"<SectionName>",// The section that was clicked. Here are the// supported strings // "NavigatedToProductPage"// "AddedToCart"// "AddedToWishlist"// "ShadeSelected" itemName:"<ItemName>". // The product item that was clicked// Eg: "Organic Onions"}SlangRetailAssistant.notifyCTREvent(eventInfo)
Please note that eventMetaData refers to a Map of key-value pairs, where both the key and value are of type String.
eventMetaData's can be:-
sectionName
//SectionName include // "NavigatedToProductPage" // "AddedToCart" // "AddedToWishlist" // "ShadeSelected"
itemName
//ItemName specifies the item that was clicked // Eg: "Organic Onion"
// Under Development
2.5.3 Associating App specific UserIds to the CONVA Analytics Events
By default, CONVA-generated analytics events do not capture any PII data about the user. We use a CONVA-generated DeviceId to uniquely identify the device on which the app instance is running. But if an app wants CONVA to associate these events with any user ids that it already knows about the current user (say a logged-in user), it can use the below API to trigger that association.
SlangRetailAssistant.setUserId("<UserId>");
SlangRetailAssistant.setUserId("<UserId>")
SlangRetailAssistant.setUserId("<UserId>");
SlangRetailAssistant.setUserId("<UserId>")
// Under Development
2.6 Assistant Lifecycle Events
The Slang Assistant handles most of the heavy lifting of interacting with the end-users and notifies the app when there is some action to be performed by the app. But in some cases, apps may want to be notified of low-level events that the Assistant is aware of. For example, whenever a user clicks on the trigger (the microphone button) or when the user cancels the Slang surface.
Access to low-level Assistant events is available through the Assistant's Lifecycle Events API.
Registering for events
The app can register with the Assistant to be notified of all interesting life-cycle events via the setLifeCycleObserver method.
importSlangRetailAssistantclassViewController:UIViewController {overridefuncviewDidLoad() { super.viewDidLoad() SlangRetailAssistant.setLifecycleObserver(with: self) }}extensionViewController:LifecycleObserver {funconAssistantInitSuccess() { }funconAssistantInitFailure(withdescription: String) {//description represents the reason for the failure }funconAssistantInvoked() { }funconAssistantClosed(withcancelled: Bool) {//cancelled is a boolean that informs whether//the assistant was closed because of a user cancellation. }funconAssistantLocaleChanged(forlocale: Locale) {//locale is a Locale object that represents the current language. }funconUtteranceDetected(forutterance: String?) {//utterance is a string that represents the current utterance spoken//by the user. }funconUnrecognisedUtterance(forutterance: String?) {//utterance is a string that represents the current utterance spoken//by the user that was unrecognized. }funconOnboardingSuccess() { }funconOnboardingFailure() { }}
Few lifecycle observer methods are still under development for web.
As part of the Lifecycle Events API, an observer will be notified of the following events:
onAssistantInitSuccess
Called as soon as the Assistant has initialized successfully and is ready to serve the app
onAssistantInitFailure
Called when the Assistant failed to initialize successfully. The reason is passed as a parameter to this callback
onAssistantInvoked
Called whenever the Assistant has been launched (this can be either as a result of the user clicking on the trigger or the app invoking the Assistant via the startConversation API)
onAssistantClosed
Called whenever the Assistant has been dismissed. A boolean parameter isCancelled is passed to indicate whether this happened because the user cancelled the session or if the Assistant was done with its job
onAssistantLocaleChanged
Called whenever the user changes the locale of the Assistant. A locale dictionary parameter is passed to indicate the current locale.
locale is a dictionary that contains the following fields :
country : field that represents the country in 2 characters. Example: "IN", "US"
language : field that represents the locale in 2 characters. Example: "en", "hi", "ta"
onUnrecognisedUtterance
Called whenever the Assistant is not able to understand what the user spoke. The utterance that the user spoke is passed as a parameter.
onUtteranceDetected
Called whenever the Assistant has detected an utterance that was spoken by the user. The utterance that the user spoke is passed as a parameter.
onOnboardingSuccess
Called whenever the Assistant has completed the entire onboarding process for the given user.
onOnboardingFailure
Called whenever the Assistant onboarding process has been cancelled by the user.
onMicPermissionGranted
Called whenever the microphone permission that is required by the assistant has been granted by the user.
onMicPermissionDenied
Called whenever the microphone permission that is required by the assistant has been denied by the user.
onCoachmarkAction
Called whenever the coachmark UI that is provided by the assistant has been interacted with by the user. It additionally provides an CoachmarkInfo object that describes the action.
CoachmarkInfo is a dictionary that contains the following fields :
Based on the how the Assistant was configured in the Console returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.
The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.
That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant to your app.
Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to the more advanced needs of the apps. Please refer to the Advanced Concepts section for more details.
2.8 Nudging Users
We also offer a nudge API to trigger the coach mark on the CONVA trigger on demand.
NOTE: This will only work with the Global and Inline Trigger that is managed by the SlangAssistant SDK. It will not work in the case of custom app Triggers/Mic Buttons.
2.8.1 Nudging with messages controlled remotely
The CONVA console allows specifying the message to be shown in the nudges. So to dynamically show the coachmark with the messages configured in the console, use the below APIs
SlangRetailAssistant.getUI().nudgeUser();
// TBD
SlangRetailAssistant.ui.nudgeUser();
SlangRetailAssistant.getUI().nudgeUser();
// Under Development
2.8.2 Nudging with custom runtime messages
Sometimes it would be useful to show more contextual messages to the user to educate them about the Voice capability in the app. Eg after the user does a text search, the app can show a coachmark pointing to the mic trigger and informing the user to try voice next time.
To perform such contextual nudges, use the below API
The user can pass language-specific prompts and CONVA will pick the right language based on the currently selected locale.
There are two strings that can be specified.
The title string - the one that shows up in bold in the first line
The description string - the one that shows up in regular style in the second line
var title = {"en-IN":"Title","hi-IN":"शीर्षक". };var description = {"en-IN":"Description","hi-IN":"विवरण". };SlangRetailAssistant.ui.nudgeUserWithParameters(title, description);
Map<String, String> title = {'en-IN':'Title','hi-IN':'शीर्षक',};Map<String, String> description = {'en-IN':'Description','hi-IN':'विवरण',};SlangRetailAssistant.getUI().customNudgeUser(title, description);