|
|
Integration - Agent Assist
The Agent Assist integration is designed to work directly and easily within Agent Guidance, providing AI enrichment to audio interactions in real-time. It comprises a few components working together:
-
Content (either audio of text) is streamed directly to the Creovai processing servers via a local service or program on the agent's machine communicating to the Conversation Intelligence Service,
-
The received content is then transcribed into text (if audio), sensitive details are redacted, and Category matches are parsed by the Conversation Intelligence platform before this processed text and data is then returned to the Agent Guidance Workflow via the agent's browser,
-
The Workflow then sends this transcribed text on to an AI provider for Intent analysis, summarisation, and to allow the agent to ask free-form questions based on the content,
-
The results of this categorisation and AI analysis are then displayed in an on-screen widget, and can also be used to trigger specific Fields (in the same manner as Update Links) based on matched Categories, Intents, or rebuttals to drive in-Workflow processes.
Once integrated, building Workflows enriched by Agent Assist is as simple as adding an Interaction Capture - Command Control to initiate the audio streaming (if required), and adding a Agent Assist - Intent Actions or Agent Assist - Category Actions Control to any Page that you want to define Intents or Categories to be watched for and any actions to occur when they are matched. The audio stream will automatically end when the record is completed, so no further management is needed beyond initiating the stream; starting the audio stream has been left as a triggered action simply so that when handling a non-enriched interaction (such as an admin process) that the record doesn't get streamed for processing.
If you wish to make use of the Agent Assist integration, please review the How To steps below and then contact your Agent Guidance vendor.

Broadly, making use of Creovai Agent Assist within Agent Guidance has 6 steps:
Prior to using the Agent Assist integration, you will need to request that the licence part be added to your System licence and API credentials be generated. Please contact your Agent Guidance vendor, and provide the following information:
-
Agent Guidance Licence Key: this detail can be found in the Licensing panel of your Agent Guidance About Section, and is needed in order to add the required licence parts to the System licence.
-
Agent Guidance System Name: this detail can be found in the System Details panel of your Agent Guidance About Section, and is needed in order to add the required licence parts to the System licence.
-
Required licence part assignment type (Per System or Named): the Agent Assist and Integration - Interaction Capture Service or Integration - Interaction Capture Client licence parts can be provided as either Per System or Named type licences, changing how Users need to be managed. If most/all agents are expected to use Agent Assist, it is probably best to have these licence parts as Per System to reduce management overhead.
-
Remote/Virtual: this is whether the agents are either using remote computers, VDIs, or other similar solutions where they are not physically present at the computer that they are using.
-
Email address: this will be used to distribute the initial login details for Cloud API as well as receive any password resets, and needs to be valid and capable of receiving emails. It is recommended to provide a "service" mail account, rather than the email address for an individual, so as to avoid issues with a key person being unavailable.
-
Company name: this will be the name of the end client (and partner, if applicable) so that the generated API user can be properly attributed.
-
Audio system name (optional): used when you will be streaming audio from the agent machines rather than using exclusively text/pre-transcribed content. This doesn't have to match the System Name found in the System Details panel of your Agent Guidance About Section, but if not then it is recommended that it be a concise, clear, and recognisable name. For example, "UK Production".
-
System region: this allows us to choose the appropriate API region for your usage, for performance and compliance reasons. For example, "US Eastern".
Once your request has been received and approved, you will be provided a link to download a Creovai Interaction Capture solution (if required), and your Agent Guidance licence will be updated to include the relevant licence parts (requiring the locally cached Agent Guidance licence to be updated). If any licence parts are provided as the Named type, then ensure that they have been assigned to the appropriate Users.
You will also receive an automatic email at the specified email address, providing a username, temporary password, and initial setup URL for the Creovai Cloud API:

Follow the URL in the email, providing the username and temporary password, and specify a new password. Be sure to record these details in a secure and resilient password store!
The content of the interaction between the agent and the customer must be captured and provided to the Conversation Intelligence platform so that it can be processed for categorisation and redacted, and then be made available to the generative AI for intent analysis. This may be handled by a separate integration that you use, or it may be irrelevant if you deal with exclusively text interactions such as chat or email, but for most clients there will be the need for audio to be captured and streamed to the Conversation Intelligence platform.
Creovai provide a pair of simple yet powerful solutions to this transcription and distribution that can be deployed for all agents via standard enterprise tools, comprising either a Windows service that communicates with the agent's browser on a specified port, or a local app that communicates directly with the Agent Guidance site.
If you require audio capture, then please follow the appropriate configuration article depending on the integration type that is being targeted:
In the rest of this document, they may be collectively referred to as Interaction Capture if no distinction is required at that point.
While transcription and categorisation is provided automatically by the Conversation Intelligence platform, if it desired to use a generative AI to provide further functionality to the agent (such as allowing Intent matching, quick actions, or agent free-input queries), then this will need to be acquired and implemented separately.
These services can be procured from various vendors with varying different capabilities and qualities, but currently Agent Guidance only has an integration with Microsoft's Azure OpenAI Service. As such, you will need an account and access to an instance of this service, and to create an integration URL and API key to be used in the Agent Guidance Connector.
If you wish to make use of a different generative AI service, or for Creovai to supply this service, please contact your Agent Guidance vendor.
For guidance on creating and configuring an AI deployment, please see the appropriate Microsoft documentation. Agent Assist supports any of the currently-available AI models provide by Microsoft Azure, generally tracking an API version for the model that matches the "Latest GA release" for the "Data plane - inference" in this table. When picking an AI model and service level, please ensure that the request and token limits will be appropriate for your expected usage as well as a margin of overhead.
Once the AI deployment has been generated, an endpoint URL and key will be needed for use by the Connector. The endpoint URL that can be found within the deployment's configuration will typically be of the format https://{your-resource-name}.openai.azure.com, but the deployment's ID and API version need to be appended for use by the Connector - please see an example below, and example Microsoft documentation can be found here.
The Resource Name will be relative to your tenant and created AI resource, the Deployment ID will be the displayed name of the deployment, and the API Version controls the version used by Agent Guidance to communicate with the deployment.
https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}/completions?api-version={api-version}
For example:
https://awaken-gpt-region1.openai.azure.com/openai/deployments/production-gpt35-turbo/completions?api-version=2023-08-01-preview
|
Once the generative AI details have been obtained, a Connector must be configured with the details to power Agent Assist. At this time Agent Guidance only has an integration with Microsoft's Azure OpenAI Service, so you will need to configure an AI - GPT - Azure Connector.
It may also be desirable to set this Connector as the default within the Agent Assist Settings to avoid needing to specify it within every Workflow individually.
In order to confirm that Agent Assist is operational, it is easiest to create a small test Workflow to verify the various bits of functionality are working normally.
Create a new basic Workflow, and ensure that the Start Page contains an Interaction Capture - Command Field. Add a Agent Assist - Category Actions Field, and configure it with one or more Categories.
If using a generative AI, then also add a Agent Assist - Intent Actions Field, and configure it with one or more Intents. If you haven't specified a default Connector in the Agent Assist Settings, then you will also have to include an Agent Assist - Command Field that runs the setConnector Helper Function.
Publish this test Workflow, create a test Campaign, and then proceed to confirming that the Agent Assist setup is operational.
Once all of the above items have been configured, then operation of the Agent Assist system can be verified as follows:
-
Confirm that the Conversation Intelligence Service shows as running and operating normally,
-
Launch a Workflow that has been designed for Agent Assist enrichment while logged in as a User with the Designer licence part assigned,
-
Confirm that the Creovai Audio Settings button is visible at the top-right toolbar next to the Activities list, and that your audio devices are either already selected or are available,
-
Confirm that the Agent Assist widget is visible at the bottom-right of the Desktop,
-
Confirm that when talking and receiving audio that the expected text shows up in the Conversation tab of the Agent Assist widget and the related Categories and/or Intents are checked off.
Once operation has been confirmed, then it is recommended that the Agent Assist widget and Workflow Design for Creovai Agent Assist articles are reviewed for more general guidance.
Agent Assist only currently supports English audio, but a range of accents have been tested successfully. Other languages are expected to be supported in a future release.
The Agent Assist transcription and Intent history isn't preserved between successive record runs if a record is resumed or rescheduled. The Agent Assist history and action will only relate to the current record run.
Intents are checked on an utterance-by-utterance basis, so if the content relevant to an utterance is delivered over the course of successive events then it may cause an Intent to not be triggered. Having the agent briefly summarise the relevant details to confirm with the caller may be a simple yet effective way to address this should it occur.
When a User with the Agent Assist licence part closes a Workflow, they will have their clipboard automatically cleared to help avoid inadvertently leaking data between records. Due to browser restrictions, this will only work if the User accepts a browser prompt to allow clipboard access, the User has the Agent Guidance window focused when the record closes, and if the site is being accessed via an HTTPS connection.
If an "Error streaming audio" ribbon appears at the bottom of the screen, hovering the mouse over it will provide additional detail as to the issue with the Interaction Capture.
If the agent sees a "You are not licensed to use this feature" message within the Agent Assist widget, then this indicates that they haven't been successfully assigned the Agent Assist licence part.
If the agent doesn't see the Creovai Audio Settings button on the top-right toolbar next to the Activities list, or if they get a message about not being connected to the audio service, then this indicates that they haven't been successfully assigned the Integration - Interaction Capture Service licence part.
If the agent only sees output (e.g., headset) but no input (e.g., microphone) devices in the Creovai Audio Settings, then this may indicate the agent is accessing a remote PC (e.g., using Remote Desktop to access an office PC or virtual machine), or there are permissions issues within Windows that prevent the other audio devices from being detected properly.
If a message is seen to the left of the Agent Assist widget's icon saying "Trying to reconnect the audio socket", then this typically indicates multiple Agent Guidance windows are open on the Desktop or in a Workflow or preview simultaneously. This is a result of a restriction to only one audio connection being allowed between the agent's local audio service and the browser at a time, and can be remedied by simply ensuring the agent doesn't have multiple instances of the Agent Assist widget loaded simultaneously.
If Categories or Intents aren't being completed when expected, then it is most likely to be the case that the transcribed audio doesn't contained the expected phrases. If running the record as a User with the Designer licence part, then it is possible to see the transcribed text within the Conversation tab of the Agent Assist widget. This should help indicate whether the issue is with the audio not being transcribed as expected, or if there are other issues such as pauses in the conversation causing the required Intent to be split across multiple utterances.
If the transcription quality appears to be low, then a common cause for this is either poor audio quality, or too-short utterances. If the audio stream has background noise or the audio is overly compressed, then the speech may be difficult to distinguish and therefore not transcribed accurately. Similarly, if the utterances are very short, then there is less ability for the transcription platform to analyse the sentence and self-correct any words with more-likely alternatives. This is most noticeable when utterances are only a few words long, and some transcribed words may be substituted with similar-sounding words.
If utterances take a long time to appear in the Conversations tab of the Agent Assist widget, or Categories or Intents take a long time to be checked off, this may be a symptom of background noise in the recording causing the utterances to not naturally break at pauses in the conversation where they would normally be submitted. It is advised to check the background noise level on the agent's microphone and the incoming audio stream to see if this may be contributing to delayed transcription and Categories or Intents.
Unusual words (most commonly proper nouns) may be misidentified by the transcription platform, as it makes use of a generic dictionary. This should be taken into account when designing any Categories or Intents. Support for custom dictionaries is intended to be supported in a future release.
If a Category or Intent is checked off but the linked action isn't performed, then this may be because the agent has navigated away from a Page that contains an instance of the Field to be triggered. Reloading the Page or navigating to a Page where the triggered Field exists as a Used Field are both valid, but care should be taken to ensure that Categories or Intents aren't triggered when the agent isn't on one of those Pages.
If transcriptions are being generated as expected and Categories matched, but Intents aren't being matched and the AI tab isn't responding to queries, then this may indicate that you have used more tokens than your current AI service's deployment allows. Operation should resume when the rate limit is next refresh (typically once per minute), but likely indicates that you need to increase the cap on your AI service's deployment.
If any of the Agent Assist functionality doesn't appear to have loaded or be operational, then some components may have been blocked by false positives from advert blocking software within the browser. It is recommended to disable any adblock extensions, and check the F12 Developer Tools to see if there are any errors displayed in the Console tab.