The KPI Dashboard
The KPI Dashboard provides an overview of how a project is doing compared to earlier weeks, quickly and without requiring too much interpretation. This Dashboard can be exported and shared internally, without too much explanation needed.
One very common question about the Dashboards, especially about the various metrics presented, is 'What is a good value for this metric'? The metrics used in this KPI Dashboard are a selection of metrics from various other Dashboards to present an overall metric focused on the various interactions our platform provides, i.e. Events, Q&A, Dialogs, FAQ, Feedback, Link Clicks.
In addition to allowing you to compare your own project between weeks, this Dashboard also presents a benchmark for the generic Key Performance Indicators (KPI) that show a project is doing well. To that end, the KPI we present here have been normalized to percentages. This way, they can be compared between even wildly differing projects/weeks in terms of the number of specific interactions or sessions.
The KPI Dashboard’s Metrics
The KPI Dashboard provides a set of general project data in time, to see the trends. Below, the six general KPI are presented. These 6 metrics cover all the types of interaction end-users can have with Conversational AI Cloud.
General Project statistics
These values have not been normalized, so they are absolute values week on week, except for the interactions per session. They also show a slightly longer historical overview – 5 weeks instead of this week and the previous week.
The values shown here are the following:
- Interactions The number of interactions, excluding data retrievals
- Sessions The number of Sessions
- Interactions per Session The number of interactions divided by the number of Sessions
- Interaction Types The proportion of Interactions that were Events, Dialogs, Q&A, FAQ Clicks and FAQ Searches.
The values should be rather consistent or show explainable differences in time (such as. holiday season influences), so if there are spikes or drops, this is an indication something changed or is off.
Note: The numbers presented for 'This Week' here will almost always be lower than the previous weeks because you're always looking at an incomplete week
Note: The Dashboard is filtered to show Active Sessions by default. For the KPI, that filter does not make a difference, but it does affect what you see here.
Key Performance Indicators (KPI)
The individual KPI are described in the next section with a generic interpretation of what to do when they are not aligned to the benchmark.
KBI Dashboard Benchmarks
Because of the broad distribution of each metric across the various projects, we’ve chosen the benchmark between the Average and the Weighted Average, but at least at or above the Median value for the metric. This results in the following values:
percentage | |
Interaction % | 20% |
% Questions Answered | 89% |
% Dialog Completion | 59% |
% Session Link Click | 35% |
% Positive Feedback | 32% |
% FAQ Clicks | 16% |
% AutoDialog Completions | 66% |
% T-Dialog Completions | 91% |
Prerequisites to the KPI Benchmark Metrics
Some of these values can be implementation-specific. By setting benchmark values, we're keeping a fairly standard implementation in mind.
The Conversational AI Cloud chatbot is on a website, available on all or at least several pages. It is using, at least, an onLoad Event or some sort of interaction when a user comes onto the page and all subsequent interactions by the end-user on the website are within the same session, until the session expires due to length/inactivity.
The possibility is there to ask open questions (Q&A), some of which may result in Dialogs, and while you offer some form of FAQs, Conversational AI Cloud's experience is not limited to just offering FAQ. Also, any FAQ you offer, are made available primarily through Conversational AI Cloud and not through other methods.
Finally, the user receives the opportunity to provide feedback, and links are offered in various answers to help the navigation of the user.
Interaction % Widget
This is a measure of how many people actively engage with Conversational AI Cloud by asking a question, searching for or clicking on FAQs, engaging in dialogs, clicking links or leaving feedback. A low Interaction % means Conversational AI Cloud is not drawing people's attention on your website. If people aren't interacting, we can't usually be helping them, either. If the Interaction % is too low, check with your web developers whether Conversational AI Cloud is visible on all pages where it is loaded, and consider changing the design to increase engagement.
Positive Feedback Widget
Overall % of Feedback that was positive. If you want to increase this percentage, you should review the various Answers that received negative feedback to improve their effectiveness. You could also look at which questions caused the person asking them to give negative feedback. Consider adding or amending Q&A that Answer those questions.
FAQ Clicks Widget
This metric measures the 'popularity' of FAQ. This is also reported on a Session Basis.
FAQs that are often shown but never clicked may not offer useful information, or may not actually be that Frequently Asked. If you offer FAQs through FAQSearches, this is an indicator of how well your FAQs are found by the user's search terms.
Please do note that if FAQs are shown through means other than interactions with Conversational AI Cloud (for example, Google Search or direct links), there will still be FAQclicks logged, but there will be no corresponding FAQ Shown -- which may still skew this KPI.
By right-clicking on the column chart you can open the FAQs Dashboard.
Link Clicks Widget
Measures how many people clicked a link that was offered. On the KPI Dashboard, this is reported on a Session basis. If this number is low, check which often-activated Answers offer links that are rarely clicked. They may need to be rewritten to emphasize the link, or perhaps excess links should be removed.
By right-clicking on the column chart you can open the Answers Dashboard.
Questions Answered Widget
Questions Answered measures the % of Q&A questions that gets any Answer at all.
If the % Questions Answered is low, you should improve recognition specifically by focusing on terms that have not been recognized at all. Those terms should then be added to your knowledge base by putting the terms in Entities or by adding extra Questions to Q&A.
Dialog Completion Widget
Measures how many people that started a Dialog, then finished that Dialog. This can be calculated either on a Session or an Interaction basis. The Session Dialog Completion % is a better indicator of the user experience, but the Interaction Dialog Completion % is more useful for trying to improve the Dialog, so they are reported in different ways on different Dashboards. The KPI presented here is the Session Dialog Completion %.
If this number is low, check the Dialogs Dashboard to improve Dialog flow
autoDialog Completion Widget
This widget measures how many people started and finished an AutoDialog on a session basis.
T-Dialog Completion Widget
This widget measures how many people started and finished a T-Dialog on a session basis.
Product
- Conversational AI Cloud