The Reviews dashboard in Zendesk QA includes all the data of your manually submitted reviews and allows you to view a breakdown of your quality results.
This article describes how to access, view, and use the information on your Reviews dashboard.
By accessing conversation data directly from the Reviews dashboard, you can gain insights into your support team's performance. Its various quality indicators allow you to monitor quality scores across categories, track metrics like average review times, and analyze the quality of your customer interactions.
This article contains the following topics:
- Accessing the Reviews dashboard
- Understanding overall quality indicator cards
- Understanding additional detailed quality indicators
- Downloading Reviews dashboard data
Related articles:
Alternatively, watch the video below for an overview of Zendesk QA Reviews dashboards:
Accessing the Reviews dashboard
The Reviews dashboard displays your quality indicators in cards. You can also access additional detailed quality indicators.
To access the Reviews dashboard
- In Zendesk QA, click Dashboard in the sidebar, then select Reviews.
- Use the filters to narrow down the data based on specific criteria. See Filtering the Reviews dashboard.
- Click the Update icon () to refresh the dashboard data based on your filters.
Understanding overall quality indicator cards
The Reviews dashboard is separated into several cards that represent your main key performance indicator (KPI) metrics.
The following quality indicators represent your main KPI metrics displayed on the Reviews dashboard cards:
- Internal quality score (IQS): The average of all review scores received or given over a period of time. See Filtering the Reviews dashboard to select a time period.
- Reviews: The number of manually submitted reviews.
- Pass rate: The outcome of your conversation reviews, determined by a predefined baseline, for example, pass or fail. See Understanding the pass rate.
- Reviewed conversations: The number of conversations that received one or more manual review.
- Average review time: The average time spent on a conversation before it was submitted for review.
- Reviewed agents: Number of users who received a manual review.
- Disputed reviews: Percentage of manual reviews that have a dispute.
- Reviews with comments: Percentage of manual reviews that have at least one comment.
- Unseen reviews: Percentage of manual reviews that have not yet been seen by reviewees. Bots are not included in the unseen reviews metric.
You may want to use this information to determine your current quality score and how it's changing over time. It can also help you understand if your review process is efficient and reviews are being seen, how active your reviewers are, and more.
Expanding quality indicator cards
Each of your quality indicator cards can be expanded to display more detailed information.
To expand quality indicator cards
- From your Reviews dashboard, select a card.
- Click the options menu () then select View > Expanded.
A detailed view of the card opens.
Understanding additional detailed quality indicators
Your Reviews page includes the following additional quality indicators displayed in graphs and charts:
- Quality scores over time
- Category scores over time
- Average category scores
- Category group performance and category insights
- Root causes and workspaces
- Reviewer and reviewee performance
- Scores by category
- Review scores
Quality scores over time
Use the Quality scores over time card to understand the evolution of your quality scores on a weekly basis. You can also view the number of reviews created to understand trend reliability.
Click an aggregated score to drill down into the individual score.
You may want to use this information to understand how strict reviewers are with their ratings, how agents are performing, if there are any outliers that require calibration sessions, and if any agents might require specific training.
Category scores over time
The Category scores over time card displays the evolution of quality ratings over time (weeks), by categories.
This information can help you understand how your categories are developing over time, how many ratings are being given per category, and what specific dates did a category dip or improve. Additionally, you can filter for a specific agent to view their individual data.
Average category scores
The category chart display which categories have scores above 80% and which have 0% scores.
Category score cards provide an overview of which categories are above an 80% score and which are below.
Use these cards to understand how your categories are performing. You can view this information to understand the distribution of scores across categories and which consistently fall short with 0% scores.
Category group performance
If your categories are grouped, you can quickly compare the score per category group with this card. It can help you understand your highest and lowest performing category group and compare category group scores to the previous period.
The Category insights card displays the combined average of all category scores that are given or received. These insights can help you analyze which category receives the highest and lowest quality scores and determine if training sessions are needed for specific categories.
You can drill into all the numbers in the scorecard for a more granular view and understand, for example, which agents are struggling the most with some category performances, or which tickets contributed the most to those results.
You can also check how frequently a category has been disputed. This can help ensure that agents and reviewers have a consistent understanding of what those categories mean.
Root causes and workspaces
Here, you can view the number of times a root cause has been selected. You can also click into it to see the list of conversations and agents that contributed to those root causes.
If you operate multiple workspaces, you can also compare their quality results.
Reviewer and reviewee performance
The Reviewer and Reviewee performance cards show a full overview of all your Zendesk QA users and their individual KPIs.
Use these cards to understand the number of reviews each user is receiving, who is giving and receiving higher or lower scores in comparison to their peers, and how many reviews are given per user.
This helps you measure trends over time.
You can also view the average review time for each reviewer. This can help you ensure reviewers aren't simply rushing to complete reviews.
Scores by category
Scores by category lists all your support agents followed by each category in your scorecard.
You can sort the table to quickly understand which agents are struggling the most, or where additional training or improving your knowledge management processes may be required, without the need to manually review individual tickets.
Scores by reviews
The Scores by reviews card shows the review ID, a link to the conversation on the help desk, comments given, and the scores per category. You can filter these scores by reviewer or reviewee and order the table by review scores, assignment name, review time, and generally any column on the table.
When filtering for a specific agent, you can determine which conversations they excelled at and where they can improve. This card can help you spot if there are any outliers in conversations and how reviewers are rating specific conversations.
Ratings and root causes by reviews
Similar to the Scores by reviews card, the Ratings and root causes by reviews card provides a breakdown of scores for each conversation. It offers an additional level of detail by breaking down scores for each category and root cause.
Last data update time
You can see when data was last updated in the very last chart of the dashboard (time is UTC).
Dashboard data is updated approximately every 1-1.5 hours for hashtags, helpdesk tags, helpdesk fields, conversations, groups, conversation channels, and assignment name fields. Reviews, ratings, rating categories, comments, disputes, feedback, scorecards and root causes are updated every 30-35 minutes.
Expanding quality indicator cards
Each of your quality indicator cards can be expanded to display more detailed information.
To expand quality indicator cards
- From your Reviews dashboard, select a card.
- Click the options menu () then select View > Expanded.
A detailed view of the card opens.
Downloading Reviews dashboard data
You can download the data from the Reviews dashboard as a CSV or PDF file.
To download the Reviews dashboard data
- In the Reviews dashboard, click the Dashboard actions options menu ().
- Select Download.
- In the Format field, select either CSV or PDF.
- Click Download.
All dashboard cards are downloaded as individual CSV files in a zip folder.
You can also download data individually for all dashboard cards, except for Reviewer and Reviewee performance (although this data can still be obtained by downloading the entire Reviews dashboard data).To download Reviews dashboard data for a specific card
- Click the options menu () next to the card.
- In the Format field, select one of the multiple formats available.
- By expanding the Advanced data options, you can choose if you want to download your results with visualization options applied or as displayed in the data table.
If you apply visualization options, any customizations made in the visualization will attempt to be reflected in the download but may not be exact matches. This includes options such as re-labeling columns, hiding totals, or adding conditional formatting. Choosing 'as displayed in the data table' will include additional fields beyond those currently visible in your dashboard card.
You can also choose whether to download your results with data values formatted or not (for example, rounding, special characters) and specify the number of rows to include. Options include selecting the current table only, all results, or a custom number of rows.
The custom number of rows is limited to 5000. This number can be increased by selecting All Results. - Click Download.