Summary: ◀▼
Calibration aligns reviewers by having them rate the same conversations and compare scores, ensuring consistent feedback for agents. You can set up calibration within a workspace, control review visibility, conduct sessions by adding conversations, and mark baseline reviews for comparison. Calibration results are viewed in a dedicated dashboard and do not affect internal quality scores or trigger notifications.
Calibration is the practice of having all your reviewers rate the same batch of conversations and compare their scores and comments. This ensures that your reviewers are aligned in their evaluations, providing consistent feedback to agents regardless of who conducts the review.
Admins, workspace leads, and managers can perform calibrations.
This article contains the following topics:
Understanding the importance of calibrating reviews
Calibration ensures that reviewers provide consistent feedback by having them rate the same conversations and compare their evaluations. Regular calibration helps maintain uniformity in reviews, ensuring that agents receive similar feedback regardless of the reviewer.
Agents do not see any conversation-related information in calibration.
No notifications are sent to reviewers, so it's up to the workspace manager or lead to inform reviewers about the calibration session and its due date.
Setting up calibration for a workspace
Calibration can be done within an existing workspace. Calibration scores don't affect the existing Internal Quality Score (IQS).
To set up calibration for a workspace
- In Quality assurance, click your profile icon in the top-right corner, then select Users, bots, and workspaces.
- Choose your workspace.
- Click Calibration.
- Click the Calibration toggle to the on position.

- Select the visibility of previously given reviews:
- Not visible: Previously given reviews are not displayed when a conversation is added to the session.
- Visible: Previously given reviews are regarded as calibration reviews. Only one conversation review per reviewer will be added to the session. Message-specific reviews are not displayed.
- Select the visibility of calibration reviews for:
- Reviewers: You can choose whether to allow workspace reviewers to see only their own reviews, or to see their peers' reviews after submitting their calibration review.
- Lead: You can choose whether to allow workspace leads to see evaluations given by other reviewers during a calibration session either only after submitting a review or always.
- Manager: You can choose whether to allow workspace managers to see evaluations given by other reviewers during a calibration session either only after submitting a review or always.
- Click Save changes.
Conducting calibration
Calibration can be conducted within the same workspace where you perform regular reviews.
To conduct calibration
- In Quality assurance, click Tasks
in the sidebar. - Click the plus sign (
) next to Tasks, then select Create calibration
session.
You can also view the list of previous Calibration sessions here.
- Enter a name for the calibration session.
- Set a due date for the calibration session. Deadlines can be extended if needed.
- Click Create session.
- When you find a conversation that you want to add to the calibration session, click Add to calibration session.
- (Optional) Click the options menu (
) next to the review you want to be the baseline and select Baseline review. Marking reviews as baselines makes it easier to compare scores against them. A baseline is marked with the label 'Baseline review.'
Workspace managers can add and remove baseline review designations.

Viewing calibration results
The IQS of the selected workspace remains unaffected by calibration reviews.
You can view the list of conversations and their associated reviews in the Calibration dashboard.
No notifications are triggered, and calibration reviews, comments, and scores do not appear on the main dashboard; they are exclusively available in the Calibration dashboard.
The baseline review is marked with the label 'Baseline'.
