You can report on Zendesk Guide data within Guide itself, or by using Google Analytics or Zendesk Explore. This article answers frequently asked questions about discrepancies in reporting data between these three tools.
This article contains the following topics:
Why doesn’t the Guide data in Explore match the data I see in Google Analytics?
Google Analytics is a general-purpose tool for tracking user activity on any website. As a result, it offers a much broader set of features than the Guide integration with Explore.
At the same time, Google Analytics provides less contextual information than the integration between Guide and Explore. For instance, Google Analytics does not automatically differentiate between page views from agents and end users.
Because the two tools were designed with different goals in mind, they are implemented differently. Furthermore, Google Analytics is user-configurable, meaning that its behavior varies widely between individual installations.
Can we fix discrepancies between the two tools?
Generally speaking, no. We have no insight into how Google Analytics is implemented. We can only speculate about how its tracking and aggregation features behave relative to analogous mechanisms that power the Guide integration with Explore.
Why do deleted or unpublished articles still get views?
There are a number of reasons an article may appear as having been viewed after it was deleted or unpublished.
Most commonly, deleted or unpublished articles receive views if they have multiple translations and only one of the translations has been deleted or unpublished. For example, if you publish both French and English translations of an article, but you delete only the English translation, the French translation might still receive views. To exclude deleted or unpublished translations, you can filter by language in Explore.
It’s also possible for Explore to register a view for an archived article because Explore relies on the user's web browser in order to register the activity. Under normal circumstances, the sequence of events that leads Explore to track a view proceeds as follows:
- A user visits an article URL in their browser.
- The help center server responds with the contents of the article as well as some code asking the user's browser to register the view with our activity tracking system.
- The user's browser sends a request to our activity tracking system with the information the help center provided (like which article it was or which user viewed the article).
- Our activity tracking system receives the request from the user's browser and registers the view.
In most cases, step 3 happens almost immediately after step 2—it takes only milliseconds for the user's browser to make the request. However, because the browser's behavior is not something we control, we cannot say for sure how long it will take for step 3 to happen.
For example, if the user's internet connection cuts out between steps 2 and 3, and they later reconnect, we may register the activity only when they reconnect. It might also be that the user's browser crashes and step 3 can happen only when they restart the browser.
When we finally receive the tracking event, we can be confident that the user did, in fact, view the article. However, we cannot make guarantees about when the view actually occurred on the basis of when the tracking event was received.
Why don’t I see Knowledge Base data in Explore from before January 18, 2021?
The Knowledge Base dataset in Explore launched on January 18, 2021. Data from before that date is not available in Explore.
Why is there a discrepancy between “native” reporting and the Explore dashboard?
We are replacing “native” reporting dashboards in the Zendesk agent interface with Explore datasets. For example, the Knowledge Base dataset replaces the Knowledge Base tab in the reporting section of the agent interface. While we aim to provide feature parity between the two reporting tools, they are implemented differently. As a result, discrepancies might exist. Once a dataset is available in Explore, it should be considered the source of truth.
"We are replacing “native” reporting dashboards in the Zendesk agent interface with Explore datasets. For example, the Knowledge Base dataset replaces the Knowledge Base tab in the reporting section of the agent interface. While we aim to provide feature parity between the two reporting tools, they are implemented differently. As a result, discrepancies might exist. Once a dataset is available in Explore, it should be considered the source of truth."
This doesn't really make any sense from a reporting perspective. I'm seeing massive differences in the numbers and the answer being "well they just aren't quite the same" doesn't really cut it when we're talking about 20% +/- differences in the number of visitors (and we're talking 4 digits and higher visitor counts only).
As a customer, I need to be able to trust the accuracy of the data and being told to just "accept this new version in Explore as the source of truth" with absolutely no explanations as to what exactly you're doing differently, doesn't give me any confidence in either reporting methodology. Can you tell us exactly why these are coming up so wildly differently? Can you provide some transparency here about how you're changing your collection methods and why the results are so drastically different?
Please sign in to leave a comment.