Question
I received a bad satisfaction survey response. When I followed up with the customer, they confirmed that they never opened the survey or didn't give a bad rating. Why are we receiving false bad ratings?
Answer
Your customers likely have link expanders installed on their computers or running on their mail servers. These programs open links in the email to verify that they aren't malicious. Since customer satisfaction (CSAT) ratings store the last rating link clicked, a script that clicks every link will register a bad satisfaction response.
Enabling DKIM and SPF records helps ensure that links are not scanned. These email authentication protocols decrease the likelihood that the link expanders will flag emails from your Zendesk account for inspection.
If you have set up SPF and DKIM and are still seeing issues, there are two workflow options to resolve the issue:
- Modify your survey automation to not include direct response links
- Create your own satisfaction survey and use the Satisfaction Rating Endpoint
Modify your survey automation to not include direct response links
By default, the Request customer satisfaction rating (system automation) includes a block with both the good and bad response links. Switch this out for a placeholder that contains a link to a separate page with these options instead.
To modify your survey
- In Admin Center, navigate to the Automations page
- Locate the Request customer satisfaction rating automation, and click on that automation to edit
- Scroll down to Perform these actions to locate the Email body section
- Locate the
{{satisfaction.survey_section}}
placeholder and replace it with{{satisfaction.rating_url}}
For more information on placeholders available in Zendesk Support, see the article: Zendesk Support placeholders reference.
Create your own satisfaction survey and use the Satisfaction Rating Endpoint
If you are comfortable with writing custom code, you can make your own survey and use the Satisfaction Rating endpoint to feed the results into your Zendesk. This will ensure that you can still use Zendesk reporting on the CSAT survey. With a custom survey, you can add a verification step to thwart the link expanders.
82 comments
Roksan Yazman
Can we please get a comment from the Zendesk team on this? We have had to put the majority of our top customers in an alternative, 2-click CSAT survey group which is a ridiculous workaround for an issue that should be addressed on the platform side.
9
PT Dadlani
Is there an ETA to fix this? The solutions provided by Zendesk either (1) create a lower response rate due to the need to click the rating elsewhere or (2) create possibly false positives, switching the order of the question doesn't prevent AV software from clicking each link in the appended CSAT. The only reasonable solution is to catch these before they are closed OR to tag the orgs that have this software so that they get a different CSAT (linked). Neither is optimal and this is a key metric. Please advise.
4
Andy Dyer
We tried the alternative and did see a pretty substantial drop in response rate, so we've switched back and just accept it for now, but do hope to see a fix for a known issue.
6
Candy Chang
The issue persists.
We found that these "false" bad ratings usually occurred at the next hour after the survey was sent.
3
Julie Vande Ginste
We still have the same issue, we activated the url instead but less reviews ofcourse.
We do hope a solution is provided by Zendesk if possible.
1
Zulq Iqbal
All,
Since my first ever post on this thread 3 years ago, I am shocked that Zendesk haven't found a fix to this problem based on the number of comments left by other zendesk users.
What we had to do was replace the bad URL link with the default URL. A customer will always leave a bad comment if they have a bad experience so taking them to another page is not an issue, but this stops any "automated" systems leaving you 'false' bad scores.
Replace your existing bad link with the link below but leave the visible name as Bad, I'm Unsatisfied.
<a href="{{satisfaction.positive_rating_url}}">Good, I'm Satisfied</a>
<a href="{{satisfaction.rating_url}}">Bad, I'm Unsatisfied</a>
Hopefully this helps a few of you. I have not seen a drop of CSAT response rate but I definitely have seen 100% fix of false responses.
Thanks,
Zulq
11
Tobias Hermanns
Same for us, we send Survey and we got negative, we asked customer, they say they click nothing.
Timestamp usually full hours, as any checker is running and click the links during send out.
Very bad....
1
Tal Admon
The reason for the bad reviews are on the full hour (or at the 30 minutes past the full hour), is because of Zendesk mechanism that collects the form responses when they are not submitted. See explanation earlier in this thread (why comments are paged at all and not incrementally loaded... it's 2021 Zendesk...)
https://support.zendesk.com/hc/en-us/articles/115012836948/comments/360001829213
0
Zulq Iqbal
@... - Check out my post and try amending your trigger for CSAT survey. This should fix your issue like it has for me.
1
Candy Chang
Hi team,
I reviewed the previous comment but don't quite understand it. Did you mean that the virus scanning program installed on users' device triggered those "unexpected" bad ratings?
Thanks.
0
Shelly Gurrola
Is there any update to this.... I see all of this from 2 years ago. I can't imagine how it would take this long to resolve this issue. Especially if (in my case) I associate performance reviews, customer health ratings, and other KPI metrics to something that is struggling. I and others have waited patiently. I am praying that this has been taken care of and simply missed the update....because as I today I AM STILL HAVING THE ISSUE AND AM VERY FRUSTRATED.
3
Jimmy Rufo
For anyone following in the community
Is anyone aware if a solution exists where we can keep the Satisfaction rating section, but force any negative rating to require a comment in the following screen as to why they left the negative? Since the main culprit here is false positive negatives, if there was a way to force just the negatives to use the {{satisfaction.rating_url}} option to validate the rating with a comment, we'd all benefit.
1
Mario Amicarelli
I recommend using what @... outlined above, as this worked well for my organization:
"What we had to do was replace the bad URL link with the default URL. A customer will always leave a bad comment if they have a bad experience so taking them to another page is not an issue, but this stops any "automated" systems leaving you 'false' bad scores.
Replace your existing bad link with the link below but leave the visible name as Bad, I'm Unsatisfied.
<a href="{{satisfaction.positive_rating_url}}">Good, I'm Satisfied</a>
<a href="{{satisfaction.rating_url}}">Bad, I'm Unsatisfied</a>"
This method ensures the CSAT email looks the same to the end user, but doesn't automatically select a rating if the end user clicks the "Bad, I'm Unsatisfied" link, and therefore eliminates false negative ratings.
Thanks for the recommendation, @...!
1
Tiffany Zwolakowski
@ Jimmy
We put in two automations for a Bad or Bad with Comment Satisfaction score. The first emails our Agent alerting us of the bad score, so we can reach out and attempt to remedy it.
Then, a second Automation called "Bad Sat Score 2nd Chance". It sends the user an email 72 hours after solved saying 'we're sorry we disappointed them' and asking if we've since solved the problem and if so, would they consider giving updating the score. It then includes the rating button so they can update it.
It's not perfect and doesn't solve the issue, but at least we know when it happens and the requester has the chance to update the score before it's solidified.
0
John Tieu
+1 to getting an official response from Zendesk on this, the dramatic decrease in response rate from the official workaround is troublesome.
While I appreciate the workaround provided by Zulq, it's not always true in our world that a negative survey would have a comment.
1
Shelly Gurrola
Is there a way to set the default auto-response if virus protector opens the csat request, to choose Satisfied instant of assuming the worse?
0
Michael Mader
We are looking into adding "Satisfaction reasons" into our Zendesk instance. I'm curious if anyone has live experience with this and if it would also select the top reason and give us, not only a false bad rating, but also a false reason. I understand we can customize this and leave a blank field at the top, but I'd rather not utilize one of those limited fields for that.
Thanks in advance!
0
Mitchell Cole
@...
Hi all,
My team is now experiencing the false negative satisfaction ratings at the top of the hour discussed in this article. I have not seen a comment on this thread in 2 years, is there any update or resolution to this besides the same resolution (included below) that was posted several times already?
"
{{satisfaction.rating_section}}
placeholder and replace it with{{satisfaction.rating_url}}
"
0
Tim Sexton
Sean - As stated in some previous posts above, I use the code below that seems to work well in preventing false negative reviews. All other work arounds seemed to cut down our overall satisfaction ratings altogether. The code below only uses the {{satisfaction.rating_url}} option for negative reviews while the positive reviews are still the single-click option of {{satisfaction.positive_rating_url}}:
2
Jack S. Edwards
We still get false negatives even using the rating URL. We have reached out to the customers and in almost all cases they never even left a rating but we are still seeing them.
1
Nemo
Hi @...
https://support.zendesk.com/hc/en-us/articles/115012836948/comments/1260800739009
Thanks so much for posting this solution! Really appreciated it!
We have been using the [rating_URL] trying to get away with false negatives but the acceptance rate got a big hit. But we started to use your suggested solution from July and it's much better now.
Big thanks!
(While we still hope Zendesk could fix the actual defect there)
1
Kyle Clark
I switched to the 2 link survey suggested by Zulq Iqbal but am now getting false positives (indicated by the tell-tale sign of the survey result being registered about an hour after Offered). I'm all for positive surveys... but not false ones. I switched back to the alternate single link survey (with much, much lower response rates).
1
Zulq Iqbal
Hi Kyle
That's weird, I have not received a false positive since I've been running this method in over 2 years. Even for those running "on the hour" or "30mins".
This must be another issue.....
0
Bill Habegger
Does anyone have a way to determine if there are also false "Good" ratings occurring?
0
Dave Dyson
0
Blair Freer
Having your customers alter the default setting is not a solution, but a workaround. This issue has been reported for several months now and not fixing this so new users of the survey don't have to go through this nonsense is quite frustrating. We had only recently turned on this feature and I've already had multiple false negative responses. If you are confident that there are no false positive results from your suggestion why don't you make this the default programmatically so new users don't have to experience this?
1
Bill Habegger
As I have worked through this issue on our end and considered ways to get around it, I have come to the realization that this is not an issue that ZenDesk can solve. The problem is 2-fold.
First, there is a desire to make the CSAT survey simple to complete as to increase participation. The simplest way to do that is to have a single click record the "good" or "bad" that is what ZenDesk currently implements. There is no need to click anything else on the webpage that is opened. The "good" or "bad" is already recorded.
Second, spammers and phishers exist on the internet. To effectively reduce the potential of these types of attacks, virus and email systems will look at links and even click them to check for malicious sites. The only way to not generate false results from this is to require user interaction on the website that opens. This breaks the first point.
The recommendation by ZenDesk of changing to the Rating link does stop false results but also breaks the first point. If you try to build your own CSAT survey, you will have the exact same issue.
One other observation I can make from my experience is that false negatives come in bursts. We did not experience this issue, for multiple months. Then over the course of a week we received a large number. Then it dropped down to almost none.
Finally, based on a ZenDesk article we read about increasing CSAT results, we have changed the timing of the survey to be sent 10-hours (instead of 1) after the ticket is solved. That makes the CSAT survey in the individuals inbox first thing in the morning. In addition, any false results occur overnight and are easier to spot. If a rating comes in outside our normal business hours it is likely not a true response.
2
Andy Dyer
Is anyone else seeing higher rates of bad ratings now? The last couple of weeks are 2x normal and there is no obvious explanation. It does seem like this problem happens inconsistently almost.
3
Bill Habegger
Andy Dyer We were using CSAT for approximately 6-months without a single "false bad", but then this October and December, we experienced a very high rate of "false bads." Then this week it seems to have stopped.
I have created a trigger that sends me an email anytime a ticket is rated as "bad". I have included the time of the last change and a link to the ticket that shows the events. This gives me a quick way to flag "false bads." Unfortunately, I do not have a way to easily exclude them from the CSAT ratings. I manually calculate our CSAT score after removing the ones that I know are false.
2
John Tieu
Andy Dyer We've seen quite an influx on false negatives starting around October which has persisted through December.
This is causing a rather significant impact to our overall CSAT by several percentage points.
Is anyone else seeing the same? Is there any input from Zendesk here?
5