Question
I received a bad satisfaction survey response. When I followed up with the customer they said they never even opened the survey email or did not click a bad response. What is happening?
Answer
What is likely happening in these instances is the end user has a link expander, like an anti-virus checker, installed on their machine or running on their mail server. These programs open links in the email to verify they are not malicious. Since customer satisfaction (CSAT) ratings store the last rating link clicked, a script that clicks every link will register as a bad satisfaction response.
The CSAT survey was designed to help garner the most results. This is not without a tradeoff because there is no verification in place to prevent automatic responses from registering as negative reviews.
There are two potential solutions that decrease the risk of false-negative ratings at the cost of ease of use for the end user.
Solution One: Modify your survey automation to not include direct response links
By default, the Request customer satisfaction rating (system automation) includes a block with both the good and bad response links. You can switch this out for a placeholder that contains a link to a separate page with these options instead.
To accomplish this
- Click the Admin icon (
) in the sidebar, then select Business Rules > Automations.
- Locate the Request customer satisfaction rating automation, and click on the automation to edit.
- Scroll down to Perform these actions to locate the email body section.
- Locate the
{{satisfaction.rating_section}}
placeholder and replace it with{{satisfaction.rating_url}}
For moreinformation on placeholders available in Zendesk Support, see the article: Zendesk Support placeholders reference.
Solution Two: Create your own satisfaction survey and use the Satisfaction Rating Endpoint
If you are comfortable with writing custom code, you can make your own survey and use the Satisfaction Rating endpoint to feed the results into your Zendesk. This will make it so that you can still use Zendesk reporting on the CSAT survey, but add in whatever verification you would like.
78 Comments
It is beyond me that Zendesk can't figure out a way at minimum to determine if the negative CSAT is from one of these known false positives, so we can avoid an awkward conversation with our clients about this. If you can't prevent these false positives, some intelligence surrounding the possibility that these are false positives would allow us to flag and action without unnecessary client intervention.
Hi,
we have the same issue and it´s such frustrating.
Zendesk any update on that?
Hello, as all of us we encounter this issue and it is getting more and more important.
We used workaround one with the url but our satsifaction retune rate decresed by 10 points going down to 5%.
I would have like to make differiencated survey depending our knowledge wether the customer has the extensor or not but it is too complex to handle.
Is there an andpoint to reset the satisfaction survey to nill ?
If so, we, Zendesk idealy, would be able to add a third link (invisible) in the regular satisfaction survey.
Hence, automated system will click last on this reset satisfaction survey link instead of the Bad evaluation one.
Please Zendesk, do something.
Best regards
We are seeing an increase in false-negative reviews now, and have to spend way too much time on having clients correct them.
But it seems that Zendesk is not gonna do anything about it.
Changing the rating to the suggested solution is not an option - as this will leave us with an even lower response rate which is already only 5%
Fully agree with Alexia.
This is becoming a frequent issue that's causing frustration and stress within our team; as we make every effort to follow up on negative feedback to try and draw useful comments from our customers and to have wider-discussions around the service they've received. So you can naturally imagine the awkwardness of those conversations when they had no idea that a response was even given!
The only suggested workaround would completely cripple our response rate and is therefore an unacceptable option.
We recently attended an Explore training session to better understand our reporting potential and have been working hard on a number of KPI & SLA-focused dashboards for reviewing at a senior-level and having to explain each month that our CSAT isn't accurate, paints a really unprofessional picture and doesn't put our team in the best light.
Overall it really feels as though we're making excuses. So let's own up to the problem and find a viable solution instead!
As we now expand our Zendesk offering to include additional brands and Agents, we expect the product to be able to satisfy these requirements - this does not reflect well on the normally excellent service Zendesk provides.
It seems the suggestion made a few comments up would be perfect, if someone on the ZD side would just listen and make it a priority to include a link that would snare automated click attempts. Matt Wisse for transparency, as its a common problem with our account.
"if Zendesk would be able to add a third link (invisible) in the regular satisfaction survey.
Hence, automated system will click last on this reset satisfaction survey link instead of the Bad evaluation one."
The method using {{satisfaction.rating_url}} would be a little better if the rating still saved automatically once It was selected by the customer. Now, they have to select a rating AND remember to click "update" which I'm worried will unnecessarily impede response rate
Actually this is very serious, and Zendesk should fix it, please try to introduce a simple captcha in the form (1+2 = 3).
It is not normal for agents to receive a bad rating, these statistics are presented and analyzed by management and 1 extra bad rating could cost a person their job.
I cannot understand how this cannot be considered a major issue by Zendesk, for 2 years nothing was done.
Can we please get a comment from the Zendesk team on this? We have had to put the majority of our top customers in an alternative, 2-click CSAT survey group which is a ridiculous workaround for an issue that should be addressed on the platform side.
Is there an ETA to fix this? The solutions provided by Zendesk either (1) create a lower response rate due to the need to click the rating elsewhere or (2) create possibly false positives, switching the order of the question doesn't prevent AV software from clicking each link in the appended CSAT. The only reasonable solution is to catch these before they are closed OR to tag the orgs that have this software so that they get a different CSAT (linked). Neither is optimal and this is a key metric. Please advise.
We tried the alternative and did see a pretty substantial drop in response rate, so we've switched back and just accept it for now, but do hope to see a fix for a known issue.
The issue persists.
We found that these "false" bad ratings usually occurred at the next hour after the survey was sent.
We still have the same issue, we activated the url instead but less reviews ofcourse.
We do hope a solution is provided by Zendesk if possible.
All,
Since my first ever post on this thread 3 years ago, I am shocked that Zendesk haven't found a fix to this problem based on the number of comments left by other zendesk users.
What we had to do was replace the bad URL link with the default URL. A customer will always leave a bad comment if they have a bad experience so taking them to another page is not an issue, but this stops any "automated" systems leaving you 'false' bad scores.
Replace your existing bad link with the link below but leave the visible name as Bad, I'm Unsatisfied.
<a href="{{satisfaction.positive_rating_url}}">Good, I'm Satisfied</a>
<a href="{{satisfaction.rating_url}}">Bad, I'm Unsatisfied</a>
Hopefully this helps a few of you. I have not seen a drop of CSAT response rate but I definitely have seen 100% fix of false responses.
Thanks,
Zulq
Same for us, we send Survey and we got negative, we asked customer, they say they click nothing.
Timestamp usually full hours, as any checker is running and click the links during send out.
Very bad....
The reason for the bad reviews are on the full hour (or at the 30 minutes past the full hour), is because of Zendesk mechanism that collects the form responses when they are not submitted. See explanation earlier in this thread (why comments are paged at all and not incrementally loaded... it's 2021 Zendesk...)
https://support.zendesk.com/hc/en-us/articles/115012836948/comments/360001829213
Tobias Hermanns - Check out my post and try amending your trigger for CSAT survey. This should fix your issue like it has for me.
Hi team,
I reviewed the previous comment but don't quite understand it. Did you mean that the virus scanning program installed on users' device triggered those "unexpected" bad ratings?
Thanks.
Please sign in to leave a comment.