Why am I receiving unexpected bad satisfaction ratings?

Return to top
Have more questions? Submit a request

65 Comments

  • Chris Swinney

    Any further updates here? It is plainly ridiculous that Zen can't handle this properly? 

    0
  • Nicole S. - Community Manager
    Zendesk Community Team

    Hi Chris -

    No further updates.

    0
  • Pascal Turmel

    We are also affected by this issue. Same dilema of either halving your C-SAT response rate with a redirect URL for all customers or incorrect C-SAT results. This is a recent issue. Please address

    0
  • Brett - Community Manager
    Zendesk Community Team

    Thanks for taking the time to share this with us Pascal.

    I completely understand your frustration here and I've passed your feedback along to the appropriate team to review.

    0
  • Chris Swinney

    We have raised yet another ticket with support over this issue, but we have had the same robotic response. In the 6 months to the beginning of the year, by my reckoning there is a 95% failure rate for Zen to accurately determine bad satisfaction from our customers. This is alarming. Now supports solution to this is to get us to engage professional services, meaning that we would have to pay additional money in order to resolves Zen's own failure. I personally find this unacceptable. 

    3
  • David Kirkaldie

    Hello,

     

    In the cases where you have a link follower submitting bad ratings either after a customer has submitted a rating or if they have not submitted a rating, why not create a third link that sets the survey back to unanswered. It could be there with an explanation in text, down at the bottom, or maybe hidden as white on white. This way if a link follower is checking all links the third one that tells you it is bogus is triggered instead of a false bad review.

     

    All the best.

    3
  • Steve Morrell

    I've been tinkering with this today. We have seen a drop in the last few months from 95% satisfaction to 70-80%, and had to have many awkward contacts with customers.

    I appreciate that the Zendesk position is "as designed", and see the logic. However, given that CSAT is our main KPI, having such a huge unknown hanging over it is a major issue. I do wonder what is more business critical in the queue of requests for CSAT that is more critical than "The CSAT values can't be trusted". I can only assume they are some pretty hearty revenue drivers.

    I do like David's suggestion. If there was even a placeholder of {{satisfation.neutral_link}}, that could suffice. Perhaps if that put some sort of "Possible automatic action" note on the ticket as well. I wouldn't mind putting that at the end of messages with a warning to customers that it is there merely to counter spam systems.

    Equally, the ability to reverse the "good" and "bad" links in the ratings section would resolve the issue, or at least make good reviews the false results. Or give us the ability to add these manually, so we can order them differently.

    I will try to rating url path. Given that we are B2B, we can contact customers directly for this. I do appreciate that this would not work for many users however.

    Regards,

    Steve 

    0
  • Tiffany

    Zendesk,

    Has there been any movement on this? Even having the ability to remove or invalidate these when we confirm that it was the case.  

    Thank you.

    0
  • Alexia Wolfhagen

    We are experiencing the same issue - we had 3 customers reporting this the last week? 

    0
  • Steve Morrell

    What I have done is put in a switch on certain organizations where I have reason to believe that they are the source of bad reviews. This switch means that they get a link to a survey that they have to sign in on, and not a mail with links on it. This has made our CSAT go from ~80% to 100% for the last two weeks. We just had our first bad review for ages, and that was from an account I was tempted to switch over anyway.

    It's ok for me, as we are B2B, and therefore can do this broadly for organizations, and can reach out to them directly for feedback. I don't know how this would work for B2C however, as there would be no way to know.

    I do think this is an important feature for ZD to resolve. CSAT should be king, and spending effort on evaluating it isn't justifiable. 

    I would propose more freedom in the way that the CSAT mails are structured, so people can break out the elements as they see fit. Also, there should be a "ignore result" link that can be embedded somewhere in the mail, where it makes sense. If this is the last thing clicked, or if it is clicked at all, then the result could be marked as suspicious.

    0
  • Rob Baker

    Hi Steve, 

      Just curious, Why not have all of your CSAT go through a 3P if it is demonstrated to be working?  Given the sign-on requirement how are you able to make the experience seamless?

    0
  • Steve Morrell

    Hi Rob,

    As you say, the sign-in is not seamless. We'll probably get more responses from customers getting a direct mail. If I have to change a significant number of organizations to using the sign-on method, then I'll consider it for all. For now, it is limited to a known few that are obvious from the metrics. I was able to simply query any orgs with more than 1 bad review in the year.

    Steve

    p.s. Not sure what you mean by 3P?

    0
  • Andy Stanley

    Also having this issue with Bad ratings on CSAT surveys that customers havent chosen.

     

    Only had a handful of these, however I can understand the communities frustration when it is a wider problem. Latest customer was using micro officescan enterprise and as they had an Exchange server, their mails are initially swept via Trend scanmail

    0
  • Tiffany

    This isn't exactly the same use case, but it may help.  We set up an automation to send an email to anyone who gives us a bad rating.  It's triggered a week after the rating has been sent in, with the hope that we've given the user some extra love and attention and asking if they'd consider changing their rating. It then includes the rating link so they can change it.  Since it's done before the ticket moves from solved to closed status, they can update the rating. 

     

    It might not solve the root cause, but could help you at least get a fair score if they update it... or if they don't :-/  .  

     

    1
  • Chris Hammond

    We initially encountered this issue back in January, at which point we were only receiving these "false" negative CSAT ratings from one particular customer who we confirmed were using MessageLabs . I was able to resolve this by creating a separate automation for users from that organisation (directing them to the {{satisfaction.rating_url}} link instead of the Positive or Negative variants) while retaining the standard one-click rating links for the rest of our customers.

    This has worked perfectly for the past 9 months, but unfortunately something seems to have changed recently. Since Monday (October 14th), we have received 5 different negative ratings, each from different customers, and with no obvious correlation between them (mobile vs. desktop, email client, relation to the previously mentioned organisation, time etc.) After following up with each of these users, they all claimed that they were either not aware that they had received a satisfaction survey or had otherwise not intentionally rated us poorly.

    While it is possible that all of these customers adopted MessageLabs or other similar antivirus/link expanding software at the same time, it is quite peculiar that this would begin to negatively affect our customer satisfaction ratings all within the same week. I'm curious whether this is due to an update on Zendesk's end or simply a massive and frustrating coincidence.

    I have since updated our CSAT automations to fully use the {{satisfaction.rating_url}} link instead, and will likely keep it this way for the foreseeable future. While this will no doubt reduce our overall response rate, it is better than the alternative of receiving unintentional/inaccurate negative ratings. It's a shame that we aren't able to continue to use the one-click links with confidence, as they had served us well for several years prior.

    I agree with earlier comments which suggest including a third option to remove a rating which could be hidden at the end of an email (making it the last link to be checked and cancelling out any automated positive/negative ratings).

    1
  • Tal Admon

    See an interesting suggestion from Marketo's community regarding handling the same issue of false-positive clicks by bots:

    The best way we've found right now is to include a one-pixel picture / link on the email - invisible to just about everyone (as suggested here). Anything that clicks on such a tiny little pixel you can consider a bot. True; someone might not load images and see a box, but most people won't see it at all.

    Taken from here:
    https://nation.marketo.com/thread/3428

    Jeremy R., Zendesk Product Specialist - is this something you can positively consider in order to solve this issue and sanitize the CSAT survey?

    1
  • Steven Godbey

    We've also just recently (within the last week) started seeing an abnormal number of Bad satisfaction ratings coming in, even though looking at the tickets everything was resolved in a timely matter so there'd be no reason for it, and the few users we reached out to said they never clicked on a rating.  It feels odd.

    1
  • Devon Nobrega

    We've also noticed that in the past week or so, we've been receiving an abnormal amount of bad satisfaction ratings.  Most end users state they didn't submit a rating at all and one other attempted to give a good rating but weren't able to submit it, so they closed the survey window.  We then received a bad rating on that ticket.

    2
  • Adam Finch-Turner

    We have also had a number of unintentional bad feedback ratings in the last couple of days. This was not previously a problem in our environment. Has Zendesk changed something on their end? We have not made any changes to the method in which we deliver the feedback survey and there is no correlation across the end Users who have given us this unintentional feedback. Can Zendesk investigate these problems please? Our CSAT rating for November, which as other's have said is a key KPI for us, is now unreliable.

    2
  • Harry Jenkinson

    Hi,

    We have also seen a massive uptick in False Negative Ratings, and as we use Zendesk for internal IT management, with the vast majority of our tickets coming from internal users.

    We have made no change to the email monitoring system, please can we get an indication from Zendesk when this will be properly looked at.

     

    2
  • Patrick Kennedy

    Add us to the list of customers who have seen an abnormal rise in "bad" ratings within the last month. We will try to adding the url solution but reading all the comments here over the last year, it is surprising and disappointing that Zendesk hasn't been able to provide a better solution to such a critical KPI for many of your customers. 

    1
  • Cesar Hernandez

    We are also experiencing this false negatives.

    We've been looking into this to see how to avoid this false negatives without having a drop in customers response rates if we use the {{satisfaction.rating_url}} as suggested, since it will required the customer to perform multiple clicks, and more important, usually users avoid what could be interpreted as a lengthily satisfaction survey.

    One suggestion, is if we had something similar to the {{satisfaction.positive_rating_url}} and {{satisfaction.negative_rating_url}} that will land on the satisfaction section URL page, with the pre-selected choice already made, BUT without the Automatic-submit and instead of the button "Update", simply said "Click here to submit your Rating".
    This double click solution, that also gives the option to still put an additional comment,  will probably be a better compromise instead of the auto URL email links, that generates false positives, or the rating_url that requires multiple clicks/selections and looks like a more lengthily survey..

    This way, we could still ask the customer that same simple question:
    How do you rate the support you received?
      Good, I'm satisfied
      Bad, I'm unsatisfied.

    But on the landing page he will just be invited click to "Submit your rating" button and add comment (optional).

    Any thoughts on how to perform something like that?

    0
  • Valerie Conlon

    How do we turn the {{satisfaction.rating_url}} placeholder into a user-friendly link?  When I add {{satisfaction.rating_url}} to the email body of the automation, I would like to have it show as 'Ticket Satisfaction Feedack' with a hyperlink vs. https://[domain].zendesk.com/requests/183965/satisfaction/new/qaYlN08LrvcNKqrFu3LutblOn?locale=1

    Email body configured as:

    Hello {{ticket.requester.first_name}},

    We'd love to hear what you think of our customer service. Please take a moment to rate the support you received:
    {{satisfaction.rating_url}}

    Here's a reminder of what your ticket was about:
    {{ticket.title}}
    {{ticket.public_comments_formatted}}

    Email sent to user shows as:

    ##- Please type your reply above this line -##

    Hello Valerie,

    We'd love to hear what you think of our customer service. Please take a moment to rate the support you received:

    https://[domain].zendesk.com/requests/183965/satisfaction/new/qaYlN08LrvcNKqrFu3LutblOn?locale=1

    Here's a reminder of what your ticket was about:
    ...

    1
  • Chad Dodds

    We are also still receiving unexpected bad CSAT ratings from customers. This produces an awkward interaction with our customers when we try to follow up and provide a better response, and they say they never clicked a rating. It is also causing agents to distrust the ratings and Zendesk to a degree.

    We are B2B and already implemented an org and a user-level checkbox to replace our standard survey with an alternate survey using the {{satisfaction.rating_url}} placeholder instead of the {{satisfaction.rating_section}}. This is a reactive solution and does not prevent the problem. We do not want to use this alternate survey as default since we anticipate it will decrease our great (30%+) CSAT survey response rates.

    The current workarounds are insufficient. Even just reversing the order of links in the rating section would be preferred so the ticket is left with an accidental good rating, not bad. This has been a problem for a long time and it is now happening more frequently. As I stated in my support ticket, my team and I would like to know what Zendesk is doing to address this problem.

    2
  • Rob Baker

    Chad Dodds thanks for sharing your perspective and insight on this issue.  Your 30% response rate is amazing, and that is exactly the same concern that I have with the proposed workaround of using the URL instead of the rating_section.  We should not be implementing anything CSAT-related which adds a barrier to ticket requestors rating their interactions with agents.  The higher the response rate, the more statistically relevant the CSAT percentage is.  The only thing I would object to is the proposed re-ordering of the rating section URLs.  This idea was already surfaced, and then retracted previously on this thread 9 months ago.  False negative CSAT is a pain (we all agree)-- only the requester can change it, it creates awkward conversations, and even unfair metrics that have adverse consequence to KPIs and build-in Insights/reporting.  However, missing a real bad CSAT that has been falsely reported as good, belies a fundamental purpose of support, CSAT, and why we use a ticket management system like Zendesk to facilitate our Customer facing interactions.  

    2
  • Patrick Snell

    Like everyone else here we are also seeing an uptick in these "bad" ratings. It's been over a year and I don't see a reasonable resolution (or even a temporary fix) to this problem. It's not allowing us to engage our customers experience with your system. This needs to be fixed and should be made a top priority. This error is causing discrepancies in the data we use. 

    2
  • Andrew Soderberg
    Community Moderator

    Rob Baker FYI, flipping the order of the Bad / Good links does not cause you to miss a real bad CSAT. The vote is not one time only. If the automated spam/malware scanning software hits the first link (Good) and stops (which is done before the user reads the email), it does not prevent that customer from reading the email and voting Bad sometime after the first (false) vote. As long as the customer votes while the ticket is still in the Solved state and not yet Closed, they can revote multiple times on that CSAT survey, and you will see this show up in the ticket's Events detail. We have our tickets remain in the Solved state for 6 days.

    We had flipped our link order to Good / Bad for some months to solve the the issues of false negatives (for some time these scanning tools were only checked the first link per domain). Later we started getting double votes Good/Bad, so we had to switch to remove the instant voting and go with the link to the page where they can vote. 

    It looks like Zendesk may have found the source of this issue that caused the spam/malware scanning app vendors to stop trusting or whitelisting Zendesk emails. Check this fix coming in March: 

    https://support.zendesk.com/hc/en-us/articles/360040599713?page=2#comment_360003963894

    Until Zendesk plugs the spam exploit, and can show to the vendors of the scanning apps that Zendesk is no longer a conduit for spam, etc. so that they add Zendesk to their whitelists, the cause of the issue here won't be fixed.

    1
  • Mario Amicarelli

    One of my agents just started noticing this a few days ago. Looking back at old tickets that received the bad rating, this appears to be the root cause. I have seen a few good suggestions regarding a fix for this, and I would definitely appreciate the Zendesk developers fixing this issue.

    1
  • Roksan Yazman

    I'd like to bump this up as we're still seeing a recent uptick in false negatives (the March update shared by Andrew Soderberg, unfortunately, does not seem to have resolved it). At this point, we're exploring the separate URL survey workaround for the biggest 'culprits' (i.e., using {{satisfaction.rating_url}} vs {{satisfaction.rating_section}}) but it feels counter-intuitive to add friction when we are trying to encourage further responses, and does not safeguard against false negatives on new accounts.

    We'd really appreciate acknowledgement and bumping up the priority of this issue by the Zendesk team!

    1
  • Jerry E.

    We are also still experiencing false negatives (probably around half of all the Bad's that we get) and are not willing to use  {{satisfaction.rating_url}} due to the decrease in response rate. The occasional false negative is the lesser of two evils, but definitely an evil. It would be great if Zendesk could fix this, maybe by adding a third link that preferably is not visible but clears the rating. In the worst case, it would be visible and show something like 'Maybe later'. 

    1

Please sign in to leave a comment.

Powered by Zendesk