Recent searches


No recent searches

Zendesk on Zendesk: How we manage customer satisfaction surveys and ratings



Posted Sep 27, 2016

By Anna Rozentale

Zendesk on Zendesk is a discussion about a specific topic and how Zendesk Support uses Zendesk.

This session is about how we manage Customer Satisfaction Survey and Ratings and is hosted by Anna Rozentale, a Support Operations Manager in our Dublin office.

See all of our Zendesk on Zendesk series discussions.

Intro

At Zendesk, we believe that our customers' feedback helps us improve and grow, both as a support organization and as a product. We actively solicit feedback after each support interaction with a survey. Replies to this survey, primarily, constitute our CSAT score, which we keep in check to see how we're doing as a support team.

However, this isn't where it ends: comments left in response to the survey may translate into product improvements, trigger reviews of processes and policies. In other words, we want to hear what customers have to say – be it positive or not so much!

There are cases where we don't send out the survey. There are ways we surface survey comments internally to be visible to the entire company. There are tags that prevent surveys from being sent on a ticket and guidelines around using them. Finally, there’s a newer Zendesk feature that lets you drill down into the reasons for bad satisfaction ratings. This article explains how we navigate the intricate workings of the CSAT survey and ratings.

CSAT survey overview

The CSAT survey is sent directly to the ticket requester about 24 hours after the ticket status has been changed to Solved (unless it’s been reopened). We can never be sure of the exact timing because the survey is sent by an automation that  runs once every hour and picks up any tickets that have been in a Solved state for over 24 hours. The automation’s title is Request customer satisfaction rating (system automation).

The automation then performs two actions: it sends a dynamic content placeholder as an email to the requester, and changes a ticket attribute to “Satisfaction: Offered”. This is is to ensure we don’t send the survey twice per ticket. One of the conditions of the automation is, of course, that this attribute has to be “Unoffered” for the automation to run.

Here’s what the text of the survey looks like:

 

Hi {{ticket.requester.name}}, 

We want to make sure you're happy with our customer service. Please let us know by answering one quick question below:

{{satisfaction.rating_section}}

Thanks!

Your friends at Zendesk

Here’s a reminder of what your ticket was about:

{{ticket.comments_formatted}}

The above message is translated into 10 additional languages that we support: Chinese Simplified & Traditional, Japanese, Portuguese, Spanish, French, German, Italian, Russian & Dutch. The text is, of course, fully customizable so use your company’s voice to keep the brand message consistent across the support interaction.

Using tags and conditions in your automation

In our Zendesk instance, we use certain tags to prevent the CSAT survey automation from firing. The list of tags that stop the survey is quite long and subject to change. To make this easier for our Customer Advocates, we have added these tags to a number of macros where applicable.

For example, sometimes a user of one of our customers’ products or services will contact us for support – in which case we will respond with a polite Sorry – we’re just a software provider! macro – we don’t want to send a survey to this requester as we didn’t actually provide substantial support. So, what happens is that the macro used in these cases will already apply the necessary tag to stop the survey from firing.

The survey will not be sent if the requester is also an agent in our Zendesk instance. Moreover, the automation excludes several non–customer–facing groups (for example, our recruitment team sometimes reviews job application tickets that don’t come in through regular career pages as tickets out of the same instance), as well as non–email channels since the survey isn’t tailored to those. For example, the screenshot below shows how the no_csat tag prevents the survey from being sent for tickets assigned to Legal or Security.

Communication is really important in these scenarios. Be transparent and let your support team know what the tags are and explain that the best way to ensure everything works as designed is to use macros and rely on triggers implemented by your admins. Discourage them from applying tags manually to stop the survey from firing on tickets where they feel they’re less likely to get a positive rating – remind your advocates that survey feedback is not only a reflection of their individual work. If you’re managing a large support team, it might make sense for you to restrict editing tags on tickets using custom roles.

Surfacing feedback outside of your support organization

As we’ve already established, the CSAT survey comment goes well beyond being a mere reflection on one particular support interaction and feedback that is only relevant to Customer Advocates. We want the entire company to see what our customers are frustrated about, or, indeed, what they love. As the comments can span feedback about the support, product, our policies etc., we sought ways to push survey feedback outside of just the Zendesk instance where it could be openly accessible by anyone.

Our first iteration of this were posts to the company–wide social network. This involved simply adding a target and building a set of triggers that push the message to that target whenever a ticket is rated either good or bad, and the rating contains a comment. The trigger essentially pushes the comment text to a target, thus creating a public post with the ticket link so that our developers, product managers and even our C–staff can view and comment on it. It has happened that a cross–functional team would loop each other into a conversation on a satisfaction rating and product changes were enacted as a result!

Initially, we’d post all comments – good and bad, every single one of them. But this created a lot of noise – a whopping 94% of our customers are satisfied with our service and leave positive comments. While it’s great to see the love, it also made our internal comms channels practically unusable. In order to limit the number of posts, we began to throttle the Good CSAT comments at 5 per day. However, all Bad CSAT comments were being posted.

This still allowed us to uncover the flaws and the feedback, see some praise but keep the noise to a minimum. However, the mechanism wasn’t properly communicated to the support team and created a negative feeling among them that their achievements weren’t recognized as much as the bad satisfaction ratings. It’s good to remember that CSAT can be a very useful channel for feedback but also a sensitive topic as your team members will often feel personally responsible for the ratings attributed to their tickets and, again, it’s extremely important to communicate how you use the comments at company level. You can also use Satisfaction Reasons to that end and show your support team what is the ratio of support–related bad sat ratings to other reasons.

Understanding the reason behind bad satisfaction ratings

Zendesk is growing as a company and as a product and we’re moving ahead to meet the demands of the market. For us, as well as many of our customers, the simplicity of the CSAT survey wasn’t enough, so we’ve introduced bad satisfaction reasons.

Instead of one simple question, there can be two: How would you rate the support you received? and Any reason why you were left unsatisfied?

An admin will then be able to choose what reasons will be visible to their customers and run reports, business rules or build views off the reasons chosen. This allows us to better see when, for example, a Good or Bad rating was attributed to the support experience and offer more constructive feedback to advocates, as well as channel improvement ideas through to our Advocacy leadership and operations teams.

You can use the default reasons:

  • The issue took too long to resolve
  • The issue was not resolved
  • The agent's knowledge is unsatisfactory
  • The agent's attitude is unsatisfactory

You can also customize your own based on your business and the product/service you provide. For example, if your company offers delivery and returns of physical goods, you might want to include ‘Logistics’ as a possible reason for dissatisfaction. If you’re a SaaS provider, that won’t be necessary but you may wish to include options related to policies or ToS. Sometimes you’ll want to use different language – for example, at Zendesk we refer to our support team as Customer Advocates, so we customized the options that refer to agents. However, it’s important to check back on your CSAT comments in the “Other” category every now and again to make sure your selection of reasons accurately represents your customers’ concerns.

Finally, since we support a multi–lingual customer base, we use dynamic content in our satisfaction reasons to display these options in the end user’s default language.

Let us know how you use the Customer Satisfaction survey and satisfaction reasons. What are some of the considerations when setting it up? Do you have programs in place to surface survey feedback internally and make sure it reaches the right people?


1

4

4 comments

image avatar

Dan Ross

Community Moderator

At your Relate conference this year, Zendesk had a presentation on the value of a good survey. Concepts like using a 5-7 point question (strongly disagree, somewhat disagree, neutral etc.etc.) to capture how your customers truly feel instead of just a simply binary choice, which can result in a hidden group of really-not-that-happy users. It would be great to be able to implement best practices that Zendesk talked about at their own conference, without having to resort to external services and the API...

Are there any plans to bring this kind of detail to Zendesk surveys? Or the ability to ask more than one question? For example, asking if the customer was happy with the agent and with the case resolution?

Thanks!

 

0


Hi Dan,

Thanks for reading - while we use our Relate platform to explore best practices and trends in the industry, we don't have immediate plans to enable more questions in the CSAT survey. Our Product Managers are always looking for feedback in our community to improve and evolve Zendesk features and I have passed your comment along, too!

Thanks again,

Anna

0


image avatar

Dan Ross

Community Moderator

Hi Anna,

That's too bad there's a gap between the preaching of industry best practices and the implementation of them. 

Thank you all the same for making the time to respond. 

 

-Dan

0


@Dan

I was in the same session and asked a Zendesk member if they were planning to enable this feature in their own CSat surveys. I was given the impression that they were considering it but perhaps I understood incorrectly based on what Anna told you.

0


Post is closed for comments.

Didn't find what you're looking for?

New post