The Zendesk Community sat down with Assunta Scala, our VoC Data & Customer Insights Manager about the possibilities of CSAT! We discussed how to better understand user satisfaction to determine where you stand with your customers and identify critical areas of improvement you may not have on your radar. Here is our recording of that event along with some questions from the community answered by our presenter.
Video Recording
Questions from the Q&A
Q: What can we do to get more open ended responses from our users?
A: You could make the comment box required or move the placement of your comment box.
Q: What are the best CSAT questions and workflows?
A: Satisfaction should be the first question asked. This can be followed with CSAT questions evaluating specific elements of the experience. Ending, typically with a comment box.
Q: How do I increase the click through/response rate and what is the industry standard on wait time before sending the CSAT survey request for emails?
A: It depends on what you are measuring. For specific touchpoints, it is recommended to send the survey out as close to real time as possible so that the experience being evaluated is top of mind.
Q: How do I encourage customers to leave a comment and get more insight than the good vs bad feedback?
A: You could make the comment box required or move the placement of your comment box.
Q: How would you best balance the number of comms to clients requesting feedback to avoid bothering them?
A: Love this question and totally agree that there needs to be a balance. You don't want to customers to feel bombarded with feedback requests. Also, think about the bigger picture of communications. Definitely, have some rules in place.
Q: What is the best way to use CSAT with frequent customers?
A: Use CSAT as a way of learning more about your most frequent customers and ask if you are consistently delivering on experiences.
Q: When sending CSAT is there a preferred way to convey which option the end user should select?
A: Any method is valid. It is really about striking a balance of what is right for your company.
Q: We currently only serve internal employees. What are best practices for an internal facing system and how might they differ from an externally facing CSAT?
A: Same principles would be valid that work for external customers. Pulse surveys are becoming very popular in this space. Some things to think about: You should include enough questions to get the data you need, but not so many that employees won’t have the patience to complete the survey. Consider doing a monthly survey on a specific topic like communication, teamwork, or growth opportunities.
Q: What is the best timeframe for requesting CSAT?
A: CSAT measuring a specific touchpoint/interaction should be done as close to the experience as possible so that the experience is top of mind.
Q: Do you use CSAT and NPS? If so, what do you find to be the benefit of using both, and how do you differentiate between the two?
A: Yes. I believe both CSAT and NPS have merits, but they really do different jobs. While CSAT measures a customers’ satisfaction with a product or service, NPS measures customer loyalty. The metrics complement each other. Whether you measure CSAT or NPS or both, it’s what you do with those measures that really matter. Choosing CSAT vs NPS boils down to whether you’re evaluating customer happiness with your products and services in the short term, or if you're trying to measure long-term brand loyalty. Oftentimes, companies will use both: NPS to keep an eye on the overall quality of the experience, and CSAT for more targeted service and product feedback.
Q: As a medium sized startup launching CSAT, we're planning not to send surveys to all our tickets. Is this biased, or best practice? What have you seen in your experience?
A: As long as you are taking a random sampling and not self selecting this works
Q: Any suggestions on capturing feedback that is service vs. product related?
A: Use questions that capture both the product and service experience. You may also want to be specific as measuring CSAT with product and CSAT with service provided.
Q: How do organizations separate the uncontrollable results from the controllable? In other words, what strategies to others implement to filter out or account for low CSAT results that are not relevant to the quality of support, but more a reaction to a service or other failure?
A: Mine the data and create themes based on this. I would not filter out the day but rather use it to explain what a subset of customers thinks. Now if a subset of customers is driving down the CSAT, this should be part of the insights shared i.e. a subset of customers are not happy with the product because are tired of bugs and are driving down the CSAT score.
Q: Is there a difference between CSI and CSAT? If yes, what is it
A: CSI is a weighted satisfaction measure typically used for benchmarking. It is based on multiple questions, creating an index. While CSAT is based on one survey question. Both are valid ways of measuring customer satisfaction.
Q: Any recommendation for following up with customers on low CSAT?
A: Yes! Definitely, follow up. You should ask these customers if they want a follow up and if they say yes then contact them.
Q: What are the best practices to increase the number of responses? We have small, specialized teams that individually may only get a dozen responses in a month.
A: Are you sending the survey to all customer interactions? If not do that. If you are a smaller team, the response may be reflective. You can still use the feedback to directionally track and understand CSAT.
Q: How can we encourage more customers to use the survey?
A: Do you have an incentive? Incentives are a great way to increase survey participation.
0 Comments
Please sign in to leave a comment.