Zendesk on Zendesk: How we make the most of our NPS data

Zendesk on Zendesk is a day-long discussion about a specific topic and how Zendesk Support uses Zendesk. Each session is hosted by a member of our Support or Marketing team.

This session is about how Zendesk analyzes and follows up on NPS data. It covers:

  • Grouping customers according to NPS responses in Zendesk
  • Analyzing Detractor comments effectively to identify root causes of dissatisfaction 
  • Sharing NPS results with internal teams to encourage cross-departmental efforts 

This session is hosted by Lori Gauthier, PhD, Director of Marketing Research in our San Francisco office.

Overview of NPS

Net Promoter Score® is a tool Zendesk uses to proactively engage with all of our customers, not just those that contact us for support. We use our own native NPS product, which asks customers to rate how likely they are to recommend us to someone they know and prompts them to share a bit more feedback as to why they chose the rating they did. The question typically looks something like this:

You can learn more about NPS in general in our product documentation.

Organizing customers by NPS group

Once we've sent out the NPS survey and collected responses, we then group customers’ ratings according to NPS guidelines:

  • 0 to 6 are known as Detractors. Scores in this category could indicate a dissatisfaction with your company
  • 7 or 8 are known as Passives
  • 9 or 10 are Promoters

The score itself is calculated by simply subtracting the percentage of Detractors from the percentage of Promoters. Passives are excluded.


To display these groups in Zendesk, we have a customer list for each segment: Promoters, Passives, and Detractors. Customer lists, which are part of the same add-on that includes NPS surveys, are like views, but for customers instead of tickets. 

From the list, we have the option to engage with those customers through an email campaign using MailChimp or SurveyMonkey. For more details and instructions, see Creating a customer list of Promoters, Passives, or Detractors.

NPS-based customer lists also make it easy to review a list of customers who fall into each group and identify potential trends. For example, are there significant differences between small businesses, mid-market, and enterprise companies? Do certain industries tend to consistently have issues with the product? These insights can help us determine where to focus our energy to best serve our customers' needs.

Sharing data with internal teams

We share our quarterly NPS survey data internally with all employees so they have the opportunity to review and dig into the results. Sharing this across the entire company helps all roles develop more customer empathy and reflect on how they can individually improve the customer experience. Improving the customer experience is a team effort—from product to support to sales to marketing.

To share our NPS results, we export the data into a CSV file. For instructions, see Exporting NPS survey responses to a CSV file. Teams and individuals can review responses by organization, region, or plan type, as well as search the comments by keyword to drill down into specific product areas.

Analyzing comments

In the past, we’ve relied quite heavily on text analysis software, but we’ve found that their algorithms frequently miss the real reasons why customers are unlikely to recommend Zendesk. These text analysis programs are great at synthesizing what survey respondents say, but fail miserably at discerning why they are saying it. Such insights often require a human touch.

As shown above, our NPS surveys include a follow-up free form question asking customers to explain their rating. We've found that digging deep into the comments provided is a great way to understand the real reasons behind their frustrations.

Customers can sometimes misidentify exactly what's causing them difficulty. 
For example, our customers may comment that they’re not likely to recommend Zendesk because of an issue with the product. Yet when we review those same comments within a broader context, we often find the “product issues” can be remedied with advice, training, or plan updates. Some questions we ask include: 

  • Is the customer’s Zendesk properly configured?
  • Have they received the training they need to use the product effectively?
  • Have they outgrown their plan? Would another plan be more appropriate? 

Reviewing the comments is an important step that allows us not only to see what Detractors are saying, but to get to the why behind their comments.

Following up

To keep the conversation going after a customer has responded to the survey, we follow up on their feedback. First, it's important to let customers who've responded know how we appreciate their time and honesty by thanking them for our feedback. A simple "Thanks for your valuable feedback" is usually a good start. 

Then, using the lessons learned from analyzing Detractor comments as described above, we reach out to individual customers to help with identified pain points. This might involve advising a change in the customer's specific configuration, additional training, or a different Zendesk plan.

In addition to reaching out to individual customers, we take steps internally based on trends in feedback we've seen. This involves all of our departments, not just customer-facing roles. Our customer support team does not own the customer experience alone; the overall customer experience is influenced by how we build our product, the conversations our sales team has, and how we communicate Zendesk on our website. Accordingly, our teams set up cross-departmental task forces to create action plans for current customer pain points.


So that’s the Zendesk approach to getting the most out of NPS data. How about you? What approaches have worked for your organization? Let us know in the comment section below!

**Net Promoter and NPS are registered trademarks, and Net Promoter Score and Net Promoter System are service marks, of Bain & Company, Inc., Satmetrix Systems, Inc. and Fred Reichheld.


  • 0

    Thanks for sharing this, Lori!

    You mentioned that "These text analysis programs are great at synthesizing what survey responses say, but fail miserably at discerning why they are saying it."

    It sounds like you now do individual ticket analysis by hand to gain insight into the "why," but do you still use text analysis software earlier on in the process to sort through or categorize raw survey responses?


  • 1

    Hi, Andrew! Thanks so much for your comment. :)

    We do still use text analytic tools to (1) understand the customer perspective and (2) identify comments that most typically require some degree of human interpretation. 
    For example, we might find from a text analysis program that 10% of customers' comments fall into a "feature gap" category. This first analytical pass helps us see that a notable proportion of customers think their pain points are driven by our features, or lack thereof.
    Then, when resources are available, we have Zendeskians take a second pass at comments in that "feature gap" category, because we know that that category tends to include comments where perception and reality are out of sync. 
    Now, notice I said "when resources are available." That second level of analysis can be time-consuming and, thus, quite costly. The resources aren't always available. 
    Fortunately, Zendesk has a whole team of engineers focused on developing machine learning tools for the customer support industry. My hope is that those same tools can one day be used to replace (or at least supplement) the human activities currently required at getting to the why behind the what of customers' NPS comments. 
    What about you? Are you running NPS surveys? Do you use text analytics tools (for NPS or any other surveys)? I'd love to hear about your experience. Please share! 
    And thanks again for the comment! 
    :) Lori  
  • 0

    Thanks for the additional explanation, Lori. :)

    We do have NPS surveys in place—one that appears while students use our product and another that’s sent out once a student has taken their exam and finished using our services. 

    We’re small enough that we haven’t yet established an ongoing system for analyzing these comments. We’ll go through a batch from time to time, but always by hand. We haven’t used any text analytics tools so far.

    I’m actually working to systematize and hopefully at least partially automate our process for reviewing this kind of feedback. Can you share what specific tools you use at Zendesk? 

  • 1
    Eep! I meant to respond so much sooner than today. So sorry for the delay, Andrew.
    Okay, what text analysis tools do we use for NPS data at Zendesk? We used Clarabridge for about 18 months and then trialed several other tools including QDA Miner Litetm (for R)SPSS Text Analytics, and Discover Text. Clarabridge performed slightly better than the other tools but at a substantially higher cost. We haven't decided which, if any, of these tools we'll use going forward. 
    My hope is that we choose a different path altogether: developing a tool internally that can analyze customer comments within the context of customer account histories. I'm meeting with a wonderful team of engineers at the end of February to discuss this possibility. Fingers crossed! :)
  • 0

    Hi Lori, You probably have come up with a solution already. I just wanted to let you know that our company Etuma provides an automatic feedback text categorization service to e.g. Zendesk users (http://www.etuma.com/embedded-analytics). It is a simple and feasible API service. We have commercial users and they are very happy. I would love to have a quick talk with you if this issue is still open.

  • 0

    I just want to commend you, Lori, for a great job on this post. Concise and clear explanation on how to maximize our NPS data. I also like to agree on doing follow-ups. It can really be a huge help where you can get additional insight that the system cannot give.

Please sign in to leave a comment.