Knowledge Manager Roundtable: How do you measure self-service activity and knowledge base usefulness?



image avatar

Jennifer Rowe

Zendesk Documentation Team

Posted Feb 12, 2018

The Knowledge Roundtable is a group of Zendesk customers who have some responsibility for the process of creating and managing content at their companies. They will share their expertise on a specific topic.

The topic this time is: How do you measure self-service activity and knowledge base usefulness?

Meet the panel and read their advice below:

Be sure to add a comment to ask a question or share your ideas and workflow. And check out the complete list of KM Roundtable discussions.

Sherri Anderson

Knowledge Manager, Bold Commerce

What are you measuring?

  • Top search terms
  • Analyzing click-through rates on those search terms
  • Percentage of users that use site search
  • How many users refine their search
  • Refined search terms
  • How many users exit after search
  • Time spent after search
  • Average time spent on page
  • Pages per session
  • Average session duration
  • Bounce Rate
  • How many visitors use mobile and which devices
  • Visitors per page
  • Self Service Score
  • Customer Feedback
  • Heatmapping
  • Site recording

Why do you measure it?

The search statistics are all done to analyze the effectiveness of search.  The percentage of users that use the search is taken to determine how much time should be spent on improving the search experience.  These stats help me to see what users are searching for and if they are finding what they need in a timely manner:

  • Top search terms
  • Analyzing click-through rates on those search terms
  • How many users refine their search
  • Refined search terms
  • How many users exit after search
  • Time spent after search

With this information I can tweak the keywords on the page, the page titles as well as the labels.  You can see real-time search improvements when you make these adjustments and re-search the keywords like a customer would. 

These statistics are used to determine if the content is easy to find and the articles are cross-linked well.  Because of the nature of our site I want these stats to be low, which would indicate our visitors found what they needed quickly without the need to visit multiple pages: 

  • Average time spent on page
  • Pages per session
  • Average session duration
  • Bounce Rate

How many visitors use mobile and which devices helps me to determine how much focus should be spent on the mobile experience and which devices I should be testing on.

Visitors per page is used to determine the top help center articles, I have created a portal on the landing pages in the Help Center that displays the top articles for each category and this is done to assist our visitors in finding the most popular articles more quickly and for cross-linking related articles.

The Self-Service Score is used to measure the over-all effectiveness of the Help Center, I have been measuring this stat since I started here so it’s great to see how much the work I have done here has benefited the company.  When I got here our Self-Service Score was 2.32:1 and it’s now 3.78:1 in less than a year (Industry standard is 4:1).  This stat shows how many people submitted trouble tickets when compared to the number of visitors, so 1 in every 3.78 customers that visit our site submits a trouble ticket.  This means I have almost doubled the effectiveness of the Help Center in less than a year, which in turn saves the company money because they do not have to hire additional Support staff as the Help Center is taking the place of Help Center employees.

Customer Feedback is used to get customer opinions on the Help Center.  I use this to tweak the content on the pages to try to make the instructions easier for customers to follow.

Heatmapping and site recording are done to determine how customers are using our site, this helps me determine how effective new features are that I have implemented and come up with ideas for new features in the future.

How do you measure it?

The analytics within Zendesk, Google Analytics and Hotjar are the three services I use to measure self-service activity and knowledge base usefulness.  I used a Zendesk article to determine the self-service score and set up the stats I needed to determine the Self-Service Score.

This Google Analytics and Help Center article series was extremely helpful to me when coming up with my methods:

I use a service called Hotjar to create a Customer Feedback form which I added to every page on the site.  I quite often receive feedback on our products and customer service reps instead of what I am looking for, but I am working on getting more useful information and the information I do receive on the Help Center has been very valuable.

I have also created heatmapping and site recording with this company which allows me to see how visitors are using our site.  It does this by recording visitors, so you can view to scrolling and mouse clicks in a session. 

The heatmapping is a screenshot of the page with color heatmap so you can view what areas are used most.

Ben Garris and Sarah Boyd

KCS Program Manager - Solve Loop and KCS Program Manager, Republic Wireless

At Republic Wireless, self-service is a major part of our support organization. We define self-service success as the rate that customers indicate they completed their desired transactions and/or found useful information. Self-service success does not imply that the customer’s need for live assistance was fully satisfied. In addition, we define self-solve success as the rate that self-service resources and/or content eliminate a customer’s need for live assistance. These measures are extremely difficult to quantify, but we believe there are certain activities and outcomes that define what success looks like.

Our recent adoption of KCS (Knowledge Centered Service) has helped to reshape our knowledge base. This combined with an active peer-to-peer community has further transformed our approach to self service.

Having the end in mind and setting attainable goals to work toward is critical to ensure the long term success of a system. We’ve built a foundation at Republic through KCS which has allowed us to implement a sustainable, and effective, content engine that focuses not only on content creation, but maintenance and evolution of the content as well. We follow the KCS v6 Practices and Adoption guides to drive our efforts.

One of the key goals we work towards is known as the 85/85 goal. The KCS Adoption Guide defines this as “at least 85% of the time customers are using self‐service first and at least 85% of the time they are finding what they need”. This can only be achieved by having an extensive, yet focused knowledge base that is both accurate and easy to use. This is directly related to another goal known as the 90/0 goal which states that “90% of what the support organization learns from resolving issues is on the web within 0 minutes of becoming known.” This can be translated to say that anytime we learn something new, it’s immediately made available through self-service without any time lapse. While that may seem impractical, when you build knowledge creation into your solution workflow without unnecessary approval systems and bottlenecks, as described in the KCS methodology, it’s very feasible - once your system has matured.

To help us reach these goals, we measure various aspects of our program including:

  • Web to assisted support ratio - Ratio of web-based (knowledge base and community) sessions compared to tickets opened.

  • Participation rate - the percentage of assisted support sessions that result in a knowledge event (creation of new knowledge, or reuse of existing)

  • External publication percentage - percentage of knowledge base articles that are available externally

We’ve come to the conclusion that a single measurement and/or percentage for “Self-service success” is next to impossible to calculate, and ineffective at describing the true effectiveness. Focusing instead on the insights that we’re able to gather about the health and effectiveness of our content, as well as the trends we’re able to monitor, has proven to be more beneficial than a single statistical number.

Rebecca McMurry

Documentation Manager, Zift Solutions

What are you measuring?
We are measuring number of visits and number of unique visits monthly, quarterly, and yearly.

Why do you measure it?
We feel it shows our customer engagement

How do you measure it? 
We rely on Google Analytics

 

What are you measuring?
New and returning visitors to the site

Why do you measure it?
I am interested to know if the content we provide appeals to both new and experienced users.

How do you measure it?
Google Analytics

 

What are you measuring?
It is not really a measure, Search Results

Why do you measure it?
We use the search results as an indicator of topics that may need to be added to the site. It might also be an indication that we need to add a tag to an existing article.

How do you measure it?
Google Analytics

Andrei Kamarouski

Business Analytics Director and Co-founder, smm3

To measure self-service activity and knowledge base usefulness systematically we are collecting relevant behavioral (what customers are doing) and motivational (what customers are feeling) data from four different data sources:

  • Default reporting in Zendesk (= behavioral data)

  • Web analytics data in Google Analytics (= behavioral data)

  • Self-service online survey in Google Forms (= motivational data)

  • Help Center related tickets in Zendesk (= motivational data)

Default reporting in Zendesk

Like many of Zendesk customers, we use data from Knowledge base dashboard in Reporting (views, votes etc). We have a dedicated Google spreadsheet where we monthly collect and review dynamics of all metrics from this dashboard. In particular, we pay attention to the following metrics:

  • Top 10 articles views volume and rank dynamics (this data helps to understand macro trends of support demand in our KB)

  • Top 10 KB search requests (this data helps to find gaps and to prioritize the KB content creation)

Web analytics data in Google Analytics

Web analytics data is very helpful in getting a more detailed view of KB usage state and trends. We are playing with many reports there; several metrics are used for monthly internal KPI related reporting, such as the following ones:

  • KB 'effective' sessions. 'Corrected' version of default article views that is used for calculating tickets deflection; in fact Google Analytics shows that many views are 'empty' while they have Session duration less than 10 seconds or only 1 page visited; that's why we have created a special segment to track sessions with at least 3 visited pages during at least 3 minutes and count them as deflected tickets/calls.

  • 'KB search' and 'New ticket' goals. These goals allow us to understand behavioral aspects of using search and navigating through KB as well as creating new tickets during this searching activity.

Self-service online survey in Google Forms

Visitors of our Help Center are asked to fill short post-visit online-survey about the overall quality of the service they received (we are using top bar notification for this). This survey includes the following questions:

  • How satisfied are you today with your Help Center experience?

  • What were you looking for?

  • Did you find what you were looking for?

  • Rate your: (a) search experience, (b) article completeness, (с) articles readability, (в) ticket submission experience.

  • How likely is it that you would recommend our company to a friend or colleague? (that's the classical NPS question).

  • Your comments and suggestions for improving your Help Center experience (open text question)

Using this data we are able to better understand why people are visiting our Help Center and how successful are they there.

Interesting fact: Very often people are satisfied with their Help Center experience but wouldn't recommend the brand (are Detractors in terms of NPS methodology). In general, knowing this global attitude to the brand (NPS rating) allows us carefully evaluate survey results and hidden motivations behind survey responses.

Help Center related tickets in Zendesk

This is a comparatively rare source of information but we receive several tickets about HC experience monthly. Very often they give us direct feedback on unexpected issues by using Help Center.

Collecting and merging information from these data sources allows us to get the nearly complete picture of our Help Center customer experience. We are very excited about Zendesk's last improvements in this area; first of all Knowledge Capture app and related analytics as well as Answer Bot and its analytics.

That’s all for today folks. Happy measuring!

 

 

Zac Garcia

Customer Care Operations, Fossil

We use the Web Widget to enable self service online. We measure the effectiveness of this widget in three ways using the built in reporting dashboards.

First, we measure search activity: how many searches are being executed on the widget and what are the top search terms? Are we getting an acceptable number of clicks as a result? We might find that some top search terms get search results, but low click through rate. If so, we’ll reconfigure the articles to promote clicks where applicable (though there are some cases where our guests need to talk to a live individual, and we want to make sure those who do can skip right past the articles as needed).

Second, we look for searches with no results. We actually noticed users would type greetings into the search box, expecting it to immediately start a live chat conversation. While we are excited to connect with our customers, we also want to make sure the content is readily available to them. We weren’t sure every visitor realized what’s available at their fingertips, so we tailored some articles to those greeting phrases and used them to point site visitors back to the article search as a primary resource.

Finally, we measure inbound tickets via the Web Widget. How many tickets coming through the widget could have been deflected with better article targeting? We use a combination of Views and Insights reporting to sort through these tickets.

Our goal is to make sure the guests who need our help can reach us, but those who have a question we can answer proactively are able to serve themselves to find the fastest resolution, before ever having to reach out. This is one of the many steps we take to make sure we’re providing the best possible experience.

Mary Paez and Sahar Padesh

Knowledge Manager and Support Operations Manager, Veeva

We use Zendesk Knowledge Capture and Answer Bot metrics to capture KCS activities (add KB articles to tickets, flag articles for feedback, and add new KB articles, and look at Answer Bot deflections). We also send out a post ticket survey using Survey Monkey to ask the question: In the last month, have you found a KB article that resolved a problem and prevented the opening of a ticket? We are trying to measure ticket deflections as well as we can.
 
I also measure # draft & # published articles and # articles that passed quality review per month, per region, and per Support Agent.  I create a graph of each month's results and QTR results to identify trends.  I also use this information to calculate the Average # of articles created per Agent and Average # of articles created per Product.
 
We use Google Analytics to report the # accesses per article but it is difficult to get a report per Agent.  We need a way to track "how often an article owned by Agent A gets used in tickets or is accessed on the portal".  This helps us understand if the Agent is creating articles of value to deflect tickets and to help customers.  
 
We also track # tickets closed per Support Agent vs the # articles used in those tickets.  We measure the % of the tickets are closed overall with KB articles.  The idea here is to close the gap as much as possible.  
 
Every QTR we reward our Agents with a KM/KCS Award using these statistics. This encourages positive behavior in following Best Practices & processes.  This award is calculated QTRLY using the following data:
  • # KB articles published
  • # KB article accesses on portal
  • # KB article usage in Answer Bot
  • # KB article usage in Knowledge Capture to close tickets
  • # support tutorial videos created
  • extra activities in KM & KCS including KCS certification, Coaching, or Knowledge Domain Expert
The issue with the metrics at this time is that it is difficult to get the data for all the KPIs we created and have to massage the data in a very manual process.  One tool, w drag & drop measures and dimensions and an ability to drill up / down from geo region ->product team ->agent would be great.
 

Melissa Burch

Online Support Manager, Zendesk

What are you measuring?

  • Increase HC interactions
  • Increase repeat users in HC
  • Deliver more resolutions web ticket form and Answer Bot
  • Reduce channel switches from HC to Ticket
  • Maintain a 24hr or less response in Community
  • Increase KCS to 50%
  • Increase proactive interactions with Outbound

Why are you measuring it?

  • Keep content up to date and relevant with KS (KCS)
  • Drive community engagement
  • Improve user experience in Help Centers

How are you measuring it?

  • Native reporting
  • Data warehouse and custom reports
  • Installed Pendo on Help Centers

0

11

0 comments

Sign in to leave a comment.

Didn't find what you're looking for?

New post