Community discussion: Quality Assurance - Reviewing Tickets

24 Comments

  • Javier Jauregui
    Comment actions Permalink

    I would suggest downloading the "Hide Ticket Fields" app. This app allows you to hide certain fields from agents. (Instructions on the app can be found here: https://support.zendesk.com/hc/en-us/community/posts/203459056-Hiding-Ticket-Fields-from-Agent-Groups)

    Once this step is completed, you can add fields for each line item for your scorecard.

    Once the fields are created, they can be linked to the group of your choice. This way, ONLY those in the selected group can access these fields. 

    Using GoodData, you can create metrics to help measure your scores. Once you create the metrics, you can create reporting through Insights to help identify opportunities for coaching, catch trends and identifying agents who rock!

    This was a challenge to put together, but definitely worth it.

    2
  • Art Rybalko
    Comment actions Permalink

    An excellent topic!

    We used to review all tickets when we were using a different system (or at least we tried hard to review all) but increasing volumes no longer permit this. Tickets were reviewed by senior support team members aka managers.

    Having switched to Zendesk recently (~6 months) we are still not sure how to formally approach QA.

    I think the most important question to start with is what do you want to improve by doing QA of tickets. It can be process improvements, product improvements, more training, etc. To me the most important aspect is agent training, i.e. when I used to review a lot of tickets by hand I tried to make sure each time the agent perfectly understood the question and replied with relevant info. If not then this was the signal to do more individual or team training on the subject.

    Right now satisfaction surveys don't provide enough insight, there is simply not enough volume. I also don't like segmenting tickets. Obviously we want VIP or high priority tickets to get the best service, but they tend to in any event, being handled by more senior team members. Same goes for complex technical topics. If a ticket about a proverbial password reset was not handled properly that's as big a problem. (And possibly a stronger sign that something needs to be done).

    For new agents' training we have what we call an evaluation week right before "graduation" when managers (all of us) review all tickets by the agent. I am leaning towards doing it for each agent already past training in turn as well.

    Maybe we should also review how we solicit CSAT ratings. What would be a typical % of tickets surveyed and response rate?

    Peer review sounds promising on paper, but I'm not sure about proper positioning and motivation to make it effective.

    This area definitely needs more thought and brainstorming and I am a bit surprised that there is so little on the subject here in the community.

    1
  • Jessie Schutz
    Comment actions Permalink

    In my experience across various companies and industries with large numbers of agents, the general process has been for a team lead or coach to pull a random selection of tickets or calls for each agent on a weekly basis. The team lead reviews these tickets or calls on their own, and then goes over them with the agent in a weekly one-on-one.

    I have often seen a checklist of quality requirements with a points value assigned; missed points on the checklist are deducted from the total, and the final score determines whether the interaction was successful based on a scale from Does not meet expectations to Exceeds expectations. CS reps need to maintain an average of X to be Meeting expectations. In terms of actionable data, that could work for you.

    Over and above that, of course, customer complaints are always followed up on, and all CSAT surveys that come back with a Bad rating should be reviewed as well.

    Taking a cross section of tickets or calls from your agents should, over time, give you a good feel for the way each agent handles their business and identify potential patterns in their performance.

    Hope that helps!

    1
  • Ryan Bajnath
    Comment actions Permalink

    This is a great question that also has many factors to consider before diving in to one practice. Jessie has the right comments, but I think what Josh is asking is a how-to.

     

    To consider before practicing:

    How large is your team or the team handling tickets?

    What is the volume of tickets you receive per day/week/month/quarter?

    What type of tickets do you receive? Sales oriented or Support oriented?

    How do you organize and track tickets?

    Do you have objective quality standards to apply expectations towards?

    Note: QA standards of Sales and Support can be different.

     

    I'd be happy to share my process in a longer post. :)

    1
  • Colin Piper
    Comment actions Permalink

    Josh, I use the customer satisfaction surveys as the starting block for quality assurance. Customers are good at telling you what is good or bad and these allow me to spot trends. For example, we were automatically closing tickets too quickly for our top customers. That was clear from their feedback so we took action.

    In the volume of tickets we handle, anything else would just be a random spot check. 

    0
  • Jan Jambor
    Comment actions Permalink

    Thats a really good topic. I assume not many Support Departments are reviewing their tickets, so if you are thinking about doing it it's the first step in the right direction.

    There are of course many concepts if you are not able to review all tickets. You can have customer segmentation and only review the premium customers. Or you can have ticket priorities and only review the important ones. Or you review only tickets which where open for a long time or have many agent touches.

    We are using currently a mix of customer segmentation and ticket priority for the review. I do it every Friday.

    JJ

    0
  • Josh Babin
    Comment actions Permalink

    Thanks for the responses! Looking for actionable data, like Colin found, is exactly what we're after.

    We thought the "bad" tickets from the customer survey was an obvious place to start as well. We are also going to start looking into other tickets based on tag types for complicated tickets. 

    Who reviews tickets on your team? I think it would be best to have a review team that filters and forwards tickets onto managers. 

    0
  • Colin Piper
    Comment actions Permalink

    Art, as you say it is hard to do good QA on large volumes of tickets. In addition to the CSAT I mentioned previously I also try to review a couple of tickets live in my 1:1 sessions with my agents. I just randomly pick from tickets solved that week. I believe in the one-minute management style so on a simple look at the ticket there is always something to praise or critique. The agent always goes away with something from the session.

    0
  • Jennifer Rowe
    Comment actions Permalink

    Ryan, thanks! We'd absolutely love for you to share your ticket QA process in a longer post.

    Here's a post that links to articles from others who have their processes/workflows (and you get swag for sharing :)

    https://support.zendesk.com/hc/communities/public/posts/203458416

     

     

    0
  • Ryan Bajnath
    Comment actions Permalink

    Here's a link to the  detailed post about my process and I've included the same content here.

    Hopefully this information can be used to an extent by everyone. It's easy to recognize that each company and their approach to support, tickets, and their customers are unique. Out of fairness, here's the approach I've had for the last 2 years that's turned out to be far more progressive with a rapidly growing Marketplace/IT company.

     

    Document Everything!

    The easiest way to anticipate scaling and at least have your objective standards of Customer Support and training methods is to simply document everything. Depending on your resources, shared documents (Google Docs/Sheets) and Zendesk (Knowledge Base/Insights) can work together like PB&J.

    Create a Customer Support manual. It should document your Support/Company Culture, team member directory, agent workflows, training programs, escalation pathways, and other resources. A manual can be a centralized document on Agent conduct. Through the lens of QA, this creates objective standards that has clear visibility for the Agent, QA team member, and as a manager.

    I work with a 24/7 global team who mostly work together closely on Skype/HipChat. We get a high volume of tickets every day, and often have issues that remain unsolved. As you can imagine, there's plenty of stuff to talk about with Backlog management, Ticket Prioritization, and more to the point, QA.

     

    QA Philosophy

    QAing with a small Start Up all the way up to an Enterprise level should follow a few core philosophies:

    1. Don't micromanage an agent's handling of the ticket.
    2. Evaluate company/support culture against the ticket handling type.
    3. Demonstrate best practices.
    4. Make learning and training a team-building exercise.

     

    Micromanaging can lead to less productivity, you're feeding in to the pre-existing anxiety that agent has knowing that their tickets are evaluated for QA. Of course that depends on how you QA and how you empower your agents. In a good scenario, micromanaging may be very useful to a developing agent and is meant just to correct on subtle nuances.

    Getting back to the manual, you have to put it out there what you expect from an agent - QA standards shouldn't come out of left field! If an agent knows what to expect and be evaluated on, they're more willing to follow that course of action. QA standards are really up to you, but some of the basics are; answering the call of the question (did you answer what this person asked in the ticket?), productive/effective handling (did you resolve the issue with fewer iterations?), was it clear to the customer (on what they need from you, what was done, how they should do something). How to do that will be later on.

    Demonstrating best practices and making training a team building exercise is simple, if you have veteran team members who know your standards (support manual), know what to expect/the correct path (QA), empower your front line staff to help train newer/inexperienced agents. Because I document and make QA part of the support lifestyle, our newest Customer Support team members are more informed about the customer experience and best practices than most team members in the rest of the organization - remember, they're at the pulse of the customer, have solidarity through customer tickets.

     

    There is one question I ask every single customer support agent whenever a "bad" QA ticket is pulled or whenever an escalation is required that has truly helped build a better team, QA results, and support mentality:

     

    If you were the CEO of this company, how would you handle this ticket?

     

     

    QA Workflow

    Because of our growing amount of tickets and team, our department structure has evolved hand in hand with our QA structure. Because our veteran agents turned in to team leads, and our team leads became managers, more opportunities of QA became possible. Our Team Leads are in charge of QAing a few of their team members tickets at the start of their shift (we have views, triggers, and spreadsheets designated for each department/team structure).

    We have one QA specialist that reviews each team leads QA per Agent and assesses their overall body of work to help provide a clearer picture to management and to that individual agent. This can include customer satisfaction score, the types of tickets they're handling, response time, agent efficiency, etc. (basically the stuff you see in Insights).

    How-to

    Views:

    Create QA specific views that pull tickets handled (or assigned) to that agent/group for the last week (again, depending on your volume). This can be sorted out by satisfaction score, ticket tendency (we create a ticket field that summarizes the call of the question in to a drop down field, we record that information every quarter and produce User Experience reports based on agent handling), or however you organize your tickets. Allow team leads to pull 5-10 tickets at random (or based on a corrective course of action from the QA specialist), and apply it in your QA resources (spreadsheet or another program).

    Resources:

    We use GoogleApps because it's free and helpful for global teams. We have QA spreadsheets set up quarterly. The Team Lead/Manager has one for each department (not each team member) and they add the 5-10 tickets in this spreadsheet. It should include; the Ticket URL, the QA evaluation (call of the question based on what's asked by the customer), COTQ answered?, Clear Communication, productive handling, customer satisfaction, QA comments, and Agent retrospect (to confirm that they reviewed it + their comments).

    QA specialists and Managers should create individual shared documents with the QA Spec and Manager. This gives a progress report on that team member and address how their tickets were during the week/month/quarter. The same standards should be applied (and even consider a point system on good/ok/bad standards). If there are any immediate corrective course of actions, it can be addressed at the weekly level, noticeable ticket trends can be seen on a monthly level, and serious regression (or progression!) can be displayed on a quarterly level. This document should contain research from Insights, the built in tools are helpful, but you can also create your own Insights-like dashboards.

    QA Lifecycle: Agent handles it, Team Lead does initial QA, QA specialist evaluates previous week of QA and creates corrective course of action, Team Lead / Manager passes information down to agent, Agent learns best practices and handles ticket. Rinse and repeat and over a Monthly/Quarterly analysis, you can see the progression trends of that agent from a individual QA document for management.

     

    Hope that helps! 

    0
  • Jennifer Rowe
    Comment actions Permalink

    Hi Ryan,

    Nice! Thanks so much for sharing all the details about your process!

    0
  • Kristina King
    Comment actions Permalink

    We're working on instituting a peer review process for tickets, and I'm struggling with how to get truly random tickets. We want to to 3-5 a month (based on an employee's tier) and want them to be random. For my first run-through I just picked one from each page of each closed ticket, but that's not truly random.

    Any ideas? Is there something we could create with Insights or GoodData that would produce a list of random tickets for each agent?

    0
  • Jessie Schutz
    Comment actions Permalink

    Hi Kristina!

    There isn't really a way to randomly generate a list of tickets...you could set up a View or a report with some various conditions to be met, but it wouldn't really be random, and you'd still have to select the individual tickets to review.

    Maybe someone in the Community has some ideas!

    0
  • Jroberts
    Comment actions Permalink

    Hello!

    I pull a sample size of tickets based off of ISO 2859, and depending on their QA score from the previous week - helps me base what inspection level to check the current week.

    At this time, we are not utilizing the CSAT score to pull tickets for QA because we are noticing if someone responds to a ticket after a CSAT score has already been provided, that CSAT score gets attached to the last individual that touched the ticket. So, this is why we pull random tickets. I have a 'Quality Rubric' that I created in Excel that is broken down into categories, with points tied to each category. Once I have the score I will put it in my MTD tracker, as well as attach it in the shared performance tracker from the associates Team Lead. 

    We know that as our business grows, so will the QA process. I'm currently on the look for quality management software that works with ZenDesk. Any ideas would be great!

    Thanks!

    0
  • Jennifer Rowe
    Comment actions Permalink

    Hi Jroberts, thanks for sharing! 

    0
  • Jroberts
    Comment actions Permalink

    You're welcome! Currently I'm looking for a cloud based solution, that will work with ZenDesk, to have all of the quality housed in one location. I'm working in Excel and as more QA Specialists are hired - this solution isn't going to be the greatest. 

    If anyone has any solutions - that would be amazing!

    0
  • Irina Chepushtanova
    Comment actions Permalink

    Hello, everyone!

    Thank you for sharing all your thoughts and experience. It is very interesting and I have actually found some useful things for me.

    I am looking for some tool that would allow storing all the QA data and results in a convenient way and will let me transform it into diagrams so that I can view each agent's progress for a certain period.

    I would actually appreciate if you share all of the tools you use within the QA workflow as now I am creating the QA workflow process for our team and there is so little information about it.

    Thanks for the attention!

    0
  • Oscar
    Comment actions Permalink

    Hey Folks, 

    We're building a QA monitoring app for Zendesk. Basically you can use our platform to review your tickets and evaluate how your agents interact with your customers.

    We're looking for people interested in joining our limited beta testing program.

    If you are interested just drop me a line at oscar@playvox.com 

    More info: http://playvox.com/zendesk 

    Thanks

    0
  • Oscar
    Comment actions Permalink

    Thanks for all your feedback during our BETA.

    I am pleased to announce that PlayVox is now available! 

    You can start a Free Trial here --> http://www.playvox.com/zendesk 

    NO more spreadsheets to manage your QA process :)

     

    0
  • Graeme Carmichael
    Comment actions Permalink

    Oscar

    The link in your most recent post looks correct, but clicking gives an error. 

    0
  • Oscar
    Comment actions Permalink

    Thanks :) 

    0
  • Vasu Prathipati
    Comment actions Permalink

    Hi, I'm the CEO and co-founder of MaestroQA. I'm throwing our hat into the ring as another Zendesk-integrated QA solution to consider.

    1) Here's an article Zendesk wrote on us about how a Zendesk customer - Harry's - uses MaestroQA to drive success in their QA Program.

    https://www.zendesk.com/blog/harrys/ 

    2) Here's a link to our Zendesk app in the marketplace:

    https://www.zendesk.com/apps/maestroqa/

    0
  • Jennifer Rowe
    Comment actions Permalink

    Thanks for sharing your suggestion, Javier!

    Seeing this post again reminded me that our friends from BetterCloud posted their peer review system for tickets in this tip: How to improve feedback looks with automated peer reviews.

    It's worth checking out!

    0
  • Pedro Reis
    Comment actions Permalink

    I'm looking for a solution for QA tickets, zendesk should include a feature for this. He doesn't make sense having to pay for an app like maestroQA or PlayVox to have that in Zendesk, as we already pay a subscription to Zendesk.

    0

Please sign in to leave a comment.

Powered by Zendesk