How are you measuring Case Deflection?

Patrick MartinPatrick Martin Senior Director, Technical SupportFounding Analyst | Scholar ✭✭
edited July 13 in Analytics

Hello TSIA Community!

I am curious to know more about how you are measuring case deflection and/or self-service success. We currently are measuring 2 different measures:

1- % of customers who abandoned the case creation process after consulting a document

2- % of customer who did not log a case after consulting a Support Knowledge article, whether they started logging a case or not

Both these measures are at different ends of the spectrum. The first one is in the 7%-10% and the other is in the very high 90s. The first measure clearly indicates our case deflection once the customer reaches the case creation page (because we can safely state that the customer intended to log a case), but we know for sure that our deflection rates are higher because customers search our community and content before reaching the case creation page. They can find their answers/solutions even prior to actually needing to log a case. This should also be considered as deflection but is not included in this measure.

The second measure is missing the intent. Our objective here was to measure Support's contributions to self-service. However, we can't 100% say that customer's intent was to log a case in the first place.

This being said, I would like to know more on how you measure deflection, how you capture intent and any other information. you might want to share around this self-service measure.

Looking forward to the discussion.

Answers

  • David BacaDavid Baca Director, Support Services Research Member

    Hi Patrick - great topic and one that I know many of TSIA's members struggle with. To help move the conversation a bit further, I thought it might be helpful to share the definitions for Self-Service Success and Self-Service Deflection:

    Self-Service Success: the rate that customers indicate they complete their desired self service transaction and/or found useful information. Self service success does not imply that the customers need for live assistance was fully satisfied.

    Self-Service Deflection: the rate that self service resources and/or content eliminates a customer’s need for live agent assistance.

    As you highlighted, a key to determining Deflection is to determine intent. One common way of determining intent is via click-stream analysis to determine % of clicks that landed on the 'case creation web page'.

    What other deflection methods and ways of measuring Intent are out there?

  • Greg HigginsGreg Higgins VP, Professional Services & Support Member | Enthusiast ✭
    edited July 14

    This is great stuff, Patrick. At Splash, we currently have a blind spot in deflections so we simply look at average tickets per user over a period of time to inform any efficiency/quality gains from our process, people and product changes. We're currently working to measure deflections in Q3, hence my attraction to your post.

    I'm writing as I was recently reviewing Zendesk's content on deflection: https://www.zendesk.com/blog/ticket-deflection-currency-self-service/. I thought their Self-Service Score: Self-service score = Total users of your help center(s) / Total users in tickets was a simple and clever metric to review. You get your current ratio and then make changes to try to increase it - an increase indicating more engagement/value with your help center relative to users who go on to submit tickets.

    I'm looking forward to hearing more from the community. As I have success implementing, I will share my results. Let me know if I can help in any other way.

  • Patrick MartinPatrick Martin Senior Director, Technical Support Founding Analyst | Scholar ✭✭

    @David Baca Thanks for the reply. I understand that the easy way to capture intent is to look at the visits to your case creation page, but as I mentioned in my initial post, our community allows users to search our entire community content from the get-go, which means we are certainly "deflecting" potential cases without capturing intent, which leads me to the self-service success measure and not case deflection. What I am trying to achieve here is to identify intent (or at least assume it), using AI. If our AI solution is able to identify the user's intent/context and tailor the experience and content relevancy, then we might get closer to a precise measurement on case deflection. This is where my thought process is headed as we now have the tools (with AI) to capture this information much more precisely than just relying on who visits the case creation page. I am a strong believer that many customers will find the information they are looking for prior to even reaching the case creation page, which means we are not measuring these in our ROI.

    In my particular situation, my case deflection is between 7-10%, while my self-service success is in the high 90s, so I would assume that if I can accurately detect intent through AI and report on it, I would have my true measure, or get close to it.

    My end goal here is to measure 2 things:

    1- Cost savings driven by self-service

    2- ROI on our Knowledge Management efforts, at least from a self-service perspective

    @Greg Higgins your formula from Zendesk is interesting, however, I still think it is a high level measure that does not take intent into consideration. By capturing intent, we can better understand customer needs and tailor the experience based on it. Then, we can measure/identify which content is more efficient in which scenarios. If you would like to discuss further, you can reach out to me directly.

  • David PerraultDavid Perrault VP Product Support and Customer Care Founding Analyst | Scholar ✭✭

    @Patrick Martin it is indeed much more practical to measure intent when customers are in the process of opening a ticket with Support; and the most effective way to do this is to track if KB articles presented during the ticket creation stopped the customer from opening a ticket within a set period of time (need to leave time to customer to actually follow the steps in the article).

    I also think that when a customer does browse your KB - outside the process of opening a Support ticket - and does not open a ticket in the same set period of time, you have to assume the customer would have most likely opened a Support ticket if he had not found the information he was looking for, i.e. your customers are unlikely to browse your content just for fun.

  • Patrick MartinPatrick Martin Senior Director, Technical Support Founding Analyst | Scholar ✭✭

    @David Perrault yup, that's what we are doing right now, kind of measuring both. With our solution (Coveo), we are able to detect intent with AI, and I will be working with our product management team to see if the detected intent can be made available in our reporting to see the difference between our numbers. Thanks for chiming in.

  • David HarbertDavid Harbert Head of Managed Services Digitisation Founding Analyst | Scholar ✭✭

    Hi All,

    Great question. This is an area we are starting to dive deeper into but still early days. We are still early in our thinking and planning how to approach. We have been working hard to improve our digital experience which has been successful (based on customer feedback) just hard to measure.

    Some challenges we are struggling with are:

    • We have two halves of our business: Consumer and Enterprise. A single end user could be both depending on the issue they are reaching out for. Our consumer side is a different set of systems then our enterprise side. So providing a great experience is tricky as many enterprise customer buy and use consumer services so not cut and dry.
    • I like the idea of Self-Service Score as mentioned in @Greg Higgins comment. Thinking how I might be able to use that.
    • We are also looking at those two end goals @Patrick Martin. It is great when customers can self-serve also curious if there is a measure around the product/solution where it continues to improve so the root query/question was never needed to be asked.
    • @David Perrault you raise an interesting point about a customer generally not browsing for fun.

    What is/are the technology(ises) you are using for share knowledge with customers? @David Perrault said they are using Coveo.

    Thanks for insights.

  • Patrick MartinPatrick Martin Senior Director, Technical Support Founding Analyst | Scholar ✭✭

    @David Harbert We are using a Salesforce community with the Coveo for Salesforce package for our search, personalization and recommendation, as well as our case creation page. With Coveo, the content will be tailored to your users and can come from different sources, so this might be an option to standardize your customer experience with consumer and enterprise. Let me know if you would like to know more on how we use it.

  • David HarbertDavid Harbert Head of Managed Services Digitisation Founding Analyst | Scholar ✭✭

    Thanks Patrick. I'd love to learn more how you are using Coveo.

  • Jaime FarinosJaime Farinos Head of Services and Support - Chronicle Founding Member | Scholar ✭✭

    In addition to the ones mentioned, it is also worth noting, the simple tracking of how cases evolve over time (increase or decrease) compared to the increase in the number of customers. Having enough historical data would help predict the number of cases you'd expect to have if no self-service measures are implemented. This method however would not help in establishing the source of the most efficient case deflecting strategy, just the overall global impact.

Sign In or Register to comment.

Hello there

It looks like you're new here. If you want to get involved, click one of these buttons!