Effective Metrics for Digital Self-Service Success: Share Your Insights!

Options
Ryan HL
Ryan HL Member Posts: 5 Seeker
5 Insightfuls Name Dropper Photogenic First Anniversary

Hey everyone! 👋

Like many of you, I’ve invested in software solutions with the promise of reducing support costs and enhancing customer self-service. But I’ve noticed that some of the strategies we use to measure the success of these tools could use a bit of a revamp.

Self-service isn’t a new concept in the customer experience world, but as we continue to evaluate the effectiveness of our tech stack, the need to prove the ROI of our digital strategies becomes increasingly important.

We’ve all seen the usual metrics: percentage of page views, number of accepted answers, in-app survey results, “helpful” article/post markings, and tracking unique user posts in the community versus tickets submitted, among others.

I’m curious to know if any of you have found success in measuring and PROVING the effectiveness of your digital self-service recently? Any new metrics or strategies you’ve tried that you’d be willing to share?

Looking forward to hearing your thoughts and experiences. 💡

Comments

  • Brian O'Keeffe
    Brian O'Keeffe Member Posts: 200 Expert
    First Anniversary Photogenic 5 Insightfuls First Comment
    Options

    I have had mixed results. Here is what I started with and followed exactly what the rest of the team, mid-market and enterprise, were measuring:

    Renewal rate

    Growth rate

    NPS

    For renewal and growth rate, we were able to show value over time. We met or exceeded all other sectors. It helped that these accounts were getting zip until we developed digital campaigns and other digital touchpoints.

    In addition, I added measuring advocates enrolled. This is where we killed it. We had a much larger pool, and I worked closely with the advocacy team to find, register, and manage advocates. The hurdle was overcoming looking for particular use case customers to enroll, which is how they operated before I came on board. I argued that we needed as many advocates as possible regardless of the products purchased/business use case. We built an army ready to go out there and advocate for us. Smaller customers can have an oversized voice; no one knows their size when doing quotes or speaking on a panel. We know that and tend to focus on that way too much. 

    Here is what worked less well: 

    NPS-I have yet to move the needle much. I do not have control on how we ask and when and the big mistake, in my opinion, is how and when we ask. We have the same questions for users, who give us lower scores consistently than administrators, who understand business value. Users are focused on user experience and ding us a lot for things we have zero control over, like security settings. 

    Renewals-I partnered with sales and the renewal team to completely redo how and when we approach renewal messaging. The bean counters at the top only see a business relationship and messaging a contractual requirement. I saw it as a key touchpoint that allows us to build on the relationship, an opportunity for our customers to express concerns or blockers and share with us intent. I broke the renewal messaging down into specific categories and signed by a real person in each who owned follow-up action items/direct responses. For some, it was sales who owned the relationship; for most, it was the renewal manager; and for others, it was the CSM. It all depended on a series of factors. I added a How Likely Are You to Renew question with a free-from-text box to tell us why or why not. We got about 12% who answered and were able to focus more efficiently on those who did not intend to renew, had blockers, and thanked those who were and sped up the renewal process and offered them extended deals. Fairly quickly, we had millions in ARR identified as likely to renew or not. Of course this was amazing data to have and action on, right? I thought so, and I was wrong. Despite working closely with the renewal team and sales, going over each piece, and getting partner input and approval, they had NO PLAN IN PLACE when we launched to use this data. I had to scramble and come up with a way to ensure every response was actioned on and had to take a management role. The big mistake was not asking how you would manage the responses and incorporate them into your SOP. I was crowing about our success but got blank stares from my partners and, more importantly, the C team, who did not know or really knew what the changes implemented were. 

    If I do it again, I will focus on redoing NPS and breaking it down into two very different messages. For renewals, I would ask the leadership team to help me build out a more effective program and built-in ownership of measuring, actioning on, and folding the data into the standard renewal process.