It’s important to have good customer success tracking for your small business. Learn why we chose Help Scout and which KPIs to monitor.
—–
Amanda: Hey Nathan, how are ya?
Nathan: Pretty good. Ready for some podcasting.
Amanda: We’re going to talk about Help Scout today. How long have you been using Help Scout?
Nathan: A long time. I think 2010, 2012.
Amanda: What made you choose Help Scout over some of the other customer management or customer communication tools?
Nathan: Sure. At that time and probably still today, Zendesk was one of the largest, known applications in that space. And I never really liked it as a customer. I think Help Scout came out primarily focused on that customer experience and to make it as transparent as possible. So to the customer, who’s submitting a support question, it’s just an email and it hides some little metadata in there so that it can help reconcile it internally for Help Scout. But you know, you have that experience for the company doing the support. And it’s pretty simple, like assigning tickets and writing internal notes. And it’s a pretty standard feature set that almost all these tools have, but I really liked how the customer experience is handled in Help Scout, because it just feels like you’re sending an email back and forth in Gmail with a friend that doesn’t have all that extra noise in there.
Amanda: The questions that I have for you today are really digging into how Help Scout is useful for you as a founder and leader. So you chose it because of the customer experience, but you obviously stuck with it as, you know, a multi company founder and a CEO. I want to dig a little bit more into that and understand how it’s most helpful.
Nathan: Yeah. So I mean, in the early days I was responding to every ticket and was really using all of the features every day of Help Scout and the product has moved forward a bit since then, obviously. But, yeah, I think. As I’ve stepped back from doing every single customer support ticket myself, the reporting and like keeping an eye and understanding how support is going, making sure customers are still having a good experience and being able to do that consistently and have a good sense of what’s going on has been more challenging. And particularly in the early days, but even now, the built-in reporting features in Help Scout, were a little bit lacking, like pretty good at looking at the overall aggregate picture, but a little bit hard to find like particularly problematic tickets or really keeping an eye on how specific people were doing. So it’s a little bit harder to get that information out of HelpScout sometimes.
Amanda: I know that you’ve solved for some of those issues. How have you personally used your own knowledge to enhance help scout reporting?
Nathan: Having tickets tagged as tier one, tier two, tier three. That takes some systematic change to figure out how you’re going to categorize them. And then you have to implement it in Help Scout as like a custom field or a tag and then do that across all of your tickets.
Amanda: Are those being assigned by one person on the team, or is it as tickets come in, as a support person’s available to look at the ticket, they’re, they’re all kind of collaborating or individually deciding what to assign, what tier to assign each ticket.
Nathan: Yeah, there’s different strategies, I think, for different companies. Like sometimes you might have so many tickets that you have like only tier three people who only respond to tier three tickets. And so in that case, you kind of have lower level people that are then escalating tickets to another level.
Amanda: You want to see that people over time are developing skills and they’re able to handle more complex tickets. So you tell me a little bit about what you’ve noticed in the past or kind of ways that you use help scout reporting to encourage employee growth as well.
Nathan: Yeah, it’s really helped with, once we introduced those tiers and really identified that it helps us make an onboarding plan for a new employee and kind of like a longer term career growth plan, right? Like on your first week on the job, like you’re only going to be answering tier one tickets. By, you know, 60 days in, you should be answering, you know, some of the Tier 2, like, a little bit. And then, you know, further on, like, you should be answering, starting to answer Tier 3 and answering lots more of Tier 2. And so it’s a good way of identifying like, yes, they’re making progress. They’re learning about the product. They’re able to answer some of these harder tickets and then also balancing just their volume against time, right?
Like over time, the first ticket you go in, you’re going to have to go read our own documentation for 45 minutes and ask people on Slack internally how to answer this question so it’s going to take you like 45 minutes to send a reply, a simple reply to this message, you know, three months from now, you should be able to you already know that information so you can send it in five minutes.
So also just looking at the average number of replies over an amount of time. So if they’re full time and they’re just 40 hours, right, you could just count the replies. If there’s somebody who clocks in and clocks out at various times, then you want to account for the time component as well.
You can’t look at it too granularly because there can be wild swings and tickets and how complex they are and you don’t want to incentivize somebody just picking the easy ones. But over a broad average like if you’re doing a performance review a year into somebody working, right? You should see averaged out across a quarter you should see it was taking them 40 minutes per reply and then 30 and then 25. And maybe it’s still 25, but now they’re also answering a much more complex tickets. Like you see their tiers going up as well. So I think those numbers can get skewed, but over longer timeframes, it is really helpful for just like, at least we’re moving in the right direction, both in volume and complexity and speed.
Amanda: Yeah, I’m thinking about it. This is a very imperfect comparison, but it’s almost like running. Where you can train just to run a mile and you can run it faster and faster and faster and you’re obviously training for speed, or you can keep a slower, more consistent pace, but train for distance. And it sounds like what you’re saying is you don’t want to over-weight the importance of speed or overweight, the importance of distance, like it’s sort of, it has to be a healthy combination of both.
Nathan: And I guess to torture our analogy further, they also trained for incline or something. So you’re going up a steeper and steeper hill,
Amanda: Yes.
Nathan: Distance, speed, and difficulty.
Amanda: Yes. Yeah, exactly. Thank you for, thank you for joining me on that. You mentioned time tracking and kind of marrying all of those things. Do you have any examples of how that’s actually helped you make a business decision?
Nathan: Sure. Yeah, I mean, it’s definitely helped with onboarding and trialing new people that are working in support. And, and then unfortunately, like when, you kind of suspect that something’s not working, and, somebody is underperforming, that can make the conversation a lot less awkward and vague, and can, at least, it can help correct somebody and give them something to work towards, when it’s something a little bit more clear and measurable. It’s less painful of a conversation,
And then you may not get there, but you’ll find out faster or, you know, your employee will have the information they need to make an adjustment and understand what they need to change to make it work better. So that’s, it’s definitely helpful for that.
Amanda: When people are, are thinking about business dashboards and business reporting, so often the first things that come to mind are money and financials or marketing and spend efficiency. And I think this is a really interesting way to sort of broaden the scope and talk about business efficiency from a perspective of customer success. Especially if you’re using a system like Help Scout, where their internal reporting maybe gives you some data, but not enough to really be useful. And, that’s why I think the way that you’ve set things up within NSquared is. interesting. And I wanted to ask a little bit more about what customer success metrics you’re actually tracking. What are you, what numbers are you actually pulling out of Help Scout and looking at from a, a CEO executive level on a weekly basis?
Nathan: Yeah. So it’s really evolved over time, but there’s some high level things like the happiness score, that help scout generates, which is like a CSAT score for customer satisfaction. And then there’s average response time. And average time to resolution. So you can kind of see how quickly are we responding just in general? How long is it taking us to solve things? Because it’s not good if we have like a 10 minute response time, but a 20 day resolution time, because we’re just sending bad responses back.
And then, you know, people are really unhappy. They can rate and leave a bad response, or if they’re extremely happy, then it comes through in those CSAT scores, but generally it’s a pretty low percentage of people that do that. But those are the first three that we started with.
So, Help Scout’s really good at reporting. The average and a lot of the reports have that in a lot of places. But then there are other things that are important too, right. If the average resolution time is 12 hours, that’s great.
But if that means that some of them are getting resolved at five minutes, but then there’s some other tickets that are actually taking like a month, like those can kind of get hidden in that big average. And so help scout does have some like kind of pie chart reporting where they can show over a timeframe this percent was resolved within four hours, resolved within 12, resolved within a day, and really looking at those, that long tail of, of long tickets has been important, but so Helpscout does surface it, but only in like when you’re kind of in one of those drilled in views. And really that’s one of the key numbers that I do like to watch.
And it’s not front and center and all of their reports, you kind of have to go, there’s one report that offers it and you have to be drilled in. And if you want to watch, if you want to look at it across like five different people, you would actually have to change your search five different times to look at it.
But I like looking at that as kind of like a, a balancing metric to average response time overall. How are we doing? And then like, what percentage of tickets are taking like three days to resolve? Cause I want to keep an eye on that one as well. Not just wash it out with low averages.
Amanda: I wanted to ask a little bit too, about an employee perspective and from a, you mentioned this earlier too, a kind of career progression perspective
Nathan: As employees have grown, like our support team also does a lot of things besides support. So they. You know, write documentation and they might record videos. They might be writing blog posts. They do a lot of QA and testing on our own products.
And so we’ve, we’ve had to try to balance their support responsibilities and that need for, you know, speed and quality and all that around support, and then also fitting in other tasks. And so we do ask them to track their time on support versus like non support tasks, like doing QA .
And so we use the time tracking and we actually pull in the helpScout metrics and the time tracking as we can see time allocated towards support. And that way we have the right context for, , How things are going, right? Like if, if support is slowing down, we can look and say, Oh, it’s because they’ve been doing all this QA work and helping us track down bugs in the product, but their support like quality and speed is actually right on par.
So it also has really helped in what could be a very messy situation has helped keep it very clean.
Amanda: Awesome. Well, thank you. We’re gonna have many more conversations about help scout because I think it is such a powerful tool and we’ve done some really cool work on it, consulting work for clients of ours. So I will bother you about help scout definitely more than once. I’m excited for this knowledge to be dropped on everybody who’s ever had a question about help scout and metrics and KPIs coming out of that platform.
Nathan: Yeah, we’ll be back. We can talk about Help Scout the rest of the day.