
Live Chat with Jen Weaver
Customer Support QA and Training: I deliver bite-sized episodes every two weeks that each dive into a single solution, showcasing the tech stacks and workflows top support pros use to tackle challenges and optimize their customer and team experience.
Live Chat with Jen Weaver
Prevention-First QA to Rocket Your Team to 97-Plus-Percent CSAT | Episode 19
If your QA process is just a checklist, you are wasting time, money, and customer trust.
Today, I’m talking with Chloé Koers-Bourrat, Process and Productivity Manager, Support Operations at Smartly, who built a prevention-first QA program that turned a fast-growing ad tech team into a 97 percent-plus CSAT powerhouse.
Chloé’s approach is simple but powerful. She focuses on catching problems before they ever reach the customer, creating a strong feedback loop between support and product, and keeping every interaction human, even when AI is involved.
In this episode, you will learn:
- Why starting every review with negative CSAT is the fastest way to improve
- How 10 to 20 thoughtful reviews a week can outperform 100 rushed ones
- The peer review method makes feedback easier to give and receive
- The visual aid change that cut handling time and boosted clarity
- How to route escalations so engineers only see what matters most
Whether you are building QA from scratch or improving an existing program, this episode gives you a proven checklist you can start using this week.
For more resources related to today’s episode:
📩 Get weekly tactical CX and Support Ops tips → https://live-chat-with-jen.beehiiv.com/
▶ Keep listening → https://www.buzzsprout.com/2433498
💼 Connect with your host, Jen Weaver, on LinkedIn
🤝 Connect with Chloé Koers-Bourrat on LinkedIn
🔧 Learn more about our sponsor Supportman → https://supportman.io
Episode Time Stamps:
0:00 – Intro: Why prevention beats apologies in support
1:15 – Meet Chloe Kors-Barratt: From CSM to Support Ops
3:40 – A week in the life of a QA-focused support ops lead
6:10 – Fixing broken escalation processes with JIRA
9:20 – Launching QA: From spreadsheets to Klaus
12:05 – Raising CSAT with visual aids & tone improvements
15:00 – Klaus acquisition by Zendesk & shifting strategy
17:30 – Rethinking QA: Fewer reviews, more depth
20:40 – Building feedback loops with product teams
24:10 – Peer-to-peer QA and mentorship in action
28:00 – Key lessons: Prevent issues, keep the human touch, and “QA to save the day”
it's cheaper to prevent issues and to apologize for them. So if we're able to actually review ahead of time everything that is currently wrong with our support process, we can work on that, we can improve it, and then maybe it will cost some time and some money to actually do that, but it will save us a lot in the future.
Speaker 2:Welcome back to Live Chat with Jen Weaver. I'm so glad you're here. Today. I'm sitting down with Chloe Kurz-Barat, the mastermind who turned a fast-growing ad tech support team into a 97-plus CSAT powerhouse. We'll unpack how her tiny QA squad reviews just a handful of chats each week, yet drives product change, why QA to save the day is more than just her motto, and how she keeps a very human touch, even while letting AI handle the heavy lifting. We also dig into what it was like for her team when the support QA tool, klaus, eventually stopped working after it was acquired by Zendesk.
Speaker 2:That's something I've wondered about for a really long time. So whether you're launching quality from scratch or you're leveling up an existing program, I hope this podcast is full of practical ideas that you can actually steal and use. Before we get started, though, our QA tool, supportman, is what makes this podcast possible. So if you're listening to this podcast, head over to the YouTube link in the show notes to get a glimpse. Supportman sends real-time QA from intercom to Slack, with daily threads, weekly charts and done-for-you AI-powered conversation evaluations. It makes it so much easier to QA intercom conversations right where your team is already spending their day in Slack. All right on to today's episode. We're going to talk about QA, but first I heard that you have a really unconventional path to support. Can you tell us a little bit more about that?
Speaker 1:Yeah, so thank you for having me. I started as a customer success manager and after almost like five years overall with my previous roles in a customer success manager position, and we had this amazing value that was everybody does support. So as a customer success manager we would have to be in support, handle customer chats, and I kind of fell in love with it, really enjoyed solving customers' issue and understanding the root cause and how we could get there so that it wouldn't happen again. At the time the support operations team lead was a French colleague. The support operations team lead was a French colleague. Then I talked to him like almost every week about different support topics and he said, like I really want to have you in my team and I said, well then I'm going to join. So here I am Four years later, joined right after COVID.
Speaker 2:Four years later I'm here and enjoying it very, very much, and so your role is support operations, is that right? So support operations is a team that I'm here and enjoying it very, very much, and so your role is support operations, is that right?
Speaker 1:So support operations is a team that I'm part of, and I'm mainly focusing on QA tasks and then internal processes between the support operations and all the other teams within our company.
Speaker 2:That's great. As you know, we're huge proponents of QA because that's what SupportMan, our tool, does, and so I love that we get to dig into that further. I just recently did a talk at the Support Driven Expo about QA, so this is following on that really really nicely. But before we get into your work with quality, I would love to hear about what does a week in your life look like? What's your typical week.
Speaker 1:I would say number one focus, other than actually QA, is being in support. So we still have support shifts. I am still in support handling chat, facing customers and working on our internal processes through actually the support chat. So I have free shifts of four hours every week that I need to be in and handle our customers' cases. During those support shifts I usually also like train new joiners and help them to get onboarded within the support role overall and handle the chats, and then outside of that, so that takes, let's say, three mornings out of my week and then outside of that I'm focusing on QA review in specific chats and I'm actually right now working on an AI project for QA.
Speaker 1:So that's a big project that is taking a lot of my time and in terms of meeting, I have a lot of meetings with product managers, but also engineering leads to make sure that the support process overall is good for them in how we are escalating cases, how we're sending feedback, what can be improved and specific topics that we need to focus on or roadmaps to understand exactly what is about to come for our customers. So we can also be prepared within our team to train our support agents or to get all the documentation ready and then internal meetings as well with the rest of the team. Because we are a global team, we have people based in well, I can give you all the locations, but we have Manila, singapore, dubai, helsinki, berlin I'm based in Madrid and then we have New York, chicago, guatemala, and we also have somebody in San Francisco.
Speaker 2:So we're like everywhere For sure.
Speaker 1:Which helps us cover 24-7 support. But yeah, I'm juggling with different working hours so I can be like talk to everybody and make the most out of it.
Speaker 2:You mentioned to me that QA helps you identify broken processes. That might be a good place to start.
Speaker 1:I would say one of the main processes I actually started working on when I joined the support operations team was the technical escalation Because, as I said before, we had this value in the company that everybody does support. We would have engineers be sitting in support with us and just handle some checks. The problem is that within our platform we have so many features and so many parts of the tool that not all engineers are familiar with everything. They each have their special feature that they're responsible for. So we'd have a chat coming in about, let's say, reporting, and the engineering sitting in support would be about creative, so it wouldn't even know how to answer those requests that we had, and the escalation wasn't very quick and efficient. So the first thing that we did is actually take engineers out of support and see how we could reach out to each individual team better so that the escalation process would be a bit more seamless. So we started with what we call the support tickets and we integrated the JIRA process within our escalation process. We had it already for our bugs report, but now it was what we call the support tickets. So, coming directly from support escalations, we revamped all of this so that whenever they get a ping about.
Speaker 1:A support ticket has been created for this specific feature. They know that it's for them, so that was time-saving for engineers not to be in them. So that was time-saving for engineers not to be in support. So it was a huge save of money. But also we're able to better track exactly the issue that was actually happening.
Speaker 1:So our engineers are only located in EMEA working time zone, so mainly in Helsinki and Berlin. But because our support is 24-7, our teams in the US or in APAC have technical issues that they need to escalate and whenever we did that before, we didn't always have engineers to actually help us because they were not always in support. So now we are able to just create a support ticket and whenever they get online the next working day, they just go into their support ticket board and just check exactly what was creating during the night and they're able to work on that investigation and the issue that was reported. And whenever they leave comments, it just gets back to America. So we can also do a follow-up from EMEA Time Zone. But that way we really have a real time tracking and task tracking as well for them and for our teams to make sure we're giving the customers all the information based on the escalation that has been done.
Speaker 2:That sounds like a really great system, and I wonder if the engineers are almost relieved that they don't have to be in support anymore.
Speaker 1:Oh yeah, so every time we have new engineers working, we tell them a little bit about support, how it works and also how the engineering organization is involved. And one question that comes back, I would say, on every session, is do I need to be talking to customer and be customer-facing? Because I don't want to. I'm like no worries, it's not going to happen, we're going to manage that for you. So I don't mind handling the communication style towards our customers and leaving the engineers just solve the issues. They don't happen again.
Speaker 2:Yeah, as a support person, I find that really validating that customer support is a skill that not everyone has or wants to develop, and as support people, we've maybe been developing it for years and sometimes it feels like it's not even a thing, but it really is a thing that sometimes other people are even afraid to deal with. Oh yeah, yeah. So I know you have a lot of other cross-functional wins from your QA system, but I wonder if first we can go back and talk a little bit about the history of QA on your team.
Speaker 1:The support organization overall has existed since the company was funded 12 years ago. When we started QA, we had 20, 22 agents, something like that. Yeah, now we're up to 37, 48. We started experimenting with QA in 2021. But based on the scale that we were at and the size of our team and the number of customers that we had, from a leadership perspective, it was still not the right time to actually go into QA. We were still reviewing from time to time, but not very thoroughly and not with a specific detailed process and category. And then in 2022, my current team lead just told me like we need to scale this because we have been still for a long time in terms of CSAT and we have never gone above 97. We're always around 96.5, 97, but never going above. And we were starting to see as well from leadership that it was better if we could actually increase that a little bit more towards 98 than remaining at 97. So we started prospecting as well QA platforms to see how we could do it, because in general, we were dealing with spreadsheets. We started using Klaus at the time, but before it was acquired from Zendesk QA and we're reviewing between 100, 120 chats per week.
Speaker 1:It was very interesting at first because when we were able to pick chats based on specific characteristics, that we're filtering in the class platform, we're really starting to see some patterns that we could improve very easily, like, for example, what we call visual aid. So whenever we're passing on information to our customers on a step-to-step basis, like what you need to do, we were sometimes forgetting to just include a screenshot, like where they could find a specific button or a screen recording small GIFs, something like one, two, three seconds snippet, like it doesn't have to be too long just to set them so that we could actually illustrate what we're telling them. And by just doing that, we started seeing customers were a little bit more responsive, especially because we had a few people in the team that got the bad habit of just sending a link to the knowledge base, like you want information, yeah, you can find it in here. And then we actually made the reflection like if customers are reaching out to us through the support chat, it's because they want to talk to somebody. They don't want to go for a knowledge base article and just read something that they could find there, obviously, but nobody's telling them exactly what to do, where, to put it.
Speaker 1:So we started changing something like, for example, yeah, the visual aid and the step-by-step guide that we're sending, also reviewing the welcoming messages, to have something a bit standardized and not have something like super friendly or over uh strict, like very, uh, cold, uh, because we have a live chat with our customers. We still wanted to be friendly, but not too friendly, not by going like yo, you know, we wanted to be just like we're here to help, we're like an extension of your team and we're here to just guide you to how you can solve the issue and help you out for the process. So the tone and welcoming messages were also something that we actually worked out very easily. But I would say the biggest change was the visual aid part.
Speaker 1:Definitely, how did you identify that problem? At first, I started obviously picking the chat that got negative ratings to understand also where the frustration from the customers was coming the support that we provided our customers or if it was more something about the answers that we gave them or the solution that was not correct or something like that. And we started seeing that actually the solution that we're providing were correct, were quite on point, but the way we're communicating, it was very cold and just like do this, do that, and not always showing them where they have to go to actually solve it.
Speaker 2:Yeah, so you started out without a particular tool and you were using kind of your common sense and not always showing them where they have to go to actually solve it. Yeah, so you started out without a particular tool and you were using kind of your common sense to get some big wins. Did you see that CSAT change? You said it was like 96, 97.
Speaker 1:In six months time. After implementing Clouds, we already saw an increase of, even if it was just a few point in percentage. But we already saw that because on a monthly basis we're never below 97.2, 97.3, which was already good, you see, just like 0.2, 0.3 points. Not that much, but at least we're making a change. And now for the past two years we've never gone below 97.5. Or, if we have, it was specific month where we had a lot of new integrations and releases coming from external API, so we had a lot of frustration coming from the customer. But we have also had peak at 98.9, which is just crazy good.
Speaker 1:We were never expecting that, but year to date we have been at 97.6 and we haven't gone down. It's only increasing.
Speaker 2:You mentioned Klaus and how wonderful that was for your team. I also really loved using Klaus. Can you tell us a little bit more about how long you used it and what that was like?
Speaker 1:Yeah, so we had and I have to be very honest, I don't want to call them out publicly, but we had an amazing onboarding team. When we joined Klaus, I had two people One was based in Malaga, in Spain, and the other one in Amsterdam and we had weekly calls. Like it was amazing. They really guided me through the platform, they helped me create all the different categories and scorecard that we had, gave me, like so many ideas on what we could review and how we could really use a tool and take the most out of it. So our onboarding with Klaus was just amazing and we had, I would say, one really good year.
Speaker 1:And then, unfortunately maybe it's only my point of view, but Klaus was acquired by Zendesk and the follow-up and the way that we were treated, if I can say not that it's negative, but the way we were handled, our company was handled from then this was not the same and the experience was a bit different. But also the tool started to lack of a few things that we're very interested in, a few features starting to go away, which was not optimal for us. So this year we decided because we're still in contract, but we decided that we're going to shut down actually the QA program with them, because we're also internizing all of it and because internally we're pushing for a lot of AI, we're going to see if there's a way that we can use AI to automate our QA process. That's great.
Speaker 2:So how does your QA process work when?
Speaker 1:we were using Class, we were reviewing between 100-120 chats per week, so on a monthly basis, roughly 400 chats more or less. And obviously that rate went down when we started using the platform and what we prioritized mainly is obviously the negative reviews that we're getting from customers. So I'm very lucky to be honest in that, because whenever I get a negative rating and it's during email working time zone I can just jump in the support chat and just talk to the customer like hey, I'm part of the QA team, I review the chat and I just tell them okay, I see, my colleagues help you with this. I actually had a case like this this morning, so it's very fresh. I actually had a case like this this morning, so it's very fresh. But I just went over the chart and said okay, I see that, michael, they actually offer you a solution. I was very thorough in explaining it, so I'm trying to understand why it was a neutral rating, so three out of five. But I'm trying to understand what we could do better. And the customer just told me it's not about the person who helped me in support, but more about the platform, because it's a limitation on our end. Like good job for the support agent, but it's definitely something we need to improve on our end. So that was also an opportunity to pass on feedback to our product team.
Speaker 1:But, going back to the original topic, to choose a chat so negative ratings is the first chat that we're going to review, and then what I'm trying to do personally right now is to pick some of our recent joiners in the team and just go over their chats. So I just want to see if there's a way that we could standardize a few things. So reviewing, yeah, like, for example, the visual aid, the tone, the communication whenever they're escalating, things to engineers, like what they're doing and they're missing information, and things like that. So I'm picking those chats first. So I'm not reviewing more than 10, maybe 20 chats per week. So the drop is huge and we're aware of this.
Speaker 1:But we're really focusing on, instead of having a quantity of chats, just really quality. And whenever we review a chat, we try to be as far as possible and the feedback that we're going to pass is constructive, it's actionable, and I just like going to grab you by the hand, you're going to sit with me and we're going to go over the chat together and um see if there's something that we can uh improve so that next time, uh, we have a case, a similar case, it doesn't happen again. But really, really coming from a constructive place and not like blaming me, like, hey, you did this bad, not intentional at all.
Speaker 2:So you mentioned your product feedback loop and actually that it's helping you to uncover product issues during QA, like that negative CSAT. Do you have tips for other teams on how to create that feedback loop, how support can work with product teams?
Speaker 1:Something that has been a bit difficult for us to do was actually to really create a good relationship with the product team, because the product managers are actually still doing support nowadays. They're still being in support, so they can also see the platform and what customers are telling you about it whenever they're in support. But because the product team has always worked very closely with engineers which is expected with From the support upside we've never reached out and said, hey, maybe we can also help. We've always, whenever we need to file feedback and we just forgot about it, like, hey, yeah, we tell the customer we file the feedback to our product team, they'll be in touch, and that was it. It was actually raised in a conversation a few months ago by some of our customers. I gave feedback about this a few weeks ago. Do you know if it was taken into account? Do I gave feedback about this a few weeks ago? Do you know if it was taken into account? Do you know if?
Speaker 1:the product manager read it, Do you know? If Customers want to know? Oh yeah, they want to know and I completely understand. I would want to know as well. Like this feature could be like super important. I'm not the only one asking for it, so why isn't it available yet, which is totally fair. So we actually started reaching out to our product manager and building a strong relationship with them. Like we know that you're in support, you're always seeing everything that support is doing and all the feedback that we're filing. And sometimes we're filing support out of a gap in knowledge from our end, because there's something that was done in the tool that actually answers to that specific feedback, but maybe we're not informed about it or maybe the knowledge base doesn't have enough information about it. So we're always asking for visibility on roadmaps and new features. Even if it's just one small button that is going to be like at the top right corner is going to help clone something super easily.
Speaker 2:I think that sounds like a really great relationship between product and support. You mentioned revamping how knowledge is shared internally and that's really related to QA, you know? I mean that goes back to training and your knowledge base. Do you have a sense?
Speaker 1:of what doesn't work as far as knowledge related to QA Not sharing it and it can be funny, but it's actually something that we're going through and we've been going through that for the past year.
Speaker 1:But because our platform is so wide and we have so many features, we're starting to get some support agents to be specialized in only a specific feature, specific features and specific tools, and I was in support last week and it was.
Speaker 1:It was a very funny case, but a customer came in asking about something and I was like reading it, I was like I've never heard about this before and I was actually writing in the notes of the chat that some of my colleagues were checking. I was like I've never heard about this before. Can we actually do that? And then somebody actually jumped in and said, yeah, we can do this for this specific platform so many places where the knowledge has been shared that we don't even know and we're not gathering it. So something that we did internally was creating a specific channel for all the knowledge shares that we're gathering from one support session, and every time we have a support session, we just share something new that we've learned, because maybe it's not going to be something new for others, but it will be something new for a colleague that is in Guatemala and I'm not going to talk to in the next few weeks because of time zone or something.
Speaker 2:Yeah, so back to QA. Do you do peer-to-peer reviews? Oh, yeah, yeah.
Speaker 1:Do you find that?
Speaker 2:absolutely essential.
Speaker 1:Yeah, that was actually one of the features that was removed by Zendesk and we were kind of sad to see it go because it was something we were really using a lot. It really helped us improve, especially from the support agent perspective, really helped us improve, especially from the support agent perspective. We know that and it's going to sound rough maybe from how I'm going to say it, but we know that when feedback can be coming from a support lead or a team lead, in general it can be perceived as really rough, like yeah, it's my manager telling me like I'm doing a bad job, and then you feel bad about it and then you really pay attention to it next time time and then you just seek for approval and just make sure that your manager is seeing that you're making an effort and you're taking into account what they're telling you, which obviously is completely understandable and completely logic. I abide by that. But when it comes from a peer, so somebody that you've been working either in the same office or in another office but that you talk to on a weekly or on a monthly basis, it's more of a friendly chat. You know, like hey, I saw this chat that you had and I was reviewing this and personally, maybe I would have done this a bit differently, because if you actually had sent a screenshot to the customer, you would have avoided like four or five messages and the answer would have been there like in the first place.
Speaker 1:I would say in eight out of 10 cases, whenever we have peer-to-peer review, the agents are always telling me like I work better when I see the review from my peers, because I know that they're in the same situation as me and if the case was reversed, it would have happened the same way.
Speaker 1:I would have given exactly the same feedback. So we can really see that it has improved on our end and support agents are more happy to just have a session and say, okay, I'm actually going to do support with another person sitting next to me and we're going to review the support chat that we're working on together. So I'm just going to open up my laptop, start working on my chats. They're going to work on their chat. And then I have a question or a doubt at some point, not about the issue itself, but more about how would you say that to a customer. We have a difficult customers or it's a difficult answer you need to give them and you don't always know how to do it. So yeah, peer-to-peer for us has been huge and it has also helped build closer relationship between our support agents together, especially when we had new journeys in the team.
Speaker 2:It's kind of like mentorship um do do all your specialists offer qa, or is that something that they they train into?
Speaker 1:they all train into it and they're not. It's not mandatory for them to actually do it. Some of them really want to give review to their peers and some of them just want to receive it, and some of them just don't want actually do it. Some of them really want to give review to their peers and some of them just want to receive it and some of them just don't want to do it. They want to receive the review directly from me or from some of their support leads. So I would say it's mostly on a case-by-case basis. Obviously, when we have negative rating from our customers, we're always going to give the review and it's always going to come from the support lead level. But yeah, for the peer-to-peer it's more on the wanting basis and if they're willing to get the review and to also give it, then let's just do it. So right now, even when we have negative rating, we have people just jumping in just to see how the conversation was handled and participating in the review, on top of what the support leads are saying.
Speaker 2:It sounds like you're really willing to expand or adapt the QA program to what specialists are needing who wants to offer QA and kind of what they need. You're really willing to expand the program or adapt it for what a specialist needs. So I have to wrap up. I have a number of kind of quick questions for you. What's something you wish that more executives knew about QA but most of them just don't, maybe don't know or don't care about.
Speaker 1:So one thing that actually comes to mind is one of our co-founders always, always, always told us it's better to ask for forgiveness than for permission, and I think we can apply exactly the same, especially like towards the customer, whenever we want to do the right thing by them, and I think we can apply that to QA, which is it's cheaper to prevent issues and to apologize for them. So if we're able to actually review ahead of time everything that is currently wrong with our support process, we can work on that, we can improve it, and then maybe it will cost some time and some money to actually do that, but it will save us a lot in the future.
Speaker 2:Good point. I like that, and you know automation and AI are definitely growing. Do you have thoughts about how QA can make sure, as we automate more things, we keep the human?
Speaker 1:touch.
Speaker 1:Yeah, we're using AI in our support process currently and some of our customers are not super happy to be in touch with a bot and rather be in touch with a human.
Speaker 1:So they're always asking, like human, please, human please, because they want to talk to somebody, which I completely understand, and it's true that the tone and the empathy that maybe a human will put into will not be done by AI, but because it's a tech heavy environment on our end and our platform is, well, very, very much tech heavy, it's always good to have that human touch. So, yes, we can definitely use ai and we're reviewing actually we're doing qa on all the chats that are being handled by ai to make sure that the answers are being provided are actually correct and accurate and we're guiding the users towards the right resources. But, yeah, we always have that human touch even if it's AI handling it, there's a human behind it just reviewing it and say, yes, this is correct. No, this is incorrect. We need to improve this. So we shouldn't lose our human touch even if we have AI in the support process.
Speaker 2:That's a really good sentiment. What's one thing you would never do again when it comes to QA?
Speaker 1:Not having it, not having it, not having it, which you did for a long time. Yeah, we did for a long time, and I felt really bad at first on why we didn't put the process in before, because even if we've been a very fast-growing company and we've passed from a small-ish number of customers to a really big number of customers we have well, I don't even know how many customers we have right now, but even our team size scaled so much and we could have made a lot of changes internally to our processes and to the way we handle support way before, even before COVID, maybe, if we actually had the right tools and we're talking to the right people about it. So, yeah, implementing QA as soon as you can in a support process, even if it's just through spreadsheets, but keep that process in mind and just implement it, because it's going to save a lot of time and a lot of money.
Speaker 2:Good point. So last question if your QA process had a tagline or a motto, what would it?
Speaker 1:be For me it's QA to save the day. I love that. Thank you so much for being here and sharing your insights.
Speaker 2:I'm excited for other teams to get QA going the same way that you have. Yeah, and just thanks.
Speaker 1:Thank you for having me. That was amazing, I had really fun. Oh good.
Speaker 2:I'm glad that's a wrap on our deep dive into QA with Chloe. If you're ready to up a level your own quality program, here's our quick start checklist from Chloe. First, lead with the tough stuff. Do negative CSAT first and get it out of the way. Two trade quantity for quality, like our other guests have mentioned. 10 to 20 thoughtful reviews beats 100, just drive-bys. Three make a feedback peer-powered to soften the sting and speed adoption. So peer-to-peer QA is a huge win. Four add screenshots to every answer and watch your handle time drop. Five escalate through a dedicated JIRA board so engineers see really only what matters to them. And finally, six let Slack nudges keep tickets moving after day three. So if a ticket gets stale, nudge your team in Slack, but definitely automate that. So I hope you try this. If you do, even just for a week, let me know how it goes, because I'd like to see if launching a program like this has an effect on your team. See you next time.