Quality Monitoring Misses the Mark

By October 27, 2016Industry Expertise
Quality Monitoring Forms for Customer Service Experience - SYKES

It’s common knowledge that human interaction isn’t a rigid binary; conversations are nuanced. For a conversation to be successful — in that both parties walk away satisfied with what just transpired — the pair conducting it needs to display a degree of empathy. There needs to be some positive feedback letting the other person know that you understand what’s being said. That’s why scripted conversations sound so forced — they don’t leave room for organic responses. We’ll look at how this affects quality monitoring.

When callers feel they aren’t being listened to or understood, they end the call feeling unhappy, even if their technical problem has been solved. After all, studies have shown that the average person would rather work with someone who is likeable and inept than a competent agent with little to no compassion. Ideally we should all strive to be competent and likeable, but this phenomenon demonstrates why focusing on what’s technically correct and ignoring customer satisfaction will hurt any company that wants to offer up a great customer service experience.

It’s this disconnect between what consumers want and how tech support agents are reviewed that creates our #2 Universal Truth jeopardizing your technical support success: Most quality monitoring forms are designed to measure the technical correctness of the agent, not the customer experience.

Understanding What Current QA Forms Get Wrong

We’ve found that the quality assurance processes in most companies incorporate a strict pass/fail system, and the evaluator’s use of this form reflects this. For example, “fatal errors” can turn an otherwise great flowing call into a mark against the agent. However, we’ve found that these designated fatal errors almost always relate to business process rather than customer outcome.

When quality monitoring is effective, QA scores will correlate with customer satisfaction scores (CSAT). When business processes are out of touch with giving customers what they want, you’ll see a noticeable discrepancy between these two scores. If your agents do everything “by the book,” they’ll receive high QA scores paired with low CSAT scores. Conversely, if they know how to make callers happy by going off-script, they’ll be dinged for it and have low QA scores coupled with high CSAT scores.

For a better customer experience, your business process should take cues from the latter group instead of forcing them to comply with the current system. Consider changing the way you do things to match their style. If agents with high CSAT scores consistently miss certain items on your QA forms, it’s these areas that need some consideration.

Redesigning Your QA Forms for better quality monitoring

Once you know it’s time for a change, you should begin by creating a QA scorecard that better aligns business goals with customer experience goals. We’ve found that the ideal QA form should assign agents points based on:

  • Product knowledge — This is usually where existing QA forms are in sync with caller demands. All agents should thoroughly know the products your company provides support for.
  • Probing skills — Knowing the right questions to ask is a crucial part of friendly tech support. What to ask should be partially gleaned by what the caller says. Likewise, it’s important to recognize that two callers may explain the same problem in vastly different ways.
  • Empathy — Going back to our discussion above, agents should make it clear that they genuinely understand and care about a caller’s problem. With voice-only communication, a good way of doing this is to repeat back certain key points of what the caller has said to verify their accuracy.

Once you’ve developed and identified a QA scorecard that focuses on a great customer experience, the following steps should be taken:

  • Identify behaviors that drive great customer experiences.
  • Determine how the quality monitoring process can be redesigned to encourage these behaviors.
  • Modify coaching to correct fatal process errors in a way that also boosts CSAT.

As an example, we’ve found that modifying QA forms to focus on three overlooked qualities — empathy, call control and understanding the customer’s needs — boosted CSAT scores by 12 percent.

It’s no secret that even the best companies can struggle to provide a stellar customer experience. We at SYKES are here to help make sure your QA process is keeping your customers happy. To learn more about the other Universal Truths that we mention in this paper, click here to download the full white paper.

SYKES

SYKES

We provide customer contact management solutions to global leaders. Our end-to-end service platform engages your customers at every touch point in the customer lifecycle, starting from digital marketing and acquisition to customer support, technical support, up-sell, cross-sell and retention.

Send this to friend