Take on any CX challenge with Pipeline+ Subscribe today.

Solving the CX Calculation Discrepancy - Part 1

Solving the CX Calculation Discrepancy - Part 1

/ Operations, Reporting, Strategy, Customer Experience
Solving the CX Calculation Discrepancy - Part 1

How to improve your reporting and results.

“Our abandons have shot up!” declared a supervisor. “The new system must be dropping calls!”

It was a Monday morning during the summer of 1997 and I’m in Phoenix, Arizona where it’s 120 degrees in the shade.

Having recently joined an upstart tech company called Interactive Intelligence (now Genesys), I’m there because we managed to convince SkyMall (yes, the catalog in your airline seat pocket) to put 100 of their contact center agents on our all-in-one communications platform.

As the sales engineer, who could barely spell ACD or IVR at that time, I quickly shifted to a one-man implementation team and got to work with their telecom manager. The charter from my sales rep, literally in their contract said: “Make it do what they did on Avaya.”

It was my first experience with what I later termed the “CX (customer experience) Calculation Discrepancy” that exists between some contact center platforms. And it is also the last thing you want to hear when a contact center goes live on a new platform.

This article will focus on some of the most common metric discrepancies between platforms, in the hopes you might avoid their pitfalls the next time you perform a major upgrade or migrate to a new contact center platform.

Given the complexity of this issue, I’ve split the article into three parts. Part 1 (this part) explains what this discrepancy is while in Part 2 we will discuss the most common offenders. Finally, in Part 3, we will cover the differences between CX platforms that are voice-first versus digital-first.

Detecting the Discrepancy

Back to our mystery abandons. After working with our development team, we discovered that we could account for every queue call that resulted in an abandon. Yet the customer still insisted and provided historic reports from their legacy system for similar times of day and staff showing a lower number of abandons. Hence the discrepancy.

The metric discrepancy causes you to compare apples to oranges and can drive you bananas if you have no previous experience.

It wasn’t until we turned our attention to their Avaya (then Lucent) G3 PBX/ACD that SkyMall was cutting over from that we discovered it had something called a Short Abandon Filter.

It turned out that you could set this configuration to filter out abandon calls for up to 30 seconds. After we explained this to the customer and their operations team, they were much more accepting of the elevated abandon metric.

When these sorts of reporting differences between platforms come to light during the excitement of cutover, the experience can be anywhere from accepting like we had here up to a dumpster fire. Where you are instantly in a do-or-die battle to keep your platform from getting kicked to the curb.

The metric discrepancy causes you to compare apples to oranges and can drive you bananas if you have no previous experience. Here are some of the most common offenders:

  1. Abandons
  2. Service level agreements (SLAs)
  3. Occupancy and utilization
  4. Bonus: first call/contact resolution (FCR)

I have played a part in multiple versions of this movie in different roles over the years and do not prefer any of them. The good news is, with a little forethought, some good questions and planning, these discrepancies can all be avoided or at least better managed.

Discrepancy Causes

How did we end up with the CX calculation discrepancy and what to do about it?

Here are some factors that have contributed to it:

  1. Contact center metrics have continued to evolve along with the various vendors and platforms over the past 40 or so years.
  2. Vendors tend to use similar terms for the metrics.
  3. No standards emerged for the names of the metrics or the calculations themselves.

You can search to your heart’s content, and you will find lots of articles from different contact industry resources on the metric guidelines. For example, you should generally target less than 5% abandon for voice calls, but there has been no discussion on how abandon should be calculated or whether or not it should filter out false abandon/short abandon calls.

On the vendor side, if you search the documentation or ask the right question, or, in a rare case, are working with someone who is familiar and performing a deeper level of reporting discovery than most, you can determine how a specific version of a specific platform calculates “abandon.” The same with SLA and the others we will discuss in further detail.

The problem is that all the platforms track a metric called “abandon,” and we assume it calculates the same outcome, but that is not always the case as we have seen.

Experience tells me that contact center and CX reporting is complex and nuanced. As much as I would love to blame the CX platform vendors, few of us in the industry, or those among us running contact centers, ever get down to this level of detail until they experience the problem firsthand like I did.

You, as the person who owns your contact center or CX reporting and analytics, also need to own the success for measuring similar outcomes as you migrate to your new contact center platform.

The burden is on you because it is rare that you will work with someone at a vendor, reseller, master agent, or a consultant who will know (much less take the time), to dig into the appropriate level of detail with your metric calculations as you consider your next platform or major upgrade.

Nobody selects their new CX platform based on reports and analytics, but everyone will complain about how difficult reporting is in their legacy platform.

I speculate that is because no business-to-business (B2B) platform (CX or otherwise) solves reporting for everyone. They cannot solve for everyone because of the unique combination of customer requirements versus customer analytics capability versus vendor platform data and reporting capabilities.

So, your best hope is 80/20, meaning your platform solves for 80% of your reporting requirements. Make sure you also figure out how to solve the 20% to meet your unique needs for data while matching your analytics capabilities and expertise.

How do I know it’s rare to understand CX calculations at this level? Because I work daily with prospects, resellers working on requests for proposals (RFPs), and vendors bringing us in to assist with CX reporting and analytics.

Nobody selects their new CX platform based on reports and analytics, but everyone will complain about how difficult reporting is in their legacy platform.

When I ask how they calculate their SLA they typically respond with the target, such as “We target to answer 85% of our calls in 25 seconds or less.”

And when I clarify with, “Yes, I understand that is your target for service level, but what calculation do you use? There are at least a handful of options.” More often than not, I get a blank stare, which does not indicate incompetence but rather underscores the obscurity of the problem.

Now that I have provided some context as to why the CX calculation discrepancy is a thing, in the next installment of this article I will explore the common metrics impacted by it.

Rick McGlinchey

Rick McGlinchey

Rick McGlinchey is co-founder and CEO of PureInsights LLC, an award-winning CX analytics and reporting platform. He has worked in enterprise B2B software for over 25 years, focused exclusively on contact centers. When it comes to contact center reporting, he's a CX data geek and has scars to prove it!

Contact author

x

Most Read

RLZD State of CX Report
Upland 20231115
Cloud Racers
Webex CC 20240826
Mpower
Verint CX Automation
GartnerMQ