How to recognize a poor KPI?

I love techniques like “appreciative inquiry” and other such approaches that help us focus on the positive descriptors of what’s working and when we are at our best. However, for cases when specific skills are needed to achieve something important and when risks are high, it’s just as necessary to be able to spot when something is going wrong.

When it comes to the measurement of strategy, more and more leaders want to be able to trust the evidence they are receiving from their performance dashboard. But sadly, many reports are never used, and one of the main culprits is that leaders can’t recognize when a KPI sucks…. Until today.

Here are five foolproof ways to recognize the signs that a measure needs to be either tossed off the dashboard and into the bin or tweaked to make it more useful.

To make it a bit more real, each poor KPI sign is attached to an example that sits in this context: A company has a goal to “Improve our customers’ shopping experience” and this is how they stated their performance result or outcome: “Customers love the way we serve them”. 

Sign #1 it’s a poor KPI
The measure’s data is not evidence of what you want to improve or achieve.

Example: The Dashboard measure is “Staff Productivity”

  • Consider this, is the data from staff productivity telling us anything about how much our customers love our service? No! In fact, even worse, if you pay too much attention to staff productivity it could have a detrimental impact on improving customer service.

Why does this happen?

  • One common reason: the data was probably quick and easy to get because your organization was already collecting it. So, instead of following a method to design the strongest most feasible measure that gives the team evidence of how customers felt about your customer service, they just went for easy (but not relevant) data.

Sign #2 it’s a poor KPI
The measure is just a couple of vague words, and there is no quantification at all

Example: The Dashboard measure is “Customer Loyalty”

  • Consider this, is customer loyalty actually a measure? No! It is too ambiguous to a be a good measure, because there is no one universal standard for calculating a measure called “Customer Loyalty”. Think of all the ways we could observe (and therefore measure) Customer Loyalty. Here are some: how much a customer spends with us over their lifetime; whether our customer would recommend us to friends or family; how much they spend with us vs. our competitor; or how often they use our loyalty program.  Each of one these observable results would be measured differently, with different quantifications, and different data.

Why does this happen?

  • One Common reason: teams don’t really know what a true performance measure is or the fact that measures need to be statistically quantified. So Instead of following a recipe to describe a true measure, they pick something that sounds good, often from a KPI brainstorm session or KPI library; and leaders never know if all the money they are spending on actions is improving customer service.

Sign #3 it’s a poor KPI
The measure isn’t a measure at all, but a milestone for a project

Example: the dashboard measure is “Implement Customer Relationship Management by November 2020”

  • Consider this, Is this actually a performance measure? No! This is a statement about a milestone that a project has to meet. It is simply evidence of whether an action was completed or not.  It may be a helpful in project management, but it is not evidence of the impact you want from implementing the expensive CRM system.

Why does this happen?

  • One common reason: no one has asked “so what”? Instead of investing time to answer the question, “What do we want to achieve by installing this CRM system?” and then design a statistically-sound measure that will tell you if you are achieving it, they pick something that is easy to check off and report done.

Sign #4 it’s a poor KPI
The measure is actually a data collection tool

Example: The dashboard measure states “Customer Survey”

  • Consider this, what data is this customer survey collecting? A survey is a data collection method not a measure. It can collect all kinds of data, which can be used for all kinds of potential measures so what is a tool doing on your dashboard?

Why does this happen?

  • One common reason is that people don’t know how to define the measure before working out the data they need or the best way to collect it. Maybe this is why so many surveys are too long and produce poor data that no one uses.

Sign #5 it’s a poor KPI
Every measure on your dashboard uses the same statistic (often a %)

Example: The dashboard has more than half the measures using the percent statistic, including % customers satisfied.

  • Consider this, when is the best time to use a percentage vs. an average? When leaders see this overuse of the % statistic, they should be wondering if their data is accurately detecting enough of the change occurring.

Why does this happen?

  • One common reason is that dashboard teams may not know enough about different quantification options and when to use them. Percentages can assume your result is binary (eg. either customers are satisfied, or they aren’t. Employees either had an accident at work or they didn’t). Percentages, in cases like these, don’t tell you the degree or extent ( like how satisfied, or how injured). In these cases, averages may help you better understand the overall degree to which a particular result is happening, not just whether it is happening or not.

I hope leaders and teams find these five signs helpful for spotting poor KPIs on your dashboard, though there are certainly many others. And to leave on a positive note, here is one more handy tool so you can spot when you have indeed developed a good KPI

It’s a definition of what a performance measurement really is.

“A performance measure is a quantification that provides objective evidence of the degree to which a performance result is occurring over time.”

I have never found a more useful or clear definition of what a KPI or performance measure (my preferred term) really is than this one written by Stacey Barr, Australia’s performance measurement specialist and the creator of the PUMP Blueprint and the Evidence-Based Leadership Program.