We sure do put up some poor excuses for performance measures - here's what to do with the ugliest of them.
INTRODUCTION
Winning awards, completing projects and initiatives on time, meeting budget, counting widgets, annual surveys, and whatever we can find at the back of the 'performance measure pantry' that was left over from last year - they are all ugly measures! If you're stuck with this sort of thing, here are some ideas for what to do about it.
WHAT MAKES A MEASURE UGLY?
In a nutshell, measures are ugly when they fail to inform your decisions about whether or not you're getting the results you wanted, and how well your actions are doing in making those results happen. Measures are ugly when they fail to give you the feedback you need to have more control or influence over the results you most passionately want, or need, to create.
Too often, people treat measurement as a bureaucratic jumping-through-the-hoops-of-the-planning-process activity. They come up with anything that can be written down in the KPI column of the business or project plan that can escape challenge from superiors or peers. The end product is a pool of measures that are usually the easiest, cheapest, most rudimentary information to produce. But they pay a price: such measures don't give the right kind of feedback to inform the proper management of that which they monitor.
DO YOU HAVE ANY UGLY MEASURES?
Quite specifically, there are a few criteria that any measure must meet if it's going to have any chance at becoming valuable feedback for decision-making. And of course, ugly measures violate these criteria. The rest of this article discusses six of the most common conditions of measure ugliness, and offers ideas for how to overcome them:
- measures that are events or milestones or very infrequently calculated
- measures that monitor the 'means', not the 'end'
- measures that are actually data, not information
- measures that are complex indices
- measures that encourage the wrong behaviour
- measures that are overly aggregated
MEASURES THAT ARE EVENTS OR MILESTONES OR VERY INFREQUENTLY CALCULATED
Measures like the following are ugly, principally because they offer little, if any, regular feedback through time:
"Annual customer satisfaction rating."
"Win Banksia Award for environmental achievement in 2006."
"On-time project completion."
Unless you design measures that give you regular feedback through time, you'll be faced with 'too little, too late". You won't get the information that will help you finetune your strategies (activities, initiatives, projects, etc…) to ensure they actually do produce the results they were supposed to.
The trap you can fall into here is assuming that your strategies will unquestionably work. Instead, work out what results you'd expect to see from these strategies, and explore how you could collect some evidence of this on a weekly or monthly basis. No, it won't always be feasible or possible for everything. But it will be for a lot of things.
MEASURES THAT MONITOR THE 'MEANS', NOT THE 'END'
Another common type of ugly measure is information about the means, not the end, not the performance result or outcome that the strategy was chosen for in the first place.
"Implement organisational restructure by June 2008." as a measure for a goal to improve customer loyalty
"Staff Productivity" as a measure for a goal to reduce organisational costs
"Billing Accuracy" as a measure for a goal to increase payment of invoices before the due date
Can you have an organisational restructure and not improve customer loyalty? Of course! Can you improve customer loyalty without an organisational restructure? Of course! So while implementing an organisational restructure might be one of the strategies you choose toward improving customer loyalty, it is not evidence of customer loyalty. The same logic also applies to the measure of staff productivity posing as evidence of organisational cost reduction.
Direct evidence of the result is essential to properly test our hypotheses about how to achieve that result. If you have measures that track the means and not the end, then you probably need a dialogue to fully describe what the end looks like, and design measures that are evidence of this. MEASURES THAT ARE ACTUALLY DATA, NOT INFORMATION
Some measures are ugly because they are really just data collection processes pretending to be measures:
"Customer Survey"
"Occupational Health and Safety incident reports"
"Budget"
These two so-called measures are data collection processes, and not the information that answers our questions. The measures can certainly come from the data these processes collect, but usually the measures need to be very clearly designed and defined before the right data can be collected.
Do you have any data collection processes that collect lots of the wrong kinds of data? Try to write down the business questions that you really need this data collection process to answer, and work backwards to identify the form the answers should take, the analysis that can produce these answers, and the data that this analysis would require. Thus, you will know better what your measures are (and what data should be collected to produce them).
MEASURES THAT ARE COMPLEX INDICES
Many people still stand firm on the notion that indexes are a great way to simplify performance measurement. An index gets its values by mathematically combining the values of a collection of other measures. It turns many measures into one:
"Road Safety Index" comprising dozens of individual measures to do with road condition, road usage and accident rates
"% of Business Targets Met" comprising all the strategic measures a business has determined are essential to its success, such as revenue, costs, customer satisfaction, employee turnover, process efficiency and so on
"Number of SLA Standards Achieved" comprising the range of agreed performance indicators in a service level agreement
Trouble is, these indexes are often vague and unspecific, so that we have virtually no idea of how to interpret the numbers. We don't know what size of a shift is important to respond to, we don't have an intuitive connection with the numbers themselves, and it really only adds an unnecessary step into the decision process.
After the index shows you a change, what do you need to do next? That's right - go and look at the measures that it is comprised of to find out what's really happening! Why not just use the original measures, and use a form of 'traffic lighting' (visual formatting that highlights which measures are on track, and which need attention).
MEASURES THAT ENCOURAGE THE WRONG BEHAVIOUR
Depending on the maturity of your organisation's performance culture, another type of ugly measure is that which suggests to people to behave in a way that actually undermines performance:
"Number of widgets produced per person per day"
"Time lost due to workplace accidents"
"Sales Representatives' Revenue Ranking"
When measures like these have targets, particularly in a culture where it is typical to pass the buck, point the finger and make excuses, you'll see people fudging the figures, shifting the goal posts, competing with those that should be collaborated with, cutting corners and sweeping mistakes under the rug. This behaviour not only misinforms those that use the measures, but it also causes performance to get worse, problems to pop up in other parts of the business, and risks to sky-rocket.
Avoid ugly measures like these by collaborating with the people whose behaviour will be influenced, and engage them in a conversation to decide what kinds of behaviours should be encouraged. Involve them very directly in the design of measures to support these behaviours, measures that give them the feedback to help them improve performance instead of masking it. MEASURES THAT ARE OVERLY AGGREGATED
Even though they're not as ugly as the types of measures described so far, the following measures hide a lot of valuable information:
"% Deliveries made on time"
"% Customers Satisfied"
"Workplace Safety Level (H, M or L)"
This kind of measure is based on what is called 'attribute data', the simplest of which is where the raw data takes the values "yes" or "no", like in percentages. "We delivered it on time, or we didn't." In the case of the customer satisfaction measure, the attribute data takes the form of a rating scale of 1 to 7 (say), and the percentage of customers that are satisfied are those that rated 4 or higher. Such measures are incapable of showing you how far away from a standard or target you are - they only tell you whether or not you met it. They're insensitive to small changes and early trends.
Why not measure cycle time and get more information? In most cases like these, the data you used to form the percentage is the same data that you can use to give you the whole picture.
BEAUTIFUL MEASURES
It takes a while to learn how to design really beautiful measures, measures that give you valuable feedback at the right time, about the right results. But practice makes perfect, and the ability to recognise what makes a measure ugly is the first step!
Stacey Barr is a specialist in organisational performance measurement, helping people get the kind of data and information that tells them how well their business is performing, and how to make it perform better. Sign up for Stacey's free 'mezhermnt Handy Hints' ezine at
http://www.staceybarr.com to receive your complimentary copy of her e-book "202 Tips for Performance Measurement".
Article Source:
http://EzineArticles.com/?expert=Stacey_Barr