Username: Save?
Password:
Home Forum Links Search Login Register*
    News: Welcome to the TechnoWorldInc! Community!
Recent Updates
[November 08, 2024, 04:31:03 PM]

[November 08, 2024, 04:31:03 PM]

[November 08, 2024, 04:31:03 PM]

[November 08, 2024, 04:31:03 PM]

[November 08, 2024, 04:31:03 PM]

[October 17, 2024, 05:05:06 PM]

[October 17, 2024, 04:53:18 PM]

[October 17, 2024, 04:53:18 PM]

[October 17, 2024, 04:53:18 PM]

[October 17, 2024, 04:53:18 PM]

[September 09, 2024, 12:27:25 PM]

[September 09, 2024, 12:27:25 PM]

[September 09, 2024, 12:27:25 PM]
Subscriptions
Get Latest Tech Updates For Free!
Resources
   Travelikers
   Funistan
   PrettyGalz
   Techlap
   FreeThemes
   Videsta
   Glamistan
   BachatMela
   GlamGalz
   Techzug
   Vidsage
   Funzug
   WorldHostInc
   Funfani
   FilmyMama
   Uploaded.Tech
   MegaPixelShop
   Netens
   Funotic
   FreeJobsInc
   FilesPark
Participate in the fastest growing Technical Encyclopedia! This website is 100% Free. Please register or login using the login box above if you have already registered. You will need to be logged in to reply, make new topics and to access all the areas. Registration is free! Click Here To Register.
+ Techno World Inc - The Best Technical Encyclopedia Online! » Forum » THE TECHNO CLUB [ TECHNOWORLDINC.COM ] » Techno Articles » Management
 You Didnt Use Brainstorming to Select Your Measures, Did You?
Pages: [1]   Go Down
  Print  
Author Topic: You Didnt Use Brainstorming to Select Your Measures, Did You?  (Read 676 times)
Stephen Taylor
TWI Hero
**********



Karma: 3
Offline Offline

Posts: 15522

unrealworld007
View Profile
You Didnt Use Brainstorming to Select Your Measures, Did You?
« Posted: August 18, 2007, 10:06:59 AM »


You Didnt Use Brainstorming to Select Your Measures, Did You?


Introduction

When Alex Osborn invented the creativity technique called brainstorming, I wonder if he had any idea just how extensively business would apply it. Almost every meeting employs some kind of brainstorming event, but there's one meeting that really should leave it off the agenda: the performance measure selection meeting.

There are 5 common ways people select performance measures

The selection of performance measures has never really been treated as anything more than a trivial, and often pesky, decision brought around by the annual business planning workshop. Usually people will take the fastest route to finalising a list of performance indicators in the KPI column of their business plan, and depending on your organisation, the fastest routes are usually some combination of the following:

- brainstorming, where participants just list as many potential measures as they can think of and then do some kind of short-listing

- benchmarking, or some other version of adoption (copying) measures from other organisations

- using existing data or measures, to save the costs of measuring something new, and having to collect the data

- measuring what stakeholders tell us to measure

- listening to what the experts in our industry have to say - what they "know" we should measure

Each of these methods certainly has some great strengths, but we often forget to examine the drawbacks. This article was written to open up the drawback discussion and offer a different way of thinking about measure design. but these common ways are limited!

brainstorming seems quick, but is really very hit-and-miss

Probably the most common approach taken to decide what to measure, brainstorming is the easy way out of an activity many people dread. Quality in equals quality out. A process that was designed for creativity and not measure design will not produce useful and usable measures.

pros:

- Seems quick.

- Lots of ideas for measures can be generated rapidly.

- Collaborative ideas - two heads are better than one.

- Easy to do, no special knowledge or skill is required.

- Engages people to be part of the measure selection process.

- A known/accepted approach, so the process doesn't get in the way of doing the activity.

All ideas are considered/accepted, which helps people willingly participate.

cons:

- Not really finished after the brainstorming is over - how to get a final selection of measures is vague.

- There is more to measurement than just selecting measures - thought about how to bring the measures to life is also needed.

- Too much information is produced, therefore too many measures often results.

- Ideas are not vetted or tested, our thinking is not challenged.

- We often are brainstorming against different understandings of the same objective/goal we want to measure.

- The bigger picture is not taken into account e.g. unintended consequences, relationships to other objectives/goals, silo thinking.

- Often what is brainstormed is not really a measure at all ? instead it is an action, a milestone, a piece of data, a vague fluffy concept.

- What is brainstormed is often expressed so vaguely no-one can remember what it meant later on.

Measure design needs to produce a few measures that have been thoroughly tested for their relevance or strength in tracking the goal or result they are selected for, and are supported by the people that they will affect.

benchmarking is convenient, but ignores strategic uniqueness

Benchmarking is about finding out what another organisation is doing, and this almost always involves or is based on some comparison of performance measures. If organisations share the same measures, then benchmarking is certainly easier to do, but there are consequences of adopting a "bolt on" set of performance measures.

pros:-- We feel safe & secure because others have gone before us.

- Others have (we assume) already put a lot of thought into those measures

? why reinvent the wheel?

- We can compare our performance with the performance of other similar organisations.

- We get a feeling of how good (or better) we are compared to others.

- It's easy ? just have to look and ask.

- Easier to justify to others why we are measuring what we are measuring.

- Widely accepted approach.

cons:-- There is more to measurement than just selecting measures.

- Not always collaborative - so little buy-in by people who will produce and use the measures.

- Not always like for like (apples with apples) - in fact, probably never is to the extent we assume.

- Isn't driven by the decisions we need to make and the information we need for those decisions.

- Doesn't challenge our thinking.

- It makes us bring some other organisation's strategy to life, not what is right for us (aren't we unique?).

- The goal posts change more frequently than the benchmarking process occurs.

- Our bigger picture is not taken into account, such as how this area of performance affects others in our organisation.

- Selecting measures against different understandings of the 'outcome' to measure.

Measure design needs to produce measures that encourage learning and sharing of knowledge, but not at the expense of discovering and focusing on the unique business strategy that best suits the organisation.

data availability makes it cheap, but focuses on yester-year's strategy

What data do we have? What have we measured in the past? What are we already measuring? All questions that are symptomatic of an organisation that is not open to challenging whether the data they are collecting really is capable of telling them what they need to know about where they are going. Measure what you have always measured, get what you have always gotten. What's strategic about that?

pros:

- Very easy, very quick.

- Known data sources mean low cost in data collection/capture ? already have systems to support it.

- People more likely share a common understanding of measures already.

- Consistency in information over time, to have valid comparisons over time.

- Have historic data available for trend analysis.

cons:

- We only bring yesterday's (or yester-year's) strategy to life.

- Rarely challenge the measure itself, so no better measures are explored (and therefore no better data will ever be collected to manage emerging strategic risks and opportunities).

- Not collaborative, because it is from previous thinkers, not today's doers.

- Bigger picture is not taken into account.

- Parts of our strategy that are new will go un-measured.

Measure design needs to produce measures that are cost-effective and have some historic data before too long, but must produce measures that will take the organisation toward its vision, not drag it back into its past.

stakeholders need information, but that's not the same as performance measurement

What's imposed on an organisation by regulators, shareholders, government, industry bodies and other stakeholders is often considered a constraint on the measures it can use to manage its performance. They struggle to retrofit the stakeholder-chosen measures to their strategy, or renegotiate the stakeholder-chosen measures. But these aren't the only two options. This method of measure selection is not really measure selection at all.

pros:

- We get told what to measure, and don't have to do the hard work ourselves.

- We give them what they want and thus we won't get into trouble.

- Often can be negotiated resourcing by the government group or stakeholder imposing it.

- Get higher management commitment to the need (and therefore to data collection and reporting) - it will get done.

- Our governance requirements are more likely to be met (assuming we report these measures properly).

cons:

- There is more to measurement than just selecting measures.

- Encourages an autocratic/ patriarchal management style.

- The imposers don't understand our strategic direction/don't trust that we do.

- Isn't driven by the decisions we need to make and the information we need for those decisions.

- Lack of ownership by us of those measures (and the results they track). - Bigger picture is not taken into account.

- The focus may not be the right focus or the only focus that matters.

- Parent-child (instead of partner) relationship with stakeholders could become the norm.

- Assumes that the stakeholders have robust methods of designing meaningful measures.

Measure design needs to produce measures that are relevant to the organisation's strategic direction, a direction that is understood supported by its key stakeholders. But the measure design process can also be used to design reports to stakeholders that are largely separate to its organisational performance measures.

experts have experience, but can be locked into one-size-fits-all

Industry experts, consultants, people with years of experience or self-nominated experts all carry a mystique of knowledge and wisdom that can make their ideas about what to measure sound more like truth than suggestion.

pros:

- We get told, and don't have to do the hard work.

- A focus is quickly clear.

- Experts can bring new ideas and experience we may not have.

- Have approaches that have worked for other organisations.

cons:

- The focus may not be the right focus or the only focus that matters for us.

- We usually don't challenge experts even though we have intimate knowledge of our unique business.

- Experts can assume we don't need to know their thinking behind the measures, so we don't learn how to think more wisely about the measures for ourselves.

- We may not really understand the measures that are recommended to us.

- Experts often cost a lot of money.

- Experts may not understand our organisation enough to know our uniqueness (the one size fits all problem).

- Experts may not take account of how feasible it might be (or not) to bring those measures to life in our organisation.

-Measure design needs to produce measures that we understand and have ownership of, and be a process that allows us to continue refining and refreshing our selection of measures as our performance and strategic direction changes.

Measure design needs a better methodology

The reason that the quick and easy methods above are used to select measures is the same reason that performance measurement is a dreaded event: people have no idea that measure design is a process of dialogue around the goals or objectives they want to measure. It is a way for them to engage in a deeper understanding of the results they are really trying to achieve, and how they would be convinced as to the degree to which they have achieved those results.

And that's why we've created an alternative method to measure design. A method that produces measures which:

- are few, not prolific

- have been thoroughly tested for their relevance or strength in tracking the goal or result they are selected for

- are supported by the people that they will affect

- encourage learning and sharing of knowledge

- assist the discovering and focusing on the unique business strategy that best suits the organisation

- are cost-effective

- have as much historic data as feasible

- will take the organisation toward its vision, not drag it back into its past

- have resulted in people (including stakeholders) having a deeper and richer understanding of what results the organisation is really trying to create

- people understand and have ownership of

This method is based around five simple but deliberate steps:

1. Write down the goal or objective or result you want to achieve (and thus measure). Stay focused on this particular result while you are designing measures for it (don't let your attention wander to other results).

2. Describe this result in a lot more detail, explaining what you and others would see, hear, feel, be doing (or even taste and smell!) if that result were happening now, explain the differences you would notice. This is listing the 'sensory descriptions' of your result. Do this until you have agreed what the result really means and what differences you will most likely notice between now and when the result is achieved.

3. Check if there are any unintended consequences of achieving this result, either positive or negative. Make sure it is a result you really do want to create before you bother measuring it.

4. Go back to the list you created at step 2 and list the potential things you could count or measure that would give you evidence of how much each of those sensory descriptions was actually occurring. For each potential measure you identify, give it a high, medium or low rating for how relevant it is to your result and a high, medium or low rating for how feasible it would be to measure it.

5. Use the high, medium and low ratings for your potential measures to shortlist to between 1 and 3 measures for this result. This method, just like all the traditional methods, has its own pros and cons as well:

pros:

- It produces very few and very meaningful measures.

- Thinking about strategy and measures is challenged and tested.

- The feasibility of measures is assessed.

- It is a conscious and deliberate approach that people have control over.

- Measures strongly link to the specific strategic outcomes that are to track.

- The bigger picture is taken into account ? and links and relationships with other measures are automatically identified (eg cause-effect, companion, etc?).

- Very collaborative because it is dialogue based, therefore high ownership results.

- Measures are not vague ideas but very evidence based.

- The measure is designed from the context of how it will be used.

- Can deliberately test the authenticity of each measure relative to its goal/objective.

- The type of language used to design the measures promotes a common and shared understanding of the result to be created (and this makes it easier to communicate strategy to all staff).

cons:

- It's not easy the first few times through - it's new and can potentially distract people if they forget what step they are doing and why (until they have been through it once or twice).

- It takes time - it needs good quality dialogue to build a deeply shared understanding of what is being measured.

- It will need training or resources to teach people how to do it and why to do it.

- Sometimes strategy needs to be altered (as this approach often deepens understanding of implications or misunderstandings of chosen strategy/wording of).

So, you no longer need to feel compelled to take the easy way out of selecting performance measures, and in five simple steps, you can not only design useful and usable measures, but also deeper your understanding of the results you are trying to achieve!

About the Author
Stacey Barr is a specialist in organisational performance measurement, helping people get the kind of data and information that tells them how well their business is performing, and how to make it perform better. Visit Stacey's website to sign up for her free newsletter: http://www.staceybarr.com

Logged

Pages: [1]   Go Up
  Print  
 
Jump to:  

Copyright © 2006-2023 TechnoWorldInc.com. All Rights Reserved. Privacy Policy | Disclaimer
Page created in 0.136 seconds with 24 queries.