Performance Indicators

You are charged with the responsibility of creating a cool e-commerce company's Key Performance Indicators for its developer community.

After recovering from your stint in hospital that resulted from throwing yourself out of the fourth floor window, you realize that perhaps it's not impossible to come up with some soft/hard indicators when given a little thought.

What do you come up with?


I owe the Wiki a longer discussion of performance measurement and evaluation, but don't have the time right now. So, this will have to do initially (from my course notes). Note this is about any measure of performance, not necessarily just appraisal of staff work:

Benefits of performance indicators (Smith)

Good sets of performance indicators should be (Carter) A performance measurement system (IPM) Roles of performance indicators (Mayston) Unintended consequences of performance indicator schemes (Smith, same one as above):
  1. tunnel vision- ignore aspects of performance not covered by the performance indicators
  2. sub-optimisation- narrow, local, objectives not global onces
  3. myopia- neglect of long term objectives
  4. measure fixation- concentration on the measure, not the underlying objective (e.g. lines of code)
  5. misrepresentation- creative reporting or fraud
  6. misinterpretation- bias, complex measurements, complex processes make interpretation difficult
  7. gaming- use of deliberate underperformance or setting of too easy/too difficult targets for other purposes
  8. ossification- continual review needed to keep them relevant, but then there's the effort of updating them and the loss of comparability over time

Mitigation of unintended consequences (Smith again):
  1. involve staff at all levels in the definition of the performance indicators relevant to them
  2. retain flexibility of use (don't make the job just meeting targets)
  3. review performance indicator system constantly
  4. try to quantify all objectives
  5. measure satisfaction of all involved with performance indicators
  6. seek expert interpretation of performance indicator scheme
  7. maintain careful audit of performance indicator system
  8. nurture long term-career perspectives
  9. use only a few indicators
  10. develop benchmarsk independent of past activity

-- PaulHudson


Your first instincts were correct. Trying to develop these mystical numbers is pointless. Read some WilliamEdwardsDeming for a more detailed analysis. If you are being pushed into this by upper management, I'd suggest you copy a kindergarten report card and use things like: plays well with others, follows directions, etc.

Ironic, since Deming was a proponent of really understanding what you measure (about your process, anyway). Kind of difficult to use StatisticalProcessControl without any statistics.

"Runs with Scissors"!


The first thing you do (well, maybe the second if your first response was to update your resume) is to understand why you're being asked to do this. Possibilities range from "we want to set some ideals for our developers to strive towards" to "we've got a cash flow crunch coming, and need to be ready to lay off our under-performers."

Next up is to help your management clarify what they mean by "performance." Too often, management fails to distinguish between "effort" from "effectiveness", leading to such silliness as using "lines of code produced" as a performance metric, or looking for whose car is in the parking lot at 8:30am.

Then help your management clarify whether they're looking for good overall group performance metrics, or are they truly looking for individual metrics? Do they want a team that collaborates excellently, or a group of people who are competing among themselves for a slice of the goodies.

Then, depending on management's clarified goals, update your resume. :) -- DaveSmith


Apart from some very simple jobs, people are far more complex and multidimensional things than machines and I believe the ideas of measuring performance that are gaining popularity are flawed and are leading to the opposite of their objectives.

Except for some very simple jobs it is very hard to come up with any measures of performance. Historically, the first attempt at performance assessment imposes a supervisor's view of what a subordinate should be doing. This then improves to an attempt to get the supervisor and subordinate to agree to a set of goals or objectives. This also fails since the world changes so rapidly and what might have been appropriate at the start of a period could be and often is detrimental at the end. Now in the multi dimensional organization we have today, it is far from clear that the supervisor's assessment of what is important for the (large) organization is any better or worse than his or her subordinates, even when the supervisor is on the Board. It is obvious that any attempt to assess or measure employee performance using the old techniques of cliques of supervisors meeting in secret in dark rooms is no longer regarded as valid by most employees and is the source of much employee resentment, cynicism and dissatisfaction.

Coupled with this is the problem that apparent performance is often more dependent of things outside of the employees control than a result of doing good work. So a chemist can only develop new compounds (costing the company money) while a salesman can win orders of millions. But who can say whose contribution is the greater as the compound that the chemist develops could be worth billions in the years to come? The same question can be asked of two people at the same level working on two different projects: one project may be easier to achieve than the other by its nature so using success or achievement of goals as a measure is fundamentally unfair.

But is there a better way of measuring performance?

I have the idea of democratic employee performance measurement. Ask each person to list a number (say 9) of people that they think have made a significant contribution to the company. Do this 2-4 times in a year. Those who get the most votes will be judged by their peers and colleagues to be making a greater contribution. This method is democratic, open, and relatively free from bias.

This sounds intrinsically biased, because the voting will reflect how well known the people are. Someone in the back room who may be doing a lot will not get votes, while someone in contact with a lot of people doing similar work would get significantly more.


I think the discussion and proposal above is very confused. What reason for measuring performance are you assuming in the above discussion? Why is identifying the top N contributors to Pfizer a good performance measurement? If I get 123 votes, have I really made a bigger contribution than the guy with 112, or am I just more visible or a better politician? It's not going to help Pfizer's other tens of thousands of employees work out how to do a better job; most of them will have no chance of being in the top contributors as a whole - but they may well be performing outstandingly in their current role (e.g. a purchasing administrator. Important that they do a good job? Yes. Top contributor to Pfizer's success? No).

Good performance indicators will tell the developer something about how well they are performing in ways that are relevant to company objectives.


I know that I can't be the only person to have had this idea. This ideas must surely have already been examined and the results must be available in the HR literature. Anyone know where?


IMHO: I cant see how, in a team PairProgramming XP environment, could you measure individual performance. -- PaulGrew

On page (... mental blank... where on Wiki was that? Something about passwords ...), someone suggested dealing with pairs as if they were individuals. Perhaps you could measure pair performance using the same sorts of metrics that, in other environments, would be used to measure individual performance.

Also what's the point of setting SmartObjectives for XP teams when the work is scoped through signing up for stories?

-- PaulGrew


See also ManagersLikeMeasurements

CategoryMetrics


EditText of this page (last edited July 18, 2006) or FindPage with title or text search