from email correspondence ...
I saw BarryFazackerly? (a DSDM leader) give a brief talk about EnterpriseXp? at an Agile event in London. When the subject moved to certification, he was of the opinion that if XP did not certify it's practicioners, it would die under the weight of "I did XP (without the customer or acceptance tests or pairing) and my project failed". Despite some pushback, he invoked GilbMeasurabilityPrinciple. -- AlanFrancis
I wonder then if we shouldn't certify projects instead of people? One way to do this is through self assessment questionairs. I'm thinking of the sort of thing one might find in a magazine: ten ways to know if your husband is cheating, but focused on the ten or twenty most common symptoms of xp's misapplication. Let's collect some sure signs of xp trouble. Once we agree that we've got good indicators we will get some consultants to apply these tests and track actual outcomes. -- WardCunningham
Questions follow separated by horizontal rules. Please keep the discussion of each question with the question so that we can easily rearrange this page.
Monitor placement. Have people positioned their computer monitors such that people can work together? Examine every work position. Are two chairs available? Do keyboard and mouse cables reach both people? Do they sit side by side?
Score 1 if 90% of all positons are so configured, 0 otherwise.
Velocity awareness. Do people know how fast they are going? Ask everyone what their velocity was last itteration. Do they know? Are they sure? Can they explain how they know and why they are sure?
Score 2 if 100% all developers, customer and managers know this number and agree, score 1 if customer and 90% of developers agree, score 0 otherwise.
Scope management. What happens when there is too much to do by a fixed date?
Score 1 if you regularly turn to a business-oriented person in this situation and they choose what is to be done first.
On-site customer. When there's a question about what the requirements are, how long does it take to get an answer?
Score 2 if the answer is "always right away," 1 if the answer is "usually right away," and 0 otherwise.
Continuous design. When do you refactor?
Score 2 if the answer is "huh? constantly," 0 if the answer is "at the end of the iteration," and -50 if the answer is "never."
More continuous design. Ask the developers to sketch out a diagram showing the important classes and relationships in the design.
Score 2 if the diagrams all show the same design, 1 if 75% show the same design, and 0 otherwise.
Even more continuous design. Ask the developers where the biggest problems in the design are.
Score 2 if all developers list the same problems, 1 if they all list problems that touch on multiple classes, -2 if nobody thinks there's any problems, and 0 otherwise.
Areas for which objective measures are still needed.
Have you tried this test so far? How did you score: