I would also challenge XP productivity by examining the code churn rate. Every time the code is made cleaner and easier to read, there is no added function for a large fraction of the time the programmer is changing the code. That is equivalent to documentation time for other programmers. --AlistairCockburn
See RefactoringIsntOverhead. C3 productivity (not the same perhaps as XP productivity) measures function produced over time, compared to estimated. There's not much to challenge, unless one wanted to suggest that we'd go even faster if we didn't refactor. For that, see GoFasterWithRefactoring.
What you call time to code new function involves rewriting old function. Not that you have to, you just choose to. That is "wasted time" according to my penny pinching management. But it comes with XP, just like documenting comes with other work styles. So refactoring time reads to me like a substitute for documenting time. In fact I wrote this on another page somewhere and you agreed. Where I am going with this is that a model of productivity for XP should take into account the time spent refactoring. --AlistairCockburn
See RefactoringIsaRequirement.
I think I see Alistair's point (without necessarily agreeing it). If you come across some unclear code, you have a choice: refactor it or annotate it. Both will bring the code closer to the quality requirements. Annotation isn't overhead because it has genuine purpose and role in maintainable software.
Of course, we've argued elsewhere that when both options are possible, refactoring is better (because it reduces redundancy while documentation increases it). OK - but that doesn't mean it's fair to call documentation "overhead". It's just a less good way of achieving the same aims as refactoring. -- DaveHarris
(note from AlistairCockburn: Dave's comment presupposes reducing redundancy is "better". The DesignByContract page argues that for finding errors, increasing redundancy is better.) I think it depends on the type of redundancy. In DBC there is redundant execution, but it maximizes coverage. DBC by itself was designed to decrease redundancy of error checking in code. If you have a contract, one party checks in code and the other party has to live up. -- MichaelFeathers
Coding to a quality level is itself a kind of overhead. You listen to the CodeSmells now because you hope it will make things easier later. At first sight this goes against YouArentGonnaNeedIt and the EconomicsOfXp. I think it's an example of the right defaults, as I mentioned on MethodsShouldBePublic. -- DaveHarris
One is clearly trying to find an optimum point in the middle of a curve. Coding for the moment with no cleanup is obviously bad, since it leads to an unmaintainable program and/or to long periods where you try, often fruitlessly, to improve the system so you can go forward. Designing and planning forever, followed by building for the ages is also bad, because time is spent now that may never be needed, even in the future. XP is saying, build only what you really need, but keep the system clean so that it can readily move in whatever direction the future tells you it must. --RonJeffries
Where I am going with this is that a model of productivity for XP should take into account the time spent refactoring.
It should, because the refactoring is part of the reason why the features are in the product at day D rather than day D+20 (or D-20 btw)
It shouldn't because it could be taken by well intentioned people in the project (and not only the customer or manager) as a variable you can play with, like in "we'll clean it later". Trying to do less refactoring for the same amount of product features could have a negative effect on the cost of coding the feature, and on the defect rate, because refactoring is key to testability.
Granted, we can be more or less efficient at refactoring. If refactoring and testing practices help writing valid code faster, what practices help refactoring faster ? --ChristopheThibaut