AlistairCockburn inquires about load factor:
The model isn't right for two reasons I see so far. Your use of the 2.5 load factor isn't right. You don't just average programmer's active programming hours by 2.5 to get XP time clock hours. I don't know what is right, but the number 2.5 doesn't have that particular significance.
As you carefully explained to me, 2.5 is JustaNumber? (what happened to that page on wiki?). It has no significance, you keep telling me. Its only use is that when one of your people writes on the board that a task will take 1.5 PerfectEngineeringDays?, you expect it will take 1.5 * 2.5 real days.
I believe the math does work backwards, because we measure three things, not just two. Recall that we collect the following numbers from the developer:
Therefore, developers are estimating something, and reporting consistently on the same thing. I don't know what it is. I call it IdealProgrammingTime. Whatever it is, the ratio between it and elapsed time is 2.5.
Here's a scenario. I ask Programmer, How long will this take? Programmer says, Two ideal days.
Task is done. I say to Programmer, How many ideal days did it take? Programmer thinks back, says Two.
TimMackinnon asks: When would the programmer say anything other than 2? Is the second figure the actual time it took me, disregarding other interruptions (making it less than clock time)? I find this example confusing because the programmer answers two each time. If instead I answered 3 afterwards - how would you use this number? E.g. This seems to imply the loadfactor would be 5/3, but 5/3 * 2 (initial estimate) doesn't give you how long the task took so I don't see how its useful. In my mind, just measuring ideal and clocked gives you a useful number.
I also observe Programmer. Programmer takes 5 real days. Happens all the time. I divide, get 2.5, call it LoadFactor. Can't explain it, I just observe it. It's JustaNumber?.
Thought experiment: I go on vacation. Whiteboard is erased. Another observer tells me it took 2.5 real days to do X. What was Programmer's estimate on the whiteboard?. 2.5 / 2.5 = 1.0. Unless Programmer's stats have changed, she estimated 1.0 ideal days.
What am I missing?
I can't help thinking that there's some sort of implicit averaging going on here. Either that or there is something special about the C3 environment. In one of the examples above, the programmer estimates 2 ideal days, which multiplies up to a week, exactly a factor of 2.5. Now, where I work there are plenty of cases of "interruptions" which take half a day, and sometimes a whole day or more. Other times, I might really get into "deep hack mode" and work pretty much solidly for several days. That same 2 days of work might elapse to 4 days, or 8.
The directly measured load factor for these days or weeks is wildly different. A week where I'm on a course might have an infinite load factor (no real work done), whereas a week where all the pointy-haired-managers are on a course might be much lower :-) A day's leave during a short task can seriously perturb the load factor.
So what averaging techniques and periods does C3 use, or does it really claim to use small ideal-time estimates directly multiplied by load factor as real elapsed time?
-- FrankCarver
Folks are reading too much into the load factor. It doesn't need explaining, it needs measuring. "When I say (after due consideration) it will take a day, it ends up taking 3 days." I don't care where the time goes. All I care is that the ratio is basically stable when measured over a couple of weeks span, and that I can keep tracking it to see if it slips over time. -- KentBeck
See relatedly EnvironmentAndLoadFactor, LoadFactorInEstimatingOtherProjects
... it needs measuring. Kent, please see and comment on: MeasuringProjectVelocity.