Thanks for visiting - feel free to edit this home page.
Wiki pages I opened:
Lots of people I know love estimation and scheduling patterns and search for the One True Estimation Technique. I'm kinda going off mega-accurate ones and thinking of writing TaskSchedulingUsingZipfsLaw.
(Old rant but I'll leave it for a while) Real/IdealProgrammingTime flaws:
LoadFactor boils three concepts into the one number: programmer availability (the ratio of time actually spent on the job, as distinct from on holidays or at the dentist), programmer efficiency (the ratio of job time actually spent programming, as distinct from other development tasks or PMing the computer) and estimation error.
The only "real" measure of programming time is actual programming time. The usefulness of RealProgrammingTime? is limited to determining deployment schedules. Highly regarded by business, but that doesn't make it a true measure of work/effort.
Estimates should be compared to actuals. AIM Software are now logging programmer time so we can determine actual programming time. I can't yet say I recommend it, but it's the only method I can think of.
The "gut-feel" or "one-minute" estimation technique used in IdealProgrammingTime may be good enough for prioritising, but the lack of an algorithm/audit trail leaves the result a little vaporous / unsubstantiated.
FunctionPointAnalysis is a little whiffy too. Where did all those factors come from? Show me their derivations.
It must be simpler. Take an engineering task. Write down:
Collect this data for a number of engineering tasks. Hopefully some coherency will emerge.
Come to think of it, RCS has tons of this data. (Except for the actual programming time, although it does have the (shock horror) Real time).
mailto:matt@omnidigm.com http://www.omnidigm.com/about/matt-rickard.shtml