It's a difficult problem to distribute expertise among project members.
The TruckNumber is the size of the smallest set of people in a project such that, if one of them got hit by a truck, the project would be in trouble.
[Many people think this definition is broken; they're right. See TruckNumberFixed for alternatives. -- GeorgePaci] [EditHint: the history of the broken definition is worth preserving, but probably not on this page.]
Some projects depend on a sole IdeaChampion to provide the vision, coordination, and architectural knowledge that makes the project fly. This makes it possible to use lots of PlugCompatibleInterchangeableEngineers to build a project. But the quality of the project depends solely on the skills of the IdeaChampion, and the project is in jeopardy if it loses the Idea Champion. The TruckNumber is unity.
Other projects spread out the knowledge through quick training or mentoring programs, or somewhat arbitrarily assign accountability instead of allocating responsibility. This creates a team where accountable individuals are not able to carry out their responsibilities, because the problem is not within the realm of their control. This really doesn't change the TruckNumber, but is nonetheless a dysfunctional state of affairs.
Yet other projects depend on a large number of experts, to the degree that everyone on the project is an expert. Each project member is narrow, knowing only their area of expertise. The TruckNumber is large, almost as large as the number of staff on the project. If any of the experts are lost, the project is in jeopardy, because a critical area of expertise is lost.
Therefore: Manage the TruckNumber carefully, depending on few (about 5) experts, at most, who are in the project critical path. Reward these experts well to minimize the risk of having them leave. Use long-term mentoring to develop backup experts under an apprenticeship program (see PairProgramming).
SmallTeams? is a derivative pattern (though it may not seem so at first).
-- JimCoplien
(There seem to be many closely related pages, but they each have a different color. My feeling is that I would have found this mostly definitive page more easily if it had many pointers, but it's hardly tidy.)
Another sort-of alias: BusSensitive, e.g. what happens if one of the developers is hit by a bus?
I don't understand how the TruckNumber is large in the "large number of experts" example above. Isn't that another example where the TruckNumber is one?
Forbidding more than a certain number of key people from travelling together isn't necessarily a poor response. A local company got clobbered several years ago when most of their executive staff were on the same, ill-fated flight.
Failing to compensate key people for such restrictions may be a poor response. I've worked for a company that shut off vacation during a critical period, then picked up some of folks vacation travel expenses when the project was complete.
-- DaveSmith
I find this definition slightly counter-intuitive - the smallest group of which any one getting mown down would jeopardize the project. I felt, for example, the third example - a large number of people, all vital - should have a low TruckNumber. Perhaps I'm should be proposing a slightly different metric - the smallest number of people can be run over before the project is threatened (the Smallest Bus Queue Accident perhaps).
In this case the IdeaChampion setup has a Smallest Bus Queue Accident of 1, as does the large number of narrow field experts. Hopefully, a reasonably competent small team would have a Smallest Bus Queue Accident of 4 or 5. With this, the larger the value the better - as opposed the the TruckNumber's more Gaussian (or Poisson) distribution.
Does this help?
-- JezHiggins
So the BiggestBusQueueAccident? would leave you with smallest number (or set) of people who could run the project by themselves?
-- OliverKamps
I think there is a definition problem with TruckNumber. The page at: http://c2.com/cgi/wiki?DoInspections states one of the forces in play is the desire of a group to INCREASE the TruckNumber. Why would I want to increase the size of the set people where, if any ONE member of the set is killed, the project is jeopardized? What I want to increase is the size of the smallest set of people where, if and only if, ALL of the members of the set are killed is the project jeopardized. The page at http://c2.com/cgi/wiki?DoInspections refers to TruckNumber as if it were Smallest Bus Queue Accident.
-- John Washburn
I'm having trouble deciphering some of the comments above but I think there are two issues mixed together here.
The first issue is the smallest group of people on a project who know a critical piece (or set) of information. The number of people in this group might be the Smallest Bus Queue Accident number.
The second issue is the total number of groups which have sole access to some critical project information. This number is, I believe, the TruckNumber JimCoplien was describing.
A 'perfect' project would have a TruckNumber of 1 and a Smallest Bus Queue Accident number equal to the team size, i.e. everyone on the project would know all of the critical information.
People with real lives can stop reading here...
For the rest of you:
At first I thought TruckNumber and Smallest Bus Queue Accident numbers were inverses of each other but I later realized that the TruckNumber could actually exceed the total team size although the Smallest Bus Queue Accident number was limited to total team size ('proof' in the form of vigorous handwaving is available on demand). This counterintuitive (to me) fact makes me doubt how useful the TruckNumber is by itself.
I think the proper determination of project risk would be the sum over all PossibleBusQueueAccident? groups of the likelihood of such an accident multiplied by the TruckNumber for that group size. As an engineer, I would rather state that as a formula but I don't know enough HTML to take a decent stab at writing it(with Sigmas, etc...).
-- JeffShelby
Given the definition above, assume that a project has a truck number greater 1. Then there is a set {x,y,...} such that when one of these people is run over by a truck, the project is in trouble. So the project is in trouble when x is run over. Hence the set {x} is a set with the above-mentioned properties and the truck number is 1. Contradiction. A truck number is at most 1.
If you substitute 'largest set' then at least a low truck number would indicate a good project. It would best be 0. But I like the Smallest Bus Queue Accident metric.
-- OlafKummer
Disagree with Olaf. If I have a project team of 10 and I have 4 key people, my TN=4. The probability of the project becoming "in trouble" is greater than if TN<4. Given the scenario above, it is true that when x is mauled the project is in trouble, but also if y OR z OR ...TN is flattened. The higher the TN, the more the project is at risk. The variable not being discussed is the "InTroubleCoefficient?" (ITC). That is, if I have 6 key people (TN=6) and one of them is toasted, then the project is probably "less" in trouble than if I have a single key resource (ITC@TN=6 < ITC@TN=1). I suppose the ITC would be inversely proportional to the ratio of key people (TN) to total people (TP). Vis. ITC=1/(TN/TP)
-- Mike Ammerman
JeffShelby and OlafKummer, above, are correct: the definition at the start of this page is broken.
You can lose in two different ways: by having "very critical" people (i.e., people who would destroy the project if they were hit by a truck), or by having many critical people (i.e., many people who would harm the project badly if they were hit by a truck).
Statements like "I have 4 key people" are ambiguous. Meaning 1: I have four people each of whom is key - i.e., if I lose any of them then the project is history. Meaning 2: I have four people who, collectively, are key - i.e., if I lose all of them then the project is history. Obviously the first situation is much worse than the second.
I think JimCoplien is trying to suggest that both narrowly concentrated expertise and widely distributed expertise are dangerous. That's sort of correct, but they're dangerous in different ways and I don't think it's a matter of a single parameter needing to be kept in the middle of its range.
The ideal is to have all expertise distributed so that no project would be wrecked by the departure or demise of any small group of people. This is usually hard to achieve: one person can't be expert on everything unless "everything" is very limited.
Since some folks above are getting mathematical about this, I propose the following rough-and-ready measure of how much trouble you're in. Suppose some disease strikes your team, with a probability p of killing each member (independently). Obviously, as p increases so does the probability D(p) that you're doomed. The value of p at which D(p)=0.5 measures how robust your project is. Call it R. (An alternative would be to look at D(p) for some fixed, rather small, p.)
You can calculate D(p), provided your team is small, as follows. For each n, count the number of size-n sets of people whose disappearance would mean doom. (Include in this sets obtained by taking smaller "critical" sets and adding new people to them.) Multiply by p^n(1-p)^(N-n) where N is the total size of your team. You can stop once (say) (2p)^n is negligible, since there are only 2^n sets of people.
Easy theorem: Suppose you've got two projects. One of them depends on n people "jointly" - i.e., if they all leave, you're doomed. The other depends on n people "severally" - i.e., if any of them leaves, you're doomed. Then the R-value for the first project is the n'th root of 0.5, and the R-value for the second project is 1 minus that. For instance, if n=4 then the values are about 0.84 and 0.16.
This notion also enables you (sort of) to answer questions like: Suppose we have the choice between having a single critical expert, or 3 "sub-critical" experts of whom we need to keep 2. Which is better? Answer: the first one loses with probability p, the second one loses with probability 3p^2-p^3. In the first case R is 0.5; in the second it's about 0.44. So a single critical expert is better.
But this is probably all much too complicated.
That is very interesting, actually. You wind up with a curve the probability of the disease killing each person vs the probability that your project is doomed. I'd like to see hoe the graphs of different team compositions look.
Once you have that graph, compare it to average staff turnover and the expected duration of the project. Given that the project will be X months, what's the probability that a team member will have to leave the project for reasons beyond anyone's control? What, then, is the expected disaster-pronenness of the project?
See: TruckNumberFixed, TruckNumberComposite, MoreFunWithTruckNumbers, BusNumber, BusTest, SuccessionPlanning. StefMurkyNumber
Compare: LotteryNumber, VacationNumber, MappingStaffToRoles