Parkinsons Law

"Work expands to fill the time available for its completion."

Attributed to Prof. Cyril Northcote Parkinson in 1958.

http://www.heretical.com/miscella/parkinsl.html

http://en.wikipedia.org/wiki/Parkinson%27s_Law


This has a real application to software development.

Assume the best-managed project, with realistic task estimates worked out in collaboration with the developers. Every project planning method has in its predictive calculations an assumption: that the task estimates are estimates only and that we can expect some to deliver sooner and others to deliver later. That's what we mean by "estimate."

The problem with that is that projects are neither structured nor managed to take advantage of delivery sooner. We still run software projects like construction projects where early delivery, of concrete for instance, can be a real problem. On a contract job, there's the more tangible issue of unrealized billable hours.

And then there's human nature: if a particular task turns out to be easier than the developers predicted, the delivery estimate is easily met. If things go wrong, it's missed. There's no "sooner."

With only "on time" and "later" to choose from, we run headlong into not just Parkinson's Law, but the flaw of averages and Jensen's Inequality: on average, tasks are late and with a probability approaching certainty, the project is late.

This effect applies recursively, which is why it seems not to matter how much contingency you put in the plan, it can't be enough.

This is a non-trivial challenge.

-- MarcThibault


One possible solution is to have a "strategic" plan that's not shared with the developers. The developers continue to work to the estimates they developed, task by task eating up the backlog. The PM's strategic plan, on the other hand, has the longer estimates taking the "sooner" cutoff into account.

The developers would usually be doing better than they think. I don't know what impact that would have. -- mt


Parkinson's Law Derivatives and Abuses:

"Data expands to fill the space available for storage."

A law that is so primal that we keep reinventing it (See MyersLaw)

I'm more familiar with "Work expands to fill the time available for its completion." -- RichardJensen

(Nice coincidence there.)

Alternatively, and more generally, "demand expands to exhaust the available supply".

The version I came up with back when was expecting to spend my life digging up mud for Winogradsky columns and preparing water-droplet slides reads:

"For a given ecosystem, use of non-limiting resources increases until they become limiting resources, unless such increase is prevented by other limiting factors."

Feel free to substitute 'project' for 'ecosystem' as appropriate. - JayOsako

This "law" like pretty much every pithy "law" about computer systems that's ever come down the pike, is pretty much false.

When hard drives expanded beyond the megabyte range, did you see the total usage of text documents expand beyond, say, a hundred megabytes or so? No, you did not. What you saw was a new demand for images. And when storage went into the tens of gigabytes range, then you saw new demand for audio. At the hundreds of gigabytes range, we have new demand for video and computer games. Now, an idiot can think that this is some kind of magical process where new demand is created out of nowhere, but it isn't.

New demand represents pent up demand which has migrated to computer storage from other forms of storage. Text existed in books. Pictures and photos existed in magazines. It's cheaper to store them on a computer now than in a magazine. Music existed on vinyl records and then audio CDs, same story. Software existed on data CDs, same story. Movies exist on DVDs, same story again; as the new media advances, the old media is being displaced. Now, if you aren't an idiot, you'll ask yourself "what's the next general information source which can be transferred to electronic storage?" And the answer is this; there isn't one. That's it, this is the end of the road!

Now, that doesn't mean that storage can't evolve, there's plenty of room for it to evolve. But raw size of storage is not one of the dimensions along which it can evolve much longer. Hard disks can evolve to be more redundant; RAID & LoggingFileSystem which use up space for increased reliability and speed. They can also evolve to have smaller form factors; tiny hard drives in iPods. Or we can even abandon hard drive technologies for much faster flash RAM; USB keys. What isn't going to happen is a vast expansion of storage capacity into the 1000 terabyte range. The economics for it are quite clear; there is no pent up consumer demand for this.

The notion that "demand fills up supply" also vastly oversimplifies a complex situation. In most cases, a drop in prices leads to vastly greater total expenditures. So when the price of coal was cut in half due to new technology way back when, then the total usage of coal quadrupled, and the total expenditure on coal actually doubled. This is very different from a simple one to one relation of data expanding to fill whatever supply becomes available. Rather, it's more like expanding supplies (at constant prices, hence decreasing price per kilobyte) actually forces the creation of new supply.

Finally, another massive oversimplification that Parkinson's law casually dismisses is the vastly different usage and ownership patterns depending on price. As pointed out in SocialImplicationsOfTechnology?, when the price of a technology drops below a certain threshold where it is considered cheap, then you get a completely different pattern of usage and ownership. Instead of the technology being centralized and controlled by elites for their particular purposes (e.g., charging 12$ an hour to play an online text game as some companies did in the early days) then you get highly decentralized usage done for the benefit of the masses, as in PeerToPeer.


I think this will offer great potentials for PeerToPeer technology ;-) --RainerWasserfuhr


Puts a damper on MooresLaw

See also http://www.vicomsoft.com/glossary/parkinsonslaw.html


This reminds me of something. My mom has a PhD in math. When she was writting her PhD thesis (15 years ago) we had a PC AT. Part of her PhD was to run very long simulations. The simulations took about 15 hours to complete. Now she has a Pentium III. She still uses her computer to do simulations. The simulations still take 15 hours to complete. Why? I think that's because 15 hours is the time my mom is willing to wait for the simulation results. (AurelianoCalvo, RefactorAtWill)

Some problems are inherently slow. Consider an exponential problem O(2^n). If you double the speed of computation, you only get a solution for n+1 instead of n (within a fixed time; e.g. 15 hours). Assuming that your simulation allows n to grow as time permits (e.g. if n is number of simulated time increments), it seems quite plausible that 15 hours 15 years ago would get you essentially the same results (a few extra cycles, but not many) as 15 hours would get you today, even if computers are thousands of times faster.


What isn't going to happen is a vast expansion of storage capacity into the 1000 terabyte range. The economics for it are quite clear; there is no pent up consumer demand for this.

It's a shame that edits aren't tagged with a date. In 2011, estimates of Google's capacity include more than 1000 terabytes of RAM. --MarcThibault

But there will be when that is the minimum drive space required to install Windows. Given the history, that's likely only 4 or 5 versions away. That's only 10 times the total information content of the US Library of Congress - and will someday soon be about the same amount of space needed to install a word processor not much different than the one you ran from a floppy disk 20 years ago.

[Lol. I wouldn't be too surprised. But I think the major demand for 1000 terabyte storage will be multi-media, especially of the 3D or holographic formats.]


EditText of this page (last edited July 31, 2011) or FindPage with title or text search