Why Discard Ones Legacy

I had always thought that "legacy" meant something like, "the sum total of all that previous generations have provided to us, which we can use for our benefit." Do I misunderstand the term? Or is the computer industry misusing it to mean "all that junk we're stuck with because the last group of nitwits didn't do their job very well," as in the efforts by Microsoft and Compaq to have "legacy-free" computers?


Put the punched cards down, Chris ;).

Hehe. I went to fourth grade across the street from a public library. At the library, there was always someone using a large knitting-size needle to thread a packet of punched cards, always shaking some cards loose until there were only a few cards left. I never did find out what was going there.

They were executing a search. Each hole is a binary column - you cut a slot to the top of the card for "true", leave it as a punched hole for "false". Insert the needle through the cards, shake, and the cards for which that column is "true" stop out . We built one of these gadgets at primary school out of old cereal packets and sheets of card ("primary" is 5-11 years old, don't know what that is in grades) -- PaulHudson


Brodie and Stonebraker wrote a book about legacy systems that started people thinking of legacy not so much in terms of age or technology but in terms of resistence to ongoing requirements change. So, we can think of a computer system as legacy if its requirements change at a rate that exceeds the rate at which we can modify the computer system to support those changing requirements. Some of the problems are to do with implementation technology, but many are not - many are to do with practices and beliefs that we subscribe to with respect to ongoing change. For example, one particular problem seems to be that if we strive for a stable architecture, the architecture may fall out of sync with the commonality and variability needs of changing requirements, forcing us to "hack" unanticipated changes into an assumed stable base. Over time, layers and layers of implicit architecture start to petrify the architecture until the code is in control. When the code is in control, the odds are you aren't going to be able to track requirements change, and things go downhill pretty quickly. -- AnthonyLauder


I don't mind the technological idea of coming up with something better than what has been done to date. What bugs me is the misuse (it seems to me) of a perfectly good English word that refers to something else. Imagine a conference of literature professors, or of architects (the brick & mortar kind), or of acupuncturists, or police officers, where the topic of the conference is "getting rid of our legacy".


Well, if we think of legacy in the general sense of stuff we inherited, then it seems pretty reasonable that some legacy stuff is going to be good, and some of it bad. Thus, lots of people will say "this legacy system is holding us back", due to requirements change resistance, whereas others will view legacy systems as assets (due to hard won reflection of tacit business knowledge, and robustness due to years of in-the-trenches debugging). Most legacy systems, then, will have a mixture of good and bad, perhaps in acknowledgement of this one big-wig I spoke to stated "Our legacy system assets are a real liability!) (and he wasn't trying to be ironic). This mix of good and bad seems pretty consistent, I think, with the term legacy in other areas. -- AnthonyLauder



EditText of this page (last edited November 11, 2004) or FindPage with title or text search