No Silver Bullet

No Silver Bullet -- Essence and Accident in Software Engineering

http://www-inst.eecs.berkeley.edu/~maratb/readings/NoSilverBullet.html -- BrokenLine? WaybackMachine found it -- http://web.archive.org/web/20070329052658/http://www-inst.eecs.berkeley.edu/~maratb/readings/NoSilverBullet.html

The link seems to be dead; alternative: http://www.lips.utexas.edu/ee382c-15005/Readings/Readings1/05-Broo87.pdf - Astrobe

Someone has more than once posted a link to an adulterated version of this paper. The adulteration tends to support a particular commercial software company. By presenting a modified paper, it turns into an infomercial for MicroPayments. The ideas in the original paper are both more logically sound and more complete, and MicroPayments won't solve the real conundrum that Brooks intended to explain.


A groundbreaking paper by FredBrooks, published three times now:

  1. Proceedings of the IFIP '86 conference
  2. IEEE Computer magazine, April 1987
  3. The MythicalManMonth, 20th anniversary edition

As far as I'm aware, this paper introduced the term "SilverBullet" to our field. At any rate, this is what most software people - even those who haven't read the paper - seem to be referring to when they use the term.

Although the term is often used rather loosely, Brooks defines "SilverBullet" precisely:

"... [a] single development, in either technology or management technique, which by itself promises even one order of magnitude improvement in productivity, in reliability, in simplicity [of software projects]"

The thesis of the paper is that as of 1986, there was no SilverBullet on the horizon, and none would appear for at least ten years.

The crux of Brooks' argument is the distinction between "essence" and "accident" in software. By "essence", he means the crucial issues and complexities that are inherent in the problem being solved, and in the abstract conceptual structure of the solution. By "accident", he means the incidental complexities that are imposed by the ways that software is built in the world: physical limitations of size and speed, low levels of abstraction (and assorted warts and misfeatures) in our programming languages, imperfect specification or interpretation of requirements, imperfect and inefficient communication among developers, and so forth. Brooks argues that the essential issues are considerably greater than 1/10 of the problem, and since the exciting improvements of the time were attacking the accidental issues, none of them had even the theoretical promise of an order of magnitude improvement.

The 20th anniversary edition of The MythicalManMonth contains a new essay, "No Silver Bullet" Refired, that reexamines the original paper after nine years. The new essay contains some effective debunking of some attacks on NSB that have come over the years. Brooks also addresses some of the confusion around his choice of the term "accidental", and admits that perhaps "incidental" may have been a better choice.

-- GlennVanderburg


I dunno... I suspect database-driven DeclarativeMetaProgramming? could easily become a silver-bullet in the 'limited' domain of systems software, device drivers, and OperatingSystem implementation (which are currently slow-as-hell due to accidental complexities). And I wouldn't disbelieve that even configuration files for GUIs qualify as a domain-limited silver bullet, improving by an order of magnitude the time required to effect many desirable configuration changes. And I know for certain that software could improve the construction rate for mid-quality artwork for video-games by at least an order of magnitude over the by-hand mechanisms still common today by use of generative programming based on sketches, selection, and related techniques. How many 'domain limited' silver bullets remain?

I expect FredBrooks is underestimating just how much of our efficiency is still damaged by AccidentalComplexity. I doubt there has ever been a single 'silver bullet' that crossed all domains; taking the view that silver bullets - order of magnitude improvements - have always been domain-limited, then the right place to look for them is in any 'common' niche.


As the source of the silver bullet comment that inspired this page (since moved, along with related discussion, to XpIsNotaSilverBullet) I had forgotten (or perhaps never knew) the Brooks had quantified his term to require an order of magnitude improvement. Most of us would be happy with a doubling of system development productivity (or reliability, simplicity, elegance, utility, or whatever measure rings your bell.) Many would be happy with less than that; a 10% improvement would make my boss dance in the aisles.

I find it really easy to be seduced by attractive promises of a better way, and have only learned though hard experience that some better ways simply trade known problems for new problems as yet discovered. I also fear that some success stories (the NY Times project was an example) are very dependent on the competence of the participants, and cannot always be transferred to other environments where apprentice and journeyman (not master) developers are the rule.

The funny part is, that even when a ten-fold improvement in development time can be achieved (I'm thinking of the APL language), there are lots of other factors that determine how widespread the use of the tools becomes. -- JimRussell

At the time the article was written, much was being made of the discrepancy between the hardware world (where Moore's Law ruled and huge improvements were the order of the day) and the software world (where improvements were slow at best). I suspect that's the reason for the focus on order-of-magnitude improvements.


I may be risking touching off a holy war here. It occurred to me that all this really raises the question: Have there actually been silver bullets? For example, when there have been major gains (losses?) in development time the complexity of that to be developed (and learned) leaps as well, such that, the hurry-er we go the behinder we get.

Well, there have been several events that have increased productivity/quality. [See below.] Sometimes, I think, people lose sight of the forest for the trees. I don't think OO failed at all. It's simpler, easier to use, more powerful, etc. I think people get hung up on expectations and hype. They want a magic silver bullet that will solve everything for them, but there is no such thing except the human mind. Maybe they should try to improve that instead. -- SunirShah

Sunir, could you be saying that ProgrammingIsInTheMind?

Past silver bullets: (EditHint: Who invented it ? What year ? Sort by year ?)

        Storing programs and data in the same location was invented by JohnVonNeumann - VonNeumannArchitecture 
Brooks argues that there will be no more silver bullets, because these past silver bullets all attacked the "incidental" annoyances to programming, and there are not enough "incidental" annoyances left (less than 9/10ths of our effort today is wasted on these "incidental" annoyances) to give us even a single 10 times improvement.

(DavidCary: I made this list from items mentioned by SunirShah)

(EditHint: add all the software items on the list from MaryBellis? http://inventors.about.com/library/blcoindex.htm)

(TobyThain: This list conflates genuine silver bullets (compiler) with hyped silver bullets (DotNet?!?!). Also see “No Silver Bullet” and Functional Programming, http://cogito.blogthing.com/2006/12/06/no-silver-bullet-and-functional-programming/, and follow-up, Silver Bullets Incoming!, http://cogito.blogthing.com/2006/12/08/silverbulletsincoming/ - in which it is argued that FP is a genuine silver bullet in Brooks' terms.)


It is unlikely there has been any silver bullet other than the constantly increasing amount of computer power available to the developers at any instant in time. The tools have improved. We've developed quite a lot of reusable code, in spite of the hand-wringing. But no obvious silver bullet. -- MartyHeyman


I suspect the silver bullet will not come in the form of "how we develop" but "what we develop". My proposal is shown in DoItFramework. -- JimRussell


MrAristotle was the first to formalize the concepts of essence and accident. Surprising at how relevant his writings especially on classification are to OO theory! See: http://www.wikipedia.org/wiki/Aristotle -- Andre Mesarovic


"... [a] single development, in either technology or management technique, which by itself promises even one order of magnitude improvement in productivity, in reliability, in simplicity [of software projects]"

[Over what time frame? A fiscal quarter? A year? A decade? Instantly?]

Interesting question. There are lots of tools that "significantly" improve productivity, reliability, etc., but they take months to really get proficient. I want to keep this open to all order-of-magnitude improvements, even if it takes half-a-lifetime to master them.

Alas, some people seem to be adding things to the above list that are clearly *not* order-of-magnitude improvements.


There are no silver bullets because:


What about automatic refactoring tools? I've found massive improvements in my productivity when using JavaLanguage and EclipseIde because of the automatic refactoring. I believe that refactoring tools attack an essential problem or programming because they let you easily change the way code embodies understanding as that understanding itself evolves. I've got to the point where I find it hard to program in supposedly more productive languages (RubyLanguage, PythonLanguage, etc.) because they slow down refactoring so much. -- NatPryce


Many people speak as if they have read NSB and still claim that something or other is a Silver Bullet. Everyone that I have spoken with about this says something to the effect that, no they have't read it (they don't have to) but they know what it's about - and then they disprove their assertion in the next sentence by claiming that some new product is a major boon in the war on the accidental problems.

The point in NSB is that the cost of a software project is dominated (ten to one roughly) by tasks that are not typing in the programmers' ideas into the machine. If you accept that premise then even elimination of programming (i.e. communicating your requirements to the machine) will only result in a 10% reduction in cost. He also points out that no snake oil vendor makes such small claims - it requires at least "double your productivity" claims to get any customer interest.

The last project that I worked on had two developers (who also did requirements gathering, analysis, and testing), two project managers, one full time tester, one full time business analyst, and at least three full time equivalent employees in steering committees, user groups, and other management tasks. To reduce costs it was decided to replace the two developers with three offshore typists but to relocate them five thousand miles to our old desks. There has been (to date) a 300% slip in completion date from the time they were taken on. Applying this silver bullet to the 22% (max) cost of typing has not resulted in the ten fold decrease in cost that outsourcing is typically quoted as providing.

Programming is the tip of the 'cost iceberg' in system delivery. The above project also incurred costs relating to the PC (and so on) maintenance of the seven non-programmers, programmer time attending meetings and reporting progress to vaious of the seven, and time spent explaining implementation decisions to people not educated in such matters. There are lots of places to make savings in system delivery - programming is not a good place to start.

-- HenryAndrew


As one of the people who actually did read Fred's paper (he published it while I was in grad school there) I think I ought to point out that there is one particular "Silver Bullet" he mentions: the introduction of high level languages. He also mentions OO programming as a potential silver bullet, and sure enough I've read a number of things that, anecdotally or otherwise, describe another order of magnitude improvement in productivity measured as function delivered per staff hour. -- Charlie Martin


I read it too. This is what I thought about it:

Brooks identifies four essential difficulties to software development: complexity, conformity, changeability, and invisibility. How do we address these difficulties?

1.Complexity. Complexity is alleviated by focusing on a small part of the whole. The task in hand is always to improve the current situation "just a little bit". It is better to focus on those things that most urgently require our attention and improve them enough so that they have a lower priority, but it is not essential. We need to embrace the idea that we are not developing complex software, we are trying to manage relatively small changes to complex systems in which complex software is embedded, and sometimes managing those systems may involve changes to software.

2.Conformity. The ambition here is too great: we need not seek to master complexity to ensure conformity. Individual employees conform, to some extent, to the needs of their employer without fully understanding them. A plethora of techniques are required to encourage this conformity. Likewise in the case of individuals and organisations in today's complex societies. In many ways, the need for conformity eases the burden of complexity: we can simply obey the law, and we can simply accept that a given interface is what it is, for the moment, and can be assumed to be going to change eventually. We do not need to plan for a specific change, but we do need to have confidence in the general ability of the entire system (including people and their processes) to respond to a range of more or less likely scenarios.

3.Changeability. Well, it's very much a fact of life for people and organisations too. And software "merely" needs to respond to some subset of the changes that the wider systems in which they are embedded need to. Effective management of the wider system in a changing context may lead to changing priorities of software changes, at which point we are back to improving the current situation "just a little bit".

4.Invisibility. Key to being able to focus on parts of the whole system is the ability to deal with different parts at different levels of detail. System representations must allow abstraction and expansion of detail. Software differs from many other system components in that there is some coherent level of detail that is unambiguous enough to be the "source" of automatically generated lower levels of detail. Automatically generated higher levels of abstraction from the same "source" are currently less commonplace and perhaps less successful. I can't see any fundamental reason why this should be so. In any event, a top-down evolution (or outside-inwards continual improvement), can leave traces, like the maps of early explorers, that can be filled in, in a variety of scales, as information is required. By linking these maps to the fundamental "source", automated regeneration of perfectly up-to-date versions is enabled.

Conclusion. To conclude this brief overview, Brooks's "essential difficulties" are all alleviated, but not eliminated, by an evolutionary perspective. Fundamentally it's just about managing Human Activity Systems. If software helps, its integration into, and co-evolution with, those systems may usefully be supported from time to time by periods of focused attention, such as a "project". This may be analogous to an "episodic" theory of evolution, like punctuated equilibrium. The maintain/enhance (or replace) dichotomy is a false one, however. Evolution is a constant, even if some organisms appear unchanging over extended periods. Brooks suggests we "grow" rather than "build" software. I suggest we co-evolve it within the wider systems in which it operates and which change it.


"In my years of experience, I have seen many language and programming fads come and go. But there's only ONE, that's right, ONE language feature I've ever seen that actually improves your productivity significantly. No, it's not object oriented programming; no, it's not intentional programming or assertions or programming by example or CASE or UML or XML or Java. The only thing that improves your programming productivity is using managed code - that is, using a language in which memory management is automatic. Java and .NET languages do this with garbage collection; VB does this with reference counting; I don't care how you do it, just let me concatenate strings without thinking about where the new bigger string will go and I'll be happy." -- Joel Spolsky http://www.joelonsoftware.com/articles/fog0000000006.html


A story of failed reuse over generations from CASE, OOP, and now SOA:

http://www.regdeveloper.co.uk/2008/03/08/soa_reuse_repetition/

Actually, the idea of corporate reuse pushes started in the mid 70's.


 There was no silver bullet until there was a silver bullet.


Oop is falling down as a failed silver-bullet, becoming yet another brass bullet in a larger tool-kit. Our pressure for evidence has paid off, guys! Pat yourself on the back. Drinks on me! (See SplashOneOne for more). --top

*sigh* Only you claimed OOP was touted as a silver bullet. The vast, vast majority of us programmers -- even those of us who have successfully used "pure" OOP in a variety of domains -- side squarely with Brooks.

Hogwash! It was The Thing from about 1992 thru about 2008.

No, you made it out to be "The Thing". Whilst OO was and still is popular, it was you who alleged it was "over sold" and spent an inordinate amount of energy naively attacking the aspects that were useful (e.g., polymorphism, encapsulation) whilst ignoring genuine limitations of many implementations (e.g., single dispatch.) The rest of us -- typically outside of the marketing department and not interested in your railing against its emissions -- regarded OO as no more than what it is: an iterative evolution of procedural programming.

Well, good for you. You get a gold star on your forehead. In general the industry over-hyped it. I'd been to multiple job and contracting interviews where claiming that OOP "has limits" got me booted faster than a soccer ball at a caffeine convention. -t

I bet it wasn't your claim that OOP "has limits" that got you booted, I'll bet it was the sarcastic tone so evident in your paragraph above.

Pot, Black, Kettle.

I wasn't being sarcastic. By the way, criticising a prospective employer's toolset -- no matter how bad you might think it or how well-justified your opinion might be -- is generally a don't-hire-me move.

I know it's not a good move. But I'm not a very convincing fibber. They should teach "proper fibbing" in school. I like to be generally honest with people and present the good and the bad rather than paint everything as happy rainbows. Plus, some places had bad experience with OOP projects and welcomed an alternative view. By the way, your tone from the beginning was derogatory, whether you realize it or not.

My tone was mainly disappointed at seeing you present inaccuracy as truth.

I believe YOU have made the inaccuracy. Anyhow, there are more diplomatic ways communicate a belief that inaccurate info has been presented (AssumeGoodFaith). True, I often skip them on this wiki also, but I didn't originate the issue of rudeness here.

I became a "frequent" anti-OOP proponent roughly around 1997 soon after Info World magazine began hosting on-line discussions. The magazine hosted the discussion topic of OOP one day, and I added my 2-cents with a message titled something like, "OOP is academic fad crap" (I admit it didn't have to be titled so sharply. Bad form on my part.) All hell broke loose. Dozens of OOP proponents jumped on my ass as the discussion turned into a giant flame-war, and I was accused of almost all the same "sins" as I am now on C2. "You just don't get OOP and GangOfFour; OO is better abstraction and you just can't see that because you are stubborn; blah blah."

One manager running a C shop in Texas practically offered me a job on the spot because he was recently burned by a C++ project gone very sour, not unlike FirstOoProjectDisasters. (I didn't want to move, having a young family, and not a C fan either.) The C manager's backing my side sent the discussion from burning into nuclear. OOP proponents outnumbered us detractors by about 10-to-1, and Info World was not an OO-centric magazine.

This long discussion eventually spilled onto UseNet and other web forums and blogs, including C2. Some of my earlier arguments were indeed stupid and ill-informed. However, the theme of lack of objective evidence for practical superiority of OO was and is still key. At first I argued that TableOrientedProgramming was better or at least as good. But many others didn't seem warm to TOP despite all the nifty things I demonstrated it doing to be competitive with their "magic polymorphism" scenarios and examples. And after many years of trying to describe why I liked TOP so much without the ability to convince most others (I had a few converts), I came to the programmer-life-changing conclusion that ProgrammingIsInTheMind and that it's mostly about WetWare, not objective universal truths. --TopMind


Literally, silver bullets that you can load into handguns (or some other gun). 99 percent sure someone has one somewhere.


FebruaryTwelve

See also NoSilverBulletRevisited, NoGoldStars, XpIsNotaSilverBullet, NoNoSilverBulletBullet, GoldenHammer


CategoryIdealism, CategoryOopDiscomfort


EditText of this page (last edited September 26, 2013) or FindPage with title or text search