The Case Against Extreme Programming

The Case Against Extreme Programming, an article by MattStephens (http://www.softwarereality.com/lifecycle/xp/case_against_xp.jsp).


Is ExtremeProgrammingRefactored the book of the article? Or is there more to it?


I was just about to add a page on it till I found this reference. Here's my favorite quote from Mr. Stephens: The best alternative to XP is simply to develop software properly. (http://www.softwarereality.com/lifecycle/xp/conclusion.jsp) -- TomRossen

The 'case' presented by Stephens is very similar to, but not quite as abrasive as, the IconixOpinionOfExtremeProgramming. Basically it's argument by disbelief - I can't see how XP could work, so it can't. I was disappointed by the article because it started with a promising angle about how XP requires eXPert leadership to work: "Not everyone out there is a process eXPert." However, it quickly devolved into a standard missed-the-point CritiqueOfXp. Stephens would have done a lot better to stick to that one point and explore it.


According to issue 7 of ObjectiveView (http://www.ratio.co.uk/), MattStephens has teamed up with DougRosenberg (surprise, surprise) to write a whole anti-XP book titled Extreme Programming Refactored: The Case Against XP ISBN 1590590961 , due out this summer (2003).


One thing I kept seeing is the belief that a "proper requirements document" would solve all the (supposed) problems with XP. I can tell you from my experience, I'd love a proper requirements document - but I've yet to see one. Every requirements document I see is loaded with assumptions on both sides, and rarely do both sides agree on the assumptions. I had one experience where the customer agreed that one "feature" we were developing for them was totally not what they wanted, but they were afraid to alter the requirements document because they were afraid we'd negotiate out of some features they wanted. So we went merrily along developing something that they would not use.


It would be wonderful for me if you folks could be a bit more concrete in your criticism. Maybe Stephens' is arguing by disbelief, but at least he brings up points to support such a disbelief. What about the fragility of XP he exposes as a 'ring of snakes'? He explains this contention over a number of paragraphs. Can that be ignored as pure disbelief? Has he misunderstood XP? Then say so and explain how.

Take this quote for example: "Having an on-site customer representative is risky, because he/she is bound to be pretty junior. Is the customer really going to spare a senior decision-maker for an entire year? But regardless, the customer representative "becomes" the formalized requirements spec...". I think that such remarks need to be argued out and cannot simply be dismissed.

Stephens has taken the time and effort to make his views clear. Saying that this is all to do with belief is nonsense. There is plenty of reasoning on those pages. It is here that XP advocates should attack.

I'll be more concrete: Stephens isn't saying anything new. And it's not supported by reality, only guesses (he does not reference any actual XP projects). That's the problem in concrete terms. There have been dozens of critiques just like this one. Stephens could have provided a hyperlink to one of them and been done with it. I would have been much more interested in hearing about actual (not perceived) weaknesses of XP. XP certainly does have weaknesses, but Stephens resorts to the old stand-by knee-jerk reactions, rather than actually exploring the real problems.


In the article, MattStephens states that Unit Testing does not catch design flaws. In my case, it has actually caught design flaws. When I find that it is difficult to write Unit Tests or my Unit Tests are getting more complex, I know that there is a design flaw waiting to get corrected. Unit Tests act as an indicator that the code is modular (has to be if Unit Tests is to be simple). -- VhIndukumar


Re: junior customers. DontDoThat. I have teams with very senior customers. The results of development are certainly sensitive to the quality of BusinessInvolvement?, but that's NotAnXpProblem. Does that respond to the point mentioned above?

Not an XP problem? It is an XP limitation. If a business won't put his best man on the job (and if they are smart they wouldn't) then you get their second best. And if he also is too valuable to the business then their third best...

Please cite a methodology that allows one to deliver the right software in spite of the fact that the customer wants to limit their involvement.

A customer, of course, has to make these trade-offs however you do development. The problem is that they will probably provide you with a monkey until you have proven your team of XPers can deliver business value.

NotAnXpProblem means that the issue in question is shared by all software development. If you have junior business input to your RUP/CleanRoom/Cowboy/2167A project, you'll have a crummy project. Period. A criticism of XP that says, "...but you'll get junior customers..." is flat wrong because I consistently get senior customers.

I could phrase it another way: "XP suggests no solution to this problem". SoftSystemsMethodology and other methodologies that use domain analysis and interview techniques do not rely on the competence of a single person (or single group of people).

One way in which XP addresses this problem is by frequent releases of working software. It sidesteps the problem of poor requirements altogether and says "The declared requirements are going to be wrong/obsolete no matter what we do, so how can we deal with the consequences?"


VhIndukumar - That statement is that unit tests caught at least SOME of your design flaws in THAT case. While that's not bad, that's not a methodology unless it makes some promises about its performance.

"junior customers. DontDoThat" Hahahaaha. And returning to the real world for a moment - this one that XP is supposed to address; what makes you think that a customer that can't make a decision about project scope and stick to it, is one that can supply a senior to be onsite? They can't spare senior people to write sensible specs for those people of us who want them...

Personally I find all this dismissal of any criticism of XP as "oh they just don't get it" to be a bit boring and religious. I can see how some of the techniques could be useful, but I can't get any sensible answers from XPs to any criticisms. -- KatieLucas

Let me put it another way. Stephens' response to XP is like one of those people who hears about Wiki, comes here, sees there's no security, and replaces the text of NewUserPages with "Gee, you guys are dumb. This is why this will never work." We see that happen a lot, but are we going to bother to try to argue about it? Umm, no. Waste of time. It works. The fact that we use it everyday is the proof. If someone is too stubborn in their world-view to try out counter-intuitive ideas, too bad for them.

If you want to know what's wrong with XP, you've got to do your part and spend some time researching it. WikiWiki is a great place to start, the books are helpful as well. I suggest reading AgileSoftwareDevelopmentEcosystems by JimHighsmith. AlistairCockburn also has strong informed opinions on the subject. GradyBooch has been open-minded enough to explore the limitations of XP and RUP on various mailing lists. For a methodology that helps to highlight XP's weaknesses, see ScrumProcess.

But most of all, you need to try it for yourself.


Stephens' response to XP is like one of those people who hears about Wiki ...

I'm sorry, but that is just not a good rebuttal to the critique mentioned in the argument. The critique was well placed and fundamental, you trying to dismiss it by comparing it to some newbie to a Wiki is just childish. That's the problem with XP'ers. They won't take critique and they'll say "Huh, he doesn't even refer to projects in his critique." Well your dismissal of critique doesn't either all you say is "It works for me." How is that any different from what Stephens wrote? In fact it's much worse as you don't even propose arguments.

It's a simple counter-example. He says, XP won't work. I say, it has worked for me and many others. Obviously, Stephens needs to restate his case.

The main difference though is that Stephens' presents clear-cut and debatable arguments. The idea would be to argue against the arguments and not just wipe them away with "Oh it works for me."

Disagree about "clear-cut and debatable arguments". Aside from strawman arguments mentioned elsewhere, another type of arguments used by Stephens is simply to claim XP is not good enough compared to ideal cases (but neglect to mention the rarity of such ideal situations). E.g. He claims XP's way of doing estimation is not good enough because Function Point Analysis is better, as "FPA is based on REAL metrics gathered from projects that have really happened, and have been measured diligently along the way." How often are those metric both available and applicable to your next project? Another example, he claims unit tests does not catch design errors, "To catch design errors, you really need a human to be involved - ideally, a full-time person to sit beside you and behave like a 'design unit tester'.", how many such 'design unit tester' will you get in real projects? Zero is a good guess. XP is not perfect, but imperfection is not an argument.

Hmm I would think that design errors can be caught if one has a DesignUpFront. That way one has an overview of the system and such errors can be singled-out more clearly. My preferred way is PenAndPaper. :)


As regards the OnSiteCustomer: If I understand it correctly, although the customer should be on-site, they're not required to work full-time on managing the project. They just need to be available. They can bring lots of other work and do it on-site, as long as it's understood that one of their duties is to be available to answer questions.

And by the way, many of these issues have already been hashed out at length, years ago, here on this very Wiki. If you haven't found them yet, you're not looking deep enough. -- francis

Does KarlPoppers criticism of Marxism apply here? XP can't be falsified. (see XpIsMarxism)

I don't know what relevance that has here. Certainly there are plenty of people who have learned from XP without becoming a 100% XP convert - I'm one of them. And perhaps the reason that people aren't so excited to be discussing possible failings of XP on a brand-new page is that there are already, like, dozens of pages that are years old that already go over these issues in depth. People tire of repeating themselves. That goes for Marxists, and capitalists, and everybody else. -- francis

I understand. Is there some sort of official list of problems with XP that folks are generally prepared to admit to? Just put the link here and we won't have to go over these old problems.

Which folks are you talking about? You make it sound as if we're some sort of HiveMind which is only allowed to think thoughts that have been approved by KentBeck and RonJeffries.

If you have concerns about a specific aspect about XP, I'd suggest using the FindPage link to search for those pages. If you're having trouble finding those pages, I'd ask the WikiReferenceDesk. -- francis

I don't have a specific problem with XP. But I would like to avoid the pitfalls and problems of using it. In all the years of discussion here, hasn't it occurred to anyone to gather this information in one place?

Here are few good pages to start with: XpCritique, ExtremeProgrammingBoundaryConditions, ThoughtfulReactionsToXp


Compare and contrast vs Criticize and complain

Perhaps what we need is fewer "critiques" and more "comparison and contrast." I will agree that Extreme Programming is not perfect; I do find, though, it tends to have fewer flaws than any other approach I am familiar with. I will also agree that the best approach "is simply to develop software properly." Now, will someone tell me how the hell to do that?

Exactly.


Refusing to try is pre-scientific

From the article:

we don't have to try something out just to know whether it's a good idea or not. We have the power of thought and extrapolation based on prior experience.

This attitude is rampant in the software industry. It pisses me off. It's the old pre-scientific-method way of thinking. Christopher Columbus dared think the world might be round? Let's think real hard and drag out the writings of our oldest, most venerated philosophers. The most venerated philosopher wins! But whatever you do, don't try it. We are Man. God gave us Brains. And those Brains, Blessed by God, hold all the Answers in the Universe.

Phooey.

-- JimLittle

Sentiment is fine, but Columbus is a poor choice, neh? How about one of the astronomers? Change it then!

Copernicus vs Ptolemy, anyone? -- StevenNewton

If pure thought is all it takes, why did we build the Hubble telescope? Why do we build particle accelerators? Why do auto makers still crash-test cars? Why do drug companies conduct trials of new compounds? Somebody should tell them what dopes they're being. -- Richard Rapp

Pure thought isn't "all it takes". On the other hand, we don't send telescopes off in to space randomly every other hour. If we can do a hundred thousand thought experiments for every "real" experiment, then we probably should. ErnstMach? thought so. Einstein thought so too. I don't know where people get the idea that this is pre-scientific.

Thought experiments are fine. They're what we call "theories." But until they're backed up by actual experiments, they're not very meaningful, particularly when there's no math to back them up. Using a thought experiment to say something can't possibly work when other people report that it does is willful stupidity. -- JimLittle

I believe the correct term is "hypothesis," if the idea has some logical derivation. If the idea doesn't even have a logical derivation, it is merely an opinion. An actual "theory" requires some physical justification.

Reasoned arguments and thought experiments are fine, but EvidenceBeatsArguments? any day. If reality contradicts hypothesis, it's time to check your HiddenAssumptions.

Consider this excerpt: (referring to the role of on-site customer)

Would the staff member accept such a lame job (having to spend all his/her time answering a never-ending barrage of inane questions from the random mouths of the programmers)?

This is part of a "thought" experiment? Puh-lease. I don't know what he's critiquing, but it is not XP. And I think that the tone of this passage is representative of the article as a whole. The article is nothing more than an exercise in vanquishing strawmen. -- Richard Rapp


All criticism should be based on prior experience...even if there is none

Maybe I didn't search enough about it, but, although I can easily think about some parts of XP that are not always quite OK - because I've been through them in real project, every critique I seem to find on the net is just that : philosophy about why you shouldn't try it.

Writing tests is just writing more code. More code implies more bugs. What if the test are buggy too? Who tests the tests?

Such an argument is clearly a pure construction of mind. What if my manual-test last if ever technique is buggy too ?

I could write pages about my ideas of why you shouldn't try rock climbing. It's dangerous. If you fall, you can hurt yourself very much. You know, I'm fat. My weight is nearly 100kg.

-- ChristopheThibaut

I think the best way would be to provide the non-believers with a list of projects that have been successful using XP (other than C3 ;-) ). -- VhIndukumar

See CompaniesDoingXp

I mean projects that were completed successfully using XP -- VhIndukumar

CompletionIsFailure?. -- KentBeck

Also keep in mind that projects that were "completed successfully" are a rarity for any methodology in existence. For any successful project completed using methodology X, you can probably find 10+ more that failed using the same methodology. According to Project Pathology (http://www.thomsett.com.au/main/articles/path/toc.htm), "... over the past 18 years, our group has reviewed over 20 major projects that were in the process of failing or had failed... What we have found is that most projects fail because of people and project management concerns rather than technical concerns such as development methodologies...".

Given that most project failures are not related to the methodologies used, # of project succeeded is poor measure for the suitability of any methodology in the RealWorld. As other pages have pointed out, sometimes, one value of XP is in terminating doomed projects before they turn ugly. In the article, they found that when they used "What was the added value or business effectiveness of the delivered function points?" as a measure of success, "... an additional 20 successful projects were in fact failures!". So those 20 projects, even though they are "completed successfully", were in fact worse than not being started at all!

A more useful study would be to compare the resulting cost vs benefits of using XP vs other methodolgies for different kinds of "unsuccessful" projects. Will XP save your money in cases you will most likely encounter (i.e. when your project "fails")? seems to be more practical question. -- OliverChung


Anti XP or Anti Agile?

I am not sure if the article is AntiXp? or AntiAgile?. Some of the arguments focus specifically on XP practices, while others attack the general agile philosophy (the idea that requirements can somehow be stabilized seems to be contrary to Scrum, ASD, and FDD for example). This is not to say that there are never any stable requirements, the agile approach is that there are situations where the requirements are not stable so deal with that, Matt's article seems to say that, using a "proper" method, you can always stabilize the requirements.

And where does it seem to say that? [How about: "If you really want to plan a project accurately, you HAVE to know exactly what you are going to do in advance.", "Of course that doesn't mean that the project requirements cannot change .... These sort of changes should be minimal though, as all the really hard thinking has been done up front.", and "XP ... embracing change - i.e. the fallacy that in the modern age, business requirements are changing monthly if not weekly. Give me strength! This is the worst kind of project, and no self-respecting consultancy would 'embrace' such a fiasco of fickle mind-changers."]

It seems to be well researched, but I think the "angle" shows up on "Songs of the XPers" link and his view of the development process comes out in the RingOfSnakes? page where the XP practices are criticized because they are all interdependent (a supports b supports c supports a) "so what holds it all up?". He doesn't want to like XP and he likes everything solid and controlled (an article critiquing the underlying ControlChaos? philosophy of Scrum would be interesting).

The weird thing is that he then reverses the process he describes in RingOfSnakes?. In the Continuous integration section he argues that XPers only continuously integrate because they are changing everything and the reason they have to do that is because they didn't get proper requirements and design it up front. If they had done that, they wouldn't have to integrate so often.

This illustrates the problem with the article:

But as ContinuousIntegration is easy doesn't the above unravel?

and on unit testing:

I think there are some good points. Making the on-site customer work is tricky and requires active management. Getting Pair Programming to work can be very hard. And the practices do form a self supporting framework which can be brittle (but is self supporting, there is no gravity in the process space).

I think it might have had more impact with a more thoughtful tone rather than the point-scoring rhetorical style he employs.


... Making the on-site customer work is tricky and requires active management. Getting Pair Programming to work can be very hard. ...

One very important point is that even though something is not easy in XP, the alternative is not necessarily easier (and likely just as difficult) in other methodologies. On-site customer is tricky, but getting a clear set of User Requirements is just as hard (if not harder). I don't have first-hand experience in Pair Programming, but training up novice developers to be able to work productively is certainly not easy, and doing code review has its own problems. Throughout Stephen's article is the hidden assumption that these crippling difficulties in competing methodologies are gone, and in that case, of course XP will fall short.


According to your first point, the possibility that requirements might stabilize goes against the "general agile philosophy". I suspect this is a misunderstanding. What I got from the agile philosophy, is that you should be ready for the situation where requirements are unstable and take advantage of those where they are clear (perhaps from the outset). I don't think that the proponents of agile methodologies deny that such situations exist (see MethodologicalPluralism). [Now Noted in text - DeleteMe]

Don't See AntiXpRamblings


CategoryXpCritique CategoryPaper


EditText of this page (last edited November 28, 2011) or FindPage with title or text search