I think one objection that architects have (myself not necessarily included) to XP is that the incremental process will "stick" on a suboptimal solution. In other words, the effort required to move the team off of a particular design approach / architecture / technology when the time comes will be too high. In other other words, the threshold of payback needed for making a change becomes too high, and it never gets done.
In a perfect XP world, the refactoring "temperature" remains high enough to prevent the system from settling too much (cf. SimulatedAnnealing). But in the real world, we know that some things do tend to be left alone. It's also human nature to think, "If it ain't broke, don't fix it." Plus, the dynamics of some teams make it difficult to herd the cats to come to a consensus on a design approach, let alone change it.
The alternative is the somewhat anti-XP approach of pushing the architecture a little further than necessary at the moment.
To make an analogy, it's sort of like the family that heads out West to California in the early days of America, and decides to settle down in Illinois instead. But they always look wistfully to the West coast and wonder what opportunities could have opened up to them there if they would have just forged ahead.
-- JeffMantei
I'm not so sure about this. The simple fact is that many heavily-architectured projects are at least as resistant to change at the fundamental level. With XP, at least, you're getting immediate feedback as to how effective your design decisions are. With any approach that advocates heavy modelling up front, you're only guessing, and sometimes you guess wrong. This would lead to a suboptimal local maximum more frequently, I would have thought.
Given RefactorMercilessly, if a particular new technology or design approach is desired, it will be used. XPers aren't just rushing to the top of the mountain and collapsing; they're avid mountain-climbers always looking for a new peak to scale. -- RobertWatkins.
What I see... is that in an XP project, and in a very few other places/methodologies/circumstances, it is possible to change the architecture in a fundamental and meaningful way. Got the wrong templating language? Start changing it, writing tests for the changes, and dealing with the pain of having two languages for a while. Maybe there is even some code that never gets moved to the new architecture...
XP tends to favor a local maxima for a while. Once the team starts to stagnate, in my experience, the local maxima seems low, and the whole team starts to seethe with a desire to be moving upward again. Finally, somebody has a light bulb flash, and *boom* we go down a few feet, over ten feet, and we're scaling a whole new peak and can't see the top.
I've come to believe that you can only truly understand XP when you've been a part of a fully functioning XP team. Not "almost XP" in *any* sense of the words.
I'm talking about 90% of the practices in full bore "practice", the last 10% being explored at the project's fringe, and code churning like a boiling pot.
When it is working like that, you don't even consider the question of XP favoring a local maximum. The first time you see a gelled team hit the wall of a set of local maxima, you understand all the way to your toes what they mean when they say "Necessity is the mother of invention".
Picture an irresistible force in a landscape in which the only immovable objects around can be dodged, and there aren't very many immovable objects.
-- JeffBay
Actually, I think this depends on your definition of "sub-optimal." In XP, every single user requirement is defined (in a StoryCard), and every single EngineeringTask supports those requirements. If the system is developed and satisfies the user requirements, then nothing is sub-optimal. The system works well enough for the user, and that's what matters, even if certain parts of of the system could have been implemented more efficiently.
If the system is sub-optimal in the sense that the customer will be unhappy with it, then either (A) that's a customer need that wasn't documented as a story card, or (B) the story cards don't reflect the customer's needs. Those cases result from a breakdown in implementing XP, not a weakness in XP itself. -- BrentNewhall
I have to admit that I've been using the "non-linear regression analogy" to describe iterative design for some time. The reference to SimulatedAnnealing above caught my attention. Having some considerable experience with various data-fitting algorithms and some (less considerable) experience with iterative design (through a couple of "almost XP" projects), I think the comparison is pretty apt. Having watched (and debugged) a lot of iterative fitting routines working to find an optimal solution to some fitting problem, I'm keenly aware of the problem of "sticking" in a local minima. So far I haven't seen that happen too much in iterative design situations. I suspect that the reason is that the fitting algorithm is implemented by some pretty non-linear processing units (aka people). Most design-savvy developers seem very comfortable with looking beyond the current solution when a new feature doesn't seem to be fitting into the existing design. In stochastic fitting terms (like SimulatedAnnealing) the means that the "temperature distribution" of the fitting algorithm contains significant density at higher temperatures. This gives the fitting algorithm (aka the development team) enough energy to climb out of a local minima and find other solutions in (hopefully deeper) minima. In some respects, a DesignSpike? could be seen as a particularly high-energy jump in the iteration process. -- GeoffSobering
I think one objection that architects have (myself not necessarily included) to XP is that the incremental process will "stick" on a suboptimal solution.
So what's your point? :)
Please pardon my little attempt at humor - you raise a valid issue. I don't know the answer, and have not encountered that problem. Maybe if you wanted to make architects happy, too, one thing you could do is to team them up with the business people, and have the architects contribute stories to the project, too (approved by the business people, of course).
I wonder how many types of macro-structural decisions are hard to evolve your way out of. My immediate impulse is simply to say that almost no decision is difficult to change later, if that your processes emphasize well-factored code (RefactorMercilessly, OnceAndOnlyOnce, blah blah blah). But perhaps some decisions are almost impossible to reverse? This question is asked a bit over in DistributionIsOptimization. To this day I have never heard anybody say "We had code that was well-factored and had tons of unit-tests, but when huge feature request X forced us to adopt pursue macro-structure Y, it was impossible to change."
But then, what do I know? I still haven't figured out what people mean when they say "architect." -- FrancisHwang
Microscopically, RefactorMercilessly leads to HillClimbingDesign. If a valley lies between here and the better design, you might not notice it. But without seeing it, you won't know to make the design worse just to then make it better. But HillClimbing doesn't work perfectly, because software design is not continuously deformable.
(One solution will be JoshuaKerievsky's RefactoringToPatterns.)
Macroscopically, BusinessValueOrientedProgramming intends to raise end-users' productivity early and often. Sorting UserStories by BusinessValue represents selecting the steepest slope. HillClimbing works, because requirements are continuously deformable - if the cost of change is low.
-- [PhlIp2004]