Too Deep Into The Bag Of Tricks

A discussion has been taking place on SystemMetaphor stating that ExtremeProgramming tries hard to make the system understandable. They do this by not digging TooDeepIntoTheBagOfTricks.

So, how deep is too deep?

-- HankRoark

In general: do the simplest thing that could possibly work. Do you have a specific example in mind? -- RonJeffries

How about this? Say the bag of tricks is the original Design Patterns book by the GangOfFour. Say you've discovered a need for a tree-structured hierarchy. Digging too deep would be exhaustively studying the Composite design pattern and all its ramifications, worrying about Decorator and other related patterns from the start, designing a tree that will accommodate a wide variety of possible future requirements, and specifying a mechanism that will take someone a month to code, and more than that to test. Not digging too deep would be quickly scanning the Composite pattern to extract the essence of it, then playing with it in your mind until you come up with the minimal set of things to do (new methods, new abstract base classes, inheritance rearrangements) to yield the desired capability, then doing it. If you need UML to consider this particular evolution you're digging too deep. You have to proceed ahead with the confidence that the more complex requirements for your new tree-structured hierarchy can be factored in as you go, just like the first rudimentary capabilities. -- ScottJohnston


That's deeper than an XP person would go. The XP person would have written one class that did the first thing that the first UserStory said. Then she would have written the second class, without regard to any code-sharing. Then she would observe the common elements and refactor to remove them. In another couple of classes, the abstract class would probably emerge. Somewhere along there she'd need to build the class that contains the collection of other instances that is the essential problem solved by Composite. She'd write it as simply as she could, perhaps as a subclass of the abstract, perhaps free-standing. Then she'd refactor it into the hierarchy.

A really good refactoring person can discover Composite as quickly as I can look it up and refresh my memory ... and she never writes any extra code. I'm too smart for that - I can recognize that Composite is going to be needed, and go off investing in infrastructure, while the XPer has already released a bunch of working code. Embarrasses the hell out of me. -- RonJeffries

Ron, are you saying there is no need, if you are an XPer, to study and understand what others have done? that it is quicker/cheaper/better to re-discover already documented patterns? This seem to go so against the direction of the pattern community to quickly educated people in common solutions to common software problems. -- HankRoark

No, I'm not saying that at all. The more you know, the better you'll do, given any consistent set of other behaviors. However, we've probably all met the person who reads GOF over the weekend and comes in on Monday looking for someplace to put Composite. This is the case of the small boy with a hammer.

Know everything. When you consider all the things you know that could work, choose the simplest one that could possibly do the job. -- RonJeffries

Oh, so it's not "discover Composite" but "discover the need for composite." Sometimes I'm slow. -- HankRoark

It is possible that an extreme programmer would just know composite without ever having read the pattern. Presumably she would have learned it by PairProgramming with someone who knew it already. Immediate goals and pair programming create an ecology for patterns that Alexander called TheWay. He suggests written patterns as only TheGate to this ideal. -- WardCunningham


I am not sure how any of this answers the original question. What is too deep? If I go back to Ron's statement "the simplest thing that possibly works" my first guess would be that too deep means spending any extra time above and beyond what it takes to find the "simplest" thing that "possibly works".

This just begs the question of how do I know that a thing is "simple" or will "possibly work". Often what is simple or possible is not immediately obvious. How much extra time is too much (or too deep) when deciding simplicity or feasibility?

Before I go into a death spiral of semantic hair-splitting let me more blunt about the point I am trying to make. "Simple", "possible", "deep", all of these things are best evaluated in context. There seems to be something about ExtremeProgrammingZen? that makes TryingTooHard? to quantify these core concepts lead to the afore mentioned death spiral. So back to "too deep", I would suggest you just make a quick assessment based on experience (guess, gut feel, intuition, whatever you want to call it) instead of an analysis and if you turn out to be wrong hopefully you'll capture the experience and do better next time.

-- MikeKrajnak


Surely the point is that after adding (say) the third feature to a part of the system you suddenly recognize that that the code for the three features you've added separately would be better expressed as common code using a Composite pattern. You don't plan to use the Composite pattern in the first place, but you do need to know all about it so that you can refactor your code to use it when appropriate.

So the point is if you plan for a pattern rather then letting it emerge during refactoring then you've dug TooDeepIntoTheBagOfTricks?

This seems to be the essence of the refactoring part of XP. You don't plan any particular complex implementation, but you need to be able to recognize a pattern that emerges in your code and refactor your code to make that pattern explicit. Obviously the more reading and studying you do, the easier you'll find it to spot patterns in your code and make use of the standard solutions to those problem.


True, but it's surprising how often the pattern itself recedes into the woodwork. You know the patterns, but you don't go cruising around until the "Composite" light goes on. You just refactor and gosh golly gee whiz, there's Composite again, already in place.

It's as if the DesignPatterns are the common destinations for refactorings, typical low energy states for designs. Articulating them helps, but after they are second nature they don't need to be articulated any more.

Reminds me of Chaos theory's "strange attractors". -- Greg McPheat

I think of patterns not so much as low energy, but low entropy. Refactoring code into pattern form via repeated applications of DoTheSimplestThingThatCouldPossiblyWork is a minimum energy trajectory to reach the pattern form rather then trying to refactor it out of a piece of (high entropy) SpaghettiCode.

I think I understand what you mean, but you perhaps you used entropy with oposite meaning? With both DoTheSimplestThingThatCouldPossiblyWork (low complexity) and SpaghettiCode (high complexity) you try to reach a state of equilibrium of complexity - that DesignPattern. You can change much more easily (greater entropy) simpler things than those more complex ones (lesser entropy).


I would like to focus back on the original question, how deep is too deep. Not because I want a quantitative answer, but because I'm wondering if this represents a core principle of XP or if its just a way of sneaking up on another idea like RefactoringLowersEntropy?, RefactoringLeadsToLowEnergyStates?, or something else...?

A related question comes from SystemMetaphor. Let's say a team is developing a metaphor and this is the first time anyone on the team has done such. They draw from a book (like SoftwareArchitectureInPractice or the PartyOfFive book) and develop a metaphor that describes the logical representation of the system, basing the metaphor on an architecture in the book. (To give a more concrete example, let's say this happened at C3. Let's also say there is not a lot of architectural expertise on the team (not true in real life). They read about data flow, said its ArchitecturalQualities? match what we are trying to achieve. Now LinesStationsBinsParts is a logical metaphor that we can use to realize data flow.) Is that too deep? Why or why not? -- HankRoark

Yes, it is too deep. The Bins portion of LinesStationsBinsParts was developed exactly this way. It was not built XP style, it was not PairProgrammed and it was BigDesignUpFront. It uses the composite pattern to minimize and simplify. When it was done some people liked it allot, others absolutely hated it. In retrospect, four years latter, I can say with conviction that both these groups are wrong. It is in fact mediocre, a fair to average job. It works reliably, does not need many changes, and has a good suite of unit tests, which is good. But it is very hard for the C3 people to modify and extend, which is bad.

Fortunately, I have reflected enough to understand what is really wrong with it. It violates OnceAndOnlyOnce. But not in the traditional way. It is, in fact, just about the fewest lines of code possible and that is actually a bad thing. OnceAndOnlyOnce has two pieces. By using a pattern to its extreme, I was able to reduce the amount of code. In doing so, I violated the first portion of OnceAndOnlyOnce, that everything that can be said about a problem domain should be said. It breaks out of the SystemMetaphor. The variables and classes are all named well enough, but there is much missing that relates directly to the metaphor and the problem domain itself. I believe this is because it was built by pattern and not grown organically as XP suggests.

When I first read the GOF patterns book I had a feeling of deja vu. The patterns in this book are there exactly because everyone uses them, knowing them by name or not. I would then, in my humble opinion, offer Kent's SmalltalkBestPracticePatterns as the surface of the bag of tricks. If you read Kent's book carefully you will find that there is a lot more there than just how to indent your methods. There is advice on how to handle most problems in a uniform way. Thus, anything beyond that is digging into the bag of tricks.

So now we ask what is too deep? Anything that leads you away from your SystemMetaphor is going too deep. Anything that has functionality beyond your current needs is too deep. Anything to optimize prematurely is too deep. -- DonWells

Don, let me publicly express my great admiration for your having learned the above, and for having expressed it. Well said, well done! -- RonJeffries


Too Deep is a derivative rule of XP, but one that sometimes hits your mind before the others. It's derivative because Too Deep is a violation of DoTheSimplestThingThatCouldPossiblyWork. Should an XP programmer working on an application ever set out to implement something with, say, Composite? No, pretty much never. It is not at all likely that you have a business story that causes Composite to be the answer.

DoTheSimplestThingThatCouldPossiblyWork means just that: do you know two ways to do the thing, and could both of them possibly work? Then do the simpler. The opportunity to use Composite arises because you have thing, and aggregates of things, and the aggregates need to act polymorphically like the things. However, the simplest way to implement the aggregates is just to replicate the necessary thing protocol onto a class for the aggregate. This code will [perhaps] contain redundancy. Then refactor to reduce it. At this point, knowing about Composite may help.

But, you say, when I set out to implement the things it was obvious that we were going to need to use Composite. So why wasn't it right to use it? It wasn't right because the rules are DoTheSimplestThingThatCouldPossiblyWork and YouArentGonnaNeedIt. Interpreting them absolutely is just fine. Save your judgment for something else: there will be plenty of places to use it. -- RonJeffries


The XP practices are aimed at mitigating the risk that technology (invented or bought) will pull you off center from your true requirements. I pat them on the back for that, because this is in reality the biggest and most frequent problem I see in software applications development. Missed opportunity for reuse is also a biggy, but it has to take a second seat. As it turns out, these two problems exist in a tradeoff relationship. Generalizing late means wasted motion, while generalizing too early means either incorrect behaviors or a different kind of wasted motion (re-use soured into ab-use).

When faced with balancing acts like this, I always work at finding the center, but I do something else as well. I assume my aim will never be perfect, and ask myself, in that case, which side would I rather err to. I think XP is giving a clear answer to that question. When evaluating XP, take your own requirements discipline into account. I think not everyone needs these rules all the time.

Ron's advice about saving your judgment for other problems is very clever, although laced with paradox. I don't want to spoil it by over analysis, so I'll keep mum.

-- WaldenMathews


A few specific questions. Is using Java's inner classes to implement something akin to Smalltalk closures (basically reimplementing the "select" method of Collection) digging too deep? I'm doing this to avoid exposing a collection instance variable and to reduce duplication of iterator code.

Is it digging too deep to use a small library that itself reaches fairly deep into the bag to do a simple task? I'm tempted to use a neat thingy called JSX that works just like Java serialization, transparently converting an object graph into a storable representation of the objects' states, but outputs XML, which has many advantages.

I have similar doubts with respect to my reluctance to go along with what appears to be a knee-jerk "reflex" of developers in the Web business, i.e. that the primary and preferred form in which one should model an application's classes is SQL tables, presupposing an SQL database.

I've been trying to "rediscover" the notion of database only when and as it became needed in the spike, going for ModelFirst; classes Candidate, Skill and SkillSet were born immediately, it was when we started writing the tests for a Database class that I reached for Java serialization as the "simplest" form of persistence. Then I started having cold feet about having to implement the kind of stuff a database does - indexing, partitioning into "pages" so you don't have to fit 1M instances in memory, and transactions. Thoughts?

OffTopic, but I'll respond: There's a system for that may interest you: ThePrevayler is an in-memory alternative to using databases (written in Java). It has some benefits and some drawbacks. Be warned that the author of the project tends to make rather sweeping claims; take it with a grain of salt. I like the system, but I don't believe it's right for everything; specifically it doesn't seem scaleable. If salability isn't critical in your project, it may be just what you're looking for. -- MatthewHall?

"If salability isn't critical in your project"? Is that a FreudianSlip?? :-)


I really think this point of view is wrong, but I suppose that is why they call it extreme programming. The CompositePattern is very simple to implement. Using it is much easier than replicating the same protocol (and possibly implementation) on two distinct classes (i.e. the node and the leaf). If you do indeed have a requirement that calls for nodes and leaves that support the same interface - say the Rules and Rulesets of a Business Workflow Engine - it is much easier, requires less code, and is easier to update/maintain (i.e. you don't need to identically modify duplicate code) if when simply uses the CompositePattern when it is called for.

I don't need to rediscover the wheel each time I use it.
A DesignPattern is not a trick lying deep within some bag of obscurity. Granted, if you are working with junior engineers, they may be unaware of this pattern. They may need to learn the hard way, but there is no excuse for a senior engineer to waste time, money, and code using an inferior solution to some problem for only the reason of following rules. That would eliminate many of the benefits of abstraction. What most rubbed me the wrong way was the assertion that using the CompositePattern is wrong for no other reason than the rules of DTSTTCPW and YAGNI and that, further, it is okay to interpret these rules absolutely despite the fact a composite was needed and that it represented the simplest solution. One can still DTSTTCPW and be smart at the same time. There is nothing wrong with being experienced and having instincts. In my opinion, there is no reason to throw all of that out in order to pedantically follow some process. -- RobertDiFalco

Do you find that taking such full advantage of your own knowledge means that people less knowledgeable than yourself find your code hard to work on? Because they need to rediscover each wheel you use that they don't already know? A ThreeStarPattern? that is used often might repay the reader for the investment they have to make to understand it. Used only once, not a chance. -- BenAveling

I don't find ThreeStarPatterns? to have wide circulation, but when they come up, it tends to intimidate the less competent programmers. An example: I had a bunch of records, and the client wanted them sorted and indented according to various properties of the records, with group headers and footers at each level of the hierarchy. But there was to be flexibility what the levels of the hierarchy represented. I instantly saw that "LevelOfHierarchy" was a useful abstraction. It was the simplest thing that actually served all the requirements. Plug a few configured hierarchy levels together, give it a record set, and watch the report pop out the other end. Other things, though simpler, would only mostly work. But the PerlLanguage code made the other programmer go balistic. The best I could interpret him was that he was afraid of not understanding.


EditText of this page (last edited October 20, 2007) or FindPage with title or text search