Oop Goes Half Way

ObjectOrientedProgramming as a software improvement technique is missing a big part. By itself, OOP is not competitive. Most proponents will probably agree that merely putting stuff in classes and using polymorphism and inheritance a lot is not sufficient to make the software better. One can use these concepts to make it worse. It is how one puts together the OOP concepts that makes the difference.

Thus, it's not OOP by itself improving the result, but OOP plus some unnamed substance that is (allegedly) making the software better. But this undefined dark-matter is, well, undefined. Or, at least very inconsistent among OOP proponents. It may be OOP + BoochMethod or something. But without that extra something, OOP is a two-legged table. One cannot merely say "OOP improves software" if they agree with the above premise. They must also cite or describe this extra dark-matter if they want to make any meaningful claim about OOP's benefits.

For the sake of argument, it may be the "assembly language" upon which greater constructs are built, but using the assembly language by itself doesn't offer much. Another way of saying this is that OOP perhaps may be a prerequisite for making "good software", but it's not the full story by far. There's a huge gap in the puzzle. A door, perhaps, but not the destination. I'm running out of cliches.

--top

OOP is a tool, top. Just because you use a nail gun, it doesn't mean you will build a better house. You do have to know how and when to use it. That is the "undefined dark-matter" in your rant. Since it's such an obvious need, must people don't feel a need to explicitly say it.

But the "know how and when to use it" is the 10 ton gorilla in the room. It's not OOP, but the undocumented and/or inconsistent gorilla that is the key to the (alleged) improvement.

[For the majority of OOPers, knowing how and when to use it is self-evident. The interesting question is not knowing how and when to use it, but why there seems to be a small minority who don't intuitively grasp this.]

It's also the ten ton gorilla for procedural, the ten ton gorilla for functional, the ten ton gorilla for using nail guns, etc. But, apparently, it's slipped your mind that knowing how/when to use OOP does you no good unless OOP is available.

Procedural constrains the design more than OOP does, for good or bad.

At least until you start using procedure-pointers and opaque data references. Once you add those, you have hybridized with the basic design properties of OOP, albeit minus any syntax support.

If I'm understanding you, I'd argue that those are FunctionalProgramming concepts, not OOP. But, at least it's fairly clear that we are not in "procedural land" anymore.


Procedural constrains the design more than OOP does, for good or bad. Generally it's based around StepwiseRefinement. The biggest differences are at the higher level where sometimes there's a central "main"; or event driven, which is less centralized. But the mid-level and lower-level is guided primarily by StepwiseRefinement.

[Huh?]

I'll try again. The division of "code chunks" is generally by tasks, sub-tasks, sub-sub-tasks, etc. with procedural. In Oop, objects and classes may be grouped by task, by domain noun, by domain noun taxonomies, "noun managers", etc. all mixed together. There are more code-chunking division aspects. Another way of saying this is to compare to grammatical parts of speech. With procedural the code division is generally by a verb or verb-noun combination (printReportX(...) for example.) In Oop it may be all the parts of speech, nouns, verbs, adjectives, etc. This can make it confusing and inconsistent because the same app can be partitioned differently: a noun in this version/alternative, a verb in this one, etc. It's poor MentalIndexability. Procedural provides a semi-standard for code chunking criteria. OoHasMoreDials and more potential combinations of dials (valid alternatives) for the same given application. ThereIsMoreThanOneWayToDoIt can be taken too far when the confusion it generates overpowers the benefits of choice. It needs an extra something to reign it in; and so far that extra something is missing or unstated.

[Oh, you're right. Hang on, I'm just gonna delete the twenty-three years' worth of C++ and Java code I've got here on my hard drive and re-write it in Pascal. No! Better: I'll re-do it in FORTRAN. Why does that not seem like a good idea? What would I have gained by writing all that code in Pascal or FORTRAN or another procedural language?]

You think those two languages are the pinnacle of procedural? And I agree there are some "spots" in a given app where OOP helps some. It's overdoing OO that's the problem.

[What do you think is the pinnacle of procedural? How do you define "overdoing" OO?]

Overdoing is in general using it where another paradigm would be simpler and has no identified future-maintenance downsides (at least not enough to override the benefits of simplicity.)

As far as "best" procedural language, I don't think there are any because OOP hype killed their progress, creating a self-fulfilling prophecy in a way. They were not modernized to have features like flag-free typing, flexible mixing of fixed and named parameters (see UniversalStatement), associative arrays with flexible use of both periods (or something equivalent) and square brackets, nested function scope, name-spaces, etc. (See CeeIsNotThePinnacleOfProcedural) -t


It needs an extra something to reign it in; and so far that extra something is missing or unstated.

The ObjectCapabilityModel restrictions serve well in this role. Also, once you add enough concurrency to give you trouble, it becomes pretty darn obvious that ValueObjectsShouldBeImmutable or even have FirstClass support. Once that happens, there is no need for VisitorPattern. OopNotForDomainModeling follows as an indirect consequence of ValueObjectsShouldBeImmutable.

And I agree that OOP is missing something important: Procedures are traditionally short-lived (a single call) conceptually perform computations and make decisions using 'snapshots' of a model. But objects are long-lived, and thus should conceptually perform computations and make decisions based on changes in a model over time (i.e. observing changes in the FileSystem, or databases, or the user behavior). There has (to my knowledge) never been a large OO project that fails to use or reinvent ObserverPattern, hook in periodic and event-driven callbacks to specific objects, and perform various classes of explicit polling, computation, and caching. Further, these reinventions are terribly awkward (especially with regards to concurrency and GarbageCollection!). It is my impression that the missing feature is FirstClass support for these patterns, this reactivity to non-message events, this ability for an object to 'observe' the results of complex computations.

The missing "companion paradigm" for OOP is, essentially, FunctionalReactiveProgramming. And that would include FRP access to external databases (i.e. subscriptions for ad-hoc queries on a database, as opposed to snapshots).

Explicit caches, of anything, ever, should be considered a LanguageSmell.


For me, I find that ObjectOriented takes me about 98% of the way. I find that some problems are still better solved with a procedural implementation. Simple problems like reversing the elements of an array -- problems with a well-defined procedural solution -- are often better with a simple procedural solution than OO. If you don't believe this, then try coding the ObjectMentorBowlingGame yourself. (You might learn something!) In practice, I find that about 98% of business code is best served with OO design. And the rest is usually small convenience and helper methods "along the edges" of the OO classes. -- JeffGrigg

Normalization is mostly about removing duplication. How does encapsulation remove duplication? Further, I find that the relationship between verbs and nouns is often not one-to-one beyond basic set/gets. Forcing them to be tightly coupled when they are not in the domain can create maintenance headaches. --top

Mostly I use polymorphism to remove duplication. As I did this weekend on a "Refactoring Challenge" problem: http://tech.groups.yahoo.com/group/testdrivendevelopment/message/32214

I'm using the term Normalization in the sense that all attributes in a table should depend on only the fields in the primary key, and that they should depend on all the fields in the primary key. In a similar vein, all the fields and methods of an object should depend on that object's identity. FeatureEnvy is a common anti-pattern (or "OO-style denormalization"); it's code that depends on another object (and its properties or methods) more than on the object where it is found. -- JeffGrigg

I don't know the context of your code and your assumed framework, so it's hard to make any generalizations from it. I don't even know if "types of" validators is a valid concept anyhow because often things don't fit nicely into hierarchical "types" (LimitsOfHierarchies). And, I generally prefer a predicate-based approach (surprise surprise):

  <fooInput name="myValue" valueType="integer" min="0" max="9999" required="true">
  <fooInput name="SSN" template="999-99-9999" templateLibrary="app77">
And it seems more reader-friendly than your approach. As far as the implementation of such predicates, there may indeed be some hierarchical tendencies; however, they are likely "soft" trees (EightyTwentyRule), and polymorphism is too definitive as a design decision such that it's expensive to undo if it's the wrong approach (or else one lives with duplication). If one studies the pattern of variations-on-a-theme seen in the real world, then I believe they will eventually agree that hierarchical taxonomies are not sufficient.

This is partly why OOP has developed a fat set of "patterns" (DesignPatterns): ways to add indirection so that strict or simplistic hierarchies or "sub-types" are not required. Many experienced OOP developers will agree that "simplistic" sub-classing is not powerful enough to capture the richness of real-world variations-on-a-theme and sell the patterns instead. This is why OO by itself is not "sufficient" -- it goes half way. It needs (at least) DesignPatterns also, according to this view. The problem is that the patterns often make the code messier and more bureaucratic than what they replace. In other words, doing OO "right" makes OO ugly. Perhaps your view is different, and mere polymorphism is sufficient. If so, then you are not the audience of this topic. --top

[OOP hasn't developed patterns, patterns were discovered when examining OO programs. Patterns are documented strategies for solving common problems, they are not about "ways to add indirection so that strict or simplistic hierarchies or 'sub-types' are not required." Similar patterns -- plus ones unique to the given paradigm or even given languages -- exist in procedural programming, logic programming, and functional programming. When I studied AplLanguage in university, we were taught it with emphasis on common idioms that applied to a variety of problems. These too are patterns. Patterns exist in virtually any creative and/or engineering endeavour. Note that the notion of explicitly recognising and documenting DesignPatterns originated in building architecture.]

What you say does not really contradict my premise. They may indeed have been "discovered", but they were discovered because the simplistic approach was insufficient, so different approaches were taken. Let me reword it: OOP without patterns is not competitive but OOP with patterns is competitive. It's not OOP by itself that makes the code base competitive, but OOP plus an extra ingredient. Other paradigms may indeed have patterns of their own. However, this is why comparing just paradigm versus paradigm is not sufficient. To compare, specifics about the style of usage is also needed. Maybe it's a bigger problem of comparing paradigms in general. But for some reason it seems easier to go wayward in OOP. There are more ways to screw up but still be "OOP". OoHasMoreDials. It's a paradigm that raises more questions than answers. --top

[There is no "simplistic approach" that does not involve patterns. There are problems, and ways to solve problems that involve collections of classes interconnected in certain identifiable ways. These are OO design patterns. There is no such thing as "OOP without patterns". There is no such thing as programming without patterns. Patterns are everywhere, in every language, in every creative endeavour. There are always patterns. As for OOP being "a paradigm that raises more questions than answers", that just sounds bizarre. OOP is a tool. What you wrote makes no more sense than, say, "a screwdriver is a tool that raises more questions than answers."] Thanks for giving me another analogy to play with. Suppose somebody used the screwdriver as a martial arts weapon. After decades of training, one can cut down enemies at lightning speed by throwing the screwdriver just right. But it's not the screwdriver that is the grand tool here, but the sophisticated training and extensive practice. Maybe if people documented and honed their procedural/relational knowledge, they can do things as good or better than OOP. Or maybe the techniques of value are orthogonal to root language or tool. The martial arts expert perhaps could do wonderful things with forks also if given enough time. Without the details of this dark matter, this mysterious eastern skill, we don't know. The hard questions are with and about the martial arts whiz, not so much the screwdriver.

[Suppose somebody used the screwdriver as a martial arts weapon? Bleh. Suppose someone used nonsense to construct a StrawMan... Good programmers tend to write good programs regardless of paradigm and language, bad programmers make a mess regardless of paradigm and language. It is notable that good programmers tend to choose OOP and functional languages of strictly procedural languages. Why is this?]

So then mastering the tool is the key, not the paradigm or language? As far as what "good programmers" choose, first we may have define what we mean by "good programmer". You may be defining it by what tool they use. Further, some programmers are fast by themselves but cannot make team-friendly or maintenance-friendly works. In my observation, good programmers are the ones who can cook up and mentally evaluate a lot of different design options against predicted future changes, able to predict future changes fairly well based on domain and general experience, and understand the WetWare patterns of fellow and future maintainers, including knowing the personality pattern types that a given shop tends to hire. --top

[Paradigm and language are components of the facility of a tool. However, neither are of any value if a given paradigm and language are not mastered. I define "good programmer" as one who consistently meets user requirements, on time and within budget on projects of arbitrary size, in a manner that draws the praise of colleagues including team members and maintainers.]

Keep in mind that "colleagues" may not have mastered multiple paradigms, and thus may be confused or slowed down by "fancy" code. It's a matter of knowing and fitting the environment. Programming tends to be a career for the young, meaning that many are not going to be around long enough to master multiple paradigms.

[Don't you mean you are confused or slowed down by "fancy" code? This is somewhat unusual, but not unknown. It doesn't mean you're a poor programmer; indeed, within your areas of expertise you may well be a master. However, among good programmers (see above), most are at the very least capable of appreciating multiple paradigms, even if their greatest facility lies in only one.]

I'm talking about a typical developer. To make sure that's clear, the "good programmer" has to produce code that is readable and maintainable by a typical developer. There may be cases where everybody on the team is the cream of the crop, but that is the exception, not the rule. I'm just the messenger, I don't control what god and society produce.

You give "typical developers" too little credit. No matter how much a man of your means might wish it, it is impossible to develop a FoolProof? language.

Most developers I know are more interested in keeping up with trends because that's where the money is and/or they are afraid of "falling behind". Exploring rarely-mentioned paradigms/languages that may have some conceptually "cool" features is not their top priority. If that differs from your experience, well, so be it. I report it as I see it. And, I don't remember asking for a fool-proof language. Usually it's the "type safety" proponents looking for that. --top

[Top, you're making deliberately provocative statements on known-contentious topics in order to spark argument. Stop it. Whenever you get the urge to troll, go write some code.]

"Fad chaser" was meant to be a compact description, not an insult. Honest. Anyhow, I changed it. Is it okay with you now?


See: MissingFeatureSmell, DesignPatternsAreMissingLanguageFeatures

CategoryObjectOrientation, CategoryOopDiscomfort


JanuaryTen


EditText of this page (last edited October 2, 2011) or FindPage with title or text search