Great Lisp War

Abstraction Versus Marketplace


Lisp (and related dialects) is perhaps the most flexible language known. You can make it be just about any paradigm or abstraction you can envision. Or at least its flexibility-to-syntactical-complexity ratio is near the top (power without clutter). It's a marvel to behold; a "language design formula" that is perhaps the E=mc^2 of programming. Lisp has been around for about 50 years, and enjoyed a renascence of popularity in the 1980's during the AiBubble, which spilled over to non-AI projects.

However, despite plenty of opportunities in 50 years, it failed to catch on for rank and file development. Its ideas keep getting introduced to various languages over the years, but often fail to "stick" for common usage, partly because such languages or add-ons don't do it as well as Lisp (meta + complex existing syntax = mess), and partly because it can encourage MentalMasturbation, job-security "creativity" in programming, and increases the caliber of skill needed to read code using such techniques, which increases employee and recruiting costs beyond what companies want to bear.

One can argue that Lisp-like techniques are the way it "should" be done, but the industry keeps saying "no" for good or bad. The industry wants tools that cater to semi-transient PlugCompatibleInterchangeableEngineers. This often results in tools that do some amount of "herding" to provide consistency with pre-packaged semi-domain-specific idioms, and less catering to "abstraction gurus". It's not quite "lowest common denominator", but something closer to the 25 percentile, meaning that roughly 75% of those hired need to be able to grok the code in a timely manner. (The economics of the caliber of hiring is debated later).

I've seen organizations complain even about "excessive function usage" with old-fashioned functions because some developers in the past were confused by the level of factoring used and they didn't want to fire them because they otherwise did decent work under lower abstraction, and often had better people skills than the "high brow" developers. (Perhaps they were partying in college while the gurus were writing compilers.) I've seen this phenomenon in multiple organizations and have at times been ordered to "dumb-down" my code. I even limit my favorite abstraction, TableOrientedProgramming, because other maintainers are not used to it.

Duplication rarely gets somebody outright stuck; it's just more busy work and perhaps more little bugs caused by missing one of the repeating segments accidentally. However, heavy abstraction can get developers outright stuck. Businesses often value consistency and predictability more than average higher productivity, and hum-drum code gives them that. Being able to make accurate future predictions in terms of scheduling and resources is often highly valued in business and management. Owners and managers have repeatedly decided that this is often more important than higher average long-term productivity. (Start-ups and R&D-style projects may be an exception because "rock-star" developers can rocket you past the competition in the shorter term, getting your company or product on the map.)

In short, Lisp Lost, and this includes Lisp-like techniques. You can bitch about the industry being "stupid" or "ill informed", but the reality is what it is and that's how it acts to the high abstraction attempts.

I'd hardly describe being #13 on the TiobeIndex (as of February 2013) as "lost", especially compared to what LISP beats.

It's not a good index of "production" language usage, as described in that topic. I don't question that it's personally popular. It's also a great teaching language. I've seen no evidence of common usage for custom business applications (unless maybe AutoCad is involved). Look how many WikiWiki topics there are on it: CategoryLisp; it's a grand nerd conversation piece, cranking up its "mention" score. It's kind of like Lamborghini's: talked about far more often than purchased. TiobeIndex would be like estimating the number of Lamborghini's on the road by counting web pages that mention them.

I'm pretty confident in estimating that less than one percent of commercial app code is in Lisp, and would guestimate the actual number is something like 1 in a 1000 lines.

What source of statistics -- outside of your inevitably-limited (and that's nothing personal -- all our personal experiences are limited) experience -- do you have for "common usage for custom business applications"?

I don't have anything approaching an OfficialCertifiedDoubleBlindPeerReviewedPublishedStudy, but neither do you. It's anecdote against anecdote.

Be that as it may, the fact that we're debating LISP's significance over fifty years after its inception, and the fact that it ranks 13th on the TiobeIndex, and the fact that modern implementations like Clojure are growing in popularity, and the fact that its used as a scripting language in a variety of products, and the fact that some of its original concepts -- dynamic typing, higher-order functions, homoiconicity, functional programming -- have such an impact even on current products, means LISP lost nothing. It won. It didn't win by being a popular programming language like C# or Javascript. Those will eventually diminish in popularity. It won by being profoundly influential on every modern programming language, including C# and Javascript, and LISP will certainly influence the languages that follow them.

In short, if there was a GreatLispWar -- and I'm not sure there was -- LISP won. Not by being a popular language, mind you, but by being influential on every popular language.

I wont deny it's had some influence on a lot of languages. However, that's not the same as taking over production languages and/or production code design (language features actually used, not necessarily what's included in the language). I've already described the problems with your reference to TiobeIndex above. Lisp fans have been saying the mainstream breakthrough is "just around the corner" for several decades. I heard the same song and dance in the 1980's. It's like how religious groups keep predicting the immediate end of Earth, but we are still here. Semi-DomainSpecificLanguages are still in charge and still pre-package common abstractions and idioms of the domain/focus to make them more digestible to fungible staff. Whether that's good or bad, it's what the industry likes and keeps backing despite 50 years of Lisp's mega-meta capabilities.

There are two simple reasons why LISP won't replace C# and Javascript: the perception that a festival of parentheses hurt readability, and the perception that prefix notation hurts readability. The concepts it demonstrates, however, will endure far longer than the languages that adopt them.

I do agree Lisp has what could be described "syntax issues", as partly described in MaspBrainstorming. But that's not the entire problem. Over-meta-tizing mainstream languages or trying to introduce such a language has repeatedly failed to catch on and I see no reason why that would change. The forces causing it are still in place. JS is in a unique position in that it's kind of forced on the industry due to historical issues akin to QWERTY keyboards.

Actually, the "syntax issues" are the entire problem. The "over-meta-tizing" you deprecate is being adopted by other languages. This will, no doubt, continue.

Until it becomes an expensive-specialist and convoluted product/stack, and then something like Html or VB comes along that's easy to for average developers to pick up quickly, and steal the market share from the it, and the cycle continues. The basic AlgolFamily idioms continue to dominate in terms of actual code written even if a particular language has additional meta capabilities. (Somewhere on this wiki is a lost topic about how languages "past their prime" add on features to try to stay competitive in the feature brochure check-mark race, but that's not the same as actual feature usage.) {Text added after original dialog.} -t

HTML is a text markup language, closer to the old-skool wordprocessor Wordstar than a programming language, so it doesn't count. You can't write programs in it. It's true that "easy entry" languages do pick up a market share from professional languages, especially when a field is developing, but that's because beginners need an easy starting point and new fields attract a lot of beginners. Such languages inevitably gain features -- i.e., more "meta-tizing" -- as their users grow more proficient.

I don't believe they become more meta-tized. If anything they become less meta-tized because the common idioms are included in a declarative way (like common GUI idioms SHOULD be in HTML or its offshoot). Making common activities be paint-by-numbers (like "Wizard" dialogs) is how they get people to pay for upgrades to the next version.

C++ did not provide templates at first. Generics were added to Java. C# recently gained lambdas. "Meta-tizing" gets added because it adds power.

Often its a feature-bullet-point pissing match, perhaps unnecessary. MS wants to make C# more of a SystemsSoftware language to compete with C/C++, not just an app language, and that's partly why they added it. Languages do tend to pick up clutter over time, and I'm not sure it's a good thing. C++ is known for its steep learning curve and piles of gotcha's. It has a few too many +'s.

For those of us who use "clutter" and find it of value, we don't mind if you don't use it. We object if you try to take it away. What's clutter to you is a toolbox for us.

See the "assault weapons" analogy below.


Note that AutoCad added VBA scripting support, in addition to Lisp. If Lisp is so palatable, why would they feel compelled to do that?

There's frequently a perception that a festival of parentheses hurt readability, and the perception that prefix notation hurts readability -- especially among naive or substandard developers. Unfortunately, naive and substandard developers sometimes control the purse strings. (Note that the VBA support is now deprecated. There's a shift to .NET support, but that obviously is due to forces that have nothing to do with LISP vs VBA.)

It is true that "average" developers wag the industry's dog, but that's just the way it is. You need to see it from the economics and staffing perspective of org decision makers. Being frustrated that no support is given to higher-brow developers isn't productive. (I addressed the parenthesis issue above already.)

Us "high brow" developers aren't frustrated at all. We typically make many times the salaries of mediocre developers and retire wealthy. Nothing stops us from using the tools we prefer, and we succeed because we excel at using them.

Regardless, Mitt, there appear to not be enough of you to change the industry patterns. I've learned to avoid excess meta-tizing and indirection because of complaints about confusing follow-on developers.

I don't want to change industry patterns. It's you who seems to want languages that cater only to the lowest common denominator, developer-wise. The rest of us are content to allow you to use as few features as you like, as long as that doesn't impinge on our ability to use features that are as powerful as we like. Your concern for follow-on developers is laudable, but why do you assume they'll be incompetent?

They are not "incompetent", just often not skilled in meta techniques. You seem to be trying to paint a grey issue as black-or-white. There are plenty of developers who are quite productive under "regular" levels of abstraction. Sometimes they type more than necessary because of it, but that's their prerogative. Their code may be fairly tedious to maintain because of the duplication, but at least it's strait-forward: no drama, no weirdness, no MentalMasturbation experiments.

Fine. Just don't advocate taking away our tools to cater to them.

That kind of reminds me of the assault weapon debate. One side argues that "assault weapons" are too dangerous to allow regular citizens to own, while assault weapon proponents argue they are useful more often than they are misused or cause problems. Lisp is the ultimate assault weapon of languages: compact, accurate, reconfigurable, few parts, and rapid-volume fire.

That's a bizarre analogy. Does it have a point?

Lisp kills :-)

FacePalm

It's a balance act between power and misuse or poor training.

LISP is a "balance act between power and misuse or poor training"? Huh? I'm still not clear on your point.

The industry choices, not Lisp itself.


Part Time "Jack" Programmers

A general trend in the industry is away from centralized specialists and toward "departmental generalists". In other words, instead of dividing up by skill, organizations are deciding that dividing up by organization structure (departmental) may be better, and to have a multi-hatter generalist IT person. An in-between division is "hardware" versus "software" where the hardware person does general trouble-shooting of hardware, OS, and basic installations; while the "software" person focuses on learning the department's purchased applications as a power-user-level trouble-shooter, a DBA, and a custom software developer.

This means that the person doing the programming will likely only be able to devote a fraction of his/her time to programming and the skill of programming.

It's not just about programmers being "dumb", it's that they are asked to be a jack of all IT trades, which makes them master of none.

Perfectly reasonable. Any skilled programmer who can't fix a PC, change printer toner and other gubbins, sort out network issues, make coffee, work with users, design a system, install a package, create a dynamic Web site, clean a keyboard, construct a database, and learn -- and quickly be productive in -- new languages, tools, and paradigms, should quit and take up knitting. I don't know about you, but to me that's exactly what being a "skilled programmer" means. As developers, we may be able to distinguish programming from printer repair, but the average user can't and is just looking for someone who is "good with computers." So that's exactly what we should be. The days are ending when all office work would stop to ring Tech Support and hope that the third or fourth call would finally get exactly the right kind of techie -- who, no matter how good he is, didn't have a clue who you were or what your office did -- to solve the (mysterious) computer problem at hand.

Your suggestion that they master everything IT is unrealistic. Maybe a select few can, the Mozarts with photographic memories and speed-reading capabilities, but it takes experience for most to do things well. Without experience one wastes time getting stuck on gotcha's and poorly-documented corners. It's more economical to have in-house or rented specialists for the times when the generalist gets stuck. Your false Mozart-or-knitting dichotomy is silly.

No, it's not. It's the skill profile of those who succeed in IT and keep their users happy. And it's not as uncommon as you think; indeed, it's a typical "computer geek" profile.

Somebody who is an interface designer whiz is not going to want to clean printers and crawl under dusty desks to string cable for a living.

Anyone who thinks they're too good to clean printers or string cable shouldn't be in this business, and that goes for everyone from junior developers to the company CTO. SpecializationIsForInsects, and elitism is disgusting.

{...and all us non-elitists are way better than the elitists, anyway.}

Hey, I'm just the messenger. I'm not smart enough to fix society and humanity, so I focus on tools instead.

[And your message is… what, exactly? That interface designers think they're too good to string cable? -DavidMcLean?]

From an economic standpoint, it doesn't make sense to pay a skilled specialist to do a semi-commodity activity that could be done by somebody who earns much less per hour. An extreme case is paying a $300/hr brain surgeon to clean and scrub his tools after surgeries. Cleaning would then cost the hospital $300/hr when it could cost them only about $20/hr. They would have to pass that extra cost on to consumers or patients. Whether a given person falls into that category has to be evaluated on a case by case basis. The human ego may mentally put oneself in that category, but outsiders may see it otherwise.

Medicine has not yet demanded, to use your phrase, "departmental generalists". However, every brain surgeon knows how to clean and scrub his tools after surgeries. It's an explicit part of his or her expertise, whether he or she does it regularly or not.

Knowing how and actually performing something regularly are pretty much two different issues.

Not in this case, or the IT case.

Let me clarify: should the surgeon know how to clean surgical tools? Yes. Should an established surgeon regularly clean his/her surgical tools? No. One could argue that he/she should still do it on occasion to keep in practice.

We're talking about IT, not surgery. Specialisation in IT is baffling to non-IT people, who find it mysterious that the person who will write a program won't fix a printer and vice versa. From a non-IT point of view, it's all computers, isn't it?

There's no specialties within "computers"? I find that a rather curious claim. Anyhow, if it takes the programmer 3 times longer than the experienced shop "hardware guy" to fix the printer, then it may not be economical to the org for the programmer to fiddle with it. It's probably good to keep your skills fresh by trying variety, but that can be taken too far.

[There indeed are no specialities within the domain of "computers", from a non-IT point of view. In general, non-IT people don't recognise a distinction between "hardware guys", "software guys", etc., generally collapsing them under umbrella terms like a simple "computer guy" moniker. It's obvious to us that there's a difference between skill with hardware and skill with software, but it's not obvious to the rest of the world. -DavidMcLean?]


Why So Long?

HOF's have been around for about 50 years (longer if include the fact that in machine language you can "point to" a subroutine.) So why would they be mostly excluded from mainstream "production" languages for 50 years and suddenly mainstream programmers are allegedly ready for them now and only now? Did the Internet change something specific? What? What Great Switch suddenly flipped? And why didn't the Lisp fad spike of the 80's trigger it?

Much of the industry was diverted into achieving language productivity through IDE evolution, rather than language evolution, until Javascript and Ruby -- both heavily influenced by LISP -- demonstrated that higher-order capabilities were worthwhile outside of functional languages.

By the way, the fact that machine language can "point to" a subroutine is not the same as a higher-order function. A higher-order function captures its calling scope. A subroutine generally does not.

I still believe their semi-prominence is a temporary blip until IDE-centric or a GUI markup language standard pre-packages GUI idioms. It's a blip comparable to C++ for Windows GUI's heyday until VB, Delphi, MS-Access, etc. came along. The VB desktop approach to low-budget in-house apps was very popular and appeared to be solid until the Web's deployment ease bit into it. Shops didn't go web because web was easier to develop and maintain with existing or fungible staff, but because it reduced deployment costs and DllHell, which really ate up help-desk time. But, we'll see. (It's not that I love VB, but shops in generally sure did, minus deployment issues.) What the industry really wants is something like VB without the deployment issues. Shops are spending waaaay to much time on UI fiddling. Something has to give. -t

Of course, VB and Delphi are still there for the shops that love them. If developers are frustrated with Web UI fiddling, they appear to be missing an opportunity to fill a market void with a profitable product. But then, why aren't Web application builders catching on? Google "web application builder" to see acres of software that you've probably never seen before. Maybe it's because developers are realising that drag-n-drop interface builders with limited languages aren't as productive as powerful languages on their own.

I will agree with that last sentence generally. However, it doesn't matter enough for a combo of staffing and behavioral reasons already discussed. The industry repeatedly votes for pre-packaged idioms (be it IDE's or markup or DSL's). "Meta-friendly" languages often lead the bleeding edge, but pre-packaged idioms usually follow closely behind when the patterns become less murky. The web side has just been later than normal to finish the cycle. I suspect it's partly due to a misguided obsession with MVC. I have plenty of gripes about certain drag-and-drop IDE's and limited meta ability also, I would note. (Deployment of desktop apps is still a problem.)

Is the lateness of the Web side to finish the cycle really because it's later than normal, or because there's a permanent shift away from pre-packaged idioms to "meta-friendly" languages? Are meta-friendly facilities, in fact, the new pre-packaged idioms? Time will tell. As for MVC, I'm surprised you're concerned with the apparent obsession thereof. After all, aren't your favourite technologies the ultimate expression of MVC? I.e., model = database, view = UI, controller = code?

Well, that's one possibility. However, many programmers I know gripe about the HtmlStack. I see no love, other than from highly-paid Web UI specialists for trendy public front-ends. Many turned to Flash to get better control and consistency, but Apple, MS, & Google are trying to kill it now. As far as MVC, it perhaps has multiple interpretations.

Indeed, I see no particular love for the raw HtmlStack either. Love starts with languages and tools that wrap it, like Javascript and jQuery.

Most of the features/idioms could be packaged into something easier to digest. VB's timer case in point: drag the timer object with the stop-watch icon onto the form, right-click to set properties (cycle duration), double click to open the event code editor, and type in the code that repeats. (The right-click and double-click are standard IDE behaviors in VB. There's other shortcuts also not mentioned here.) No "naked" HOF's or other implementation guts sticking out and no need to dig in manuals for setup and parameter syntax etc. Even a drunk programmer can do it. The markup version shown in BradyBunchDiscussion?(?) is almost as easy. -t

That's fine when the "packaged" materials precisely suit requirements. Unfortunately, they rarely do, and that's why we have higher-order functions and other mechanisms to help define what isn't pre-packaged. Drag'n'drop forms and click-able property lists encourage human interaction, which is an inefficient way to implement customisation and (especially) automate repetitive tasks. Language encourages automation; drag'n'drop interfaces do not.

HOF's don't provide any new functionality that cannot be done some other way (unless the root or base system is built around them.) A code "design" improvement is perhaps debatable, but do versus not-able-to-do is another thing. At the very least, we have TuringEquivalency. Generally the "idiom machine" will handle roughly 80% of our needs, and the rest needs to be custom coded. Ideally primitives or lower-level building blocks are available such that one can diddle individual bits or pixels if need be for special cases. You have not presented a CBA case where they offer a significant improvement (outside of client and SystemsSoftware-specific idiosyncrasies).

That's true -- HOFs don't provide functionality that cannot be done some other way. But, by the same token, 'for' loops don't provide functionality that cannot be done with 'while' loops, and we don't really need 'while' loops when we've got gotos. Functions & procedures aren't necessary -- they don't provide functionality that cannot be done with masses of in-line code. All you really need are gotos, 'if' statements -- no 'else' required -- I/O, variables and simple expressions. We don't even need multiply and divide; we can construct them from repeated addition or subtraction. Sounds like 6502 assembly language, doesn't it?

Obviously, 6502 assembly is no longer a reasonable tool for developing business applications. For the same reason, "primitives or lower-level building blocks [that] are available such that one can diddle individual bits or pixels if need be for special cases" aren't reasonable either. Lower-level facilities are more complex, more awkward, and generally riskier in terms of reliability than higher-level abstractions. High-level abstractions are preferred, precisely because they make it easier to achieve reliable functionality than lower-level constructs.

Even if HOFs only offered a significant improvement in client-specific and SystemsSoftware-specific cases, that alone would be a reasonable justification for HOFs. Client-specific and SystemsSoftware-specific cases do appear when developing business software, and it's better to have facilities to easily and reliably handle them than not.

Again again, the "advantage" is like saying SQL works better on SQL databases than Rel does, and Rel works better on Rel databases than SQL does. It's a UselessTruth. In a parallel universe, the designers of JS or its equivalent could have just as well made the native timer API be Eval-based or object-based INSTEAD OF hof-based. If it was truly powerful in a general way, you should be able to find a more universal example. The fact you can't is quite telling. You are grasping at a straw with that example, trying to milk it for every speck, which is about all it delivers as far as evidence.

The universal example is HofPattern.

And you've been invited, several times, to pretend Javascript/DOM's native timer API is not based on a higher-order function, and demonstrate how you could implement the "Brady Bunch example" without HOFs. Show us how it would be better. Our car is in the race; it's time for you to RaceTheDamnedCar.

(define (divisible x n)
  (= 0 (remainder x n)))

(define (fb n) (if (= 0 n) #t (begin (fb (- n 1)) (cond [(divisible n 15) (printf "FizzBuzz")] [(divisible n 5 ) (printf "Buzz")] [(divisible n 3 ) (printf "Fizz")] [else (print n)]) (printf "\n"))))

(fb 100)

<define name="divisible" args="x n">
  <equals>
    <num>0</num>
    <remainder>
      <var>x</var>
      <var>n</var>
    </remainder>
  </equals>
</define>

<define name="fb" args="n"> <if> <equals><num>0</num> <var>n</var></equals> <then><bool>true</bool></then> <else> <fb><minus><var>n</var> <num>1</num></minus></fb> <cond> <case> <divisible><var>n</var> <num>15</num></divisible> <printf><string>FizzBuzz</string></printf> </case> <case> <divisible><var>n</var> <num>5</num></divisible> <printf><string>Buzz</string></printf> </case> <case> <divisible><var>n</var> <num>3</num></divisible> <printf><string>Fizz</string></printf> </case> <else><print><var>n</var></print></else> </cond> <printf><string>\n</string></printf> </else> </if> </define>

<fb><num>100</num></fb>
. As far as what tools to include in the "tool-box", it's a matter of weighing the trade-offs. I won't go into those again here because I've already discussed it in multiple topics of late. (Frankly, I wouldn't personally miss FOR loops; I don't use them that much. I'd gladly trade that for far more useful features such as optional named parameters.)

No, as to what tools to include in the "tool-box", it should never be a matter of "weighing the trade-offs" because that's nothing but a euphemism for catering to the lowest common denominator. It should always be a matter of providing the most powerful tools possible within the given language paradigm.

I see no reason why we should ignore what the markets wants and real-world staffing situations. And I'd call it "mid-common-denominator" not "lowest". You are exaggerating.

[Language design should focus on what programmers need, because programmers use programming languages. Managers do not use programming languages. Thus, when designing a language the primary goal is making it as useful as possible for programmers. -DavidMcLean?]

I'd estimate that at least half of development managers were programmers themselves. If their experience with the multiple programmers that have come and gone in their shop showed that heavy abstraction and functional indirection made for better projects, including maintenance, it would gain acceptance. And when they do fuss about such, it's often not "I don't get it", but more along the lines of "the last guy who did things like that made code too confusing for the rest of our team after he left. Many couldn't maintain the code, putting us in a bind during vacations."

If a shop that wants to outlaw certain language features, that's fine. Some shops outlaw C++ multiple-inheritance and/or templates, other shops make effective use of them. That's no reason to consider excluding multiple-inheritance and/or templates from C++.

As for features that "gain acceptance", note that HOFs are available in Javascript and lambdas are appearing in C# and Java. It seems they are gaining acceptance.

I believe it's a FeatureCheckboxBrochureGame. Time will tell. New languages and fads will come along and people will start forgetting about Java etc. And if people cannot find a decent use for them in their domain, HOF's will mostly be ignored. Outside of existing API's forcing it on you, you haven't found decent cases.

...Except, of course, for every indivisible algorithm that requires customisation, as shown in HofPattern.

It's great because you say it's great. The pattern is rare in the wild, or at least uncompetitive with alternative solutions.

[What about qsort(), which is part of C's standard library? What about approximately half of Ruby's standard library? Are standard libraries rare in the wild? What about any genetic algorithm ever written, DijkstrasAlgorithm, AstarSearch, and so on? -DavidMcLean?]

I've never ever used qsort. I imagine it's useful for SystemsSoftware and embedded systems, but those are not my niche. And much of Rudy's library, and perhaps qsort's comparison techniques, could be implemented in objects. If you don't like Ruby's object syntax, complain to the Ruby makers, not me.

[If you use an object constructed on-the-fly with an associated callable method---the C++ functor pattern, effectively---then you're using higher-order functions. It's exactly the same thing, just with syntactic overhead. The minimal syntactic overhead on blocks in Ruby is vital to their utility in code; any syntax that required declaring an object-with-some-named-method to pass to things like Enumerable#each would be too verbose to be used as ubiquitously as Enumerable#each is and should be. JavaScript might be a better example to look at here, because its object syntax is extremely concise; despite this, one finds that the vast majority of functions-that-need-to-take-functions just take functions, not objects. -DavidMcLean?]


Re: "Abstraction Versus Marketplace"

Or is it TheoryVersusPractice???

FalseDichotomy -- The third point is, for lack of a better term, "elegance" which is always tied to the task at hand. LISP is the capstone for functional programming, but nowadays, it's about data -- a bunch of nested parens representing data relationships isn't suited for recursive solutions.

Recursion-centric techniques are not really "fighting" for territory with FP and "traditional Algol-style", at least not that I see. Perhaps it could become a visible competitor, but it's not there yet.


"Intelligence" is not so clear cut. I don't consider my wife very "intellectual"; however, she has a great street smarts that I admire much.


See also: IfFooIsSoGreatHowComeYouAreNotRich, WorseIsBetter, EconomicsOfAdvancedProgramming, OneMoreLevelOfIndirection, AynRandDesignPhilosophy, IndustrialSociology, StaffingEconomicsVersusTheoreticalElegance, SummaryOfHofExamples


CategoryLisp, CategoryIdealism


FebruaryThirteen JanuaryFourteen


EditText of this page (last edited September 13, 2014) or FindPage with title or text search