Languages that are "critic favorites" but fail in the marketplace perhaps fail for reasons that can be identified. The reasons are not meant to be value judgements, but rather identify popularity patterns. Here are some suggested patterns:
- Deviate from C-style syntax
- Python, VB, and Pascal did OK. Not to mention PerlLanguage, with some of the ugliest syntax around.
- That's because they have C-style syntax. VB and pascal are much more similar to C than they are to SmalltalkLanguage or LispLanguage. And as stated above, we're stating patterns, not laws.
- COBOL and SQL appear to have some market share
- It may be worth noting here that COBOL, BASIC, and Pascal all predate C. Maybe this is a factor?
- You could argue that it's "deviation from familiar syntax" that's important. Most programmers have two syntaxes they are familiar with: English and C. C++, Java, C#, JavaScript, and PHP are C-syntaxed. Python, VB, Pascal, COBOL, and SQL are English-syntaxed. Anything else fails.
- Python has english syntax. Wow. I wonder what you mean by english syntax then. Especially compared to Smalltalk syntax.
- Perl does more or less follow C syntax.
- Don't market with sufficient power
- What about PHP, C, and Python?
- But in fact, neither PHP nor Python are sufficiently important. And C was boosted by Unix.
- Make them expensive
- Keep in mind that different kinds of expense, while equally costly, may be weighted differently. In large software houses, the cost to the end user must be considered as well as the cost to the developer; if, because of the of implementation tools, a product is too large or too slow to run on a majority of the target systems, or requires substantial retraining, then it is unlikely to succeed in the market. Contrarwise, in a contract programming project, the established user base may not be an issue, but the monetary cost of the development tools might be.
- Have a style that relies too much on ultra-meta techniques such as closures and lambdas. An "Eval" operation has almost the same power for occasional use and is easy to learn/understand.
- Since when are closures and anonymous functions considered ultra-meta techniques? Javascript has these and it seems to succeed just fine!
- Why is that so? Who knows.
- What about JavaScript? It surely didn't fail, despite its flaws.
- I'm curious what flaws you think JavaScript has, and browser incompatibilities don't count, those are DOM libraries that have nothing to do with JavaScript. This deserves its own page: JavaScriptFlaws.
- People do their best to ignore closures and anonymous functions in JavaScript. Which suggests a new principle: if you must include advanced ultra-meta techniques, hide them so that average programmers don't need to deal with them.
- JavaScript is semi-popular because it's the only non-install way to control the DOM programmatically because customers/bosses want desktop-like behavior in browsers. It's an accident of history.
- Make it extensible by working programmers, fostering EmbeddedApplicationSpecificLanguages? and the ProgramIntoaLanguage style
- this creates the impression (for managers) that new hires will be expensive to train, the fear (for developers) than experience is not portable
- Create them via large committees
- What about CommonLisp as compared to other Lisps?
- What about XSLT?
- CommonLisp and XSLT are successful?
- Didn't CL replace other Lisps?
Contested:
- Deviate from the AlgolLanguage-derived control structures (if,else,while,for). They are ingrained for good or bad. Perhaps QwertySyndrome. If you want to move away from them, you better have "killer examples" that people can relate to rather than dogma to justify deviations.
- Examples please? I'm not even sure what it means to have control structures that aren't (if,else,while,for).
Smalltalk: the primary control structures are
<condition> ifTrue: [<true_case>]
<condition> ifFalse: [<false_case>]
<condition> ifTrue: [<true_case>] ifFalse: [<false_case>]
<condition> whileTrue: [<loop>]
<index> timesRepeat: [<loop>]
<collection> do: [<loop>]
(<start> to: <finish>) do: [<loop>] "extremely rare"
<start> to: <finish> by: <step> do: [<loop>] "almost never used"
(For examples of actual code, see
http://www.cc.gatech.edu/classes/AY2000/cs2803ab_fall/SqueakBasics.html). Furthermore, many basic conditional cases are factored out by means of polymorphism and/or overloaded methods, where possible.
Scheme: The major conditional forms are
(if <condition>
(<true_case>)
(<false_case>))
(cond
((<cond_1> <block_1>)
(<cond_n> <block_n>)
(else <default_block>))
(case <variable>
(<value_1> <block_1>)
(<value_n> <block_n>)
(else <default_block>)))
(do ((<variable_1> <init_1> <step_1>)
(<variable_n> <init_n> <step_n>))
(<terminating_condition> <final_value>)
<loop>)
The italicized parts are optional. The (do) loop is the only iteration construct; there is no equivalent of (while) in the standard, even though it is easy to code one as a macro. Recursion is strongly preferred, and there is a special form, the 'named let', for using
TailRecursion to implement iterative algorithms (see
SchemeIdioms for an example of this). Furthermore, higher-order functionals such as (map) are often used to eliminate the need for explicit iteration.
- So let's get this straight. Is "unconventional control structures" supposed to refer to a different type of syntax for (if, else, while, for) like you start off with [#ifTrue:ifFalse:, (if (condition) (statement))] or do you refer to genuinely different things like polymorphism and declaration? Because these are two radically different ideas.
- Primarily, I was explaining the different conditional syntactic forms; the original poster had stated that they weren't familiar with different approaches, so I provided some examples. However, that isn't enough to explain how the languages differ from Algol-like languages; as you know, Richard, there are differences in idiom and in the fundemental concepts as well as in syntax, ones which (as you point out) are far more fundemental than the particular way that one codes, say, an if statement. I felt that some indication of this was called for, lest the casual reader get the impression that the differences are in syntax alone. I did not mean it as a serious gloss on the use of polymorphism or high-order functionals to eliminate the need for iterative structures (which is only incidental to their broader implications), just to indicate that they are part of the idioms of the respective languages. If you feel it would be better refactored into a separate discussion, by all means feel free to do so. Also, if I made any errors (especially regarding Smalltalk, which I do not know very well), I would be indebted for any corrections. -- JayOsako
- I'm the one who wrote asking about examples. And it's not that I didn't know the different approaches so much as I literally didn't know what the original poster was referring to. Until now, I only paid the usual lip-service to polymorphism being a control structure. Now, I'll probably add continuations, threads and every manner of concurrency to the list of control structures.
- Note well that the Smalltalk #ifTrue:ifFalse: form is crucially different from the (any?) C-style if(){}else{} form in that it is implemented by regular objects and not by opaque magic in the language implementation. As such, any other control structure devised by a programmer has equal footing with the ones provided by the language implementors. The Lisps also have this feature (achieved by macros). In Scheme "cond" is usually more fundamental than "if", since "cond" is implemented by the evaluator but "if" is a macro on top of cond. [ Not necessarily; many Schemes (including Steele's thesis) make "if" a language primitive but "cond" a macro that expands into a nest of "ifs"]
- Also note that understanding how Smalltalk implements true and false as objects, and ifTrue and ifFalse as method's on those object, is a key insight into understanding what OO really means, and is one the the reasons Smalltalkers have every right to say Smalltalk is more object oriented than Java/C++/CSharp/etc., if "if" and "else" can be implemented in the library as methods on objects, so can every other language feature, including those of your own design.
- Opaque magic? What is opaque or magical about a stinking boolean conditional? While I'm not criticizing the Smalltalk approach, Smalltalk fans have long overestimated the advantages of the "DynamicDispatch on the booleans" style of conditional logic over the "mere" procedure if/else clause. And in doing so, they obfuscate the language a bit. Sure, the Smalltalk style is easy to learn - but it's yet one more conceptual barrier, and one thing that Smalltalk does do a bit differently than the rest of the world. And the advantages - if any - seem to be minute.
- Because it's not just a stinking boolean conditional, and it's not about the advantages of dynamic dispatch on bools either. It's about understanding that language control structures can simply be methods on objects, even the most basic ones that most people think must be part of the language. That's well-understood by all of us. Most junior level programmers probably don't make the distinction at all. Well they don't, "if" doesn't need to be part of the core language, it can be part of the library, just like every other control structure, and that's magical, because it means you can implement your own control structures that are on equal footing with the built in ones, and that's the whole point, the language is extensible. All truly powerful languages are, everything else is shit! The procedural approach is inferior, because it isn't extensible, I can't change it, the Smalltalk, Lisp approach is far superior, they don't obfuscate the language, they enable it. The C approach, obfuscate's the programmers abilities and cripple him. It's not Lisp and Smalltalk that are backwards, it's everything else! Uh, Lisp and Smalltalk use two completely different approaches to this. Lisp has a base conditional form (cond?) that has special semantics (it lazy-evaluates its arguments; normally Lisp is a strict language); other conditionals are implemented as macros. Completely different from Smalltalk; the only difference between Lisp and a "procedural" language here is a) syntax is unified, and b) Lisp has much better macro capability. (Why bring up "procedural"? Nobody's arguing for it.)
- What is opaque about a stinking boolean conditional is that the implementation of it is hidden from the application developer, and what is magical is that the C-style "if" has a special status that application programmer constructs don't have. Try this experiment: find a computationally niave person and attempt to explain the C-style "if" to them, likely you end up talking about some notion of a program counter skipping about, and likely they'll find this very confusing. Then find another (because you've spoiled the first one now) and explain the ifTrue:ifFalse form to them, likely they will find this easier to understand, partly because it can all be described at one semantic level. I doubt any conclusions can be drawn from this argument; a "computationally naive" person (which I assume means one unfamiliar with programming) will probably have difficulty with both; and fail completely to grasp any difference in the approach. Of course, any discussion of conditionals in Smalltalk requires the introduction of blocks, as that's how Smalltalk implements the LazyEvaluation semantics that if statements (or things acting like if statements) require. See SmalltalkBlocksAreThunksInDisguise
- I really hate slippery slope arguments, but it's my observation that only designers who are fanatical about elegance and uniformity will create anything resembling a clean design. People willing to compromise are willing to compromise a lot, whereas people utterly unwilling to compromise will only do so in a very limited number of places, where they literally have no other choice. The internal logic of a set of requirements forces enough compromises, why be willing to add more?
- As a consequence, it's perfectly natural that Smalltalk's creators and users are all fanatics. They value the uniformity of Smalltalk's boolean conditionals far more than "rationality" dictates. Local rationalism doesn't imply global rationalism. These are the alternatives. You can look at every single design choice individually, compromising this principle for this feature and that principle for that feature and ..., or you can look at the grand picture as a whole, compromising on nothing. Your choice, clean pure water or excrement. It's a matter of taste, right?
- There is one caveat. #ifTrue:ifFalse: is optimized away by the compiler which typically assumes that all receivers of the message are either true or false. If you try to define #ifTrue:ifFalse: anywhere else, it won't work.
- Can you explain precisely why it is that Lisp macros are on the same footing as built-in constructs?
- They expand into built-in constructs which are then dealt with by the compiler as if they were what the user typed in the first place. And many of the "built in" constructs are macros anyway.
Reserving the right to change my mind: I used to think this way, but now (14 Dec 2004) I feel the argument is wrong. Just because a language is not super-popular doesn't mean it is better. In fact sometimes WorseIsBetter. On the other hand, I still think one premise of this page is not well defined. That is, what does it mean for a language to fail? -- JimmyCerra
Previous Discussion:
Backed by a small clique that seemingly claims the language is God's language. Of course, every modern language starts with a small group of programmers, by definition. They remain unpopular when the initial group actively retards the evolution of the language. -- JC
Examples please?
Evolution isn't necessarily progress. After all, hominids evolved towards stupidity to give rise to chimps. If someone doesn't control change to a language, it becomes a morass of contradictory features like C++.
:I think your example is wrong - the general opinion is that chimps split off from other hominids before they became smart. The principle is fine, though.
The real question is why users would prefer a crappy language whose evolution is "responsive" to their needs over a language that's already close to perfect. Why would you want to tie yourself to a language that needs to be improved to be usable over one that doesn't?
My key point here is that it's not a defect of the initial users or the language designer. It's a defect of the vast unwashed hordes of programmers. -- RK
How do you objectively say that any particular language is close to perfection? Suppose that most programmers can't program effectively in a programming language (for its domain); doesn't that imply it is flawed? After all, I think the ultimate purpose of any programming language is to make it easier to write and maintain programs. If it is a chore to write (reasonable) programs with a language unless one is an "elite" programmer, then it becomes costlier to develope in and thus less useful. In other words: If it ain't easy, it ain't working! How do you determine if a language fails? -- JC
End Previous Discussion
Give it a learning curve that is either too steep or too short.
This last part is likely to be controversial, but it is my experience that successful languages - C, Java, Visual Basic - are relatively easy to learn the basics of programming in, but are difficult to master to any meaningful degree. This means that once a critical mass of users begins to develop, you end up with two groups - a large number of inexpensive journeyman-level programmers, and a one or two very expensive master-class programmers, who run the project, break log-jams in the project, fix the showstopping problems, etc. Conversely, many of the other languages have some BarrierToEntry, usually an unfamiliar conceptual framework, but can be quickly understood in detail once this obstacle is overcome. This results in a much smaller group of very expensive programmers, none of which are necessarily master level. For large projects, this is often seen (sometimes correctly) as too expensive an approach, especially since managers as a rule would prefer a large number of PlugCompatibleInterchangeableEngineers to a small group of specialists.
''But does this relates to the language itself? Maybe a simple enough language can succeed if you provide the wanna-be-gurus with large and complex libraries to learn?
What if the language is as simple as possible but with dark spots? i.e. provides call/cc'' -- GabrieleRenzi
Hard to say. Part of the issue, I suspect, is economic rather than technical: if the language is too easy to work in effectively, and especially if it does not have a lot of places where hidden bugs can go unnoticed until after deployment (requiring repeated bug fix upgrades, which earn more money for the programmers on each iteration), then the programmers using it are risking putting themselves out of work, just as an auto manufacturer would if they built cars to last twenty years with minimal service. Thus, there is a market/evolutionary pressure that favors languages (and software products in general) that are 'good enough', but not 'too good'. If this is in fact what is happening, then it could be seen as an aspect of the WorseIsBetter phenomenon. -- JayOsako
- Nonsense. Better technology allows an immoral company to produce buggy software with less manpower, reaping even more undeserved profit. There are only a few explanations why this isn't done: a) there is no better technology, b) companies do not decide rationally or c) most managers and most programmers are dumb. Take your pick.
- Ah, the SilverBulletConspiracy. I've seen no evidence that such a conspiracy exists.
- Read it again; I was discussing the economic viability of the technology, and never claimed anything about a conspiracy. The reasoning goes like this: If a manufacturer who produces or uses a highly efficient technology (whether it is computer software or automobiles) does less repeat business because of it, they will lose market share to those who produce or use a slightly less efficient technology that generates more paid work(==income). As a result, the latter remain employed and hire others who can work with the less efficient technology, promoting it at the expense of the more efficient one (note that this phenomenon would bottom out once the overhead of using the less efficient technology exceeds the increase in paid work that results from it, at which point efficency becomes adaptive again; the result is a homeostasis, where the most and least efficient technologies are selected against in favor of those of medium efficiency). Like with any other form of natural selection, the players adapt to the local maximum, not any idealized 'best'. Hence, 'worse' (more bugs, resulting in more bug patches; less flexibility, resulting in more upgrades) is 'better' (more work, thus more income for the programmers, and more incentive for new programmers to learn the language). This can be seen as related to the 'upgrade/replacement cycle' issue I mention in QualityIsntAlwaysCompetitive. Also, note that I was proposing that this might be one factor in the apparent failure of technologies that a minority of programmers considers superior. I never meant it as a one-size-fits-all explanation, just one aspect out of many, and one which I do not even know for certain occurs. -- JayOsako
- In many cases, what occurs is a PrisonersDilemma (from the point of view of the vendors). As long as all vendors make the "cooperative" choice (not providing the efficient technology to the market), the advantages of the efficient technology can be stopped and the vendors benefit from increased business. However, if any one market player makes the "treacherous" choice, and provides the efficient technology; enhanced (if momentary) profits are his to capture. In general, the only way the treacherous choice is preventable is by agreements to restrain trade, outright monopoly, or some other means (such as patent protection) to prevent the efficient technology from being released.
- Er, while I don't deny that this can and does happen, it sounds like a different phenomenon from the one I'm talking about here.
- If you look at OpenSource, you see numerous software companies embracing a business model which - looked at from one perspective - is suicidal. With proprietary software, vendors can charge monopoly prices due to the legally-imposed scarcity that copyright provides. With open source, it all becomes a commodity. But numerous companies are racing ahead. Why? Because as soon as one company elects to commodotize the marketplace, they all are pressured to do so.
Put it in the hands of illiterate programmers whose ability to learn has been hobbled.
Put it in the hands of inexperienced programmers and hide the manual.
Add "compatibility" baggage to improve "interoperability" or "portability" and then make this baggage a requirement for all future development.
For the opposite view see e.g. BeingPopularEssay.
See also: IfSmalltalkIsSoGoodWhyDoesNobodyUseIt, SocialProblemsOfLisp