Language Trends

What long-term trends are at work regarding mainstream / business programming languages?


One scenario:

Important goals for languages are to be efficient, productive and scalable (to larger problems / teams). MooresLaw makes efficiency become a secondary goal for more and more areas of application. So, relatively, productivity and scalability become more important. The importance of scalability depends on the size of the project, of course. Also, scalability features often come with an overhead (learning, complexity, syntax etc.), so there's a bit of a tradeoff between scalability and productivity for small projects.

Following from this we should expect to see a trend from languages which put the main emphasis on efficiency to languages which put their emphasis on productivity and scalability. Which of the two is valued more depends on whether the language is more targeted at small programs ("scripts" -> "scripting language" [ScriptingLanguage], focus on productivity) or large ones ("system language", focus on scalability).

Put in terms of examples, we saw a trend from efficiency (AssemblyLanguage, FortranLanguage, CeeLanguage, etc.) to productivity (PerlLanguage, PhpLanguage, VisualBasic, etc.) and scalability (CeePlusPlus, JavaLanguage, CeeSharp). Of course there are border cases, such as very productive system languages (e.g., SmalltalkLanguage, CLOS [CommonLispObjectSystem]) or relatively scalable ScriptingLanguages (e.g., PythonLanguage). These are sadly not mainstream, though.

How was the shift from efficiency to productivity achieved? If programming is the process of translating something from the problem domain to the machine domain, then more problem-oriented languages are more productive (and less efficient). In other words, languages become more abstract and high-level. Secondly, more powerful / advanced control structures make for more productive languages (for-each vs GoTo, etc.).

How were languages made more scalable? Mostly by introducing features for more strongly structuring the system, such as ObjectOrientation, package systems, components, static type systems, etc.

Type systems are a nice case in point: StaticTyping done right helps scalability especially on larger teams (reliable, statically-checked interfaces), but hurts productivity on smaller projects through their overhead and reduced flexibility. Thus, all the prevalent system languages are statically typed, and all the major scripting languages dynamically typed [see DynamicTyping].

What will happen next? In general, I expect languages to continue to get more abstract.

Regarding typical system languages, I assume languages will become even more scalable, through features such as AspectOrientedProgramming or DesignByContract.

On the other hand, I expect languages such as PerlLanguage to maintain their "high productivity for small scripts" niche, to which they are very well adapted. Adapting some features from FunctionalProgrammingLanguages (such as PythonLanguage is already doing) could make them even more productive.

In the middle, driven by the economic advantages of highly productive languages, I expect dynamically typed languages traditionally used for scripting to be enhanced for better scalability by better package systems, better tools or improvements in the type system (optional type declarations, TypeInference, or some compromise such as coarse-grained components having a hard, statically-typed outer shell and a soft, dynamically-typed interior). Such "enhanced" ScriptingLanguages are well-positioned to take away market share from traditional system languages, as they have all of the productivity advantages of languages such as CLOS or Smalltalk, but none of their "academic image"-problems.


I wrote the above for a seminar paper (I study, but I work full-time as a developer as well), and now I'm interested in what others think:

PleaseComment

-- FalkBruegmann


I enjoyed this essay very much. As far as comments, here are some random thoughts about the dynamics of language invention:

  1. I don't think there is very much money to be made in computer languages, so there is not a great commercial pull to develop better languages.
  2. Languages are usually developed in universities and labs, where the background is ComputerScience theory, or in the trenches, where the background is ten people working on a one- to two-year project. Ideally you would want both at the same time.
  3. What does seem to work is OpenSource projects with champions (RubyLanguage, PerlLanguage, PythonLanguage) or internal projects at big companies (SunMicrosystems > JavaLanguage, MicrosoftCorporation > CeeSharp).
  4. The more popular a language is, the bigger the drag of "backward compatibility" [BackwardsCompatibility].
  5. Creating a new computer language is just a medium to large software project - thousands of people have the technology.
  6. Languages are collections of features. Like genes on a strand of DNA, these features can be borrowed, mixed and matched among languages.
  7. SourceForge is a new dynamic.
  8. I don't think anyone really knows what a good language is. It is possibly quite subjective. There are many paths to the lake.

Given these dynamics, my opinion is that
  1. We will continue stumbling along exactly as we have been, only faster.
  2. Languages created using an ExtremeProgramming-type process will start to dominate - languages that pragmatically combine the best parts from other languages, without regard to theories. Theory-based languages from the universities will be "feeder" languages to these pragmatic languages - contributing new features, but not supplanting them.


I think CharlesSimonyi's ideas on IntentionalProgramming are going to have a great influence. You can already see it in MS .Net [MicrosoftDotNet] implementation so far, but it will probably go a lot further. Perhaps a way to think about what might happen is to look at what has happened with NaturalLanguages: up till recently each was being spoken by a (more-or-less) disjoint group of speakers. Now, English is becoming a new lingua franca, but in the process it is absorbing more and more idioms and dialects from other languages. Perhaps something like this will happen in programming, with one established language becoming the framework into which the useful abstractions (i.e., idioms) of other languages will be fitted.


Your essay seems to focus on business apps and for those I think languages will become more abstract. It appears, however, that we will see a growth in EmbeddedSystems programming as well. What languages will be used to program the growing number of PersonalDigitalAssistant, cell phone [MobilePhone], etc. applications? JavaLanguage and CeeSharp seem likely candidates but AssemblyLanguage and CeeSharp are still heavily used as well. -- AndrewQueisser

Reading EmbeddedSoftware there are very different forces at work than for business apps, so you are right, my essay doesn't apply to that. Good thing, too, as I know next to nothing about EmbeddedSystems... -- FalkBruegmann

I agree that different vertical domains will climb different abstraction ladders. For example, in the business apps domain, you might have orders with multiple orderlines. An advanced business programming language or tool should let me declare concisely that orderline.productNumber should be not only non-null but also unique across all its sibling orderlines. -- SteveYen


I would like to see more research in the melding or interchangeability of text programming code and DataStructures; as started by the LispLanguage and perhaps with variations of the theme such as TableOrientedProgramming. The concept appeals to me. I suppose it could be said that this is what ObjectOrientation has done, but OO seems to have pulled the data structures to the code side instead of the other way around. Time to play with the other side of that coin for a while.


I think you made some good points in the essay. However, I would like to add another dimension to your concept of "abstract": You say for a language to be more abstract it must be closer to the ProblemDomain and more distant to the machine domain. Although this is a valid and important differentiation, I think we also have to consider that there is more than one problem domain. What is seen as very productive and highly abstracted for one problem domain may be quite awkward and low-level for some another problem domain. There are languages like XML [ExtensibleMarkupLanguage] that are very well suited to describe document or DataStructures but are hard to use for mathematical expressions. There are languages like SQL [StructuredQueryLanguage] that are good for set-oriented operations but bad for ProceduralCode. Or take RegularExpression languages.

So, I would agree with you when you define abstract as close to the ProblemDomain or close to the task. But opposite of close to the problem domain, in my view, can not only be close to the machine but also close to some other problem domain.

My conclusion is that ProgrammingLanguages will also evolve according to the problems we want to solve with them. For example if mainstream programming one day moves away from the client/server [ClientServer] or web paradigm, maybe more in the direction of peer-oriented networks of distributed autonomous agents, there will be programming languages that have the right abstractions for this paradigm. And it will be exactly those special-purpose languages like PHP [PhpLanguage] that will not survive such a shift, whereas I think highly adaptable languages like LispLanguage could accommodate it. -- Alexander Jerusalem

It seems like LispLanguage never makes it to mainstream despite being bendable to just about anything. Its bendability is both its power and its curse it seems. It frightens people away somehow. LISP will never go out of style because it has never been in style.


See http://www.paulgraham.com/diff.html for a dangerously mind-blowing view on this topic. ;)


Back in the day when a computer filled an entire room, using *more* than one computer was unthinkable. Today every desktop PC includes a microcontroller in the keyboard, one in the mouse, a DSP in the hard drive, etc.

While parallel computing (clustering) for number-crunching has been foreseen for some time, people are starting to use multiple CPUs for other kinds of things that were unthinkable 20 years ago. (Does the Google cluster do *any* floating-point calculations?).

I'm starting to suspect that rather than try to approach a single PerfectLanguage that does everything, (but would take years to really master), perhaps it's better to have a bunch of special-purpose LittleLanguages:

Then it doesn't matter much if a project gets dumped in your lap that uses a language you've never heard of -- hopefully the efficiency of breaking a program up into 4 parts, and using the best language for each part, more than compensates for the (short, because it's a "small" language) time spent learning 1 new language. It seems like OtherLanguagesForTheJavaVm, and DotNet, are steps in this direction. As do ExtensibleProgrammingLanguages with a focus on DomainSpecificLanguage extensions.


See Also: IdealProgrammingLanguage, AlternateHardAndSoftLayers, FutureOfProgrammingLanguages.


EditText of this page (last edited May 13, 2008) or FindPage with title or text search