Can Programming Be Liberated From The Von Neumann Style

JohnBackus's classic TuringAwardLecture, introducing the functional programming language FP which has no variables.

http://www.stanford.edu/class/cs242/readings/backus.pdf

FP has no variables, but the paper does not propose using FP. It proposes using extensions of it that do have variables.

If you can show people how to do their work faster and better then it can. But this must be shown. Not argued.

Claims made about FP were challenged at HowCanSomethingBeSuperGreatWithoutProducingExternalEvidence. There was disagreement about the applicability of toy examples to certain domains.


First perhaps we need to ask *why* we need to get away from it.

Because it defines its architecture in terms of a bottleneck.

But perhaps this also makes it easier for humans ToGrok, especially us men who can only do one thing at a time :-) Actually, some things seem to work well under non-Von, while others don't IMO. Relational notation, which is non-Von is (or can be) quite powerful, going beyond the Von-ish NavigationalDatabase approach. However, I am not sure everything can be done (or easily learned to be done) that way. If the machine can give us sufficient speed for Von, then why fuss? Overhauling/optimizing the machine to fit our head is usually easier than the other way around in most cases. Now, maybe for things like weather prediction this does not apply.

For an excellent introduction to grokking non-Von algorithms, see GuySteele and DanielHillis's wonderful paper "Data Parallel Algorithms," http://cva.stanford.edu/cs99s/papers/hillis-steele-data-parallel-algorithms.pdf. See also DanielHillis's book (thesis) The ConnectionMachine for a discussion of the performance issues.



Reading through it, one is struck by a sense of irony. Backus clearly hoped that FP would check the "cancerous" (I think that's his word) growth of programming languages, but given the proliferation of FunctionalProgramming languages and their associated arcana, it would seem that no programming paradigm is immune from metastasis.

There is significantly less unnecessary or frivolous variation in functional and declarative languages than in imperative languages. The variation tends be to due to differences in approach -- for example in how to model state or I/O.

Depends what you mean by functional.. Lisp certainly has proven the opposite - there are many ways to redefine lisp to do the same thing many different ways. This does not lead to consistency, it leads to variation. Usually there are 1 or 2 ways to do I/O in imperative languages (for example in C you do it one way usually) so I don't see what your point is.


How ironic, my computer science teachers always used to criticize me for doing things like:

  printf("1+2=%u",1+2);, 
they would argue it's better to put it in a human readable variable like:

  result=1+2;
  printf("1+2=%u",result);
Now I hear we need to get away from named variables?

Ha ha ha. The principle in FunctionalProgramming is not to avoid named values, but to avoid mutable named values, especially one which cause side-effects (i.e., global variables). In the case you gave, the latter code would be acceptable, so long as you didn't change the value of result later on. (Actually, the printf() function would itself be a problem in a purely function language, as it causes a side effect - it has an output other than the function return - but most purely functional languages have ways around that). Constants and function parameters (including functions themselves as values) are permissible, but not reassignment of variables.

To put it another way, FP is about modelling program functions after mathematical functions; the goal is for the functions to be stateless (if you run them with a given set of arguments, they will always return the same result) and declarative (they should as much as possible describe a relationship between the arguments and the result, not a series of actions to take to get the result). -- JayOsako


The interesting thing about Backus's paper is the focus on the limitations of von Neumann machines, and how language choices affect this. First, as noted above, the languages he was railing against were largely the industrial languages of the time, things such as AlgolLanguage, PlOneLanguage?, CobolLanguage, and Backus's own creation FortranLanguage--languages which largely mirrored the von Neumann machines they were executed on. Backus attacked them on two grounds--one, that they are inherently fragile and poor programming models, and two--that the von Neumann architecture does impose a performance bottleneck (the VonNeumannBottleneck) which has gotten worse over the years. (The CellProcessor is the latest in a series of attempts to get around (some) of the limitations of the von Neumann architecture).

Of course, one may ask: "What does the choice of programming language have on the hardware? A functional language which is compiled to run on a von Neumann machine will still suffer the bottleneck." The answer, is ReferentialTransparency--which makes parallel computation much more tractable (and capable of being automated). Effectively parallelizing imperative languages is still an active research topic.

Granted, we have a long way to go before production-grade compilers capable of efficiently distributing an algorithm across an arbitrary group of processing nodes come into being--so perhaps Backus is invoking the mythical SufficientlySmartCompiler. But it still is an interesting thing to consider.


Here you go: "How To Compute Without Numeric Variables in a non-Von Neumann Architecture" http://tinyurl.com/indiscretelogic


See VonNeumannBottleneck, AdvantagesOfFunctionalProgramming, DeclarativeProgramming, PurelyFunctionalOperatingSystem, KillMutableState, SemanticsDiscussion, AreCurrentLanguagesShapedByHardware, ProgrammingNotAboutMachines


CategoryPaper, CategoryConcurrency


EditText of this page (last edited November 27, 2014) or FindPage with title or text search