Em Expressions

The originally intended external syntax for expressions in LispLanguage. EssExpressions were originally devised as an internal representation only.


Example:

 maplist[x;f] = [null[x] -> NIL;
                 T -> cons[f[x]; maplist[cdr[x]; f]]]

The guarded expressions take the place of a (COND ....)


The list below shows how to translate S- to M-expressions:

It seems to me that ML syntax resembles Mexprs quite closely.

But Mexprs are more similar to historical math notations, so JohnMcCarthy and others originally presumed that it would be a better way to go. It appears that was incorrect.

But non-Lisp-programmers dislike Sexprs. Not sure if they like Mexprs any better, on the other hand, but it's more similar to Algol-family languages than are Sexprs.


JohnMcCarthy writes (X . Y) as (X . Y) -- sexps denote themselves; (A B C) was (A, B, C) in sexp format as well, and the COND expression should be [P->A; Q->B; T->C]


JohnMcCarthy writes: we still expected to switch to writing programs as M-expressions. The project of defining M-expressions precisely and compiling them or at least translating them into S-expressions was neither finalized nor explicitly abandoned. It just receded into the indefinite future, and a new generation of programmers appeared who preferred internal notation to any FORTRAN-like or ALGOL-like notation that could be devised.

Imho, the early adoption of M-expressions might have helped Lisp gain a greater foothold in the industry.


More info is available from http://www-formal.stanford.edu/jmc/history/lisp/lisp.html

In particular, the following section seems worth quoting:

The programs to be hand-compiled were written in an informal notation called M-expressions intended to resemble FORTRAN as much as possible. Besides FORTRAN-like assignment statements and go tos, the language allowed conditional expressions and the basic functions of LISP. [...] M-notation was never fully defined, because representing LISP functions by LISP lists became the dominant programming language when the interpreter later became available.


Wow. I certainly would have understood and enjoyed Lisp programming a lot better had M-expressions been the standard syntax. I think it would have resonated much better with the mathematical logic and abstract algebra courses I was taking at the same time. Did anyone ever do a Lisp-variant fully implementing them? --KyleBrown

O yes. The idea of having a non-Lispy syntax for Lisp has been reinvented time and again. As a matter of fact, the CMU AI repository has a complete section with alternate Lisp syntaxes (syntaces?).

http://www-cgi.cs.cmu.edu/afs/cs/project/ai-repository/ai/lang/lisp/code/syntax/0.html

Also consider the GuileScheme project, which originally had as a goal to support many languages (e.g. TCL-like syntax, Perl-like syntax, Python-like syntax) on top of a common core. But as with the original M-expressions, this goal seems to have receded into the indefinite future. Nevertheless, a C-like syntax is available for Guile, but it doesn't seem to be in wide use. -- StephanHouben

In "TheEvolutionOfLisp", GuySteele and RichardGabriel go over several different Lisp implementations with Algol-style syntaxes. To end the section (section 3.7.1, page 85 of the unexpurgated version), they write:

On the other hand, precisely because Lisp makes it easy to play with program representations, it is always easy for the novice to experiment with alternative notations. Therefore we expect future generations of Lisp programmers to continue to reinvent Algol-style syntax for Lisp, over and over and over again, and we are equally confident that they will continue, after an initial period of infatuation, to reject it. (Perhaps this process should be regarded as a rite of passage for Lisp hackers.)

In other words, M-expressions lower the barrier of entry, therefore they are Evil.

I think their point is that M-expressions make it harder to perform the kind of code-as-data meta-programming that is used for advanced LISP programming. Therefore LISP programmers will come to prefer S-expressions as they become more experienced.


While it is true that EmExpressions might be easier to learn, because they leverage familiarity with conventional algebraic forms, the effect is overstated. First off, 'conventional' infix syntaxes differ from algebraic notations, and from each other, in striking and often surprising ways; the 'familiarity' is as often as not misleading. Second, the mathematical InfixNotation is usually only a small part of most language's syntax, and in most languages, bears little resemblance to the way the rest of the language works.

In any case, what EssExpressions lose in familiarity, they gain back in simplicity. It shouldn't take anyone, even a computer neophyte, very long at all to grasp the basic structure of a LISP function. Surely no one in their right mind would assert that CeeLanguage syntax is easier to learn than EssExpressions, yet that is not usually seen as a barrier to entry, is it?

Surprise -- learning C is substantially easier for most people to learn, because the syntax of C more closely resembles the normal sentence structure used in human spoken languages. I remember when I first learned C, it shipped with a debugger which was written in a Lisp dialect of some kind (proprietary, but still Lisp). Learning C was way easier for me than learning to use the Lisp dialect. To this day, when describing a procedure to someone, I almost never make use explicit recursion -- I use constructs like for (for all these ghosts, we need to make sure they boo at the right time.) and while (while the program is still running, make sure we don't crash). CL has explicit iteration constructs like this, but the examples that I've seen seem to be more complicated than their counterparts in C or Pascal.

I guess the point I'm trying to make is that C and its ilk are more conversational, and less formal. This would appeal to the common programmer because it is a lot less to have to think about. I generally tend to find those who prefer Lisp are more mathematically oriented to begin with; or, at the very least, tend to think in terms of formalisms more readily than most others who are trying to program a computer. After having learned ForthLanguage some years ago, and then taking the time to learn Haskell, I have a new appreciation for Lisp and, especially, Scheme. Though, I still prefer RPN over s-expressions. :) --SamuelFalvo?

Is SamuelFalvo? confident that SamuelFalvo? is in his right mind? And how can he be sure? Maybe he is meant to be in someone else's mind. Maybe he is meant to Be John Malkovich. :)

I will gladly admit that as an experienced programmer, I found going from C to LISP is a bit jarring; however, the same was true in going from BASIC to Pascal, from Pascal to C, and from C to Java. Learning a new language always takes some adjustment. I will also admit that EssExpressions are not the ideal means of expressing programs. However, I find the power and flexibility of the LISP languages to be very desirable in general-purpose programming (though I do not particularly like either CommonLisp or SchemeLanguage, and hope that something like ArcLanguage will arise to supplant them).

On one last note: the choice of a language should be dictated by the needs of the program, not the other way around; A programmer should be flexible enough to use a language suited for a given task. The advantage of LISP is that very often, it makes it easy to create the appropriate language as part of the solution itself (though one could point out that this is also true of the Unix environment tools like bash(1), yacc(1), etc. taken as a whole). - JayOsako

It's a matter of programming productivity though -- Lisp and Forth both allow effortless construction of domain-specific languages. Unix provides the tools, but often, you need tools to use the tools. With Lisp and Forth, the language is really the only tool you need. -- SamuelFalvo?


I don't believe that syntax is very significant fraction of what it takes to 'learn' a language. Even after mastery of the syntax, you need to learn how that language is used to solve problems. This is affected (significantly) by the libraries one receives with it, the ability and culture for producing DSLs or TypefulProgramming of other sorts, the abstractions offered by the language and its environment, etc. Flexibility, I believe, is what makes it truly difficult to 'learn the language'. If you don't have flexibility, you can say you've 'learned the language' when you can solve problems by walking the narrow path the language forces of you. The solutions will rarely be 'optimal' (especially in the sense of direct expression of intent), but they'll get the job done. If you do have flexibility, however, you cannot say you've 'learned the language' until you've learned to choose among many different approaches how you're going to solve your problems, and you'll learn more and more that 'well, this approach isn't quite optimal for this problem for reasons XYZ' and 'well, that approach isn't very expressive... it'll make maintenance or explanation very difficult', etc. I.e. you'll need to make choices, and they'll be difficult choices. In addition, if the language is flexible enough, the more you learn (and the broader your horizons grow) the more you'll see is available for learning. Even if the syntax is extremely simple, and the standard libraries are small, it can end up being one of those 'two days to learn, a lifetime to master' things. I think people feel their own mastery when they know how to make the right decisions, how to optimize every little detail to the degree the language allows, etc. For flexible languages, it takes a very long time to get there. In relation to SamuelFalvo?'s experience: 'C' is one of those 'pragmatic' languages that constrains you rather severely on how you go about solving problems. It's possible to take different approaches, but you'll feel it... sort of like stepping off the trail to meander through a field of nettles. Scheme, relatively, is one of those almost 'research' languages. The concepts and primitives are relatively simple, but high level (continuations, conditionals, lambda, recursion, tail-recursion-optimization, etc.) and the traditional 'getting started' libraries are nice and tiny. Scheme is very nimble, very flexible, but not very pragmatic... not without grabbing more libraries. And Lisp is, of course, everything in Scheme (with slightly worse syntax and macro definition) plus massive, hulking libraries and GenericFunctions and objects and (more and more and more).

Prior to learning C, I was programming in assembly language (footnote: I started programming the Z-80 on the TRS-80 Model I at the age of 6, and moved onto the 6502 (well, 6510A in the C64) at the age of 10 or so. From there, onto the 68000 with my Amiga 500, which was the first platform I've used that encouraged C for anything nontrivial). Assembly is the most liberating of all programming languages in a very real sense. But, in other ways, it's also the ultimate in pragmatism. This just reinforces my point: a beginning coder will want to express the program in a style that is closest to what he already knows -- natural language. C satisfies this. (otherwise (would 'you (see 'sentences (like 'this)) (in (life 'real)))). Humans just don't think in terms of function application; if they did, we'd all be algebraic genii and speakers of fluent Lojban.

Those who work with Lisp successfully (any dialect) will generally tend to be math inclined (even if they don't know it yet). I would be willing to bet that, given a problem is presented to an average SmugLispWeenie and an average SmugCeeWeenie, the former will almost certainly think about the problem longer than the latter. You'll see him identify things such as invariants, or he'll use recursion, as recursion and proof-by-induction are very closely related. The latter, however, will tend to be more pragmatic, whacking on it here, tweaking the code there, etc. until a solution is found, all the while making sure to use pointer post- or pre-increments for efficiency, writing functions to manage memory as needed, etc. In fact, this kind of pragmatism is formalized in TestDrivenDevelopment, where you always write just enough to satisfy the given set of unit tests.

Another way of looking at things is that Lispers generally look to satisfy entire properties (this is usually reflected in the unit test packages too; see QuickCheck for Haskell), while C coders look to solve specific problems.

Also, let's not forget that Lisp is older than C, and so it should have more libraries available for it. By the time 1985 rolled around, I'd wager that the number of useful libraries were about the same. After 1985, the year of the GUI, libraries for C started to take off. You would think that the slope of libraries-versus-time would be steeper for Lisp than it would be for C, but experience has proven otherwise. Why? It cannot be because of academia, who were using Lisp at least as much as they were using Fortran. It cannot be because of business requirements, because most businesses didn't even have access to a computer, let alone have a computer at all. There is only one logical explanation: because C is structured more like human natural languages, therefore are easier for newcomers to learn, and therefore easier to make DisposableCoders with. --SamuelFalvo?


CategoryLisp


EditText of this page (last edited September 9, 2007) or FindPage with title or text search