Models Of Computation

ModelsOfComputation are abstract specifications of how a computation can progress, and they are often expressed as the description of some kind of conceptual automaton. ModelsOfComputation are very distinct from DecompositionParadigms, which are strategies for structuring programs, not the computations that executing those programs may occasion.

The single most important theoretical aspect of a model of computation is its power, which is the class of (mathematical) functions it can compute. Another important pragmatic, qualitative, aspect is its expressiveness, which is related to the ease with which it is possible to express computations in it.

A classic example of a model of computation is the Turing Machine model, which has great power even if its expressiveness is very limited, because of how awkward it is to express computations in it.

From http://www.cs.umbc.edu/~squire/reference/computable.shtml

Church's hypothesis, ChurchTuringThesis

This is a mathematically unprovable belief that a reasonable intuitive definition of "computable" is equivalent to the list provably equivalent formal models of computation:


For a contrary view suggesting that interactive computation models, such as ActorsModel, PiCalculus, CommunicatingSequentialProcesses, etc., are more powerful than Turing a-machines and the pure LambdaCalculus, see InteractiveComputationIsMorePowerfulThanNonInteractive.

[Much content moved there]


Even for sequential computation, a TuringMachine is considerably "lower-level" than the LambdaCalculus, which explains why there are programming languages based almost directly on the latter, but not the former.

With regards to expressiveness; nobody ever claims TuringMachines are an expressive model. There isn't any modern useful high-level language which limits itself to "turing machine" features; even modern procedural languages - the lowest "level" type, arguable - borrow much from the LambdaCalculus (they have functions, etc.... even if they aren't full FunctionalProgrammingLanguages). In fact, we have a rather derogatory term for languages whose chief selling point is that they are TuringComplete - we call them TuringTarpits. The raw LambdaCalculus isn't a particularly programmer-friendly model either (though some functional languages provide little more than a thin veneer on top of it.)

TuringMachines are not useful as high-level programming models.

Right; that was all I was trying to say with the 'a TuringMachine is considerably "lower-level"' comment. Note that it is independent of the arguments about interaction and encapsulation given in InteractiveComputationIsMorePowerfulThanNonInteractive, which deal with power, not expressiveness.

They are useful for two reasons: 1) they are easy to reason about mathematically; much research and knowledge in computational theory is based on the concept, and 2) virtually all computing hardware that is built resembles TuringMachines. Building hardware that directly evaluates the LambdaCalculus isn't economical; it's more feasible to emulate such on a von-Neumann architecture (which, due to years of R&D and numerous economies of scale, are both fast and cheap these days).

Virtually all computing hardware that is built has considerable support for interactive computation, and that support does not resemble anything provided by a TuringMachine.

But discussing the "expressiveness" of a low-level computational model is a RedHerring anyway. This is why programming languages exist; to provide reasonable abstractions on top of the models that are easier for humans to program and reason about. Speaking of the "expressiveness" of a low-level computational model misses the point completely; expressiveness is more important as a figure of merit of a language (or other tool used by programmers). And as all modern languages are Turing-equivalent, expressiveness has almost nothing to do with computational power.

And keep in mind. Unless you have very esoteric hardware, any program you write is ultimately getting run on a TuringMachine - more specifically, a finite-memory approximation of one.


And I seem to recall that ConcurrentSystemsTheory? (maybe in RobinMilner's PiCalculus) does not necessarily rely on any of the automata classes above, but generalizes them in the sense that everything/anything can happen at once.

See also http://www.cs.brown.edu/people/mph/HerlihyRu00/focs.pdf

In the introduction, they say: "Unlike sequential notions of computability, in which all reasonable models are essentially equivalent, concurrent computability is highly sensitive to the underlying model." and then go on building a hierarchy of models that can be built on top of others.

The paper is based on an approach whose basis was laid in http://www.cs.brown.edu/people/mph/HerlihyS99/p858-herlihy.pdf (which seems to be later than PiCalculus).


Note that reasonable here is related to how powerful a language is, not whether anyone can actually use it. Some commonly used languages are not in fact TuringComplete, and some TuringComplete languages are far from what any sane person would call "reasonable". See EsotericProgrammingLanguage.


SlashDot recently ran an article (http://developers.slashdot.org/article.pl?sid=03/06/18/127214) about a book which describes several models of computation. See ConceptsTechniquesAndModelsOfComputerProgramming.

Awesome book.


Other models of computation, weaker than those given above, include (from weaker to stronger):

Note that deterministic and nondeterministic TuringMachines are equally powerful, and so are deterministic and nondeterministic FiniteAutomata. But nondeterministic stack automata are more powerful than deterministic ones.


I seem to remember from University that PrimitiveRecursionTheory? is also one of the ModelsOfComputation.

Correct. Primitive recursion is recursion where the number of recursive calls is bounded a priori. Therefore, termination properties of primitive recursive functions are computable by a TuringMachine. However, there are things (such as AckermannsFunction?) which can be computed by a TuringMachine, but not by PrimitiveRecursion? (Ackermann's is recursive, but not PrimitiveRecursive). Primitive recursion is less powerful than a Turing machine.


Let us look at the models of computation that have been influential in the design of Programming languages: the most influential: the von-Neumann machine, a conventional CPU with only one register (called the accumulator) and load and store instructions. At most one memory cell changes during any one machine cycle, but machine cycles are fast.

The second most influential model of computation underlying programming languages is the one where state is sequestered in objects. If you consider a variable as "in reality" a simple object that responds to get and set messages, you are definitely using the ObjectOriented model (or the ActorsModel).

The third most influential model is the lambda calculus, the conceptual foundation of FunctionalProgramming. Fourth most influential has been the PredicateCalculus?, aka, predicate logic. Expressions in predicate calculus are closer to NaturalLanguage than expressions in the other models are, which is an advantage, but IMO we do not yet know how to implement PLs based on the PredicateCalculus? model well.


See also: ModelsOfComputationAndFormalLanguages, PrincipleOfLeastPower


CategoryModels


EditText of this page (last edited December 3, 2014) or FindPage with title or text search