Computer Science Version Two

...aka ComputerScienceRefactored. Noting the irresolvable debates (ThreadMess) and HolyWars revolving around Objects (ObjectsAndClosuresAreEquivalent, more, more) and terminology (TypeTheory, TypeSystem, TypeSystemCategoriesInImperativeLanguagesTwo, TypesAndAssociations, etc etc.) AND learning that the profession of ComputerScience has been fractured within itself and away from the programmer community, I declare that there's been a half-century of deferred maintenance. The whole field needs to be re-centered. LetsBlowUpTheUniverse.

The fulcrum of the whole confusion is conflating the digital logic of the VonNeumannArchitecture, the symbolic logic of PredicateCalculus?, and the analog movements of Babbage's DifferenceEngine -- putting them all under the rubric of "computation". But there's a major difference and they create completely and absolutely different ModelsOfComputation. WikiStub.

After much analysis, I identify two major groups, heretofore un- and under-differentiated. They were both working under the common grouping of "Computer Science" but they work in two separate domains. Up until now, they have been unconsciously sharing a common lexicon, but this has been a major source of the confusion and "everyone has their own opinion" is not a sufficient excuse for an academic milieu. Language needs to be clarified so that this boundary can be clearer and the field stronger. Once this is accomplished, it will resolve all the confusion between "Objects", "Types", "Classes", "Languages", etc. Can we agree that that is a good thing?

So, there are two major camps in Computer Science. These camps are separated by two different ModelsOfComputation. The idea that one can be expressed by the other (along the lines of ChurchTuringThesis) actually misinforms the intuition. In fact, under (and after) this analysis, it probably renders this *thesis* invalid.

In the first camp, which I'll call AnalogComputerScience?, we have what is nominally called "mathematics", but historically originates *strictly* from philosophy, specifically SymbolicLogic. Here we find LambdaCalculus and the desire for functional compaction. As such, a defining feature here is recursion. Arithmetic is not a defining feature nor is there any metaphor of physical computation (the laws of physics do not apply, only the "logikos")

The second major camp, I'll call, for the moment, just ComputerScience. It originates from the computation that evolved mostly after 20th Century military history and is rooted in binary (or BooleanLogic) and TuringMachines (generally VonNeumannArchitecture). This computation is rooted in friggin *physical reality*, not predicate calculus; i.e. the computational hardware has to respect the laws of physics. It even has a simile (a physical tape) rooted in physical reality. I argue that it is here, in the Turing Machine, the field needs to be re-centered. Throw out symbolic logic, completely, except as a cross-disciplinary interest and historical curiosity. Punt it back to Philosophy. Not because it's not useful, but because it confuses the thinking, like forgetting the i in a complex equation. Or, like confusing English ("one","plus","two") for the realm of natural numbers.

The first camp worries about provable code, the second camp does not -- it has type-checking and I/O to see whether programs work or not. (The former camp hardly ever works with I/O except at the very edge of its computations.)

-- MarkJanssen

Both camps have type-checking and (hopefully) tests.


Note: Terms to reconcile and grab back: Program, Object, Type, Specification, Language... more?

We're okay with them as they are, thanks. But feel free to make your own definitions and we'll let you know if we like them.

The phrase "tilting at windmills" comes to mind. Not a big fan of FunctionalProgramming, are we?

You know I was actually going to use that phrase of Cervantes, but I didn't think of it in time before submitting. I'm actually am a big fan of functional programming, and count lisp as one of my favorite languages, but this is the only way to make sense out of the two camps: to taxonomize them at the outer-most level, and start regrouping.

Good luck. Call us when you've succeeded at that.

Not to offend, but this seems rather a failure of CS departments versus Maths departments. For software, I will hire a maths grad over a CS grad every time. Perhaps CS v.2 should return CS to EE. CS/EE engineers for hardware design, and programmers should come from applied maths programmes.


This may be semi-related to the above dichotomy, but I see a split between "theorists" and "pragmatists". The theorists tend to use ArgumentByElegance, while the pragmatists tend to focus on economics and also consider programmer knowledge/skill a limited resource to manage and factor into design decisions. The theorists spend time looking for ideal abstractions, while the pragmatists look for the best "investment" given the existing environment and resources (human and capital). The pragmatists are a closer fit to the second camp, but it's still not really a focus on "hardware". (If step-based programming is favored, it's done for allegedly practical purposes, such as fitting common WetWare, not because it reflects hardware design such as VonNeumannArchitecture.) - t

Yes, that is the split, the question is which side is which... each camp sees themselves as the pragmatists it seems and the other, theorists...

You mean one side is "symbolic abstraction theorists" and the other "economic theorists"?

Yes, and the latter also refers to themselves as pragmatic programmers.

I still don't see where VonNeumannArchitecture fits in. (I added to the above paragraph.)

VonNeumannArchitecture is [loosely] a *physical implementation* of the *theoretical* TuringMachine. As such, it limits the style of computation. Gone is the ease of functional calculus, focussing as it does on recursion, and hello to iterative programming. Recursion is probably impossible to implement in VonNeumannArchitecture.

[It's quite reasonable to ask where VonNeumannArchitecture fits in, because VonNeumannArchitecture makes it possible to implement any other architecture on top of it, including the "functional calculus, focussing as it does on recursion". Thus, the characteristics of VonNeumannArchitecture become -- for most purposes -- irrelevant to the point of being ignorable. Obviously, there will be limits, but there are limits in all pragmatic computing, because it's all ultimately done on real machines. The fact that the real machines are running virtual machines is immaterial. Do you believe that hardware (real machines) is somehow superior to software (virtual machines)?]

The characteristics of VonNeumannArchitecture are significant because they dictate a kind of domain-specific thinking - the key word here is DICTATE. It's like communicating in Greek vs communicating in English, but stricter, you see.

[Given that (say) Haskell, Prolog, Java and Python all run on the same implementation of VonNeumannArchitecture, how does VonNeumannArchitecture dictate "a kind of domain-specific thinking" to the Haskell, Prolog, Java and Python programmer?]

The languages you mention require very elaborate "abstraction engines" to adapt to the architecture of the machine; some, like Prolog, are borderline magic and difficult to fathom. Portuguese and English share the same alphabet, but they make entirely different semantic relations. That is, they dictate a style of thinking even though (given the energy/time), in theory, one could translate from the one to the other.

Whether (full/true) recursion is possible on such machines may be secondary to whether it should be the default or common programming style even if it was possible. Recursion is not the natural/default thinking style of most human beings, at least not to a heavy extent. Some may argue it's superior ONCE one masters it, but that gets into issues of MindOverhaulEconomics (including the finding of future maintainers). Is the productivity increase once mastered sufficiently superior to make up for the time learning? Thus, there are two issues: 1) is it economically practical from a programming labor standpoint, and 2) is it feasible to build such hardware to support it (or close approximations)? - t

[That's why we have more than one programming language. It is good that we have a variety of languages to suit various purposes and programmer abilities or inclinations.]

But there's a certain pattern to which tend to have industry preference. See GreatLispWar.

[Yes, but the industry is broad. Even those with limited abilities or narrow inclinations can find employment using languages suited to particular proclivities.]


There are two camps:

Us should be supported, whilst Them should be condemned.

No, sorry it seemed that way. There is a need to move the "Them" camp to one side of the partition, and the "Us" camp to the other side. The partition is only a practical device to maintain some order, not judgement.

But you were thinking it the way I wrote it, and you know it. :-)

That's not true. My only judgement is for those who purposely added confusion from the second camp into the first.

What do you mean by "deliberately added confusion"?

Word switch: purposely. They didn't necessarily "deliberate"

What do you mean by "purposely added confusion"? Who are they that did so? Can you show objective evidence of this purposeful confusing? Was confusion the purpose? Or was there another purpose, and confusion was the result?'


Congratulations on rediscovering science vs. engineering, pure vs. applied science, theoretical vs. experimental physics, synthetic vs. descriptive biology, research vs. clinical medicine, climate vs. weather, art history vs. art, legislation vs. adjudication ... I could go on. The same split occurs in every sufficiently intellectualized field, and corresponds only to differences in human temperament. That is, there is nothing about the subject that splits naturally -- each side exists in microcosm in the other -- but some people gravitate one way, some the other, while a few move easily back and forth and are scorned by both sides. Insufficiently aware adherents of each side think theirs is the one worthy pursuit, but rarely get the upper hand for long because only periodic infusions from the other side can stave off sterility and decay.

Ha, except I didn't "rediscover" it. It will take some time before the lexicon gets sorted out. If it had already been seen, we wouldn't have this mess to begin with.

[I'm not clear what "mess" you feel there is? Confusing terminology?]

Refer to pages linked from ThreadMess. A good starting point: TypeTheory {edit: or TypeSystemCategoriesInImperativeLanguagesTwo}.

[Are you sure you're not conflating a perceived "mess" -- in which "mess" apparently implies something rather negative -- with what is actually healthy debate amongst enthusiasts over minutiae, which is positive and leads to reflection and clarification? Note that the "mess" linked from TypeTheory is largely the result of one participant's personal minority views pitched obstinately against an otherwise largely uncontroversial majority view. In what computer scientists and software engineers deal with on a daily basis, whether theoretical or pragmatic, there is no "mess".]


Nice set of dichotomies -- thanks. Yet I wonder if even this much distinction can be applied to CS. In particular, is a computer finite or infinite?

Last time I checked, there weren't any infinite computers for sale at the computer store, but maybe I didn't notice -- I do tend to buy my hardware a year or two behind the cutting edge.


1. You say: Computer Science... originates... Turing machine... friggin physical reality

2. Considering that Turing machines have an infinite tape, they are infinite objects.

3. Last I checked the world is finite -- in fact the universe is finite -- so... was just trying to figure out the friggin alternate reality you are evidently living in.

Thanks for the opportunity to clarify. Turing machines have an infinite tape. Actual hardware approximates this with memory and swap space which is adequate for tasks requested. People have to buy more memory (or faster machines) if their tasks require more power than their current hardware allows.

BTW the match between Haskell and lambda-calculus is way closer than between TMs and von Neumann machines.

Hmm, I doubt that, but I would settle for equivalent.

Also lambda-calculus can be hardware-implemented in a way that TMs can never be. Search for "Reduceron". Remember the infinite tape?

I've seen very smart computer scientists not be able to deal with the finiteness/infiniteness of computers question. Because:

1. If you say it's finite, then it's a finite automaton, which is a ridiculously low-powered machine.

2. If you say it's infinite, then it's just a mathematical abstraction, and so pragmatically useless.

3. And if you say it's both, you lose the possibility of making the division that you are trying to make. I dare say that the re-centering you are trying for is a laudable objective -- you just need to listen when people speak.

It is finite when I'm in "hardware mode". But when in "software mode", ever since the advent of GUI computers, the computer is treated as practically infinite. It is a strange source of confusion (and one of the places where the truth is stranger than fiction), but fortunately, there are physical implementations stamped into reality we can actually point to and resolve any confusion.

[I presume by "practically infinite" "since the advent of GUI computers", you're referring to the fact that an addressable GUI lets us display a very large number of possible pictures on the screen. That can certainly be considered "practically infinite" -- though no more so than any arbitrary RAM in non-GUI computers, obviously -- as long as by "practically infinite" you mean "very large". Mathematically, it is finite. There are finite number of pixels, and each can only display a finite number of colours, thus the number of images that can be displayed are finite. There is no such thing as an infinite computer. "Infinite" data structures are not actually infinite; they may be conceptually unbounded but are limited by available memory. Even the number of different programs you can run on a given computer is finite.]


See ObjectOrientedRefactored, ProgrammingIsInTheMind, PragmaticThinkingAndLearning, ConfusedComputerScience.


AprilThirteen


EditText of this page (last edited February 3, 2014) or FindPage with title or text search