Confused Computer Science

ComputerScience has been confused. It has conflated two entirely different ModelsOfComputation and put them both in the lexicon of CS. For evidence of this confusion see TypeTheory, TypeSystem, TypeSystemCategoriesInImperativeLanguagesTwo (and One), TypesAndAssociations. If you can't make sense of it all, you now know why this page exists.

ComputerScience got confused when Babbage's DifferenceEngine got lumped together with modern day computers (i.e. which are TuringMachines + VonNeumannArchitecture). In other words: a conflation between AnalogVsDigital? computers.

[Babbage's DifferenceEngine was not an analog computer NickKeighley]


As an aside, I believe there are three ModelOfComputation that can be enumerated:

For example's of each: LambdaCalculus, DigitalComputers?, and for the last see StephenWolfram. I don't think many have considered what kind of computation would be natural for such a model, perhaps some unnamed visual (non-numeric, non-symbolic) computation.


The DifferenceEngine is in the same category as LambdaCalculus - neither use digital logic or BinaryArithmetic, yet these are the "must-have", defining features of modern-day computers.

[Note: Originally, the above read "... defining features of modern-day ComputerScience", and this was a response to that.] Neither "digital logic" nor binary arithmetic are "must-have", defining features of modern-day ComputerScience. ComputerScience is, by definition, the study of computation in all its forms. No basis in digital logic or binary arithmetic is required, though obviously both are as important as LambdaCalculus. You may be confusing ComputerScience with computer engineering. In computer engineering, the majority of modern-day computer designs do depend on digital logic and binary arithmetic.

Okay, I think you and I must have a talk. If I turn a gear wheel 'round that turns a second, smaller gear, am I "calculating" or is it physics? Because I think we must draw the line somewhere. They do not have the same behaviors at all. Though they may share the same name "computation", they do not share a common computational domain. One will be constrained by physics, the other is not.

IN OTHER WORDS: One of us here is the programmer and one of us is just a brogrammer.

[I guess someone is trying to be rude to someone else but it's hard to tell who is who (and I didn't understand the insult anyway!). Cogs and wheels *can* compute. Babbage's intent was to produce tables of numbers. He was implementing digital computations. He designed (but didn't build) a general purpose programmable digital computer. Can't get much more comp sci than that! NickKeighley]

[I'm pretty sure the originator of this page is the confused one. I think he has a variant of "vitalism". In this case "it isn't a computer/computer-science unless it computes the same way a contemporary computer computes". It *must* use electricity and it use must use binary math. Seems like a failure of imagination (amongst other things). NickKeighley]

I'm not quite sure what you mean, or how it's relevant -- or why it matters -- to a discussion of what ComputerScience is about. ComputerScience is the study of computation, largely independent of (though sometimes studying as well) the physical mechanisms used to achieve it. Again, I think you're conflating computer engineering with ComputerScience.

Perhaps that is what you think ComputerScience is, along with Harvard it appears, but it cannot be the case at all. If you are going to call something Computer Science you're going to have to specify the ModelOfComputation you're using for a Foundation for the science otherwise, this confusion will continue. (See the top paragraphs on ModelsOfComputation, specifically the counter-argument against the ChurchTuringThesis that begins with "the most important practical aspect of...".)

[No. And I have a degree in Computer Science. No you do not have to specify a model. There is no confusion (other than you) NK]

Actually, it can be the case, because it is the case. ComputerScience does not have to "specify the ModelOfComputation you're using for a Foundation for the science", any more than (or as ridiculous a notion as) biology has to specify a particular nematode to use as its "Foundation". Specifying a ModelOfComputation is a foundation for (say) engineering a new programming language, but ComputerScience studies and embraces all models of computation; it doesn't limit itself to one.

Well then, perhaps biology is behind too, because I'm quite sure the DNA structure of fungi is going to be quite radically different than the plant Kingdom, even though the share the generic name of "life". Frankly, I'm not even sure that fungi have DNA at all. So, yes, a biologist would have to specify the Kingdom of life that they're studying if they want to make sense of it.

[you as well informed about biology as you are about computer science. DNA is DNA in pretty much every life form on the planet. I understand there are minor variations but they are rare. NK]

Sorry, I found out that Fungi don't actually have DNA, so we're both wrong -- but you more than me. Because DNA is not in every life form, insects don't have it, and reptiles don't have it, except perhaps in the blood.

Where do you find this stuff? Fungi, insects, and reptiles all have DNA.

Biology is the study of the mechanisms of "life", so fungi, plants, bacteria, viruses, diatoms, protozoa and creatures are all appropriate subjects of study. Fungi have DNA. What breadth of study is appropriate is up to the individual biologist, but it's all equally biology. Likewise, all models of computation are equally ComputerScience.

What has made it even more confusing is ideas (akin to ChurchTuringThesis) is that people keep arguing that that each can be implemented in the other, so that "it's all the same". This is wholly irrelevant except for ToyProblems? and endless argumentation. One can also map the imaginary numbers onto the reals, but this doesn't mean they are the same or interchangeable.

Of course LambdaCalculus and TuringMachines are part of the lexicon of ComputerScience -- they both belong there, and both are valid bases for theoretical study. Both are equivalent in terms of not presenting any particular obstacle (or help, for that matter) to the programmer implementing programming languages, operating systems, or other software.

Stop right there. You are making a claim that is not true. This confusion does present an obstacle and are not equivalent, the exact nature of which I'm trying to elucidate. Specifically, it is enormously expensive to design certain kinds of languages on some models of computation. I'm not going to get much done with C on a LISP Machine -- if I could even write a compiler on it, simply because its design is for recursive models (akin to LambdaCalculus).

I'm not convinced any serious confusion exists (casual pub-squabbles amongst equals don't count), and what is not an obstacle is basing a language on any particular ModelsOfComputation. C++ and Haskell, for example, are based on different ModelsOfComputation and present very different approaches, but both can achieve equivalent ends.

I don't think you can do GenericProgramming (in the CeePlusPlus sense) in HaskellLanguage.

You don't have to. You do it in the Haskell sense, which is arguably more powerful.

The other thing is that this kind of argument leads to a TuringTarpit: just because you can do it (in "theory"), doesn't mean it is a viable alternative.

A TuringTarpit is a language that is TuringEquivalent but, usually for illustration or amusement, so awkward as to be effectively unusable. Haskell, for example, is obviously not such a language; it can be used wherever you'd use C++.

ComputerScience has no difficulty reconciling theoretical ModelsOfComputation with applied SoftwareEngineering. In short, there is no confusion.

Sorry, you just made a claim that does not have consensus. There doesn't seem to be any confusion because where is the "software engineering" in Babbage's Computer? In any case, I will try to find the references that show the CS field has been fragmented and does not have consensus. Where you might be confused is that it has, instead, a long-standing tolerable disagreement which has gone silent; but this should not be confused with consensus.

Among computer scientists and experienced software engineers, I think my claim does have consensus. I work with computer scientists and experienced software engineers on a daily basis. Whilst there is good-natured ribbing between (and about) the imperative programming proponents, the logic programming proponents, and the functional programming proponents, we all get the job done using the tools we prefer and we recognize the equal validity of the ModelsOfComputation that underpin them.

Among some computer scientists, you are right. With software engineers, it's mostly irrelevant, because while they've learned various paradigms, they stick within their respective domains, and it isn't an issue. But these different domains, like say the one used for the JavaScript "engine" (a mirror in some way to the JavaVirtualMachine..?) were honed in the presence of these contradictions - they did not solve them. In any case, my consideration is for a UnifiedDataModel, and these old domain (mal)adapatations are no longer tolerable and can't simply be waved-away to "get the job done". Come to think of it, the entire confusion only came after the web created its own little domain of "web programming" and, though slow attrition, usurped the lexicon of ComputerScience. Were they not once called ScriptKiddies?

I think your problem of creating a UnifiedDataModel, in light of contradictory views and redundant terminology, is trivially solved by implementing a notion of synonyms (to take care of redundant terminology) and microtheories, which are collections of knowledge that are not contradictory within themselves, but which may contradict other microtheories. The OpenCyc people have already explored some UnifiedDataModel notions, and it may be worth examining their work. That will certainly be easier than trying to reconcile (and then legislate?) terminology throughout the entirety of the computing field.

What you are suggesting is more like a distributed SemanticNetwork?. I actually want to create an ecosystem of working objects, like the DNA snippets inside an organism.

As for Java and Javascript, they share only a name and some syntax common to all C-derived languages. As is sometimes said, Java and Javascript are as similar as car and carpet. The name "Javascript" was chosen purely for marketing reasons, a deliberate attempt by Netscape to ride the coattails of Java popularity. It's not a source of confusion except among the naive, and is mainly a source of amusement, e.g., "Q: What are Java and Javascript? A: One is a light-weight toy language suitable only for trivial applications. The other runs in a Web browser. *groan*"


For the record, I did not create this topic nor the introduction. --TopMind


I think it's clear where the confusion about computer science lies here. And indeed about other fields of mathematics as well (for example, somehow blaming the behavior of exponentiation on Knuth). Computer science is the study of computation. Period. Lambda calculus, Turing machines, cellular automatons, electrical, optical, mechanical, von Neumann cores and memory, FPGA computational fabric, neural networks...everything. MarkJanssen, your view of the field is cripplingly narrow and limited.

[addition: see http://en.wikipedia.org/wiki/Peano_axioms NK]


See also ComputerScienceVersionTwo, OneTruePath


CategoryMetaDiscussion


MayThirteen


EditText of this page (last edited November 11, 2014) or FindPage with title or text search