ComputerScience has been confused. It has conflated two entirely different ModelsOfComputation and put them both in the lexicon of CS. For evidence of this confusion see TypeTheory, TypeSystem, TypeSystemCategoriesInImperativeLanguagesTwo (and One), TypesAndAssociations. If you can't make sense of it all, you now know why this page exists.
ComputerScience got confused when Babbage's DifferenceEngine got lumped together with modern day computers (i.e. which are TuringMachines + VonNeumannArchitecture). In other words: a conflation between AnalogVsDigital? computers.
[Babbage's DifferenceEngine was not an analog computer NickKeighley]
- Perhaps you're just being argumentative, but okay, let's not call it an analog computer then. In that case, it's not a computer at all, but a machine doing physics, not computation.
- All computers "do physics", if I understand what you mean by that, and they do computation. That's true whether they're electrical, mechanical, digital, analog or some combination of all of these.
- No, computers (hardware) obey physics, physics does not obey the computer. [Edit: Do you say a car transmission is a computer, for example?]
- Yes, (manual) car transmissions are analog computers that perform calculation of rational multiplication of a limited set of multiplicands.
- Okay, then. Apparently the source of confusion is the use of the word "computer". When I say "computer", I mean a GeneralPurposeComputer. When you say it, you mean something more like a calculator.
- When I say "computer", I mean a computational device, i.e., a thing that performs computation. You're thinking of a programmable computer.
- True. No one said otherwise.
- You did: you said Babbage's engine was not a computer. [Edit: Sorry, again, this confusion is rooted around different semantics for the word "computer". See comment above about GeneralPurposeComputer.]
- No, I didn't. Look above: NickKeighley says "Babbage's DifferenceEngine was not an analog computer." He didn't say it's not a computer. He's addressing your assertion that there's "a conflation between AnalogVsDigital?" computers and "Babbage's DifferenceEngine got lumped together with modern day computers". Actually, Babbage's DifferenceEngine was a mechanical calculator -- in other words, a constrained, non-programmable, digital computer.
- What I'm saying is that YOU are doing the computation: when you look at the state of a AnalyticalEngine you are putting the "equations" together coupled with the very design of the machine, which required so much flippin' foresight and "computation" that no one's been able to put one together.
- [That's just silly. The Difference Engine produced a perfectly clear result, no further computation needed (if you added the optional printer you got a printed page! (actually something more like a type-set block)). The Science Museum in London has built a Difference Engine (two actually). Really if you are going to use Babbage's machines as examples couldn't you do some minimal research first? Wikipedia is a good start. NickKeighley]
- Whether you use mechanical devices or electrical devices to create a computer, it's functionally equivalent. The only difference is that it's possible to make electrical computers that are much faster, smaller, lighter, and easier to build than those based on purely mechanical action.
- No, it's not. Because in the mechanical one, there's one more layer of processing done by the user. It's akin to a slide-rule. Does a slide-rule DO computation? And here you see, why I reject your definition, not because you can't rationalize it, but because it simply is less informed. Now go away.
- A slide rule is a manually-operated analog calculator (i.e. a non-programmable computer). The AnalyticalEngine was a proposal for a digital, programmable computer. It would have been TuringComplete.
- Okay, now you're drinking the kool-aid, sold by Britain (with respect and credit to Wyler's for "kool-aid"(TM)). You're passing off the ChurchTuringThesis (which is really just Churchs Thesis) as *fact* when it is a thesis, a theory. The model of computation you're describing has no symbols, so formally it can't be said to be the same ModelOfComputation as a TuringMachine, which is defined specifically with symbols..
- Huh? Please read some introductory texts on ComputerScience and computer architecture. It will help you understand our position, and why yours -- at least so far -- doesn't make any sense.
- And there you see the problem. If introductory texts were sufficient, we wouldn't have 200+ pages of raging debate. When you're ready for graduate school, find me.
- I teach graduate study, and you've yet to give me any justification to "find you". The 200+ pages of raging debate here are mostly between the people who have read the introductory texts vs those who haven't. There are worthwhile, meaningful and intelligent debates -- with useful resolutions -- amongst those who have not only read the introductory texts but the advanced ones too. Sadly, they aren't here, with the result that the lengthier debates tend to stay squarely at the 1st-year undergraduate level.
- Graduate "study" -- that's about as ambiguous as "tutoring help". You teach the status quo. The 200+ pages of debate are people who have one point of view and people who have another. There are not useful resolutions, only destructive ones. Again, you are talking like an expert, but are not. It doesn't matter if the whole Establishment agrees with you. But you would know that if you had a real PhilosophaeDoctorae?.
- Please identify where I (and the others who disagree with you on this page) are wrong, and back it up with logical or empirical evidence. If you are unable or unwilling to do so, we'll have to conclude that your criticisms are spurious and unfounded.
- "We'll have to conclude" (emphasis mine) is false. You won't have to conclude, you simply will because 1) you don't understand, 2) are too lazy, or 3) are weak. They are neither spurious (by fact that they arise after considerable and analysis and deduction), nor unfounded (by virtue that they are founded on a broader review of the subjects entailed). Your problem is that you want find someone else to back up what I say. And you don't feel secure enough to "jump ship", not because you can find anything wrong with it, prima facie. So STFU.
- No, I want you to back up what you say, because otherwise there is no evidence for your assertions. Why should anyone believe you without evidence?
- The evidence is someone using their life to propose something. It takes energy to do that and risks losing one's reputation. That is evidence enough. You ***"want""***. But you do not deserve.
- You're not making sense.
- To you.
- That's right, to me. I'm sure you make sense to yourself, otherwise you wouldn't be writing what you do. If you make sense to anyone else, I'm sure they'll be along shortly to clarify things. As it stands, I don't see what your comment -- which appears to be about personal sacrifice -- has to do with ComputerScience, computational models, or historical computing devices. Could you explain the relationship?
- Well, then, it seems just Top and me are "holding the fort".... The relevance of the last statement is not about personal sacrifice as much as showing that there's "just cause" to question. You want me to provide the evidence. I'm saying that my effort to explain is justification for you to examine the argument. If you want to lock yourself up in some private intellectual enclave of the Boundless, go ahead. My effort to dialog should be seen as respect for your personhood. I'm a bit aggrieved that you refuse to show the same. But that loss hasn't been tabulated. Anyway, moving along, tell me, how are you going to make a compiler for your Babbage Engine? (I'll be looking forward to your bank of car transmissions compiling my C source for the Linux kernel.)
- A bank of car transmissions is rather awkward to program. There are obvious practical distinctions between a programmable computer built with rods, gears and levers and one built with microscopic transistors, but there are no theoretical distinctions.
As an aside, I believe there are three ModelOfComputation that can be enumerated:
For example's of each:
LambdaCalculus, DigitalComputers
?, and for the last see
StephenWolfram. I don't think many have considered what kind of computation would be natural for such a model, perhaps some unnamed visual (non-numeric, non-symbolic) computation.
- Strangely, one finds this confusion surrounding the venerable DonaldKnuth's idea of exponentiation of the zero; that is, zero to the zeroth power, which he set equal to 1, but it cannot be. Zero times anything is zero. This is a disagreement between basic algebra and limit theory for calculus (which is to say the numeric/symbolic domain vs. a geometric domain). I put this little tidbit here to see if anyone else cares to examine why such a decision affects ComputerScience.
- Your misunderstanding lies in thinking 0 to the zeroth power is zero times anything. A simplistic explanation is that it isn't doing "times" by the time it gets to zero. 2^3 is 2 * 2 * 2; 2^2 is 2 * 2; 2^1 is 2; so 2^0 is ...? In one interpretation, it's 1 by definition rather than by derivation. Are you in fact referring to 2^0 (two to the zeroth power)? The fact that it's 1 makes it trivial to (for example) convert binary to decimal. E.g., 11001 equals 1 * 2^4 + 1 * 2^3 + 0 * 2^2 + 0 * 2^1 + 1 * 2^0 equals 1 * 16 + 1 * 8 + 0 * 4 + 0 * 2 + 1 * 1 equals 16 + 8 + 1 = 25
- Hmm, but what about this: ln(0^0) = 0 * ln(0) = 0 * (-inf) =? 0. Even if you don't want to want to except that it's equal to zero, it's clear that it must be indeterminate, yes (since 0 * inf is indeterminite at best)? (Note: taken from njguliyev @ math.stackexchange.com)
- Zero to the zeroth power being equal to 1 pre-dates Knuth -- it long pre-dates ComputerScience, in fact -- and see http://www.askamathematician.com/2010/12/q-what-does-00-zero-raised-to-the-zeroth-power-equal-why-do-mathematicians-and-high-school-teachers-disagree/
- Thanks for the reference. I'm sure the question pre-dates Knuth, but it seems that he made the decision that affected Computer Science.
- He documented the definition, but didn't originate it. He did argue -- quite compellingly -- in favour of it. See http://mathscitech.org/articles/zero-to-zero-power How do you believe 0^0 negatively (I presume) affects computer science?
The
DifferenceEngine is in the same category as
LambdaCalculus - neither use digital logic
or BinaryArithmetic, yet these are the "must-have",
defining features of modern-day computers.
[Note: Originally, the above read "... defining features of modern-day ComputerScience", and this was a response to that.] Neither "digital logic" nor binary arithmetic are "must-have", defining features of modern-day ComputerScience. ComputerScience is, by definition, the study of computation in all its forms. No basis in digital logic or binary arithmetic is required, though obviously both are as important as LambdaCalculus. You may be confusing ComputerScience with computer engineering. In computer engineering, the majority of modern-day computer designs do depend on digital logic and binary arithmetic.
Okay, I think you and I must have a talk. If I turn a gear wheel 'round that turns a second, smaller gear, am I "calculating" or is it physics? Because I think we must draw the line somewhere. They do not have the same behaviors at all. Though they may share the same name "computation", they do not share a common computational domain. One will be constrained by physics, the other is not.
- Computation is constrained by physics. Information does not travel faster than the speed of light. Information always has a physical representation, with a minimum energy dependent on ambient temperature (cf. Landauer's principle). What's with this widespread myth that computation is independent of physics? Computation is mechanical. (related: http://calculemus.org/logsoc03/materialy/ConservativeLogic.pdf)
IN OTHER WORDS:
One of us here is the
programmer and one of us is just a
brogrammer.
[I guess someone is trying to be rude to someone else but it's hard to tell who is who (and I didn't understand the insult anyway!). Cogs and wheels *can* compute. Babbage's intent was to produce tables of numbers. He was implementing digital computations. He designed (but didn't build) a general purpose programmable digital computer. Can't get much more comp sci than that! NickKeighley]
- Sorry, I'm going to pull rank and decline your notion of computation. Despite it's popularity in the canon of CS, it simply isn't computation.
- On what basis? NickKeighley and the "canon of CS" are correct and you're wrong, unless you can prove otherwise.
- Okay, as I asked above: Does a slide-rule DO computation?. Because boner, they are very different ModelsOfComputation aren't they? Would you please make TuringComplete slide-rule, because no one is able to make a functionally-equivalent Babbage Engine, despite the claim that it's the same.
- The process of computation is indeed different between the mechanical, manual, analog calculation of a slide-rule, vs the digital calculation of a DifferenceEngine or a modern pocket calculator, vs the programmable TuringMachine computation of an AnalyticalEngine or a modern digital electronic computer. They are all different ways of performing computation, but all computation is the essence of ComputerScience. A functionally-equivalent Babbage Engine to a modern computer is the AnalyticalEngine -- which certainly can be built but hasn't been built only due to cost, not due to infeasibility.
- Please tell me how to write a recursive function on your AnalyticalEngine. Then, tell me how your claim of irrelevance as to the question of ModelsOfComputation will allow you to make it equivalent to iterating from 1 to 10,000. Thanks bonehead.
- Are your insults necessary? None other than AdaLovelace, the first computer programmer, already solved this problem. See http://people.maths.ox.ac.uk/kar/AdaLovelace.html and search for "Ada's Notes to the translation of Menabrea's memoirs terminated with a step-by-step description ..." You appear to be conflating complexity with possibility. The fact that it's humanly complex to (say) write or implement a program to compute a recursive function, or even an iterative one, on a historical computing device (or a modern constrained one, like a PIC chip) has no bearing whatsoever on its computational validity.
- The point isn't complexity, it's that it simply isn't or won't be TuringComplete. Besides computation, is the meta-computation of algorithm and arbitrary functions.
- Why do you think it wouldn't be TuringComplete?
[I'm pretty sure the originator of this page is the confused one. I think he has a variant of "vitalism". In this case "it isn't a computer/computer-science unless it computes the same way a contemporary computer computes". It *must* use electricity and it use must use binary math. Seems like a failure of imagination (amongst other things).
NickKeighley]
- NickKeighley is pretty sure. But that adds nothing. I'm pretty sure you're another spoiled benefactor of a country that has never taken responsibility for its history of near genocide and more recently torture. So you're speaking with power unearned. "What goes wrong with setting 0^0=1"? [Algebra] What goes wrong with setting 1/0=0? You are something like TheKooks whom expect everyone else to answer you rather than just consider a proposal (originated elsewhere) may actually be the best way forward.
- [I'm not sure what my ethnic/cultural origins or political opinions have to do with a technical discussion. NK]
- What does being a "spoiled benefactor of a country", etc., have to do with anything on this page? And, once again, what problem does 0^0 = 1 actually cause?
I'm not quite sure what you mean, or how it's relevant -- or why it matters -- to a discussion of what ComputerScience is about. ComputerScience is the study of computation, largely independent of (though sometimes studying as well) the physical mechanisms used to achieve it. Again, I think you're conflating computer engineering with ComputerScience.
Perhaps that is what you think ComputerScience is, along with Harvard it appears, but it cannot be the case at all. If you are going to call something Computer Science you're going to have to specify the ModelOfComputation you're using for a Foundation for the science otherwise, this confusion will continue. (See the top paragraphs on ModelsOfComputation, specifically the counter-argument against the ChurchTuringThesis that begins with "the most important practical aspect of...".)
[No. And I have a degree in Computer Science. No you do not have to specify a model. There is no confusion (other than you) NK]
Actually, it can be the case, because it is the case. ComputerScience does not have to "specify the ModelOfComputation you're using for a Foundation for the science", any more than (or as ridiculous a notion as) biology has to specify a particular nematode to use as its "Foundation". Specifying a ModelOfComputation is a foundation for (say) engineering a new programming language, but ComputerScience studies and embraces all models of computation; it doesn't limit itself to one.
Well then, perhaps biology is behind too, because I'm quite sure the DNA structure of fungi is going to be quite radically different than the plant Kingdom, even though the share the generic name of "life". Frankly, I'm not even sure that fungi have DNA at all. So, yes, a biologist would have to specify the Kingdom of life that they're studying if they want to make sense of it.
[you as well informed about biology as you are about computer science. DNA is DNA in pretty much every life form on the planet. I understand there are minor variations but they are rare. NK]
Sorry, I found out that Fungi don't actually have DNA, so we're both wrong -- but you more than me. Because DNA is not in every life form, insects don't have it, and reptiles don't have it, except perhaps in the blood.
Where do you find this stuff? Fungi, insects, and reptiles all have DNA.
- To you.
- Are you claiming that fungi, insects, and reptiles having DNA is dependent on one's personal viewpoint?
Biology is the study of the mechanisms of "life", so fungi, plants, bacteria, viruses, diatoms, protozoa and creatures are all appropriate subjects of study. Fungi have DNA. What breadth of study is appropriate is up to the individual biologist, but it's all equally biology. Likewise, all models of computation are equally ComputerScience.
What has made it even more confusing is ideas (akin to ChurchTuringThesis) is that people keep arguing that that each can be implemented in the other, so that "it's all the same". This is wholly irrelevant except for ToyProblems? and endless argumentation. One can also map the imaginary numbers onto the reals, but this doesn't mean they are the same or interchangeable.
Of course LambdaCalculus and TuringMachines are part of the lexicon of ComputerScience -- they both belong there, and both are valid bases for theoretical study. Both are equivalent in terms of not presenting any particular obstacle (or help, for that matter) to the programmer implementing programming languages, operating systems, or other software.
Stop right there. You are making a claim that is not true. This confusion does present an obstacle and are not equivalent, the exact nature of which I'm trying to elucidate. Specifically, it is enormously expensive to design certain kinds of languages on some models of computation. I'm not going to get much done with C on a LISP Machine -- if I could even write a compiler on it, simply because its design is for recursive models (akin to LambdaCalculus).
I'm not convinced any serious confusion exists (casual pub-squabbles amongst equals don't count), and what is not an obstacle is basing a language on any particular ModelsOfComputation. C++ and Haskell, for example, are based on different ModelsOfComputation and present very different approaches, but both can achieve equivalent ends.
- Turing completeness ensures that you can write any computable function. It says nothing about equivalence with respect to extensibility, modularity, security, performance, complexity, and convenience. (To be equivalent with respect to modularity and extensibility is formally related to homotopy and homeomorphism.) Cellular automata are Turing complete, but it would be extremely difficult to write a usable operating system in one. Why do you imply such considerations are not valid obstacles?
- I did not intend to imply that such considerations are not valid practical obstacles, so I'm not sure why you inferred it. My point was that ComputerScience -- in particular, as a theoretical field of study -- is not hindered by "any serious confusion" between LambdaCalculus and TuringMachines, as was alleged by my correspondent.
I don't think you can do
GenericProgramming (in the
CeePlusPlus sense) in
HaskellLanguage.
You don't have to. You do it in the Haskell sense, which is arguably more powerful.
The other thing is that this kind of argument leads to a TuringTarpit: just because you can do it (in "theory"), doesn't mean it is a viable alternative.
A TuringTarpit is a language that is TuringEquivalent but, usually for illustration or amusement, so awkward as to be effectively unusable. Haskell, for example, is obviously not such a language; it can be used wherever you'd use C++.
ComputerScience has no difficulty reconciling theoretical ModelsOfComputation with applied SoftwareEngineering. In short, there is no confusion.
Sorry, you just made a claim that does not have consensus. There doesn't seem to be any confusion because where is the "software engineering" in Babbage's Computer? In any case, I will try to find the references that show the CS field has been fragmented and does not have consensus. Where you might be confused is that it has, instead, a long-standing tolerable disagreement which has gone silent; but this should not be confused with consensus.
- I think the problem is between a TypeSystem model (the view from above) vs a Hardware view (a view from below). The latter is tied to DigitalLogic. See also ClosuresConsideredHarmful.
- For most programming -- to the extent that this is a concern at all -- it's only a significant issue for the compiler or interpreter developer and certainly not a difficult or controversial one.
Among computer scientists and experienced software engineers, I think my claim does have consensus. I work with computer scientists and experienced software engineers on a daily basis. Whilst there is good-natured ribbing between (and about) the imperative programming proponents, the logic programming proponents, and the functional programming proponents, we all get the job done using the tools we prefer and we recognize the equal validity of the ModelsOfComputation that underpin them.
Among some computer scientists, you are right. With software engineers, it's mostly irrelevant, because while they've learned various paradigms, they stick within their respective domains, and it isn't an issue. But these different domains, like say the one used for the JavaScript "engine" (a mirror in some way to the JavaVirtualMachine..?) were honed in the presence of these contradictions - they did not solve them. In any case, my consideration is for a UnifiedDataModel, and these old domain (mal)adapatations are no longer tolerable and can't simply be waved-away to "get the job done". Come to think of it, the entire confusion only came after the web created its own little domain of "web programming" and, though slow attrition, usurped the lexicon of ComputerScience. Were they not once called ScriptKiddies?
- [No, not really... unless you are intentionally conflating developing websites with breaking into websites.]
I think your problem of creating a UnifiedDataModel, in light of contradictory views and redundant terminology, is trivially solved by implementing a notion of synonyms (to take care of redundant terminology) and microtheories, which are collections of knowledge that are not contradictory within themselves, but which may contradict other microtheories. The OpenCyc people have already explored some UnifiedDataModel notions, and it may be worth examining their work. That will certainly be easier than trying to reconcile (and then legislate?) terminology throughout the entirety of the computing field.
What you are suggesting is more like a distributed SemanticNetwork?. I actually want to create an ecosystem of working objects, like the DNA snippets inside an organism.
As for Java and Javascript, they share only a name and some syntax common to all C-derived languages. As is sometimes said, Java and Javascript are as similar as car and carpet. The name "Javascript" was chosen purely for marketing reasons, a deliberate attempt by Netscape to ride the coattails of Java popularity. It's not a source of confusion except among the naive, and is mainly a source of amusement, e.g., "Q: What are Java and Javascript? A: One is a light-weight toy language suitable only for trivial applications. The other runs in a Web browser. *groan*"
For the record, I did not create this topic nor the introduction. --TopMind
I think it's clear where the confusion about computer science lies here. And indeed about other fields of mathematics as well (for example, somehow blaming the behavior of exponentiation on Knuth). Computer science is the study of computation. Period. Lambda calculus, Turing machines, cellular automatons, electrical, optical, mechanical, von Neumann cores and memory, FPGA computational fabric, neural networks...everything. MarkJanssen, your view of the field is cripplingly narrow and limited.
- Quite, the opposite, sir. It's actually very broad. Do you know what you even mean when you say "computation"? Do you mean adding numbers? Do you mean interfacing with a user? That's what computers do right? What does it mean to add a number? Is it a quantity being increased? I'm arguing that you do not know as much as the field would confer to you.
[addition: see
http://en.wikipedia.org/wiki/Peano_axioms NK]
- Oh, and by the way, I'm only placing the blame on Knuth for infecting ComputerScience with this ridiculous notion of making the exponentiation of the zero equal to 1 when it should clearly remain indeterminate. If math wants to believe it, fine, but there is no rigor that can support it, only personal philosophies, at best.
- What is the (presumably undesirable) effect of making the exponentiation of zero equal to 1?
- About the same as making 1/0 = 0.
- That doesn't answer the question. Specifically, what effect does 0^0 = 1 have? What goes wrong because of it?
- In ways impossible to explain to you, the future goes wrong.
- If you can't rationally defend your assertions, I suggest you don't make them. Comments like "the future goes wrong" may be serious to you, but they're silly to us.
- You couldn't respond to my legitimate comparison between 1/0 and 0^0 having definite answers. I'm not going to do your thinking for you. Either you admit the comparison is apt, or you're just another TheKook?.
- No, I chose not to respond because the "legitimate comparison" is silly. The rationale for 0^0 = 1 and 1/0 = undefined are amply demonstrated all over the Internet and in math texts. If you have a genuine ComputerScience reason for 0^0 != 1, you should articulate it. Otherwise, we'll consider your claim to be spurious.
- Fine. x/x = x^1/x^1 = x^(1−1)= x^0, therefore 0/0 must be treated the same as 0^0 (from Tim @ math.stackexchange.com).
See also ComputerScienceVersionTwo, OneTruePath
CategoryMetaDiscussion
MayThirteen