Are Current Languages Shaped By Hardware

Are current languages and/or our language preferences shaped by hardware-centric concerns?


(Based on a discussion in ValueExistenceProofTwo)

{I think you'd find my writing style a great deal less "alien" if you had slightly more background in ComputerScience. I suspect I'm often writing assuming knowledge of computer fundamentals and computer architecture that you appear not to have.}

Your definition/view of "fundamental" is suspect, possibly polluted by hardware concerns/exposure. Software is not really about hardware ("computer architecture"). The fact that you brought computer architecture into this strongly suggests you are misguided. In fact, I'd say programaming languages are more about communication with and between humans than machines. Their comparative lack of ambiguity is what allows them to be processed by machines, but also improves inter-human communication. If our languages are primarily shaped by machine concerns, then we are doing it wrong, because machines can be improved if they turn out to be the bottleneck. Ambiguity is the biggest shaper of programming languages, not machines. (See also ProgrammingNotAboutMachines). And you seem to lack fundamental knowledge of the scientific process, not understanding the role of models in it. Prove you are smart and objective, don't just claim it. Claims are cheap on the web: mere commodity hot air and bragging. ItemizedClearLogic not based on word-play would be a good start. Even with word-play it would be a step up from your fuzzy English "proofs".

{Popular imperative programming languages are an abstraction of the underlying machine. C/C++ makes this particularly obvious, but it's no less true of (say) Python and PHP. Software is not about hardware, but it runs on hardware, and popular imperative programming languages reflect this fact.}

They are NOT an "abstraction of the underlying machine". That's utter hogwash. And perhaps they were partly influenced by hardware concerns, but that doesn't mean one is forced to view them in terms of hardware. That they can be viewed in hardware terms is not the same as "must" be viewed in hardware terms, or even "best" viewed in hardware terms. If you can logically prove they are "best" viewed in terms of hardware, please do. But paint me skeptical.

{You don't think C, for example, is an abstraction of the underlying machine? That's exactly why it was created -- to abstract away machine specifics, whilst still acting effectively as a generic assembly language for operating systems development.}

C, perhaps, but that's a language-specific trait. Other Algol-influenced languages, especially dynamic ones, differ. The only machine-centric remnant I can identify in ColdFusion is the a maximum precision (number of decimal places) in floating point numbers. One does not have to worry about 4 byte numbers versus 8 byte numbers, etc. But even from a practical "external" matter, machines aside, floating point precision should probably have some upper limit. Otherwise writing un-formatted test output would take up gajillion screen-fulls. Thus, I'm not even sure I'd consider floating point precision limits to be a machine-centric issue. Maybe the existing limit is too small, but that's a minor thing to quibble about. For roughly 99.9% percent of programming thinking, I don't have to think about such machine issues (performance aside). (Note that I have not encountered issues with integer size in CF. There may be limits or conversion-to-floating issues with that, but I've never encountered them myself.)

{VonNeumannArchitecture was influenced by algebraic notation, which in turn influenced language design. Languages that implement mutable variables and imperative statements are all abstractions of the machine. Languages like Prolog and Haskell, which implement significantly different semantics, are much less of a machine abstraction.}

That statement difficult to verify. It more appears as if the marketplace determined which coding style became popular, not hardware builders, and we cannot rip open millions of consumers from that era to study their neuron patterns (if they are still alive). You are welcome to present indirect evidence of such an association, but it's probably only spotty evidence and not reliable enough to make such generalizations. BASIC was pushed by the early microcomputer hardware builders because it was cheap, relatively familiar, fit in RAM, and didn't require a RAM-hogging code editor. However, despite temporarily popularity, programmers eventually drifted back to Algol style. Thus, hardware concerns were not sufficient to control the language choice in the longer run in that case.

If programmers liked Prolog, it would have been more common by now. Rejection of Prolog has very little, if any, to do with hardware. I suspect is has to do with the difficulty of incremental decomposition and re-composition of such logic languages. They are less subject to StepwiseRefinement. Humans like to break things into smaller parts for debugging and grokking and DivideAndConquer.


Re: "VonNeumannArchitecture was influenced by algebraic notation"

Exactly how was VonNeumannArchitecture influenced by algebraic notation? What are the alternatives that were considered and rejected for not "fitting" algebra, and where is this documented? It appears to me that early computers were largely shaped by resource constraints. If somebody found a way to do useful computations with cheaper hardware, it would have been done in a heartbeat because the existing machines were extremely expensive compared to other factors (such as WetWare). Cost was THE driving factor. Only in mid/late 1950's did WetWare-friendly notation start being a subject of investigation and research (driven largely by GraceHopper). Being cross-vendor compatible was at least as much a factor driving higher-level language research as WetWare issues. If we looked at the ranking of the importance of such issues around that time, it would probably be:

Solving #2 tended to also help with #3 since the language didn't have to target a specific machine. A period of heavy experimentation began; math-oriented notation (Fortran), English-like (COBOL), and many other flavors (such as APL, Lisp, Prolog) were invented. Roughly around the early 1970's, WetWare issues overpowered hardware costs in priority. The cost of programmers was growing larger than the cost of the computers they used. Thus, it made sense to buy more powerful machines instead of hiring more programmers. Shops could choose languages based on language design with only minority concern over hardware issues. Shops were free to choose from all the coding style experiments from golden era of language invention (roughly 1956 to 1976). Despite all that choice, the Algol style eventually prevailed and became dominant (and perhaps a bit of the logical/FP style in SQL).


http://www.stanford.edu/class/cs242/readings/backus.pdf (Related: CanProgrammingBeLiberatedFromTheVonNeumannStyle)

The above paper has been suggested as evidence that existing languages are driven by hardware design. However, it does not provide a lot of evidence. The similarities between machines and our languages may be coincidental, or due to another force(s) besides "historical habit". A causality relationship has not been established in that article.

More math-like or formula-like languages similar to the alternatives described have been tried, such as ProLog, but they fail to catch on because they programmers find them hard to grok. For one, they tend to resist or lack something comparable to StepwiseRefinement. StepwiseRefinement makes it possible to define high-level tasks via progressively lower-level tasks in a fractal kind of way. Most find it natural and intuitive, and this is even among programmers with little or no "hardware" education. There is no known equivalent of StepwiseRefinement in ProLog. Pieces of equations extracted in isolation rarely have any meaning or comprehension to humans. However, empirical steps taken one at a time usually do.

The formula-like approach can almost be compared to working with a hologram rather than a clay model. Humans can generally relate to the clay model even if the model is broken or sliced into pieces (with the original context tracked), but not a sliced hologram of the same object/scene. A sliced section of a hologram cannot be readily comprehended by most humans. (A couple of artists got the hang of simpler holograms and can carve their own with special "scratch brushes", but they are the exception, and it took practice[1].) A hologram is not the "wrong" model of a 3D scene, but it's not one that humans can readily comprehend the construction and deconstruction of.

"One Big Formula" approaches tend not to be dissect-able in a way that most humans can relate to. Their useful granularity is stuck on too large a size. Maybe this can be solved, but nobody has done it yet. Thus, we stick with imperative. The large granularity often found in SQL is sometimes considered one of the SqlFlaws. Newer statements in the standard and/or suggested fixes tend to allow better granularity, but they also make it closer resemble imperative programming. Thus, the "solution" is to de-formula it.

The so-called VonNeumann-style languages remain because the alternatives keep failing in the market and in trials, NOT because we love machines and their history. The alternatives don't appear to match human WetWare.

Further, the alternatives have not appeared to significantly reduce code size even by those who master them, except for specialized uses or domains. Thus, they don't appear to provide any objective improvement to the imperative approach. (There are also techniques to reduce code-size in imperative languages, but generally they harm readability and maintainability by most accounts.) -t

{Why do you assume that the cause of market failure is incompatibility with human wetware? It seems to me that you're trying to interpret the evidence to fit your theory, rather than attempting to find a theory to fit the evidence.}

That's a rather rude response. I gave evidence. If you have new or additional evidence, just present it, and then present your view of what it means.

{What did you offer that you believe was evidence of cause? Please be aware that "evidence of market failure" is completely different from "evidence that the cause of market failure was incompatibility with human wetware". So far, that's just your theory. You did say: "The alternatives don't appear to match human WetWare." But that isn't the same as evidence.}

Like it said, BASIC was a hardware-friendly language and was popular for a while, but programmers and shops went right back to block languages that had already been gaining traction. Thus, hardware-centric concerns don't appear to be a sufficiently powerful source to push choices. As far a OfficialCertifiedDoubleBlindPeerReviewedPublishedStudy, I don't have one, and neither do you, so don't be a dick about that.

{I'm not understanding your logic. How does switching from one hardware friendly language to another lead to a rational conclusion that being hardware friendly isn't a big factor in language choice? I think most people would see the pattern: "people seem to choose hardware friendly languages" and call it evidence of the opposite conclusion.}

It wasn't a "switch". Read it again.

{How is the minor difference between 'switch' vs. 'went right back to' vs. 'eventually drifted back to' even relevant in context? Are you trying to suggest that compiled Algol-derived languages are less hardware friendly than interpreted BASIC? Nevermind, no reason to derail the conversation. Clearly, you don't have any evidence that rationally supports your pet theories.}

BASIC is more similar to machine language, and I believe most will agree even if you don't. And I did not say anything about "compiled".

{Comparing BASIC with C aptly demonstrates that 'similar to machine language' is very different from 'hardware-friendly' or addressing 'hardware concerns'. Is your argument even relevant to the topic?}

I'm not sure what your point is. Why C? Why not JavaScript?

{Programmers did not transition from BASIC to JavaScript. But even if they had, I could point out that ability to eek performance of JavaScript (via JIT, Emscripten, etc.) has been a very significant factor in its continuing success, though less so than other NetworkEffects. Meanwhile, fitting wetware is not something JavaScript does well (cf. http://zero.milosz.ca/). Do you somehow believe JavaScript is evidence in favor of your theory?}

I'm saying they went back to Algol-style/influenced languages, not JS specifically. And I'm generally considering flow control (nested blocks), and expression syntax, not type systems. And I don't know what eeking out JS performance has to do with the topic. (JS's current success is largely a QwertySyndrome per browser standards, but that's another existing topic(s). And yes, JS has a shitty type system in my opinion.) The type systems of some Algol-style languages are close to the hardware (C) and some are not (ColdFusion). Being Algol-like does not imply a certain kind or style of type system (for root types).

From a theoretical point of view, all modern imperative programming languages are essentially indistinguishable. Their differences are almost entirely syntactic, plus some negligible semantic distinctions between their TypeSystems and related minutiae. C and ColdFusion have far more in common with each other -- and with the machines they run upon -- than either have with, say, Prolog and Haskell. The latter are semantically different from each other, different from the machines they run upon, and different from imperative programming languages which are collectively nothing more than a transparently-thin abstraction away from the underlying machine.

{(in reply to Top) The topic is "are current languages shaped BY hardware?" To me, this means: does hardware influence design and market selection of modern programming languages? Performance seems very relevant to this question. How are you reading the topic? Based on your comments, it seems you read: "are current languages shaped SIMILAR TO hardware?" That is a different question. Regardless, comparing a bunch of imperative languages (e.g. BASIC vs. JavaScript vs. ColdFusion) seems to offer a narrow view of the subject. Consider: every imperative language is both more hardware-like and hardware-friendly than Prolog.}

{The popularity of languages with incomprehensible type systems was mentioned only as evidence against your 'fits the wetware' hypothesis. It isn't clear to me that imperative programming or popular languages are a good fit for anybody's wetware.}

But the features/problems you listed are not specific to Algol-influence, or at least what we are considering the AlgolStyle?. As a working assumption, how about we agree that Algolness won't consider the type system here. WetWare-fitting of type systems can be a different topic.

{How about you provide evidence that supports your wetware-fitting hypothesis? All the evidence you've presented so far seems to contradict your hypothesis.}

Sorry, I don't see a contradiction. C's type system itself is certainly tuned for hardware performance, but that doesn't contradict anything I've stated. We have hardware-friendly AlgolFamily languages like C and not-so-hardware-friendly AlgolFamily languages like SmallTalk and ColdFusion, and many in-between. And I do agree that type systems are still fairly largely shaped by hardware concerns, but this is partly because the practical distinction between a human-friendlier type system and a compromise type system is small enough that speed perhaps overrides the minor human factor improvement. Plus, interfaces to other sub-systems, such as databases and OS API's, work better if we use machine-centric typing. There is kind of a QwertySyndrome in type systems per inter-language data sharing. The lesson here is that the industry will choose hardware-friendly if the human-side improvements of the alternative are small. I suppose one could argue that Prolog-like languages provide a slight human-side improvement at the expense of performance, and that is why the Prolog-style is mostly ignored. But, I doubt a large human-side improvement would be ignored. Orgs often complain about the cost of programming and would be happy to buy beefier hardware to reduce such cost if the trade-off was clear to them.


[1] One artist got the idea by studying the shiny streaks on newly waxed cars. He noticed that the streaks appeared to either "poke out" of the car, or appeared to be inside the car, depending on the curve of the wax marks and angle of the sun. He studied and learned the association patterns between the wax marks and the apparent visual location of shine streaks in 3D and applied this knowledge to draw 3D objects by brushing/scratching metal plates.


The languages in which data may be defined as words, like CeeLanguage integer type, there is imho an obvious hardware influence. An integer is the same as the hardware word to which it maps. If it overflows, unexpected behaviour is likely. -- ChaunceyGardiner

As mentioned above, there are some AlgolFamily languages that have a type system that is close to the hardware and others that don't. However, this is orthogonal to the Algol-influenced style.


See also CanProgrammingBeLiberatedFromTheVonNeumannStyle, GreatLispWar


DecemberThirteen


EditText of this page (last edited January 1, 2014) or FindPage with title or text search