Programmers Refuse To Use State Of The Art

And when they don't refuse to use state of the art tools then they refuse to learn WhatIs StateOfTheArt, or refuse to accept that they are state of the art. Even in this day and age, otherwise intelligent and open-minded programmers still ask for proof that AccessControlLists (contrast CapabilitySecurityModels) don't work or propose that users just "make backups" to guard against errors.

So do woodworkers, doctors (film xrays vs digital ones), police, etc, etc, etc. Tell us something useful, Richard

Digital xrays have only been around a few years and so far as I know they don't provide any qualitative improvements, just cost and logistics improvements. IOW, they don't affect doctors' abilities to do their jobs. In contrast, capabilities have been around for 30, Smalltalk for 20, TransparentPersistence even older than Smalltalk since ST has it, and they all directly affect programmers' ability to perform their jobs.

How long did it take for MRI to be widely adopted? And yet, it's an obscenely expensive technology which is probably going to be made completely obsolete by 3D ultrasound in a few years. In contrast, programmers refuse to adopt tools and practices which incur even slight perceived costs, regardless of their qualitative benefits, regardless even of TCO.

Programmers whine that they shouldn't be held to the standards of other professions like medicine because CS is a "fast changing" field. If they knew how fast medicine changes, they wouldn't be BSing like that.

Try a dental office. It seems like they have new equipment every time I have my teeth cleaned. A bit of an exaggeration, but not by that much.

My U.S. dentist invests in bleeding-edge technology, and has done so for ages, but he is not typical; early adopters are the ones who take the risks. Financially, it is more prudent to wait for early adopters to take the risks, and then make a judgment. The question is what is being maximized and minimized; it is literally impossible to simultaneously maximize possible health benefits, of a new and unproven technology while simultaneously maximizing doctor revenue while simultaneously minimizing financial risks (e.g. from lawsuits, if the new technology is highly deleterious in even a statistically small number of cases). Since it is impossible to do all of the above simultaneously, it becomes a matter of individual doctor's personalities and risk/reward attraction/aversion and personal interests/goals, etc. For doctors who set their own course, naturally, as with U.S. doctors in private practice.

But this needs to be brought back around to programmers. -- DougMerritt


I think we have a right to ask for evidence instead of ArgumentFromAuthority. New does not necessarily mean better. We want proof it is something really better instead of a catchy-sounding idea that the boss read about in an airline magazine. Related: HowToSellGoldenHammers.

Yeah, dragging your feet for 30 years after something has been proven beyond the shadow of a doubt. Yeah, that's really "asking for evidence".

Perhaps an example of some things that have been "proven beyond the shadow of a doubt" would be in order. I suspect the "30 years" reference is an exaggeration, because as of today, 30 years ago was 1976. I'm a little in the dark as to what was suggested in 1976 that was clearly valuable, but has not gained acceptance.

The Cambridge CAP computer demonstrated a CapabilitySecurityModel both in hardware and software. The paper on it was presented to the ACM Symposium on Operating Systems Principles in 1977. http://portal.acm.org/citation.cfm?id=806541

The problem is that you bought into the Belief In A Just World. The BJW is a commonplace assumption about the world that's so fucked up it's difficult for me to imagine people buying into it, but they do. You believe that whatever happens (such as the horrible state of computer science) must happen for a Good Reason, but it doesn't. It's the same dynamic that says that if the US government is murdering a million Iraqis then it must be because they Really Deserve It, or at least that the US government is well-intentioned but is making an "error". The sooner you give up the BJW and see the world for the truly fucked up place it really is, the better off the world will be.

Perhaps it wasn't your intent, but I find the juxtaposition of the Cambridge CAP computer (and its failure to get acceptance), and the WarInIraq?, to be...disturbing. Certainly you're not equating the two--and are instead, merely using the latter to make the point that the world is often irrational.

At any rate, why do you suppose the CAP didn't go anywhere? Killed off by IBM or other industrial behemoth who would be threatened? Solved problems that the commercial computing world didn't believe it had? Researchers abandoned the work? I'm curious.

Unfortunately I don't know computing hardware history that far back. Since the CAP was partly done in hardware, that may have been one reason why it was abandoned. Innovative hardware has an extremely poor track record (eg, the i370? chip that Intel built to support OO, abandoned). Mainstream chipmakers have a really poor understanding of what it is they're creating and what the customer really needs.

It's not merely that the world is irrational. There is a very specific dynamic that underlies pretty much all these phenomena, and it has to do with power & authority. People in power say that Iraq has WMDs, people in power say that Saddam is a big threat, people in power say the bombs the US is dropping on village markets and sewage treatment plants are "precision" bombs, people in power say you need a car, people in power say PeakOil isn't a problem, people in power say climate change isn't a problem, people in power say Kyoto is useless (well, it is but anyways), people in power say coal and gas are a great thing, people in power say that you need to go to college to get a white collar job to get a big house, people in power (Mayor of NYC) say that niggers (oops, can't say nigger) of the TWU are "thugs", people in power say that suburbs is a great way of life, and yes, people in power also say that you should use Unix or Windows but not ever anything else. And do you know what's the number 1 thing that people in power say? That you can't create a world without power. And that should really get you thinking about the symbolic and propaganda significance of ACLs.

You find this association disturbing? Good, you should be disturbed. Not by my making the association, but by the fact that it exists in your brain. Your brain is a marvellous pattern recognition machine and it's made this association between Power here and Power there a long time ago. That's what you should be disturbed about, not the fact that I brought it into the light. -- RK

It's traditional to not use analogies to (or otherwise appeal to) sex, politics, religion, etc, because they are volatile subjects that tend to immediately add unnecessary disagreement and heat to a conversation. -- Doug

ClayShirky has an essay on social compromises (eg, politics) embedded in groupware. Multi-user OSes are groupware. The entire reason why I studied security was because I detested Unix' fascist user structure. I can't not appeal to politics when the subject is politics. The InteractionDesign aspects of security models is pure undiluted 200 proof politics. -- RK

In all seriousness one can make an argument that almost all fields of human endeavour are politics. The very fact that politics is nearly universal is all the more reason for us to distinguish different sub-areas of politics. If I'm discussing office politics with a co-worker, it truly is a change of subject if they start talking about the politics between France and Iran. It's not the same topic, even though they are both politics. Same thing here. Calling the Unix approach "fascist" is not a change of topic, that's just descriptive. Talking about Kyoto and NYC is at least a brief change of topic. There are other ways to make parallels. -- Doug

I don't intend them as changes of topic but merely as parallel examples to support a useful generalization of the one topic, politics of mainstream OSes. Now, if there are other ways to make parallels then I would think there would be other things to make parallels to. But what parallels can you draw on politics other than more widely known politics? Unless you mean something completely different. (Is there any reason why we don't have a public ReverseIpLookup for members of WikiWiki? For example, I forgot who igate is.) -- RK

Because Ward practices DoTheSimplestThingThatCouldPossiblyWork, aside from the more obvious point that people are posting without UserName cookies on purpose.

Last time you asked about igate I said "first letter S.... :-)"

SJ?

And I wasn't talking about having a script, but literally just a page ReverseIpLookup where people could put their ip/names if they so wish. It wouldn't have to be as obtrusive as having your name plastered all over RecentChanges, and it would be more reliable.


On the other hand, some programmers refuse to have so-called "state of the art" stuff such as DotNet shoved down their throats by MicroSoft, because they don't like it.


created in JuneZeroFive and repeated interst in JanuaryZeroSix


It may very well be that computing StateOfTheArt may be fine and worthy of swift adoption, but that until proven and acceptable by the cadre of programmers and users, that its use might safely and economically be avoided, especially since history has taught us that Today's State is tomorrow's obsolescence and that it will most likely be SoonSuperceded? by something vastly superior or far more correct and applicable.

Science as an example:

The StateOfTheArt in Cosmology now isn't the BigBang, but something more like a forever expanding universe. In a decade or two, StateOfTheArt changes to what is currently acceptable and useful. It is bound to change from the more current view. Science isn't or shouldn't pretend to be an authoritative establisher of fact, but instead to be the forever inquisitive explorer of what can be known, proven, even SeeminglyPossible?. -- DonaldNoyes.20080801.1554.m06

Few scientists would claim science is an authoritative establisher of fact; science is about finding evidence for or against theories. It is scientific journalism (which is frequently neither) that is fond of turning a scientist's most tentative theories into breathless claims of stunning breakthroughs or "Solution Found At Last!"


It is my opinion that it's realistic that "new" candidate technology be tested:

before being considered. (For domains that have to be nimble, some of these may be skipped, but with the knowledge that there is more risk. Being nimble sometimes requires more risk.) -t


See DesignerFraud


A bigger problem is when state of the art is a giant leap backwards, causing negative net gain, such as Xml or new, poorly implemented yet ubiquitous tools (like TFS instead of Subversion).


CategoryEvidence


EditText of this page (last edited July 9, 2012) or FindPage with title or text search