Software Geniuses

... would be a proper subset of RespectedSoftwareExperts -

Well, in a perfect world maybe. On Wiki, the HumbleRefactorer works to make a WabiSabi world a little better ...


When RespectedSoftwareExperts was linked from WorldGeniuses, I was a bit surprised that anyone thought that that old page I created for a bit of a knockabout with sg, leading to an interesting interaction with AlistairCockburn on the maturity of the software industry, was really relevant to the subject of WorldGeniuses.

My sense of the ridiculous was heightened further by reading the following by RogerPenrose. A phrase in ShadowsOfTheMind intended to caution physicists about possible arrogance was interpreted by one rather sensitive reviewer, DrewMcDermott of the Department of Computer Science, Yale University, as an attack on the AI community. Unfortunately from McDermott?, that wasn't exactly what Penrose had in mind:

A new theory [unifying quantum mechanics and general relativity] is needed quite independently of any necessity for new laws to describe a universe that can support consciousness. However, physicists themselves often get carried away into thinking that they know everything that is needed - in principle, at least - for the behaviour of all things of relevance. There is a curious irony, here, in McDermott?'s quoting from Shadows p.373 "It is only the arrogance of the present age that leads so many to believe that we now know all the basic principles that can underlie all the subtleties of biological action." For he takes that remark to be aimed primarily at the AI community. In fact, the people I had primarily in mind were the (theoretical) physicists. I do not blame the biologists, or even AI researchers, when they take from the physicists a picture of the world commonly claimed to be almost final - bar some technical details that are irrelevant for the behaviour of macroscopic objects. But perhaps McDermott? is right; some AI researchers seem to be nearly as arrogant as high-energy physicists (and with far less reason) - especially those AI researchers who claim that the deepest mystery of the physical world can be answered without any reference to the actual laws that govern that world!

from http://psyche.cs.monash.edu.au/v2/psyche-2-23-penrose.html, section 11.8

RogerPenrose is well known for his disbelief in AI. And given that he's known to have tried to invent new physics to undermine the AI community, I would also have interpreted that remark as aimed at them. In fact, Penrose is quite underhanded in his attacks on AI, using popular physics books as vehicles for his unsupported opinions and hare-brained speculation. So it is understandable that AI researchers, who do not have his stature, would feel defensive about their field when Penrose starts blasting away without being precise about his target. In any case, it is hypocritical of RogerPenrose to lecture anyone about arrogance.

Penrose's opinions are not unsupported. -- JasonGrossman


Candidates

0 = not a genius in any sense of the word, 9 = bona fide genius

Proposed voting guideline: Anyone please feel free to add their votes on any of these characters. The wider the discrepancies of opinion, the more interesting this could get. We'll show the average once we have over five votes per person.

I just turned the list into a WikiWeightedVote. One of the strongest votes you can record is to remove all the ballot boxes. But I would be interested in the any lobbying below on this subject (including any additional candidates).


Lobbying

I know someone who was an undergraduate student with RichardStallman at MIT. He said that if RichardStallman had stayed in physics, he would almost certainly have won the Nobel Prize, and it was a shame he got side-tracked into software, where he had to work with people not nearly so smart and ended up tilting at windmills.

I have heard a similar comment from a (now) mathematician who was in undergrad maths with Stallman; that watching Stallman as an undergrad had made him realize (much later) what it would take to be a first-class mathematician.


Thanks for adding Stallman. We should have thought of him to start with. Has he only been tilting at windmills? I'd say that his voice has been really vital to the debate on the relationship between software and law (and money), even if I don't buy all his ideas. So I've added a second vote/opinion on the Stallman genius rating.

Maybe Stallman's lobbyists should point to the software work that qualifies as "genius level".

''Maybe being the author of emacs, a SoftwareMasterpiece, is sufficient? Then again, perhaps not.

Are theoretical breakthroughs required to qualify?

His recognition that software is an idea, a form, and yet still only a number and that these are mutually mutable is a revolutionary insight in it own right. The FreeSoftware movement must be a consequence of this insight not its cause. -- MartinSpamer

Or is it his founding/leadership of the FreeSoftware movement that earns him the genius title?


Does the fact that JohnVonNeumann made the biggest blunder in the history of physics, setting quantum cosmology back 6 decades (and still counting as far as non-experts are concerned), work against him? If it does not, then is it on the basis that blunders don't count or work in non-software doesn't count? If the latter, then what has he done in the field of software?

Does automaton theory count, or is it simply not useful enough?

What blunder? And how can a scientist set back a whole field of science? I guess he didn't force others to do what he said. If his ideas weren't useful, but it took 6 decades to find that out, then they were pretty close, don't you think?

Philosophically speaking, in science any falsifiable suggestion that fits the data is a useful thing to have lying about. The fact that the ideas were wrong is not the important thing, it's that that question was asked: "What if the universe were like this?"

There was no way to know a priori that the ideas were wrong. Every physicist who suggests anything risks sending the community off on a "wild goose chase" that ends with the ideas being proved wrong. It's the proving them wrong that makes for scientific progress. -- KatieLucas

His suggestion didn't fit the data, it wasn't good enough and there were a priori ways to know that his ideas were wrong. Unfortunately, his ideas also fit prevailing prejudices of the time and his enormous reputation prevented anyone from even thinking about double-checking his reasoning.

Von Neumann came up with a "proof" that the universe was non-deterministic. Unfortunately, that so-called proof was a piece of shit that assumed exactly what it was concluding. It took decades for anyone to bother double-checking von Neumann's bolster to common wisdom. It took many more decades to come up with conclusive proof that von Neumann's position wasn't merely false but ludicrous.

Von Neumann set back understanding of QuantumPhysics by at least half a century. The only one who did more damage was NielsBohr. Of course, Bohr also did it intentionally whereas Neumann only did it by mistake. -- RK

I'm curious why you end up sticking Bohr with all the blame. Usually the Copenhagen interpretation is attributed to him and Heisenberg, and it didn't seem clear to me which one derailed the model.

Derailed the model? The model was fucked from the get go. Heisenberg has his own list of sins, chief of which is the Uncertainty Principle. But anyways, one or the other of them had a massive reputation at the time and more or less single-handedly pushed through the fantastical Copenhagen gibberish onto a stupefied physics community. My guess is Bohr though I can't remember precisely.

That would've definitely been Bohr. I suppose I was thinking more of its development than acceptance, where I think Heisenberg had a greater role, both in developing the math (which was ok, hence derailed) and in coming up with the interpretation. So, fair enough, and thanks.


How could ArchimedesOfSyracuse be forgotten? His achievements where unmatched until the renaissance, 1600 years after his death. Archimedes advanced calculus and integration using them to discover and developed the first principal algorithms for calculating pi, the area under a curve and the volumes of geometric shapes. -- MartinSpamer

Most problems in mathematics are solved more or less algorithmically. That doesn't make all mathematics into software.


I notice that LinusTorvalds was omitted. I submit that, while he may not be another DonaldKnuth, there is certainly genius at work here. Without it, I am certain that Linux would have been just another bright idea that fizzled.


GordonLetwin is retired now, but he has, quite in addition to architecting OS/2, written some of the tightest firmware, software, languages, and OS code that I've ever seen. This guy was an idea factory with fast hands.


EditText of this page (last edited March 15, 2011) or FindPage with title or text search