Post Seventies Idea Slump

Have ideas stagnated since the 70's? By the 70's we had:

Most new ideas are just rehashes or refinements of these. Nothing really new has come since. So get out your plaid pants and afro wigs and shake your booty.

Methinks you're going to play the game called 'NoTrueScotsman' regarding what means a 'NewIdea?'. There is no idea that is not built upon or derived from other ideas, and it is impossible to express a new idea except in terms of old and better understood ideas. I see, below, that many of your bold counters rely upon the notion that the idea has antecedents. To be frank, so does everything on your list. Put away your afro wig and plaid pants... put on a toga. -- Any true Scotsman knows that it's not an afro, it's a SeeYouJimmyHat?. :) By the way, I'm not in bold to SHOUT, just to distinguish myself from the person who was posting in italic . (More shortly, including hopefully some ViolentAgreement, the other WikiCliche? we can't go without.)


No, ideas have not ceased:

Most of the above are about hardware advances. The topic was meant to be about software engineering. I reworked the intro to make this more clear. Althought it is true there've been new programming languages, none of them have any significant new ideas in them. At best, they are a good packaging of pre-known ideas.

The advances above are advances in "ideas", hardware does not operate in a vacuum, it requires software to function. Software exists to make hardware "work". The advances made to make the items in the above are primarily advances in Human/Machine Interfacing. Software not only operates using algorithms and processes, it also operates by responding to "events". What makes most of the above ideas work is software handling events through interfaces to move data as directed by the user while the user's computer is running the software. If this is defined as merely good packaging of pre-known ideas, and not the implementation of new ideas, then what old, pre-known activity can be defined as doing the "packaging"? -- DonaldNoyes

InformalHistoryOfProgrammingIdeas suggests "patterns" and UML are revolutionary. Patterns are merely an attempt to classify certain coding idioms. Giving names to something does not by itself make it new. (Plus, some people, such as PaulGraham, feel that patterns are signs of a lacking language or paradigm.) And UML diagrams have similarities to ideas that existed for decades. It is merely an attempt at standardization.

I think that social processes as well as meta-topics around software-engineering might be new ideas.

Some might disqualify these as they are deemed not formal and fundamental enough, but then one could easily go the opposite way and exclude most on of the list at the top and say that only electricity and math are formal enough.


Perhaps the post-70s period seems to suffer an "idea slump" because we're still too close to it to identify the ideas that are becoming, or will become, influential.

And also because contemplated ideas may take years to centuries to become implemented ideas, and because the implementation of the idea may take on facets and twists not contemplated fully by the many original ideas which become part of its realization. A perfect illustration of this can be seen in the ideas and sketches of LeonardoDaVinci regarding manned flight and submersible ships. An IdeaImplemented? is most often conglomerations of more than one IdeaConceived?. --DonaldNoyes

But I have not seen any decent candidates even.

Only because you reject candidates that build upon, derive from, or advance ideas from pre-70s. You could do the same for pre-1870s. Every idea has antecedents.

Perhaps it could be said that 40's-to-70's is when the key software engineering ideas we use today were identified, described, and recognized as fairly distinct and powerful ideas. Darwinian evolution was indeed hinted at many times before Darwin, but he "opened the book" on it. Same with Dr. Codd (relational).

[Very true. The ElderDays established the essential foundations of computing. The "no, ideas have not ceased" list is one of relatively narrow iterative refinement and commercial application of ElderDays foundations -- or outright obviousness. There is not an item in that list that does not rely on ElderDays foundations, and there are few (if any) ideas in the list that are equal in generality or significance to any ElderDays foundations.]

[Every industry goes through a "Golden Age" of significant research, invention, and innovation, where genuinely new territory is explored. In the automotive industry, for example, its ElderDays were roughly from the late 1800s to the late 1930s. Virtually everything currently found in a modern car was either theoretically or practically explored at that time. Apparently modern innovations like airbags, anti-lock braking, emissions controls, electronic fuel injection, lightweight materials, hybrid power, etc., are all iterative refinements to pre-existing foundational work, and only appear to be significant to those unfamiliar with the history of the foundations that made them possible. Current automotive "innovation" is little more than applications of basic ideas, the majority of which are nearly a century old. Similarly, modern computing is little more than application of the basic ideas established in the ElderDays.]

By this reasoning there have been no significant ideas in math and physics since hundresth of years. I think you have to refine your condition for 'significance' somewhat and will realize, that it's not as easy as you think. -- .gz

[There is a clear and obvious distinction between fundamental theoretical research and trivial technological development. In physics, for example, no one would confuse the development of (say) string theories with the invention of a better mouse trap. Yet, above, we have clear "mouse trap" entries like CascadingStyleSheets, RemoteProcedureCall, and ReallySimpleSyndication. None of these required extensive research or any significant intellectual effort. They might be somewhat clever applications of existing technology, but that does not make them important in any foundational sense. In scientific terms, they are trivial. This is not meant to diminish their industrial significance, but to regard CascadingStyleSheets as on par with (say) the RelationalModel is as ludicrous as treating the discovery of general & special relativity as on par with developing a slightly smaller portable MP3 player, or treating the invention of the internal combustion engine as equivalent to making a better windshield wiper. In terms of its pervasive foundational significance, there is an order of magnitude difference between the work done in the ElderDays and the bag of gadgets listed above. Admittedly, there are some fuzzy areas -- like the "advances in TypeTheory" -- but let's not insult the true innovators and researchers in computing by making the ludicrous claim that (say) ReallySimpleSyndication is somehow a significant idea, by any meaning of the term "significant".]

Only if you disregard the millions of computer users who daily employ the implementation of this idea (RSS). You might consider usage of idea implementations as one measure of significance. Success of an idea demonstrated by its widespread use is in my view significant. It is obvious to me that significance of ideas is based on ItDepends on who you are, and the value system you wish to use in determining what is significant, and what is "trivial". -- DonaldNoyes 20070712

I imagine that many of what you consider 'weak' ideas like CascadingStyleSheets will be derived from and built upon to far greater extent. After all, CascadingStyleSheets is simply an implementation of an even greater idea: separation of content and presentation. If it can be done for 2D text, it can be done for 3D Virtual Reality objects and by Display Agents -- CascadingStyleSheets, or some derivative thereof, will become pervasive to the point of being part of every HumanComputerInterface? you care to name. Many ideas of the 70s have borne fruit that is visible today, and a great many more have fallen to the wayside and been forgotten except in computer lore (UseNet, anyone?). You can expect the same of many ideas of the 80s, 90s, and today -- some will advance beyond what you currently imagine, and others will fade away. Heck, it isn't too late for old 70s ideas to fade away. I expect that the RelationalModel will be surpassed by ideas already present, including RDF and much of what is listed in WhatIsData and KnowLedge... if only because computer agents (future of 'Web3.0') must know what a 'tuple' means by context if it is to perform any sort of learning or DataMining across vast stores of information.

The author of the bracketed argument above is also grossly underestimating the relative 'intellectual effort' and 'ideas' that went into practical implementations of such things as RemoteProcedureCalls?, CascadingStyleSheets, and ReallySimpleSyndication. Fact is, a ballpark notion of where you want to be is just one idea... it takes finding and implementing of detailed ideas to get there... ideas on how to combine one idea with another.

Modern TypeTheory will probably gain real use no earlier than twenty years down the line. Users of languages evolve far slower than the languages and language theory currently does, and code-base inertia prevents any rapid change. However, I expect that the LanguageOfTheFuture will let you use any damn syntax you please (from Befunge to Occam, and even graphical programming like SmallTalk) and will be tightly integrated with the OS of the future. Look at LanguageOfTheFuture and NewOsFeatures if you want a list of 'ideas' a long, long way from implementation. Just glancing at one -- the ExoKernel -- shows an idea that's been around since circa 1994 but not yet in common use that very well might be the future design-path of all OperatingSystems. And there are wild ideas, too, like KillMutableState. We don't know where all of them are going, and which will turn out to be duds... and which will disappear for fifty years only to appear again when another, newer, idea makes it practical.

[KillMutableState was well implied at least as far back as 1967, by D. L. Childs in "Description of a Set-Theoretic Data Structure". I don't deprecate the notion of CSS or the broader principle it represents. I simply reject the notion that it is anything new or innovative, or that it represents the starting point of separating data and presentation. Such notions have been implicit best practice for decades, with early browsers and HTML actually being a step backward in terms of separating data and presentation.]

[You young'ns who genuinely believe the above list represents worthwhile, significant, theoretical innovations would be wise to review some history, starting with reading some of the classic papers in computer science. You'll be surprised to see how little is new, and how much is simply old wine in new bottles.] A new bottle IS a new idea. An idea is just a way of looking at, packaging, or combining other ideas, after all.

[As for RDF, or the content of WhatIsData or KnowLedge having any significance at all, let alone surpassing the RelationalModel... I'll believe it when any of said content becomes the basis of a working multi-terabyte OLTP database or it's echoed in a Knuth volume. Until then, I'm not holding my breath.]

[Imagine that RSS magically disappeared tomorrow morning. Would the computing world grind to a halt? No. A few million users would be slightly inconvenienced by not receiving their daily force-feed of SlashDot posts or whatever, but computing as a whole would be unaffected and Something Else would be easily devised to replace RSS. The same goes for every item on the list above: take it away and there might be inconvenience, but that's it. Now imagine that the B-Tree algorithm -- an ElderDays invention from 1971 -- magically disappeared tomorrow morning. Furthermore, imagine that the B-Tree algorithm is magically replaced with something similar, but slightly poorer performance on any criteria you like. Though this is obviously wild hypothesising, it is quite likely that the computing world would grind to a halt. Magically replace RSS with something similar but poorer performance? I doubt anyone would even notice. That's why the B-Tree is a foundational ElderDays product, while RSS is a trivial and uninteresting grain of sand on the vast computing landscape. In ten years, I bet no one will even remember it.]

Imagine if the very idea of communicating just disa-

[Tee hee... :) ]

{RSS, XML, and CSS are commonly used because they are an industry standard, not because they are revolutionary. There are plenty of things I would change about CSS if given a choice. It is almost like listing the QWERTY keyboard as revolutionary because it is common. Being a standard does not count for much by itself in the context of this topic. By the way, some argue that Lisp EssExpressions would be better or equal to XML. Much of the popularity and utility of the web is driven by standardization. But none of those standards are fresh ideas.}

I, too, would change many things about CSS... but how, exactly, would I change them? I've put perhaps fifty hours of study into how one might go about using a variation on the idea behind CSS to automatically transform the artistic styles of virtual worlds (e.g. between cartoony, gritty-realistics, surrealistic like Salvador Dali, etc.). In particular, my goal was to allow user-constructed characters to traverse one user-constructed 'world' to another and automatically adapt to that world's art style, along with all the items the user carries, and also reduce rework wrgt. models for constructing the worlds (e.g. so a chapel-model would also transform between worlds). This was 'inspired' by seeing the use fo CSS. However, while I have an idea what I want and several ideas for approaches, I've not yet been able to find what I'd consider a satisfactory approach to get what I want; at best, I've proven that it will require attaching a great deal of semantic information as hooks into model data, and that model-data must be expressed as constraints on the output (and animation) rather than exacting point data. i.e. so the model can be recognized as the same (or at least as a unique model in entirely surrealistic settings) from one world to another despite otherwise significant changes. Using those constraints intelligently, however? that's where the semantic information comes in (I want a object/man-made/furniture/table with a object/natural/plant/vine 'growing around (or 'engraved upon' or 'engraved into') the table's base' and various other features and constraints...), but making use of that semantic information, expressing constraints, and performing transformations on the constraints/features is difficult, and I still lack ideas for it.

How, exactly, would YOU change CSS? and to what effect? Or are you out of ideas the moment that question is asked? Ideas ideas ideas... it takes a lot of ideas to get from an idea of what you want to an idea of how to get there. As far as XML, I'd rather just skip it entirely and use a typed language for data expression that handles both macro and functional expansion... and possibly syntax extension. You always need to have an understood language to initiate communication, but the language can self-extend during communication. Just recycle a language for macro expansion that handles the whole of KolmogorovComplexity and be done with it!

I didn't refer to the gadget list. My list would look more like this (from my not so small references repository): ... just ask for more -- .gz

I'll add wait-free atomic containers (heaps, lists, sets, maps, etc.). Those will be damn important when data is distributed widely on nodes across a network... since waiting on a lock held by a node that stops talking is ridiculous and insane. Oh, and I'll add all the newer theories regarding Network Survivability (which constitutes far more than failure tolerance... it constitutes resistance to -attack- and -natural disaster-) and Disruption Tolerant Networks. Many of those ideas will be necessary even in software engineering and protocols of the future, since overlay networks will become more and more common.

[My complaint was with the "ideas have not ceased" list. Your list is far more worthy than any that considers WebCams and Blogs to be examples of fertile idea-smithing and productive intellectual toil. Some day, portions of your list might become as influential and foundational as the work done in the ElderDays.]

Time as Judge

I reject your position. This page isn't: PostSeventiesWorthyIdeaSlump?. And you are not the correct judge as to whether a particular idea is worthy... only society and time will determine whether an idea sees use or becomes influential. And the vast majority of ideas cannot be foundational. WebCams and Blogs could just as easily influence future ideas (including the manner in which (associated or similar) software is engineered) as those from TypeTheory or any given model or approach to computation or information storage or processing.

It took about 15 or 20 years before most of the listed ones were clear shiners. Thus if this pattern continues, 80's ideas should start being revered by now. But they are not. Was 80's just a coincidental gap? (I know I try to forget that decade :-)

Maybe not coincidental. I imagine that the 80s was a period of idea assimilation (that, and the microcomputer was a new hit wonder), and the 90s were loaded with ideas regarding how to use the tool new to the majority of humanity: TheInternet. Both times were loaded with ideas and derivative ideas, making practical some of the concepts merely fancied in the seventies and earlier. Your dismissal of the intellectual effort that went into such products and their influence today is in error, but such ideas are harder to point at or put in a list. The devil is in the details when you're the one implementing them.

[My "dismissal", as you call, it is not one of error vs. non-error, but simply a reiteration of an observation of evolution in an enduring industry. See, for example, http://lambda-the-ultimate.org/node/2059

The term "Elder Days" or "Golden Age" or whatever exists because it is commonly recognised that the computer industry, like many other industries, goes through distinct phases:

If computing follows the pattern of the automotive industry -- which has arguably reached statis -- it will reach a point of negligible new development, where minor tweaks are heralded as breakthroughs by marketing departments, but no one (outside of Marketing or naive observers) would claim the ideas are, well, ideas. Of course, some theoretical innovation or discovery (to a limited degree, work on these always continues) may be sufficiently revolutionary to spark a new "Elder Days", and thus renew the cycle.

It must be emphasised that I do not deprecate any of these phases, nor do I attempt to put one above another in some fashion; I merely wish to highlight the fact that there is a PostSeventiesIdeaSlump (end of "Elder Days"), but it has been balanced by a PostSeventiesImplementationBoom (post-"Elder Days"). That boom is what the "ideas have not ceased" list is about, and there is clearly a qualitative difference between the nature of the ideas spawned in the ElderDays vs. most of the "ideas" since. While it's difficult to articulate precisely what makes the development of (say) the B-Tree or Prolog different from developing a Web camera or CSS in terms of "idea-ness", that difference is unquestionably there.]

"but no one (outside of Marketing or naive observers) would claim the ideas are, well, ideas." -- So you're saying that those who argue with you are either in Marketing or are naive observers? How very kind of you.

There are, of course, qualitative differences in how the ideas are applied and among those who recognize the ideas. However, I'm not all that convinced there is a qualitative difference in the development of the ideas, or their idea-ness. Nor am I convinced that the world of computation science has entered a stasis, at least among fields involving automated theorem proving and type-theory, network survivability & disruption tolerance. It seems to have entered a new phase of pre-implementation work on OS design, compiler optimizations, and HCI as people try to get past what is currently a rather stagnant forms of the same (i.e. there's a lot of talk in these fields about what ought to be done, but little actual change).

As far as phases go... development of new ideas slows down when building atop older ideas only because it takes people a long time to master the old ideas. There's a lot of educational territory to cover before you reach the frontier. What you seem to be looking for aren't new ideas, but new 'revolutionary breakthroughs' -- ideas (e.g. models and theories) that, by themselves, open entirely new fields (new frontiers) of study, even if they aren't at all practical until someone starts advancing them through the more normal evolutionary development.


Ancient Quotes (perhaps 600 BC):

Not so ancient quote (perhaps 2004):


Related:


JulyZeroSeven


CategoryHistory


EditText of this page (last edited November 5, 2014) or FindPage with title or text search