Entropic Law Of Complexity

The hypothesised EntropicLawOfComplexity -- All systems grow to be as complex as they can be and no more.

This is very close to a statement in Gall's theory of GeneralSystemantics: All systems encroach. Not to mention about 20 or 30 other Murphy's Laws.

Not quite. Open (i.e. unconstrained, free, able to lose heat) physical systems will tend to increase their entropy to a maximum over time. In the case of a volume of gas, there's really not much more to this statement than that there are hugely more dis-ordered than ordered states avilable.

Corollary: The most complex a system can be is just past the ability of humans to understand it.

Why?

It may seem like a losing battle, but we fight entropy all the time. Perhaps the solution is to form a SelfOrganizingSoftware system. I wonder what that could mean...

Might not GeneticAlgorithms offer some of what you're talking about?


This discussion reminds me of one of the laws from GeraldWeinberg's QualitySoftwareManagement. The problem, I guess, with really seminal books is that many things frequently remind you of them. :)

Unfortunately my copy of the book is currently on the wrong side of the Atlantic. Or, at least, a different side of the Atlantic than me. But here's what I remember.

We attack a simple software project. We are successful. Our ambition rises; we look up to the next most complex software project. We tackle that one. And so on, we climb the scale until we try one too complex, and fail. This may be a little grim, and may be too much like the principle about rising to the level of our incompetence. But for me it captures a basic characteristic of development teams. -- BillBarnett

And so on, we climb the scale until we try one too complex, and fail.

At which point the PeterPrinciple applies?


See also EntropyReduction, OrganicSoftware

CategoryComplexity


EditText of this page (last edited October 31, 2005) or FindPage with title or text search