Failure Is Inevitable

FailureIsInevitable, but that does not mean we should do nothing to try to prevent(or more appropriately delay it), but we should always remember it is inevitable to be prepared because failure will eventually reach the system, just as death will eventually reach all of us, and if we are expecting it, we might be able to mitigate its consequences,... if only for some time... or at least we can be psychologically prepared to accept it


Is it helpful to worry that hard work might still lead to failure? Or that we may have problems that we don't know about?

Maybe, to some extent. But it might still be helpful to believe that hard work will make success more likely. And to be on the lookout for problems.

My experience has been that the problems that sink companies are well known to at least some of the people in the company. The problem is that their warnings are ignored. -- JeffGrigg

Maybe so -that in fact has been my experience too- but maybe that is because I see things from the perspective of an employee... lets try to see things from the the perspective of the top guys: How many times were they told, possibly by many supposedly wise people, that what they were doing was going to lead them to catastrophic failure in situations that ended up with them being successful? If a group of persons is told 10000 times that they are going to fail, and they do not...can you really blame them for not listening to the 1 voice that is finally right?. --LuxSpes

So the top guys at Enron had no idea that something fishy was going on? (That's what they said under oath! ;-)

The thing is that they probably "knew" there lots of fishy things, and they failed to predict which one of the was going to lead them to catastrophic failure... in a company the size of Enron, there are probably many fishy things happening at the same time, some of them get fixed by human effort, some of them fix themselves, and some end up killing the company... the warning voices warn the top guys about many of them... how can they tell when the warning voices are right? would you still believe the warning voices after you have survived their warnings without harm? would even be able to listen to the one voice that is right, in the middle of the noise produced by all the voices that are wrong? --LuxSpes

[The problem with Enron is that unethical people tend to hire unethical people. When the cheaters at the top believe they are the only ones cheating, then the tower of cards soon falls. A corporation is only as good as its leadership.]

Is that so? Maybe we should ask ourselves first how much power the leadership of a company actually has... Enron, for example, we can see it as a really a big team, and if you have n people on your team, there are (n^2-n)/2 communication paths in your team, that means that the effort of coordinating something the size of Enron is huge, maybe plain intractable, it just works... for a while... since it is so big, there is no way each one of the "team voices" can effectively communicate with each one of the others... so a complex system like Enron has to create specialized sub-teams, and each of this sub-teams will only deal (and see) a part of the big picture... this, of course, will work for a while, but then, the little mistakes in each of the isolated sub-teams will accumulate, until sooner or later their combined toxic effect poisons the whole system... Are we really right to believe that Enron "leadership" should have been able to prevent its own demise? or is that only WishfulThinking of the kind we humans typically use to self deceive ourselves to believe we are in control of our environment when in fact we are not?

It is certainly easier to say that the leadership of Enron was corrupt ethically than to admit that maybe it is impossible to prevent corruption on systems of that size, that some times corruption does not lead to the immediate demise of the systems, that probably their failure is just a normal part of their life-cycle and we should prepare ourselves for their eventual failure, instead of believing we can always do something to prevent it? That does not mean we should do nothing to try to prevent failure, but we should also be prepared because failure will eventually reach the system, just as death will eventually reach all of us

Since FailureIsInevitable maybe what we should do is prevent system from becoming so big that when they fail, they put the very survival of humanity (or a large part of it) at risk, we need to become AntiFragile


Take for example, the failure of BlackBerry: was it really the case that they were "unethical"? or perhaps "uncreative"? or was it that they were just plain too big and complex to change?


See TheCemeteryOfUnknowns, FailFast, FaultTolerance, FaultIsolation, MakeFailureImpossible


EditText of this page (last edited January 22, 2014) or FindPage with title or text search