The management failures at Fukushima and both Space Shuttle disasters suggest a flaw in our management structures: the reward/blame structure (culture) set in place during "normal" operations may be problematic during high-risk situations and disasters. In bureaucracies, one is often taught to downplay, deflect, or hide mistakes because not making waves is given priority over honesty and creativity. This habit apparently doesn't go away during crunch-time.
In the Columbia disaster, there were hints of tile problems known while it was still in orbit. The manager was presented with the option of requesting an imaging review by spy satellites of the underside. However, she turned it down, probably because making such a request would be an embarrassing admission that the mission wasn't going perfect, and complacency made them believe that chipped tiles were "normal and expected" because they survived before with chipped and missing tiles. They also believed that little could be done even if major damage was found. But it turned out that some risk mitigation was possible if engineers thought about it hard enough. (Each in IT probably have WarStories of management complacency regarding file backups and security risks.)
The poor Japanese seemed to beat up on their own culture for management failures during the nuclear plant meltdowns, but it's not just in Japan.
CategoryManagement, CategoryRisk?