Digital Woes

As amply demonstrated by history, software has bugs. Software has bugs because it is a structure of logic, embodied in a machine that resides in a noisy, messy physical environment.

The process of producing software is error-prone, time-consuming, and expensive. Producing safety-critical or mission-critical software, if it is done right, is even more arduous; the gains in reliability will be modest.

We are planning an enormous number of digital systems. Software is not the appropriate medium for all of these applications. In particular, using software for safety-critical applications is irresponsible; for mission-critical applications, we should be thinking long and hard about it.

Machines with software in them are intrinsically different from machines without software. What are wise uses for smart stuff?

-- LaurenRuthWiener


Does this mean that safety-critical applications should be implemented as hardware?

-- OleAndersen


Two comments: first, the line between hardware and software is certainly blurring. Look at a modern CPU chip and tell me it's intrinsically different than software. The key issue here, imho, is complexity. Computers have the ability to execute software of basically unlimited complexity at ridiculously low costs. For things that came before computers, you actually had to do something physical to get all those fiddly bits in there, thus constraining the complexity significantly in practice.

Second, I consider our ability to make trustworthy systems to be one of the great intellectual challenges of this century, or perhaps forever. VernorVinge, in A DeepnessInTheSky, listed a number of FailedDreams?, which included real ArtificialIntelligence and also the ability to make bug-free software. These, in the world of the book, are simply beyond the reach of humanity.

I am enough of a naive optimist that I don't agree with Vinge. I, along with a number of OldSchool ComputerScientists such as EwDijkstra and TonyHoare, feel that the techniques of mathematics are powerful enough to fully (ok, fully enough for AllPracticalPurposes) characterize the systems we want to build. If we simply had the will to apply these techniques, we wouldn't have to put up with random software failures.

The other approach that I see is an ecological one, where people deploy buggy systems, and apply a range of higher-level techniques for minimizing the impact of the bugs on their lives. This approach leads to things such as VirusScanners?, among many other things.

It's sure going to be interesting to see how this stuff resolves. -- RaphLevien


EditText of this page (last edited December 22, 2000) or FindPage with title or text search