Pretext: Researchers have discovered a trigger that leads to urban decay (Wilson/Kelling, Atlantic Monthly, March '82). Once a window in a building is broken and left unrepaired, the building starts to go downhill rapidly. A car can be left on a street for a week, but break one of its windows, and it will be gutted in hours.
Note however that modern science is very sceptical of this concept. See http://en.wikipedia.org/wiki/Fixing_Broken_Windows -- MartinHaecker?
Context: Software seems to be subject to entropy. Things that used to work stop working. Ideas that seemed good at the time, seem poor 3 months later. Interfaces get ugly. It's easy to find yourself surrounded by code that has niggling things wrong with it, or that's just plain bad.
The problem: When you're surrounded by ugly things, your attitude and outlook change. You become pessimistic, and your expectations are lowered. Soon you start accepting that "that's just how things are." You start producing ugly things yourself.
Therefore: Don't live with broken windows. Fix things when you see them. Refactor when you can. If you can't make the change right then, put some kind of flag in place (like a FixmeComment) to at least acknowledge that you've recognized a problem, and to tell people that someone cares and is on top of it. Stop rot while it's isolated.
The Flipside: People respect well maintained things. Beautifully restored vintage cars are treated with reverence. People take off their shoes when entering well-cared for houses. A good way to get other developers to treat your systems with respect is to keep it looking cared-for.
This is similar to saying RefactorMercilessly and make all test cases run all the time. But it's more general.
There's more to fixing broken windows than refactoring mercilessly. I think it's about maintaining a certain minimum standard. When I was learning C, we all used pre-ansi compilers. The example code we were given produced some warnings and we weren't given any advice about setting up the warnings given by the compiler. When we did the undergraduate compiler coursework, I spent a long time sifting through the warnings and errors I got from lint, trying to work out which messages were "real errors" and which where kludge reports. Things got bad because that was the norm.
When I taught C, one of my first slides (entitled "how to cope") told the students how to turn on all the compiler checking. It told them that you must do this (i.e. don't do this and you are on your own). This forces beginners especially to think about their code. After all, it's much easier to get a clean compile than to debug the aftermath of a dozen little hacks.
yes. Teach us (I speak as a student) to always compile with all warnings. Teach us to use lint. Teach us to keep the compile clean. But go one step further, too. One of the most valuable assignments I was given in my software engineering course was to take a truly ugly pile of code (code that passed all tests, but was nonetheless frighteningly poorly factored), and to bang on it until it passed lint. Besides the inherent value in learning how to repair code, the pure pain of this assignment was a wonderful lesson: "Don't let it get this bad in the first place." --AdamBerger
Shortly after I first started as a professional programmer, another programmer was telling me about his relief of being moved from a maintenance nightmare of a project to a brand new project. He told me of the months he had spent teasing out pieces of information from commentless SpaghettiCode and that while he had written a few documents about what he had found, it was still an unwieldy, mysterious mess. After describing a particularly horrifying example, I asked him if he had fixed it. He then explained to me that his mandate was fixing a specific list of bugs, not going around cleaning up the code. Furthermore, any code that he touched would need to be retested. It made some sense at the time (and I didn't know any better) but that attitude grated on me. He also told me that the company would have been happier if he had done a complete rewrite if it didn't slow him down, but he wasn't going to put his neck out. While his reasons have some validity, further interactions with him showed that his programming philosophy was anything but extreme. This helped to convince me to FixBrokenWindows.
I'm in an easier situation than that as I'm working on a new system and try my best to keep any windows from breaking and look after it that our windows are well-secured and not brittle.
Nonetheless, I'm not fond of the heroics the position above implies. If I could not convince my employer - or GoldOwner in general - that fixing broken windows is worth the effort then I won't force it on them. Rather, I'd consider quitting.
I had a captive job several years ago (it was a brief experiment with captivity -- blech!) for a firm that makes automotive exhaust gas analyzers. These guys had a product in place that met the California BAR 80-something specification and wanted to bring it up to BAR 90. They were using an off-the-shelf X86 PC as the compute engine and user interface. The product code was compiled with Lattice C of 1988 vintage. I was hoping to convince these guys to go with Borland, so I tried compiling the mess. 400 errors and 1200 warnings later, I saw that there was a small problem. I asked for and received permission to attempt to fix all the broken windows in the product code. Two weeks later there were no outright compiler errors that I could not explain, but I still had over 800 warnings. My boss told me to drop it.
Now, am I going to just plow ahead on a beater like that? Sorry, some windows are going to remain broken. I get paid to produce "results." These results are measured by people who stare out of broken windows all day long. -- MartySchrader
Would the product have been "ok" in the first place, it would have saved you (and your company) two weeks of your time. Probably it would have been cheaper to do it "right" then instead of fix it "now". Assuming this assertion holds, the argument can be applied recursively to the decision your boss faced.
When we're talking about fixing broken windows, that's probably the wrong analogy for a big mess like that: It sounds like you attempted a major refurbishment of a derelict building :) Would it - in retrospect - not have made more sense to clean up each file as you work on it? You refactor in small pieces, and while you gain the benefits only incrementally, your work is also spread out. All the usual benefits of small work units apply: you keep track of what you are doing, it's a safer way of doing it, and if the project gets canned halfway through you've only lost half the effort. Oh, and your satisfaction of working in a nicer environment is spread, too -- BurkhardKloss
Sorry, folks. I guess I didn't clearly describe the situation. The code I was attempting to bring into the (at that time) 20th century was for a working product. It would compile and link with minimum fuss on the ancient tool they were using. The code was filled with stuff that any compiler with a grain of decency thought were errors; the warnings alone would make one go screaming off into the night. I did not even get the chance to try refactoring anything. All of the work I put into that junk was just to remove straightforward bad coding practices so that the code would compile.
It clearly was not worth the effort to try pursuing this venture, so I went along with my boss and dropped back to the ancient tool set. <sigh> Ya gotta pick 'em. This was not one of the "ones." -- MS
The Atlantic Monthly article is at http://www.theatlantic.com/politics/crime/windows.htm
Whoops, that 1982 article may be feel-good public policy, but it may not have much of a theoretical leg to stand on, says this article in the Chronicle of Higher Education: http://www.chronicle.com/free/v47/i22/22a01401.htm
It probably makes poorer public policy than programming practice, and perhaps the difference is one of scale and diversity. One possible problem with the "broken windows" theory in criminal justice is that cities are large and varied, so its inhabitants have vastly different values. To pick one example, property owners in poor neighborhoods often see graffiti as vandalism, but younger residents are often likely to see it as an improvement. Many public-nuisance laws -- loitering, noise laws, graffiti, etc. -- can exclude a certain part of the population based on divergent values, and it's those laws that "broken windows" proponents are often trying to enforce heavily.
But consider, by way of contrast, a group of programmers. Their dedication to clean, maintainable code might vary from person to person, but it's extremely unlikely that the values are divergent. Sure, some programmers are bad at guarding against bad code. But they won't actually tell you "I like crufty, unreadable code, so there."
What programmers are you working with? Most of the ones I've met would say almost exactly that. Of course, I work with C++ so it comes with the territory. -- BrianRobinson
I agree with Brian, it's not that programmers don't want clean code, they just have vastly different ideas of what constitutes clean code. Coming from the Microsoft world myself, I find most programmers to think duplicating code is the proper way. For example, I actually had an argument with a guy who claimed cut and paste was superior due to the extra redundancy it provided, then he proceded to explain to me how this was like the redundant systems in a battleship and that was obviously proof of his technique. What can you do, I laughed and gave up the argument, got better things to do with my time. -- RamonLeon
He's right, you know. It is a redundant system and it produces quite robust stuff: when you fix a bug, its backup comes on line ;) -- SethKlein
You could say, then, that FixBrokenWindows works best when the goal is shared, but commitment to that goal varies from person to person.
For the social aspect of the "broken window" theory, see also TheTippingPoint book.
Regarding compiling with warnings and fixing all of the warnings: what if you are given a library (set of headers, broken runtime environment, etc) that fails to run warning-free in and of itself? Example: when using the STL that ships with MSVC++ 6.0, the compiler generates many warnings about "unreferenced formal parameters" and "variable may have been used without being initialized." These are within the STL headers themselves, and articles from Microsoft even admit to the fact that they were never designed to compile cleanly with all the warnings on.
It might be being pedantic, but it's still frustrating to get pagefuls of heinous STL warning messages for a batch of clean code that compiles silently with GCC.
I've seen some company specs that attempt to "solve" this problem by telling their coders to disable these warnings with a #pragma command so that it compiles cleanly, as code has to compile cleanly to conform to their ISO 9000 coding standard. Blech.
An actual honest-to-goodness CompilerBug makes it difficult to even #pragma out all of the STL-related warnings (4786) which VC6 causes. The workaround is to make the first #include a "compiler.h" in each CPP file and put the #pragma there. Before the workaround was discovered, this was an interesting illustration of how FixBrokenWindows operates in reverse: with tons of STL-related warnings you know you can't do anything about, how likely is it that you will find the courage to fix all the other warnings?
Windows tend to get broken when you slice a system horizontally. See SliceSystemsVertically for more.
People often try to FixBrokenWindows on Wiki by doing a little WikiSpringCleaning. Improving the pattern, programming, computing, etc content that has been around for far too long without changes is also something to consider. Without FixBrokenWindows, Wiki may die of entropy.
The more broken windows you allow to accumulate, the more you will have to fight against people using the logic of the BrokenWindowFallacy when you finally try to fix something.
Besides WikiSpringCleaning by deleting pages, can I ask some of you who are highly knowledgeable and calm people to consider: