Forget The Debugger

A process cannot be understood by stopping it. Understanding must move with the flow of the process, must join and flow with it. -- F. Herbert (The First Law of Mentat in "Dune") See YouCantUnderstandaProcessByStoppingIt.

One might want to define different types of flows and processes. Anyone who has debugged a system with an ICE knows this to be untrue.


Testing is one of the primary XP activities. It is first and foremost a part of the coding microcycle (test/code/refactor) and UnitTests are its most visible expression; but it extends well beyond. For some reason though, lots of people resist testing; let's face it, no one likes designing or writing tests.

You need a convincing pitch to get people TestInfected.

Therefore,

Tell your colleagues you've forgotten what a debugger looks like. Ask developers how much of their time they spend debugging, as opposed to actually coding. Ask managers how confident they feel when giving a product demo; ask them how often they hear "um, we're still working on this bug". Compare with your own experiences using UnitTests.

You may still use a debugger for legitimate purposes (i.e. listening to your code more closely, when a test fails or prior to refactoring), but you might want to do so covertly at first.

Then,

Introduce the notion of AcceptanceTests. These work as functional tests (telling you when you're done), regression tests (telling you when you introduced a new bug) and acceptance tests (helping get that most important customer's signature on acceptance letters).

More generally, insist that TestabilityEnsuresFeedback?, not just in coding but in most goal-oriented activities.

How is this relevant? Debuggers are for finding where bugs occur, unit tests for making bugs manifest themselves early. Sometimes good unit tests make debugging sessions easier, but if a test fails, it is not the job of the test to show directly just where the bug is.

Rarely does a debugger show where the problem is either. Unless the language is C/C++ and the problem is stack smashing, memory overwritten, or other pointer mess.

The debugger can show that there is a pointer mess, but it cannot (in general) show what caused the pointer mess. This is what purify was written for. --MatthewFarwell


Discussion

Maybe my immune system is fighting TestInfectedness. I've been making good use of UnitTests for about 3 months. I'm still trying to learn CodeTestFirst? and to work in very small units of work. I still find the Debugger useful to help me determine why a test is failing. -- DavidCorbin

Yeah. The primary reason I'm not missing the debugger much is that I work primarily with server-side code running under an application server; I generally can't step through code that is being exercised by an acceptance test or put breakpoints in it.

Never accept can't step use the debugger. There is almost always a creative way to do it. I use the debugger on a lot of server-side code -- DavidCorbin

Oh, sure. I've done it too. There's always an entry point; I've used OptimizeIt on the same code that I'm saying I can't use a debugger on. In this particular instance though, I consciously decided to go with the XP wisdom that you reserve your creative faculties for those problems which demand it. I made a conscious decision to ForgetTheDebugger.


The tactic I adopted before finding out about UnitTests was to instrument the problematic portions of the code with System.out.println(expr) calls and figure out whether expr had the expected value, and if not why. This tactic carried over to UnitTests well enough that I still favor it over stepping through code with a debugger; I do find myself occasionally missing it.

That would seem to take a long time to me, compared to using a debugger -- DavidCorbin

Not necessarily. It takes about four seconds to type System.out.println. With experience, it might take about as little time to get to a breakpoint in an appropriate location. Generally though, once you get there what you're interested in is why a particular expression doesn't have the value you expect it to have, and whichever of the two methods you're using it takes you a few iterations through the cycle to find that out.

The advantage of a debugger is that it will show every value, not just the one (you think) you're interested in.

But far be it from me to be a bigot about such issues. The goal is to ListenToTheCode; how you go about achieving it is a matter of preference, local conditions, previous experience, which side of the bed you got up that morning.


I recently used the debugger in DolphinSmalltalk to learn about its classes, and that was a delight. Just possibly I'd been afraid to go back to using a debugger because I had such bad memories from the time when I had to use it to actually hunt bugs. -- LaurentBossavit

It can be a really nice tool for investigating how a language/library works. -- DavidCorbin

I stopped using debuggers when I switched from C to Java a few years back. The Java debuggers then were not worth the trouble learning them, so I relied on System.err.println(). I later thought about learning a debugger, but since I started developing test-first I haven't even been using System.err as much as before. But I will start using a debugger, some day. -- AndersBengtsson

In my experience, stepping through code and automated UnitTests both illustrate what the code is doing. The big benefit of automated UnitTests are that they keep on giving the benefit, long after the debugging session is over. -- RichardBash


Torvalds' Rant

See http://lwn.net/2000/0914/a/lt-debugger.php3

Oh, what a juicy rant! Linus gets my vote for GrumpiestProgrammerInTheWorld?! -- IanOsgood

Linus seems to dislike debuggers for different reasons than this page promotes. He doesn't want linux kernel coding to be easy, or done quickly, which is a rather anti-XP attitude. Although a kernel (which many people depend on) is very different than your average standalone application... it probably makes sense to not EmbraceChange when maintaining a kernel. :-) Personally, I'm generally pro-XP and pro-debugger. A good debugger is IMHO the best tool to use to become familiar with a new library or a new language. But I can see how having robust UnitTests can reduce the need for debugging of any sort. -- DougWay

The applicable point of Linus' great comments (and also from UncleBob below) is the following generalization which my experience suggests is true: people who rely heavily on a debugger WHILE PROGRAMMING tend to fail to understand the higher level structure of code. I am still steaming from quite a few ProgrammersWhoWorkInDebugMode? who said they were "done" but whose code worked in one or two cases and failed _miserably_ in _all_ the other cases. Watch for this effect. Try to program without a debugger to ensure you create a logical high level design. You will just KnowItWorks?. -- DaveEaton

I think that Linus described the reason behind YouCantUnderstandaProcessByStoppingIt: With a debugger you understand the code only from step to step, from line to line, not the tangle of cooperating threads as a whole (what Linus calls 'one level up') which is the important thing. -- GunnarZarncke


I develop software for a 16Mhz processor (software debugging would be awfully slow), with a debugging system that doesn't always break on the code you place a breakpoint on. We use a lot of interrupts, so sometimes breaking on a breakpoint will prevent things from happening that normally would have (due to timing)... I've found at this point that I do a lot less debugging then I ever did on the PC.

And it's amazing how annoying bugs in the compiler can be :-) I've had one recently that performs unsigned comparison on signed numbers that only occurs when optimisations are turned on (rendering tracing completely useless ;). -- MichaelAbbott


All programs are broken (have bugs); debuggers are programs, so...

There is a real issue here, however. Many debuggers have (sometimes nonobvious) dependencies on the software environment, and on certain software conventions being observed.

That's true, but it wasn't the issue I was alluding to. IMHO the practice of relying on a debugger is a broken practice. I have met teams whose first reaction to anything is to fire up the debugger instead of thinking. I once watched a developer painstakingly set up breakpoints and carefully step through the code spending fifteen or twenty minutes only to reach a line of code that he already knew was broken. He just couldn't shake his reliance on the debugger. That reliance had turned into a significant liability.

-- UncleBob

Also from UncleBob, "Debuggers are a wasteful Timesink": http://www.artima.com/weblogs/viewpost.jsp?thread=23476


Unit tests only detect problems at the unit level. Most interesting problems are at the system level. For example, in one stress test CPU usage is very high which causes the socket buffer not to be read, this causes an overflow, which caused the retransmission of some messages, which caused messages to be out of order, which caused a higher level protocol to shut some services off that it shouldn't have. No reasonable unit test would have ever caught this. These absolute positions strike me as amazingly childish.

Is this a threading issue in the socket buffer? Either way, why shouldn't a test that exposes this in the socket buffer not be a reasonable unit test?


The debugger, UnitTests, printf statements, assertions, contracts, etc., are all tools we employ to write better software. Each has its place. For someone to suggest that we shouldn't use a debugger is like suggesting that we shouldn't use a hammer while building a house. Sure, we could use a good heavy wrench to drive nails, but hammers are far more suited to the task.

Do agree, though, that debuggers frequently aren't the best tool for the job. If a debugger is a hammer, it's great at driving nails but less so at cutting wood. However, the more tools available the better.


The most useful thing I find in debugging is not the debugger, but logging. Using a debugger when you're developing is great, and easy. But trying to figure out whats happening on a production system that you can't attach a debugger to, or trying to figure out a problem that actually happened last week, you can't use a debugger for that.

If, however, while debugging, you've added sensible logging (very hard to do), then maybe you can debug after the fact. Don't forget the debugger, but you can leave logging there for someone who comes after. -- MatthewFarwell


See UseTheDebugger, IfIdeVendorsConcentratedOnTestRigsInsteadOfDebuggers


AdoptingXpPatternLanguage


CategoryPattern CategoryExtremeProgramming CategoryDebugging


EditText of this page (last edited November 14, 2012) or FindPage with title or text search