One of the best lessons I learnt from my first boss was: “when your code doesn’t behave as expected, don’t use the debugger, think.”
Can't entirely agree. I'm not a grizzled programmer, so sometimes when I'm coding for microcontrollers the debugger winds up teaching me about some new quirk in architecture. It's often something I was aware of (endianness, some of the finer details of addressing) but had not yet actually dealt with in person.
A poor developer uses his tools to cover up the symptoms. A good developer fixes the root cause. The ability of the good developer to fix the root cause quickly and efficiently is bounded by the quality of their tools. If you want to claim that a good developer will find it no matter what, fine, but don't tell me the developer with better tools won't be able to do it faster; it's almost the definition of "better tool". I find it very strange how people just write off using a debugger, ever, for any purpose, and think it labels them as a better developer, as if a carpenter who refuses to use a plane under any circumstances is thus a better carpenter for resisting the temptation to easily-smoothed wood. (The fact that a plane is good for more than that is deliberately part of my point.) It isn't the best solution for every problem and I certainly don't reach for it directly, but when you need it, you need it.
The root cause of every problem is that I don't fully understand my code. A debugger can only show you what the code happens to do in one particular scenario. The fear is that people are doing that and fixing only obvious mistakes as a substitute for static analysis and diligent review of the code leading to a correct model in their head of how it would handle any scenario. It's like declaring a bridge to be sound because you could successfully send one truck across.
"Doctor, it hurts when I do this." "Don't do that."
That fact that other people can misuse a tool is a really stupid reason to not use it yourself. I've had great success in using debuggers to identify which assumption I was wrong about when the code did something unexpected. If you're so awesome that you never make bad assumptions about your code... you're probably not pushing yourself hard enough. Perhaps that's your job's fault. I'm pushing myself to my limits in my job, and I'll take all the help I can get.
The problem isn't using a debugger, it's being overconfident in how much you can infer from any given debugging run. The solution is to have realistic levels of confidence, not to swear off the use of debuggers. Obviously, if you're trying to understand how the code works, having one empirical sample is better than having zero empirical samples.
If said developer can match his peers without using a debugger, then he'd do even better with a debugger. A tool is just that, it can only assist you in performing a task.
I don't agree 100% but I think debuggers are for code that you didn't write yourself and has insufficient documentation.
When it comes to a dynamic language, knowing how and what lead to a particular function being called is important and all that information isn't exactly saved on the stack trace---you have to step through to find it.
The debugger vs no debugger "problem" has been discussed through and through countless times with little result. I don't think we'll reach new insight this time. People just work differently.
I agree. There's times where you need a debugger and there's times where you don't. Learning to recognize those situations are valuable, in my opinion.
Sometimes good old pen and paper is the best solution when you find yourself getting into a bind. Other times, stepping through the code and watching what happens to the data/memory is the way to go.
It depends on the type of bug, sometimes its just simple quirks or something you've done incorrectly like the wrong pointer arthimetic or trying to write to a null pointer or something along those lines.
However, a lot of the biggest headaches are conceptual and in which case taking a break from the computer almost always helps I find.
Can't entirely agree. I'm not a grizzled programmer, so sometimes when I'm coding for microcontrollers the debugger winds up teaching me about some new quirk in architecture. It's often something I was aware of (endianness, some of the finer details of addressing) but had not yet actually dealt with in person.