Suppose we can compile and execute our code at such a speed that a given line of code will be executed the very instant after we press run. In this case, a choice printf line will execute at the same speed as typing print in our debugger. The benefit of the printf line is that we are in effect scripting our debugger to print some data every time that line is reached, without having to type it over and over. For one thing, this allows us to more easily go backwards in the execution, since we are really rerunning the program from scratch each time. We can also much more easily execute a whole litany of prints, which we can then scan for some expected data. The advantage of the debugger remains in the first place in languages that lack strong compile time reflection, such that only the debugger can reliably print the data we want to see, and in the second place in the myriad instances where we cannot get the compile+execution speed down to an acceptable time. I suppose debuggers are also useful when you have a core dump you want to analyze. Besides that I would consider the printf strictly superior.
Adjacent comment pretty much explained it. Sometimes it's printf, sometimes it's subtle tweaks to program behavior in strategic places, sometimes it's adding asserts (that might very well call functions not intended for public consumption).
I always start debugging by coming up with a theory about why the observed behavior occurred. The fastest way to validate that is generally one or two tiny changes to the program.
I only break out the debug tooling for the serious stuff that has me scratching my head in confusion. Either because the behavior has me thinking "there's no way that's possible" (spoiler: almost always memory corruption) or because I'm segfaulting in a shared library that I didn't compile or because I'm well into some obvious badlands with concurrency or memory corruption or the like.
What are your primary languages? Even in 2025, C & C++ debuggers are light years ahead of 15 years ago. And Java / DotNet debuggers are mind-blowing they are so good, plus "memory corruption" isn't possible (that I know) in vanilla code that runs in a VM. I almost never bother with your techniques anymore.