Thursday, November 17, 2005

Why does a profiler almost always point out that your intuition is wrong?

Nine times out of ten, profiling reveals your code is not spending the time where you think it is. Why is this? Usually you don't start profiling until either testing or a production environment reveals a performance problem. Most of the time after running a profiler in this situation, I'm left thinking "I never would have guessed that was going on!" Why is this? Probably because if I had thought of the particular set of circumstances earlier, I would have designed/coded for it. This phenomenon is part of a larger problem facing the software developer: How do you know your software will function properly when your customer begins to use it? The truthful answer is you don't really, no exceptions. You can derive a level of confidence, based on how close the customer's environment matches your test environment. Reality dictates that when deploying a complex software system to a new customer environment, something is going to break. Maybe the failure will be due to some low-level network parameter, or perhaps the database needs slightly different configuration because the RAID device in use is faster or slower than the reference test system. The list of possibilities is virtually endless. It all comes back to the fact that your testing only accounts for the situations that you thought of and is valid only within the environment you were in control of. Does this mean testing is useless? Of course not. It simply means that you need to be prepared for dealing with the unexpected failures. This is where debugging skills become critical. Knowing what tools to use can make all the difference. In these circumstances, sometimes you need to let your intuition take a back seat to the hard data that debugging tools can provide.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home