Tuesday, August 26, 2008

Performance Testing Confusion

Among the huge number of things that I have little knowledge of is performance testing. Interestingly, I find that I am in a position to view the development and execution of a performance test. I'll concede that the parties I have as the models are probably not of the highest caliber but then you use what you have. Anyway, these are some of the observations I've made:
  • It requires a special skill
  • Strategy documents are superior to check lists
  • Results don't need to be in context
  • Execution only needs to be outlined
  • Data should be someone else's problem
I've never yet encountered a document that could replace the convenience and ease of reference that can be accomplished by a check list where the information being tracked is growing and evolving at each step. Consequently, I disagree with this notion.

No contextualization of the results? Kidding... surely? A summary may be sufficient but then a summary contains a "summary" of the explicitly rendered results. Wait though. There are more graphs than text. That must count right? I mean each has a useful description like "the transactions per second graph shows the number of transactions per second"... epic.

Baseline, load and soak will be run. That is an execution plan? Sure. Short and to the point. What though will happen as time gets shorter and problems refuse to get resolved? Contingencies? Eerie.

Data obviously can't belong to someone else. Surreal.

Skills. My current view is... that I need more input here although its probably QED.

No comments: