Greg Margolin:
While working on one of the first e-commerce internet systems in pre-dot.com boom times, I remember a conversation that happened between me and my development counterpart Ken. We were talking about software quality and specifically about the need for performance testing. With his characteristic dry humor, Ken quoted a sign that he saw at his dentist’s office -- “If you don’t like to floss, floss at least those teeth that you intend to keep”.I have thought about this expression many times since. By now, everybody knows that you have to do performance testing. And everybody says that they do it all the time. But there are probably more negative anti-patterns than proper patterns in this domain. To enumerate just a few -- performance testing done right before deployment, testing only after major outages, testing only when the growth in traffic takes on threatening dimensions, assigning performance testing to relatively junior engineers, getting excited about scripting and neglecting analysis, not giving proper time to developing workload user simulation models, etc. New open source and off the shelf performance testing tools put performance testing within reach of everybody and anybody. But being able to do it, and doing it right are two different stories. And as usual, very often the real, meaningful performance testing ends up in the bin of technical debt -- something to take care of one day, once tasks of higher priority are accomplished. The price we pay for such an approach is staggering and keeps growing. Alas, we just keep kicking the can down the road. So what would be a proper approach?
At GQP, our recommended approach to performance testing is through Integrated Quality(™), something I demonstrate in today’s instructional video -- it is about 40 minutes long. If you watched David’s previous videos, (two last videos on the page) you should be all set as far as the environment is concerned. In the course of this video I will show you how to drive JMeter functional and performance tests, from Eclipse, using Maven, Cucumber, and writing feature/story step definition scripts in Groovy. I believe there is a direct advantage in such an approach -- instead of making the testing tool centric, we are making it design specification centric…literally in plain old English. It is in front of the whole team -- Stakeholders, Dev, QA, and Ops. The Behavior-Driven-Model forces the team to make an explicit decision about the performance requirements the team wants to live with. The next logical step would be to include such performance tests into a continuous integration/delivery solution, so that such performance testing can be evaluated with each and every build/release. As far as what to test, I refer you back to my friend Ken’s quote -- if you don’t test performance of all your projects, at least test those that you intend to keep.And now, click here for today’s instructional video, it is the one at the top. Enjoy!
posted by: Greg Margolin
Recent Comments