Stormy Seas Brighton Beach by Les Chatfield
No, it is not. But it is in an exciting, promising, and potentially fateful transition. In this post I will try to outline the scope of both the problem and the opportunity, and will start charting the way (as we, at GQP, LLC see it) for navigating the uncertain waters of today’s software quality environment.
One can say that this is the worst of times and the best of times for software quality assurance, both as a functional discipline and as a profession. On the one hand, as IT and the software that powers it moves to the core of every business, the importance and the attention given to the quality of the systems has never been higher. As reported, President Obama spent many a meeting asking and worrying about the quality of Healthcare.gov. That is, our Commander-In-Chief, spending time from his busy schedule on the issue of software quality. Outages, security breaks, malfunctions and such, all that popular media is charitably calling software glitches, makes the front pages of papers and web sites. At the same time, business and the craft of software development is going through its own revolution -- agile development and devops. In the process, great new technologies, methods, and practices have emerged. To name just a few: agile development, agile testing, continuous integration, devops and continuous delivery. Great open source testing tools and testing frameworks like JMeter, Cucumber, Jenkins and many others were developed, improved, and made popular. Automated testing and continuous integration became part and parcel of the software development life cycle. Then what is lacking in this paradise?
To begin with with we are very far from a paradise. The fact that the Chief Executive of the country is spending time and knocking on wood while listening to the reports of key IT systems, or a major retailer changing its CEO over a cyber security breach, shows that with all the advances in software development, IT systems remain very vulnerable and often fragile. It is no secret that for years the main focus of rapid software development was a race to deliver to end users new and exciting features. Agile techniques turned out to be the right fit for a quick turn-around in producing new features. Product owners and developers that adapted agile could release new products really fast. Continuous integration and devops made frequent updates possible. But all these advances came at a price. Simplistic application of the agile methodology brought many places to quick death marches and mini-waterfalls cycles instead of real agile sprints. Core system improvements that would compete for resources with new features would invariably lose in priority, and be relegated into the “we gotta do it one day” category, thus contributing to ever growing technical debt.
The move to a new QA organizational culture -- away from standard, centralized groups and toward Google like "developer-in-test" came at a price. There is no doubt, that the move to automated testing as part of continuous integration is a great practice. Likewise, the slogan that quality should be everybody’s business is also very true. Nevertheless there is a lot to be said for great domain knowledge, the QA mindset, and a broad view of the system under test that was, and still is the best value contributed by the best traditional QA groups. Sometimes it seems that in a rush to make software quality everybody’s business, the industry is about to make quality everybody’s business in general, and nobody’s in particular. Getting rid of Software Quality Assurance as such, could lead to getting rid of software quality. This could be compared to a health care system getting rid of primary care physicians and entrusting the care of the patient exclusively to specialists. In such a hypothetical case, with many great doctors treating specific ailments with great attention but without coordinating with each other, you might end up with lots of success for specialists and a dead patient as a result. The matter of fact remains, that many automated test suites that are executed as part of integrated testing are not effective. Martin Fowler calls such tests “non-deterministic” and calls for regular work on cleaning and updating these tests. Alas, most teams do not follow such practice as part of regular testing. So called non-functional requirements -- like performance, security, and scalability are tested for, in most places, as a reaction to a crisis, outage, breach, data theft, of impending government regulation. While fire-fighting is unavoidable, it is not prudent or productive to rely on such measures as a matter of doing business.
Hence, the brittle, and unsatisfactory state of software quality today. With all the great progress, software today is neither reliable, nor secure. The Healthcare.gov web site could not be launched on time, cars are routinely recalled due to software defects, and massive data breaches threaten to take down whole industries out. Government shows its concern by issuing edicts, but compliance has a tendency to live in a separate world, mostly slowing the process and not really contributing to quality, security, and robustness of the systems. In other words to borrow from a question asked in a different context: if everything is so good, why is everything so bad?
In our analysis, there is a lot of promise and room for concrete practical improvement in the way we build and test our systems. Five years ago, we at GQP (Global Quality Partners) developed the concept of SIO -- Systems Investment Optimization. This approach was based on the notion that IT systems are among the most vital assets of a modern enterprise and as such, the need to be accounted for, developed, and protected in an effective and efficient manner. We saw the key in an integrated approach -- where quality of user experience -- the most visible and appreciated of system quality attributes, would be coordinated and integrated with other less visible, but no less vital attributes of the system, such as robustness, scalability, security, endurance etc.
The good news is that today with great advancement in tools, testing frameworks and techniques, an integrated quality approach could be implemented in an effective and efficient, practical manner. Such an approach that we call Integrated Quality™, and which we embark to discuss and develop on these pages, would focus on the best ways to integrate software quality assurance across teams and would recommend the best complementing methods, tools, and practices. The emphasis is on practical integration that would allow the industry to produce robust, and secure systems in an efficient and effective manner. In the course of this discussion we will explore questions about how BDD could help integrate teams, where QA fits in the Devops movement, how security and privacy, if designed from the ground up, could become an advantage instead of a cost, what are the best ways to incorporate performance testing, and how to do agile testing right and integrate it in an organization of any size and maturity. We see this as an exciting journey. We are inviting our readers, business associates, and colleagues to join us.
Recent Comments