At the Agile Testing Days David Evans presented on how to hit a moving target and fixing quality on unfixed scope.
David described that the problems he would like to solve conclude of user story dysfunction, failure to see the whole, and that good testers might not be Agile tester. Accepting stories is not quite testing a system. A system is more than the sum of its features. Consistency, coherence, elegance, and simplicity help to see the Whole. Usability, security, and performance aid there. On user story dysfunction David stated that too much presumed design might be included. Overstuffing the suitcase and having a backlog as a “parts-list”. He finished stating that good testers in a more traditional context are not necessarily also good testers in an Agile context.
David continued on risk. Building the thing wrong as opposed to building the wrong thing is a risk you have to tackle. He reference Mary Poppendieck from last year’s conference that a late change in requirements is a competitive advantage. David mentioned Kent Beck’s safety assumption, that code that isn’t tested is not working. In the end, the product of testing is confidence – a principle Evans coined. He made his picture visual by referring to a picture that Mike Scott also used in his presentation.
David referred to Agile QA not as Quality Assurance but instead as Questions and Answers. In the Quality Assurance we’re more concerned with dotting these, and crossing others. What we should instead use in an Agile environment are questions and answer. Asking the right questions are part of test analysis & design. The second part, the answers, should be got fast and consistent via test execution.
After all the question we should ask is how good is it? He went from diamonds which are beautiful in themselves on to something more complex like a current smartphone. How do we measure what good is? It’s a subjective statement we make about the quality of these complex things.
Evans presented the issue of the Jigsaw misconception. The notion of pieces in an Agile project are considered to be a jigsaw puzzle, which we just have to put properly together. But this is a misconception, since in the end there is a piece missing at the end of the day. David explained that we don’t want to treat stories like jigsaw pieces. We want to have the customer involved in this.
David’s second point was the issue of done vs. improved. A unit of change is target to improve the product. In order to do this, we need to get to ask “why”? How well does the product do what it’s supposed to do? So, how dan the product be improved? He drew the picture from cars getting the cleaners from windshields to self-parking or having intermittent, or automatically cleared. The qualities for the wiper are control, convenience-for-driver and all-weather effectiveness found in most cars nowadays. But why did we improve this? Things stakeholders actually care about, which are the ‘real’ requirements. Safety is a the underlying main concern. After that visibility comes into play, and finally even having this in wet weather. Ironically there is an alternative solution in design, which shapes the windshield so that rain just wipes off while driving.
On common threads, David reported statements like “We can’T find enough testers to resource all our agile development teams” or “our tester are not able to keep up with the output from development” or “there is not enough time to complete regression testing” or finally “developers complete their stories in a Sprint, but the testers always overrun”. In an Agile world you can’t keep with fighting test managers vs. development managers, who just get stuck with what is more important. He said that Acceptance-TDD slows down development just as passenger slow down the bus. We should rather measure the right thing in this regard. Instead Agile Testing shapes and validates our mental model of the evolving system. Local optimization fails. We should not measure the speed of the bus.
He mentioned Tom Gilb, Jeff Patton, and Alistair Cockburn referring to their work. From Gilb he cited that we should identify the ‘real’ requirements. Any project sponsor should be able to list the ‘real’ requirements. No more than a dozen of them should exist, fitting on a single A4 piece of paper, and finally each should be objectively quantifiable. Project Requirements are expressed as Stakeholder values. Identify your stakeholders for your project, then start to understand what value means to them, then document these values that your project is about to improve. While doing all of this, include quantified goals (as values) where appropriate. On product requirements, we shall identify what product qualities will address stakeholder value, and focus on this.
Continuing with Jeff Patton’s view. He cited Jeff with “Flat backlogs just don’T cut it for me”. Traditional story backlogs lose the context from which they came. Patton says that we shouldn’t use a flat backlog, but instead a two-dimensional map, arranging mandatory user activities and tasks horizontally, and arranging details, subtasks and improvements vertically.
David continued on Alistair Cockburn. He simplified Kano. Categorize all options into A, B or C classes. A means mandatory, B means features, the could have’s, and C means pleasant surprises, the delighters, which the customer don’t necessarily come ask you for. Over time more feature turn out to move from the C-category into the two other categories if applied. In order to thin requirements, don’t limp all potential feature aspects into one story. For any A mandatory item, move stories to B, and C as well.
Evans got back to the wind-shield analogy, stating that the What is the clearing water from the windscreen. With how well first car designers used manual approaches. After all, we should move stories from simplifying “What?” to improving “How Well?” This provides the customer much more power to drive the product to what he or she finds valuable.
This a great post!