Are Agile Testers Different?
In her keynote Lisa Crispin went very detailed into the skills Agile testers need. Her definition consists of some interesting skills which can also be found in her book Agile Testing:
- Continuously Learn
- By proactive
Gojko Adzic did a good write-up of the session. Lisa opened up with a quote, that I need to remember:
We don’t break software, it comes to us already broken.
She raised the point of the Whole-Team Approach Janet Gregory and Lisa followed in her book over and over again. The Power of Three is essential to Agile development.
When she mentioned that Agile Testers help developers fixing the problems, I was reminded of a personal anecdote. On one of the bugfixes we delivered to one of our customers during this year, I was able to take the bug report, build a failing test in our FitNesse environment and walk over to the assigned developer of the bug. We sat together and pair programmed the bugfix. I was able to refactor the complicated conditionals that were sitting in that area and show him, how I would treat that code. In the end we delivered the bugfix the same evening. All of this in a traditional environment.
Lisa mentioned a personal anecdote regarding communications at her current company. She is the only remote team member. The team decided to put here up on a laptop that is portable together with a web camera. She is able to control the web camera and inspect the room. When the team meets up, the laptop is taken with them, so she really has the feeling to be colocated though she is indeed not in the same town.
Ulrich Freyer-Hirtz came up with an approach for team assessment. He analyzed the value statements and the principles from the Agile Manifesto and bring them together in an assessment spreadsheet. Thus far he stated there has been no application of this method to a real project. The flow of his approach is that each of the principles has some amount of relevance to one or more of the values. By fitting up another level of followed practices by the team and relating them to the principles, it is possible to get to know where problems arise. His work reminded me on the ShuHaRi idea connecting practices with principles and values. Though I noticed that the critics of his approach were mainly related to fears when trasitioning from first-order measures of Agility to his second-order measurement.
Overall I noticed a problem with the applicability for Agile teams related to the approach Ulrich took. The assessment starts with an evaluation meeting, where the principles and the values are related to the actual agreed on practices. By giving them weights and answering regularly questions in the retrospectives regarding the fulfillments of the practices, one gets a picture of troubling areas. This approach contradicts the High Touch, High Tech nature of other Agile approaches. The overall approach seems too heavyweight to me. There are some other works from James Shore, Scott Ambler and Alistair Cockburn (in his book Agile Software Development – The Cooperative Game) available which are lighter. In addition the one I had read from Brian Marick is also very interesting on the topic. Overall the approaches look very similar in order to try to get from gut-based feelings about the state of the Agility in the team to a numbered measurement.
Overall Ulrich and I completely agree, that dealing with the Agile Manifesto and the principles behind it in a constructive manner is crucial and his approach is truly a kick-off on this.
How to develop a common sense of “DONE”?
Alexander Schwartz presented an idea to reach a common understanding of done at his company. Alexander used a Done chart as a checklist in order to reach the common understanding of “We’re ready to put it on production” is. One interesting quote was
You would not use untested parts, if you build a car.
In particular they had the issue that they had just part-time testers on some of the teams. If this turns out to be a problem, I would prefer to coach developers to step in for this, so the testing activities get done, before the story is claimed to be done.
Over the talk I noticed that it could be a good thing to raise a team issue on the problems they seem to have with delivering software. As Michael Bolton taught me, software testing and Exploratory Testing in special is a constant questioning of “Problem or not a Problem?” This can be transported to team issues as well. “We don’t have a common understanding of ‘done’. Problem or not a Problem?” If you can answer this question with “Not a problem”, why bother with it now? But more likely you are going to notice your uncommon understanding of “done” because it differs between individuals, so you answer the question with “We got a Problem”. Then it might be the time to start a dialog about this and discuss what this means for you and your team, identify, where you are and where you want to be and solve the problem between the two.
Despite the re-occuring queue at lunch – there were three equal-sized queues, so we actually did learn from the day before – I sat together with Tom Gilb, Mary & Tom Poppendieck, Lisa Crispin and Gojko Adzic. Gojko has previously worked in the printing business. Over the discussion we came to deadlines in printing. Mary Poppendieck pointed out, that
You never miss a deadline in printing.
Why? Well, if you miss it, you are not able to print your magazine for the next weeks, months or whatever your scheduled release cycle is. If you miss it, you won’t ship. Period. Why? You get assigned timeslots for printing your particular magazine or book on the printer. If you miss to deliver when the timeslot is scheduled for you, nothing will be printed. So missing the deadline is fatal in this particular business and everyone knows this. Please compare this to software deadlines yourself in your mind. It’s too obvious for me to do this for you.
Another thing I learned at that lunch-break is, that telling highly skilled people what to do is a problem. Gojko noted it down as
Systems should support intelligent people doing their work rather than trying to replace and de-skill workers.
Gojko had an anecdote of a company, which hired him for their test automation problems. When he started to raise the point that the unit tests were crap, the technical leader started to cry for two minutes, went out, got back with the supervisor of her and asked the supervisor to tell Gojko that he was hired for test automation, not for unit testing.
A final anecdote I got during the lunchbreak came from Gojko, too. He explained that he was recently helping on a project with Mike Scott. They pair-programmed on a rather complex fixture. This is why he test-drove it with TDD. Mike said, that Gojko had added ten percent code coverage to the project when they finished the submission to the version control system.