Testing Dojos

About a year ago, I heard about Coding Dojos for the first time. Getting back to work, I nearly immediately tried out the idea with some colleagues. The implementation was great, and we had a lot of fun. Every since I wondered how to get testers involved into this deliberate learning. It took me some time and thought, but now I ended up with Testing Dojos. We ran several of them at work so far. I decided to provide you with some know-how that should get you started as well. In case you’re looking for personal experience, I will be presenting the topic together with a sample session at the XP 2010 conference in Trondheim in June this year.

What is a Testing Dojo?

A Testing Dojo is a meeting where testers come together to work on a testing challenge. The testing challenge can consist of testing a product, or generate test ideas for a particular software, or even exercise bug reporting. Mainly the testing challenges will use Exploratory Testing.

Testing Dojos take testing into a safe environment without schedule pressures and deadlines. Testing Dojos are a way to train testers new to the profession in a collaborative manner.

Equipment

For a testing dojo there is little you need to prepare.

  • a meeting room large enough
  • access to a computer
  • a video projector so everyone can see what’s happening
  • pen and paper, a flichart or a whiteboard to take notes

A meeting room with a computer and a projector are usually available. Pen and paper for note taking are easy to organize.

Mechanics

The facilitator introduces the rules of the testing dojo. The facilitator provides also a mission and clarifies the structure of the dojo. Any testing may be done by a single tester in front of the computer or in a paired setup. The missions given may vary between test this product, evaluate the usage of the following up to use this new approach to check if we may incorporate it into our testing process.

Single tester

In a single tester environment there is a single tester exercising the product in front of an audience. On a previously agreed upon time the tester is exchanged by a tester from the audience. The new tester then continues to follow the mission tackling the product under test. When the individual tester may get stuck, he may ask for support from the audience. Otherwise the remaining participants remain silent.

Paired session

In a paired session two participants sit in the front of the computer. One of the testers is working on the keyboard, while the other than takes notes on test ideas and found bugs. After a previously assigned timeframe (i.e. 10 minutes) the pair switches. The tester who was in front of the keyboard gets back into the audience, the note taker takes the keyboard and the empty seat is filled up with a tester from the audience for note taking.

During a paired session the testers in front of the computer need to clarify their steps taken on the mission so that everyone in the audience understands what they’re doing. They should at least talk as much as test, probably talk more than test. The pair may involve the other participants when they get stuck just as in the single tester setting.

Missions

In this section we will take a brief look into possible missions for a testing dojo and give some recommendations.

Test this

The classical mission involves to test an application. The variety of applications may include open source programs, commercial software available in your organization, or even your company’s latest product. Thereby such a session may end up as well as a bugfest. The facilitator needs to check the proper availability before the session starts. For examples, the company’s firewall may block some content on a web page. For unstable network connection testing a web page may become cumbersome.

Evaluate tools

A mission to evaluate a tool may include to use mindmaps for test ideas, or to try out a particular test tool for the whole session. For such a mission the product under test is usually the tool itself, but you may run the mission also for a common program that you test at work, and compare the results directly with your daily work. Evaluating tools serves mainly the purpose to decide whether or not you may use it on a more regular basis.

Learn new approaches

There are many testing approaches to try out. Focusing the mission on some particular mnemonic like FCC CUTS VIDS or use soap operas to generate test ideas. Similarly to evaluating tools, this type of mission serves the purpose to try out and learn about new approaches to testing. After these sessions the whole team has made some experience and can make a more informed decision – like if and when the approach is useful for them.

This is the concept behind Testing Dojos as it stands today. Please, let me know when you tried them out at your company, and what worked for you, what didn’t work for you, and what you found worth to add or to try out above that.

  • Print
  • Digg
  • StumbleUpon
  • del.icio.us
  • Facebook
  • Twitter
  • LinkedIn
  • Google Bookmarks

11 thoughts on “Testing Dojos”

  1. I like this idea, except that it is focused only on post-development testing. I feel it’s more important to know what tests to write before coding, what questions to ask the customers, how to get useful examples from the customer, how to gain domain expertise so you can help the business focus on what is going to help the business be successful.

    While I feel it’s important to hone our skills at testing a finished product (or at least, a checked-in one), we can add more value to our team and our business by helping them figure out what features to deliver next and how those features should work, then by finding ‘bugs’ in delivered functionality which may not help the business even if it technically is bug-free. Does that make sense? How could we do that sort of testing dojo?

    1. That could be great missions to generate. Basically, you may run any testing challenge also in a collaborative session, discussing the approaches and answers collectively thereby training your testers as well. Evaluating the application for missing functionality is also a topic I put in the category of “evaluate tools”. Instead of delivering a list of bugs by then, some recommendations regarding the next features to incorporate could be the outcome at that time.

      Currently for Weekend Testing we’re evaluating what we would need to have a mission on test automation. The time constraint of a single hour makes this a challenging task. So, for other missions you may need to extend the scheduled time of the dojo. This is the main difference from my perception to a coding dojo. You get something done in a very short period of time for the coding part, for testing there are lots of preparations necessary, like downloading the software, maybe installing it, or starting up a back-end system for your company’s enterprise application in order to test the GUI on that one. Unfortunately this is nothing a Testing Dojo will solve for you. :)

  2. I did these kind of test sessions with my team a year or two ago with different goals in mind. It could be to strengthen people’s ET skills and to discuss approaches or to see how they would take notes and reports and then compare them and look at strengths and weaknesses for each. The reports all looked radically different which surprised me. If you haven’t looked at it yet, head to James Lyndsay’s site, he’s got some Flash applications with “bugs” in that can be used for ET testing in-house.
    In terms of figuring out what features to test next and how they should work is probably more valuable in an agile environment. It would be quite hard to do this kind of thing in a limited EWT or test dojo, at least I can’t think of an approach from the top of my head. Mmmh, interesting..

  3. This is a great training idea. I know testers can learn a lot from team testing but I think that expanding that to include an audience and a time limit both makes it fun and adds good pressure/importance.

  4. I tried testing dojo for like 6 -7 sessions till now and blogged about it. I’m trying to make it a weekly event. It’s a pity not every one care about testing craftsmanship, even if you have some one in there, they’re not really in there.
    For the first 2-3 sessions, it look like cool. Later no body would like to test in front of everyone, and even if they do, they do not apply or follow the advice when they get back to their own tasks.
    How will you handle this?

    1. Hi Jarodzz,

      there a many things I may suggest here. One is come up with short retrospectives after each Dojo in order to see what can be improved on the meta-level. Try to find out what the perception of everyone was, and try to change your Dojo process accordingly.

      A second option is to apply the patterns from Fearless Change. In the beginning you mostly deal with Early Adopters. Over time, though, when the process starts to establish, you have to find challenges interesting for the people attending. Also you may want to make sure to hand over the process of the Dojo to your colleagues. When you achieve to engage them in organizing the events, you will more easily get their buy-in since they are now emotional attached to it since they are involved.

      A third option might be that you need to change your company’s culture to transform it into a learning culture. This surely takes more time and energy from yourself, but is maybe the root cause behind little buy-in from your colleagues. Just keep going, bring in varieties of dojos and missions to engage them and show them that learning in a team is way more fun.

      Hope this gives you some pointers.

  5. I’ve been reading about the Testing Dojo’s with interest and once I present to my colleagues about Exploratory Testing/Session Based Test Management, I will look to implement Testing Dojos. There are some people in my team that are fresh from the business side of our industry and therefore do not have a rich education in Software Testing.

    Also, I will report back to you ASAP on how it went.

Leave a Reply

Your email address will not be published. Required fields are marked *