The other day on twitter I asked
What questions do you ask during an Exploratory Testing session debriefing?
Since I didn’t any replies on this at all, I figured it’s time for me to come with some ideas on my own. Let others work doesn’t work on this one.
A quick Google search showed up with some references. One particular I would like to point is a blog entry from Michael Kelly on IT Knowledge Exchange. He has some generous questions there:
- What was your mission for this session?
- What did you test and what did you find?
- What did you not test (and why)?
- How does your testing affect the remaining testing for the project? Do we need to add new charters or re-prioritize the remaining work?
- Is there anything you could have had that would have made your testing go faster or might have made your job easier?
- How do you feel about your testing?
Another reference comes from WikiPedia. It lists the PROOF mnemonic from James Bach:
- Past. What happened during the session?
- Results. What was achieved during the session?
- Obstacles. What got in the way of good testing?
- Outlook. What still needs to be done?
- Feelings. How does the tester feel about all this?
The ons from Michael Kelly seem to be based on the PROOF mnemonic.
Among the search results these seem to be the only two references with some stuff. There may be more, but I gave up looking at useless links after ten results or so.
Certainly the questions based upon PROOF are very good. I like them a lot, so that I just read three times over the list. But for me there seems to be something missing from the list. Based on the sessions debriefs we did in Weekend Testing, here are some ideas that I tried out over the past year. Please mind that this is a raw list based on a brainstorming activity. Feel free to bring this into some mnemonic if you like.
- How did you work the product?
- What did you observe?
- What didn’t you observe?
- Which important bugs did you find?
- Which problems did you notice?
- What did you find?
- How much time did it take you to set up the program?
- How much time did you spend on testing related tasks?
- How much time did you spend on notes taking?
- How much time did you spend on bug reporting?
- How much time did you spend on bug pinpointing?
- How much of the program did you cover?
- Consider you would test this program again now. What would you do differently?
- What would you say to the next tester testing this program?
- What would you tell the programmer of this program?
- What would you like to tell the architect/designer of this program?
- How many bugs do you think are still lurking in there?
- Do you think this program was sufficiently covered with programmer tests? Why?
- Do you think this program needs more testing? Why?
- Do you think we can ship this program? Why?
- What would you like to tell the project manager of this program?
- Which test charters would you set up for this program next?
- Which areas should be covered better by follow-up test charters? Why?
- Which areas should be covered better by programmer tests? Why?
- Consider I was a potential customer of this program. Would you sell it to me? How? Why?
- Consider I was providing all the money for the project to create this program. Where should I spent the next pile of money in this project?
Please note that all the questions mentioned are open questions. They don’t ask for a yes/no answer. Instead they aim at revealing as much information as possible. During a session debrief I want to get as much information, as I can get. It gives me the warm and cozy feeling that everything is under control. This might be a trap, though. That’s why I also ask for traps.
Which questions have I missed? And I am sure that Michael Bolton or James Bach have some more questions in their Rapid Software Testing classes.
3 thoughts on “Questions to ask during Debriefs”
Great post! I agree that there is a lot more to software testing than looking for bugs. Good testers get a feel for software. They develop a sense for the overall quality of the application.
It also helps if the testers have been involved in evaluating customer issues. They need to have a sense for how customers use the software and what is important to them.
This is a good post, and I like the debriefing questions. However, there are some aspects that I think is important to point out.
Is it really important to specify the time spent on different subtasks in the session? I can understand that some hunch of the time proportions, but 6 out of 26 is almost 25 % of the questions asked. Depending on the purpose of those kind of numbers, I would stick to a subset of them.
I really like the aspect and viewpoint of the user when I test. You do mention a customer that you want to sell the product to, but that is not enough of that focus.
What would the user do?
What would he feel about the product?
How would the user react to the functionality? etc.
And last but not least, I would be careful asking debriefing questions where the answers already are found in the notes or other reports. “What important bugs did you find?” is one example. They will be listed in the notes. I would instead ask “Why are these bugs important?”
Fantastic list :)