The other day on twitter I asked
What questions do you ask during an Exploratory Testing session debriefing?
Since I didn’t any replies on this at all, I figured it’s time for me to come with some ideas on my own. Let others work doesn’t work on this one.
- What was your mission for this session?
- What did you test and what did you find?
- What did you not test (and why)?
- How does your testing affect the remaining testing for the project? Do we need to add new charters or re-prioritize the remaining work?
- Is there anything you could have had that would have made your testing go faster or might have made your job easier?
- How do you feel about your testing?
- Past. What happened during the session?
- Results. What was achieved during the session?
- Obstacles. What got in the way of good testing?
- Outlook. What still needs to be done?
- Feelings. How does the tester feel about all this?
The ons from Michael Kelly seem to be based on the PROOF mnemonic.
Among the search results these seem to be the only two references with some stuff. There may be more, but I gave up looking at useless links after ten results or so.
Certainly the questions based upon PROOF are very good. I like them a lot, so that I just read three times over the list. But for me there seems to be something missing from the list. Based on the sessions debriefs we did in Weekend Testing, here are some ideas that I tried out over the past year. Please mind that this is a raw list based on a brainstorming activity. Feel free to bring this into some mnemonic if you like.
- How did you work the product?
- What did you observe?
- What didn’t you observe?
- Which important bugs did you find?
- Which problems did you notice?
- What did you find?
- How much time did it take you to set up the program?
- How much time did you spend on testing related tasks?
- How much time did you spend on notes taking?
- How much time did you spend on bug reporting?
- How much time did you spend on bug pinpointing?
- How much of the program did you cover?
- Consider you would test this program again now. What would you do differently?
- What would you say to the next tester testing this program?
- What would you tell the programmer of this program?
- What would you like to tell the architect/designer of this program?
- How many bugs do you think are still lurking in there?
- Do you think this program was sufficiently covered with programmer tests? Why?
- Do you think this program needs more testing? Why?
- Do you think we can ship this program? Why?
- What would you like to tell the project manager of this program?
- Which test charters would you set up for this program next?
- Which areas should be covered better by follow-up test charters? Why?
- Which areas should be covered better by programmer tests? Why?
- Consider I was a potential customer of this program. Would you sell it to me? How? Why?
- Consider I was providing all the money for the project to create this program. Where should I spent the next pile of money in this project?
Please note that all the questions mentioned are open questions. They don’t ask for a yes/no answer. Instead they aim at revealing as much information as possible. During a session debrief I want to get as much information, as I can get. It gives me the warm and cozy feeling that everything is under control. This might be a trap, though. That’s why I also ask for traps.
Which questions have I missed? And I am sure that Michael Bolton or James Bach have some more questions in their Rapid Software Testing classes.