How I would test this – Part I – The basic idea

Matt Heusser from time to time comes up with challenges related to testing. The latest one introduces a web application and asks for a strategy to stress it out. Here is my (late) reponse to the challenge.

First of all there are some questions I would need to ask in order to prepare my testing and checking activities for the delivery of business value. Who is the customer here? The customer seems to be a product owner or product manager, but it could also be a real end customer, who wants to get a new business webpage. How do the developers work together with the QA team? Are they ignoring them? Are they using TDD to develop the software? Will the testers be able to walk through the delivered product like a hot knife through ice or will there be a hard time for the testers on the team with a real bugfixing phase? Which process does the Agile team follow? Scrum? XP? Crystal? Some mix? The iteration length is two weeks, which is a pretty fair schedule. Do the testers participate in the planning game or are they kept separate? What about easy access to expert users? Who is available for the clarification of open questions? Is the product owner taking care of this? Is there an on-site customer? Is there a customer proxy? Which equipment and tools do the developers use? Which equipment and tools do the testers have access to? Which tools did they sucessfully use in the past? Is the team new to Agile or did they deliver continuosly over the last five years? Did they have success with the Agile adoption? Which practices do they use? Which practices do they not use? What is going to be delivered? Is a test report necessary? In which form? Which documentation needs to be created? Which documentation can be used as a basis for more informed testing? Which similar product exist on the market? Matt already named a few, but is this application like those mentioned, something completely new? Which features are incorporated into the product as enlightment so that this particular product will outperform the existing ones? Is there any new technology included for the team? Has the team – the developers and the testers – dealt with the technology involved beforehand? How do the binary deliverables look like? Is the webpage hosted on an internal server or is there an installer or even packaged CD going to be sold? Is there a bug backlog, which the testers have to deal with? Are team members assigned to the project on a 100% basis? Are there intervening projects, that might ask for particular specialists from this project? Is the product already available? Is there a successful build at least once a day? More often? Less often? How do the release plans look like? Which overall timeframe is taken and does the testing have to come up in parallel? Is the design test-friendly? Is it possible to test behind the GUI?

Ok, this is the basic brainstorming and the question I would start to be asking in order to make a more informed decision on the test approach to follow. While these questions get answered, let me start drafting the approach, which takes some assumptions on the questions above, but which should be easily to change during the course of the testing.

Based on the problem description and my past experience and the fact that I have to deal with an Agile team, I would start the first steps with timeboxed Exploratory Testing sessions on the product if the testing takes place as a separated phase. Mainly this will deal as a learning curve and as an information gathering process. Ideally this would be done as a pairing session, so that both team members take a similar learning curve. During the initial two to three days there will be a mix-up of learning in ET sessions and preparations for the test automation part to follow.

If testing and development are running in parallel, there is a need for pairing with customer representatives during the first days. For each planned story for the upcoming iteration, I would try to prepare at least one acceptance test per planned story for the iteration, more for the harder business conditions. Of course here is the assumption that it is possible to some degree to test behind the GUI. Some framework for testing the GUI might also be necessary for the UI checks.

So far, this is the high-level idea I can provide. For the particular mentioned business conditions I will follow-up with a blog entry later the other week. Maybe most of my question will be answered by that time so I can come up with some more informed approaches, too.

  • Print
  • Twitter
  • LinkedIn
  • Google Bookmarks

3 thoughts on “How I would test this – Part I – The basic idea”

  1. Wow, Markus. Well, here you go, I hope these answers help you develop your strategy.

    >Who is the customer here?

    Any company developing intellectual property. Mostly likely law offices, publishers, record label, even coordination companies. Anyone where ‘production’ work is project work.

    You do have a product owner; you also have 2,000 companies who are customers.

    >How do the developers work together with the QA team?

    All the technical staff is on IRC; the whole team has a standup every two days.

    > Are they ignoring them?

    No; it’s collegial. Testers do story review, kickoffs with tech staff, and retrospectives.

    > Are they using TDD to develop the software?

    Yes, at the unit level. A widget is basically a bunch of javascript doing back-end API calls to get the data.

    >Will the testers be able to walk through the delivered product like a hot knife
    >through ice or will there be a hard time for the testers on the team with
    >a real bugfixing phase?

    You don’t know. In reality, it took about two weeks to shake out all the bugs, while meanwhile you also worked on other projects.

    >Which process does the Agile team follow? Scrum? XP? Crystal? Some mix?

    It’s pretty close to XP with a testing team and QA ‘phase’ for each story.

    >The iteration length is two weeks, which is a pretty fair schedule. Do the testers participate
    >in the planning game or are they kept separate?

    Estimation is typically done without the testers, but testers can contribute on stories as much as their time permits.

    >What about easy access to expert users?
    >Who is available for the clarification of open questions?

    For purposes of this exercise, you can talk to me. In reality, the test team is expected to /become/ expert users. For example, I prepare my writing on the wiki.

    >Is the product owner taking care of this?
    >Is there an on-site customer?

    Well, the PO is a full-time employee, and you can ask her questions, sure.

    >Is there a customer proxy?

    That would be the PO/PO.

    >Which equipment and tools do the developers use?
    >Which equipment and tools do the testers have access to?

    Git, linux, apache, screen sharing with VNC, screen sharing with UNIX, voice over IP phones, conference calling, wikis, IRC

    Developers write perl tests (.t files) that are output in TAP format – test anything protocol – so they can be easily parsed by any parser. (Test::More or prove, for example)

    Testers can use whatever they want. You define the test strategy. If you want to use some framework that takes a month to set up, you can claim it is set up before this story begins. If your test framework takes more than a month to set up, justify why and you can use it.

    >Which tools did they sucessfully use in the past? Is the team new to Agile or did they deliver
    >continuosly over the last five years?

    Greenfield development, all agile, some tech debt from the older stuff.

    >Did they have success with the Agile adoption?
    >Which practices do they use?

    TDD, story-driven development, refactoring, CI for unit tests and continuous builds, story-test driven development, pair programming, velocity measurement with yesterday’s weather.

    >Which practices do they not use?

    Most other practices. :-)

    >What is going to be delivered? Is a test report necessary? In which form?
    >Which documentation needs to be created?
    >Which documentation can be used as a basis for more informed testing?

    Well, you could tell me, or you could just produce a set of bug reports along with notice if the story passed, failed, or “passed with some concerns” for the PM to decide if it’s really don’t yet. If it fails, re-test when the devs are done fixin’

    >Which similar product exist on the market? Matt already named a few,
    >but is this application like those mentioned, something completely new?
    >

    Main competitors are yammer and atlassian

    > Which features are incorporated into the product as enlightment so that this
    > particular product will outperform the existing ones? Is there any new technology included for the team?
    >

    This featureset – the summary of activitiy – is a new killer feature. We also have an awesome REST API so 3rd party devs can extend our stuff.

    >Has the team – the developers and the testers – dealt with the technology involved beforehand?

    Sure

    >How do the binary deliverables look like?

    It’s a build you install on a server

    >Is the webpage hosted on an internal server or is there an installer or even packaged CD going to be sold?

    Both. There is a production host running you can buy logins for, or you can lease a physical appliance from us. We also have a virtual (vmware) appliance you can get on DVD.

    >Is there a bug backlog, which the testers have to deal with?

    There is a list of bugs. PM can pick some bugs each iteration to be assigned as story work

    >Are team members assigned to the project on a 100% basis?

    For purposes of this exercise, yes

    >Are there intervening projects, that might ask for particular specialists from this project?

    For purposes of this exercise, no

    >Is the product already available?

    Yes. You can build an appliance in about 20 minutes

    >Is there a successful build at least once a day? More often? Less often?

    Continuous; typically 30 minutes after the latest checkin

    >How do the release plans look like? Which overall timeframe is taken
    >and does the testing have to come up in parallel?

    Iteration Branch is cut friday. Candidate testing happens over the weekend and into the next week, we expect to state by the first friday of the following iteration. By monday of 2nd week of iteration, new stories will be in QA …

    >Is the design test-friendly? Is it possible to test behind the GUI?

    you could, conceivably, write tests against the REST API.

Leave a Reply

Your email address will not be published. Required fields are marked *