Wednesday, April 18, 2012

Transforming the way testing is performed...part two

    It's now been almost 2 years of me being in charge of testing where I work for an eCommerce website.
    My first act was to attempt to abolish test cases. They managed to survive under-wraps for about a year till I could put a final nail in their coffin. We still have test cases, but only in instances where writing the test cases out is faster or simpler then writing out our test strategy.
    My second act was to dictate to people that we should be writing documentation, but that it should take less time compared to the actual testing performed. (If it takes you 5 minutes to test it, it probably doesn't even need to be documented) The idea being that we need a knowledge repository of what we did, but that repository is for the tester's use.
    At first lots of the test strategies looked like just cut down test plans where test cases were the only thing used, but as I pressed people for more abstraction, it got more and more easy. The idea of abstracting is where you talk about what you think needs to be done in testing, using shortcuts naming conventions (tours in this case) like those recommended in Whittaker in "Exploratory Software Testing" and talk about more here http://www.developsense.com/blog/2009/04/of-testing-tours-and-dashboards/
    So we now write documentation as a high level abstract of what we are testing. (I like to call these 10 minute test plans, but in reality they take between 15-60 minutes to write-up) Here is a small sample to give you an idea.

***** NAME    DATE *****
Basic Profile tests in ATG
STRATEGY:
  Create documentation on the basic testing strategies for profile tests.  This should include login, logout, and new user creation.
TOURS:
    FEDEX - Verify that usernames are passed between the ATG and the _censored_ databases including making sure that invalid and duplicate usernames can not be used
    LANDMARK - Create users in all locations and make sure that those users can log in to all locations including in My and CSC.

    ANTISOCIAL - See how the system handles illegal inputs such as short usernames, long usernames, duplicate usernames, legacy usernames, etc.  Try to break the system.
NOTES:
  The requirements for basic login and user creation are the following:
    - usernames must be between 3 and 15 characters in legnth
    - new usernames can only contain numbers and letters
    - legacy characters including . _ - and space on legacy users only
    - log out should log user out
    - duplicate usernames should not be allowed (non-case sensitive)
    - 3 places to create username ATG, My and CSC Tools:
        _censored_
        Need to get with QA if you do not have the the password for it.
Database:    _censored_
IGNORED:
    Anything beyond the basic log in and user creation functionality of the profile, this will not include the weirds such as what happens when e-mail addresses are modified, changed username, changed slugs.




The reason we choose to do this, is that my company hires people who are experienced in testing. I think they all have the ability to take a strategy from anyone and come up with 80% of the test cases that the writer did.

Is this perfect? No, but…
It solves two frequent problems.
    1: Testers writing days of documentation for no one to ever read, or if they do the documentation is too old to matter.
    2: Testers never documenting what they do.
I think this is a nice, happy medium solution to the problem and it works here.


Why is it not perfect? i.e., What failed?
    People get attached to processes, particularly those who like to have organization round their tasks. They got attached to the strategies, so much so, that they would use them as templates…not as opportunities to think about the problem. The original idea was presented to people with the idea that there are other ways to document testing. However it has evolved into a more static "This is how we test here". I'm not really sure how to fix this yet. Just by writing this blog, I'm hoping people will re-evaluate why they write strategies.

What I didn't expect to happen:
    I expected a significant amount of pushback from uppers about metrics and numbers (aka passing rates, number of test cases ran, etc). After a very enlightening conversation with the company COO, he told us if it slows us down to stop doing it. And we haven't been tracking test cases since.