Towards the end of last year (2009) I was delivering a one day training course on Agile Test Management and as we were wrapping up one of the students asked "what do you do if there are no documented requirements?" Its a question that has been asked many a time, and I came across the answer I gave, well before I understood what Agile was.
I was working on a project that I now understand was using one of the Agile methodologies Feature Driven Development. I remember clearly enjoying a coffee with the test manager mid afternoon and discussing our approach to the documentation of the test cases and our analysis of the requirements specifications; and one of the comments which has stuck with me ever since was "well you know that documenting test cases is really just further derivation of the requirements. Because by documenting positive and negative tests you are refining the requirement to say that not only does the system act in accordance with the requirement, but also does or doesn't behave inline with the test conditions you've specified." Now, several years on I have found many a time when this is exactly the case.
Whilst testing on a subsequent Agile project, following a scrum methodology, I found that it was my test cases that modeled many of the system behavior. In this instance where the story said something like "must be able to save an Agreement with the minimum data set date, amount, client..." it was the testing that I did with fields missing, maximum text entered, alphabetic characters in numeric fields and alike that formed the basis of the error checking implemented. So the derivation of the requirement now meant that the requirement was "must be able to save an Agreement with the minimum data set including valid dates, an positive, integer value for the agreement dollar amount, a valid Australian format date in the future for the beginning and end dates of the agreement. All of these test cases where derived from the fields present on the screen and a little discussion with the business and developers.
My answer to the student also included another school of thought that I subscribe to "requirements are everywhere and are not always contained in a BRS document". One of the lessons I learnt early on was that in the absence of specific system documentation there are so many other sources of system requirements. The system itself contains almost all of the requirements in implemented form, and by exercising it in a structured manner you will be able to discover them. A simple example I use is to just click the Save or Submit button on a screen and there is a pretty good chance you'll find out what the minimum data requirement to save is!
Another potential untapped requirements repository is the system Help and Training documentation which will often tell you what to do, as well as what not to do. My response on the day also included speaking to the 'power users' of the system and asked them about how they use the system on a day to day basis, and then reverse engineer the requirements from there.
While researching another presentation I re-read the marvelous book "Lessons Learnt in Software Testing" by Karner, Bach and Pettichord 2002 and came across lesson 179 from "Take advantage of other sources of information. You aren't helpless if no one gives you a specification. Plenty of other sources of information can help you steer your thinking..." The lesson goes on to list many sources of information to assist with testing and further demonstrates my point about requirements being abundant you just have to know where to look.
The final part of my response was to leverage the knowledge capital in your organisation, by having some of the more experienced people in the team/project review and agree to your test cases, and therefore test your derived requirements.
No comments:
Post a Comment