Blueprint for Test Strategy Creation

Toyer Mamoojee - Oct 14 '22 - - Dev Community

The context

Having a strategy or plan can be the key to unlocking many successes, this is true to most contexts in life whether that be sport, business, education, and much more. The same is true for any company or organisation that delivers software/application solutions to their end users/customers. If you narrow that down even further from Engineering to Agile and then even to Testing or Quality Engineering, then strategy and planning is key at every level.

Let’s zoom into the software testing or quality engineering context. It can be extremely daunting when one is tasked to create a test strategy whether this is at a product, domain, project or company-wide level as there are many aspects and factors to consider plus cover which could be enough to strike fear at the heart of those who need to carry out the task. In this post, I will hopefully make this task much easier and set out a potential blueprint or areas to consider when drawing up a test strategy

The plan

Keeping in mind that there are many different contexts, settings, technologies, and structures that different people operate in, coming up with a generic test strategy might seem impossible. However, if one zoom’s out and tries extracting key factors to consider then some paths to a tailored test strategy can be clearly visible. Over the years my experience across many different industries, company sizes, methodologies, and technologies has allowed me to extract these factors to trace in order to gain maximum coverage on a test strategy.

As Quality engineers and testers, the expectation is to try and gain as much test coverage across every area to ultimately ensure maximum quality related to solutions delivered, so where do we begin?

When tackling Test strategy creation I usually try to visualize what I am trying to cover by using tools to brainstorm ideas. I try to start my visualization on a blank canvas and then expand to cover the following key areas:

  1. People

  2. Process

  3. Technology

Once I have these, I expand this further by incorporating the 5W1H concept to each area above. So typically I ask the question of What, Why, Who, When, Where, and How? related to each area above but this is totally optional. This helps me validate and support my approach. I do this to ensure that every possible gap is covered when addressing an ideal test strategy.

Looking for Appium Automation? Appium Tutorial: A Detailed Guide To Appium Testing

The strategy evolving

  1. People: I usually like to start off my strategy around people as they are key to any successful strategy. So typically here by incorporating the 5W1H technique to support my thinking it could start to shape up like this:
  • What are the expectations from our QA Engineers? What will they be covering? What are the different roles we would have in our QA team/chapter?- This helps you to make sure that you know what you need to cover

  • Why do we need them to cover these roles and tasks?- Here you would make sure that you validate why you are doing certain things

  • Who will carry out the work? This helps you address any shortages or how you can strategically structure the team to be effective

  • When are the QA engineers supposed to carry out their tasks? This part puts a focus on at which stages more QA involvement is required as an example

  • Where are our QA engineers based? Where should they be physically located? Where will we source our QA engineers from, if we need to hire them? Ideally, this would allow you to cover many areas ranging from the physical location of our QA team/chapter to enable us to work effectively to where we would actually hire them from.

  • How is the QA chapter structured? And how will they move across projects?

2- Process -in this section, the focus is related to practical aspects or regular day to day activities. When it comes to Process and Technology/Tools, I can further split this up into another 2 sections related to ‘Delivery’ and ‘Craft’. This allows one to ensure more coverage from a practical nature. You could once again also add the 5W1H technique when going through each area here, but I will not go into too much of that detail in the next 2 sections as I covered the concept above.

1- Delivery: Within this section, I focus on the actual implementation aspects. So whatever process/framework is used I try to cover practical aspects of QA and Testing related to that. For example, if Scrum is used, then I would cover the following related to QA and Test process:

Sprint Level:

  • Pre-Sprint:

  • What are the Entry criteria required for testing

  • in-Sprint:

  • QA Scenario identification and documentation
    , traceability back to stories

  • in-Sprint Test automation

  • Defect logging cycle and process, Defect dashboards

  • Exit Criteria

  • Post-sprint:

  • Post sprint QA activities, Risks

We can further expand to other aspects related to delivery such as releases and/or other enablers related to delivery test environments, test data etc

  • Releases Level:

  • What is QA’s sign-off criteria related to releases

  • Pre-release QA and testing activities

  • Post-release QA and testing activities

  • Release QA checklist

  • Release/Launch risks/mitigation suggestions

  • Test environments (this could either be part of the delivery or the technology sections):

  • Define the test environments where specific types of testing will occur

  • availability of the test environments and pre-requisites required for sign-off

  • Test Data (this could either be part of the delivery or the technology sections):

  • aspects covering test data

  • test data availability, masking, and anonymization.

  • Documentation and results:

  • how are we planning to keep results, in what format, and for how long?

  • is there a legal requirement to keep this?

2- Craft: To compliment the thinking above, the next step to ensure that full coverage is gained is to define aspects related to QA/Testing as a craft. What this does is incorporate best practices as a QA/Testing discipline into the custom delivery aspects (which would vary from company to company). Once again you can supplement the thought process by incorporating the 5W1H technique here too. Some of the key aspects to cover or keep in mind here are :

  • Test Levels: Are we looking to cover Unit tests, component tests, Integration tests and UI tests? If so, then who would be responsible for doing it?, how will it be done? at which stage of the process would this type of test happen (when)? Where will this type of testing happen? What will be covered by each type?

  • Types of testing: I split this into functional and Non-functional testing and try to cover aspects related to test documentation (if any), exploratory testing on the functional side and Load/Performance/Security/Accessibility etc on the non-functional side. I would later overlay and compliment this part with the actual tools/tech used for each test type.

3- Technology and Tools: Finally the last focus area that I consider as part of my planning is to draw up a comprehensive test strategy related to Technology and Tools focus from a QA/Testing point of view. Once again to ensure no stone is left unturned, I also incorporate the 2 sections mentioned above ie. ‘Delivery’ and ‘Craft’ and also validate my thinking once again with 5W1H thinking technique.

Check this out: Online Test Automation Platform- Accelerate your release velocity with blazing fast test automation on cloud

1- Delivery- as mentioned above, this section I focus on practical aspects as few points related to delivery from a tech and tool perspective include:

  • What tool/s am I using to capture my test scenarios/cases (example TestRail, XRay)

  • What tools/software will I use to send my API requests?

  • What other tools would be needed to support my Test delivery tasks and activities

  • Test automation tools required for each test type (based on test types mentioned)

  • What access is required for logs/monitoring to support testing?

2- Craft- From a tech and tool perspective, I usually focus on best practices around these areas which once again might be totally independent of the company and could be thought of as plug-and-play concepts related to tech/tool implementation. I could for example split further here related to Test automation and Technical testing. Then have further sub-categories on Test automation related to functional and non-functional tool-specific best practices. Some of the might include:

  • Functional test automation

  • UI coverage: Selenium webdriver following BDD style (then further elaborate on details)

  • API coverage: JS Jest with SuperTest -then go into further detail

  • Mobile test automation: Appium etc

  • Non-functional test automation

  • Load/Performance tests: JMeter — naming best practices around this area

  • Security testing: OWASP- plus expanding on best practices here.

  • Test automation execution

  • Cloud execution: LambdaTest — providing further details here

  • Technical Testing aspects:

  • related to Swagger, Postman API best practice in testing, etc

  • testing RabbitMQ, Kafka etc

Check this out: Say No To Safari VM! Perform Cross Browser Compatibility Testing On All Safari Browser for Windows Versions Across Real Browsers And Operating Systems.

Conclusion

A test strategy can be like a massive spiralled portal that can take you down the path through many different dimensions, however if some of the aspects mentioned above are followed then the focus around making solid progress related to Test strategy coverage is much closer. The above approach can mostly be used to kick-off and create your content that will go into your final test strategy (in your chosen format), however you can be confident that by touching on each of the sections mentioned above plus validating each section with the 5W1H technique that you should be well on your way to create a solid test strategy.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .