“If you fail to plan, you are planning to fail.” — Benjamin Franklin
Just as any software development project should have a project plan to ensure a successful project, it should also have a test plan to ensure adequate and effective verification of what is being developed, as part of the overall project quality assurance. Developing the test plan should be a collaborative effort by the project team from the start of the project, beginning with general project and testing information. The project testers should be involved from the very start.
Essential components of a test plan include the following.
Project Overview
It is very important for a development team, including the project testers, to understand the overall project. This sets the stage for the project, and for every testing round.
- What is the purpose of this software, business or otherwise?
- What need does it fulfill?
- How does this software fit into the overall picture?
- Who are the stakeholders in the project?
- Who are the decision-makers?
- Who are the development team members and what are their roles in the project?
Testing Approach
Describe the general testing approach for the project.
- What testing methodologies are used?
- What types of testing rounds are performed, and at what frequency?
- How is stress or load testing done?
- What is the project tester’s general availability?
- How much time per week does the project tester spend on this project?
- How does the project tester coordinate with the rest of the development team?
- If there are multiple project testers, describe how they coordinate efforts to ensure full testing coverage.
Assumptions
List any general assumptions that may be useful in testing the software.
- Who is the intended user?
- Who else might conceivably use the software in the future?
- Who is the intended beneficiary of the software?
Dependencies
Document any dependencies for the project that will affect testing. For example, list any server, database, web service, etc. that must be running in order to perform a testing round.
Testing Environment
List all supported versions of any operating systems, browsers, applications, etc. for this software. Also list any supported devices along with required storage space, memory, processor speed, screen resolutions, special hardware requirements, etc. List the most common configurations and specify if the others will be only spot-checked. List any required server URLs, user names and passwords, etc. needed to test the software.
Testing Tools
List the testing tools that are used on the project, such as automated testing, load testing, screen capture, etc. Specify bug tracking tool information and instruction.
Testing Schedule
List all anticipated milestone names and dates. These may be adjusted as the project progresses, but it is preferable to list all milestones for the entire project as currently understood, for scheduling purposes.
Build Acceptance Criteria
List the general acceptance criteria for all builds. Following are some examples.
Continuous Builds: Each source commit triggers an automated process that checks out the full source, runs the full unit testing suite, and builds the full project, reporting immediate feedback to the project team. The project manager ensures that any problems are corrected immediately.
Release Documentation: Release documentation is issued for every new build. It should include the build number, all changes, and new features from the previous build. In a component release, it should indicate which areas are ready for testing, and which areas are not. Any new bugs found during development and unit testing should be reported in the Known Issues section.
Sanity Testing: The project manager will perform a brief testing round of every new build before releasing it to the project tester. Any obvious problems may result in a rejection of the build.
References
List any reference documents or e-mails used in writing the test plan, generally including the requirements document and project plan.
Glossary
If any nontechnical personnel may read the test plan, list any definitions of technical terms used in the test plan.
Test Cases
A list of all the test cases for a test plan may be quite long, so it may be best to keep it in a database, with the following details for each test case.
- Test Case ID
- Milestone
- Feature
- Corresponding Development Task ID
- Description
- Steps to Test
- Expected Outcome
- OS+Browser Version
- Pass/Fail
- Test Dates
The test case shouldn’t include any defect reports, which belong in the defect tracking tool. It is useful to simply sort the list of test cases by Pass/Fail to quickly determine if a build should be rejected.
Test Data
Include any data required or helpful for testing.
- Database tables to preload
- User names and passwords with specific roles or data sets
- Input data to test edge or other special cases
- Input files for the software to operate on
- Output files to compare with testing results
Following agile software development methodologies, Just in Time is an effective approach to developing a test plan. As milestones, features, and tasks are solidified in the project plan, corresponding detail should be added to the test plan. Individual test cases may be added as the corresponding features are finalized and begun in implementation.
Adding test cases before the corresponding features are finalized may end up being wasted time and budget (or even worse) when future project details further evolve. And of course, waiting too long to add test cases for finalized features results in a useless test plan.
A good test plan helps testers adequately verify the project, and it helps avoid a project disaster. It guides all testing activities for the project whether formal or informal, including acceptance testing, regression testing, user testing, etc.