Subscribe by Email

Tuesday, January 13, 2009

Details (sub-parts) of a test plan !!

A software test plan is a document that is critical to any software project. This is the document that is the proper reference to the process that QE follows during the testing process. Having a software test plan is a critical requirement of quality processes, and the document by itself takes some time to prepare (given the importance of the document for the overall process). So let us talk in some more detail of what a test plan should be like:
The software project test plan document describes properties of the testing effort such as the objectives, scope, approach, and focus. One advantage of the process of preparing a test plan is that it provides a useful way to think through the efforts needed to validate the acceptability of a software product, and the consequence of this effort is that such a completed document helps people outside the test group understand the 'why' and 'how' of product validation. One major need for such a document is that it should be thorough enough to be useful, and succinct enough at the same time so that no one outside the test group will read it. The following are some of the items that might be included in a test plan, depending on the particular project:
• Title
• Identification of software including version/release numbers: Which is the software for which this document captures the test process
• Revision history of document including authors, dates, approvals: A document could go through many versions, so this field captures the current version number
• Table of Contents: Very useful, and needed for most documents
• Purpose of document, intended audience
• Objective of testing effort: What should be the goal of the testing effort
• Software product overview
• Relevant related document list, such as requirements, design documents, other test plans, etc.: A test plan by itself is not complete, since a project has different areas covered by different documents
• Relevant standards or legal requirements
• Traceability requirements: Traceability determines how the various requirements are mapped to the different test plans
• Relevant naming conventions and identifier conventions: Very useful for people not involved in the preparation of this test plan
• Overall software project organization and personnel/contact-info/responsibilties: Such a section provides contact details of the key people in the group
• Test organization and personnel/contact-info/responsibilities: The same as above, but covers people in the testing organization
• Assumptions and dependencies: Assumptions can make a lot of difference to the success and failure of a project (and its test plan) and need to be carefully validated
• Project risk analysis: Risk analysis provides a good list of items that could cause risk to the project
• Testing priorities and focus: Any testing process has certain areas of focus (high risk areas, high impact areas, high change areas), and these need to be highlighted
• Scope and limitations of testing: There are some parts of the testing that may not be able to be covered, and this section attempts to cover this part
• Test outline - a breakdown of the test approach by test type, feature, functionality, process, system, module, etc. as applicable
• Data structures - Outline of data input, equivalence classes, boundary value analysis, error classes
• Details of the Test environment - hardware, operating systems, other required software, data configurations, interfaces to other systems; all these items should be detailed so that anybody picking up the test plan will have enough information
• Test environment validity analysis - differences between the test and production systems and their impact on test validity.
• Test environment setup and configuration issues
• Software migration processes
• Software CM processes: CM stands for Configuration Management
• Test data setup requirements
• Database setup requirements: If the software requires a database, then these instructions will be needed
• Outline of system-logging/error-logging/other capabilities, and tools such as screen capture software, that will be used to help describe and report bugs
• Discussion of any specialized software or hardware tools that will be used by testers to help track the cause or source of bugs
• Test automation - justification and overview
• Test tools to be used, including versions, patches, etc.
• Test script/test code maintenance processes and version control
• Problem tracking and resolution - tools and processes
• Project test metrics to be used
• Reporting requirements and testing deliverables
• Software entrance and exit criteria
• Initial sanity testing period and criteria
• Test suspension and restart criteria
• Personnel allocation
• Personnel pre-training needs: For some cases, people doing the testing may need some special training
• Test site/location: Where will the testing be done. For some specialized equipment, the testing would need to be done at the location where the equipment is based.
• Outside test organizations to be utilized and their purpose, responsibilties, deliverables, contact persons, and coordination issues
• Relevant proprietary, classified, security, and licensing issues - These are very important from a legal point of view
• Open issues
• Appendix - glossary, acronyms, etc.
All of these sections, when completed, will provide a test plan that should be the single document that will guide the testing process

No comments:

Facebook activity