Process impact: This document
specifies "why" and "when" the testing will be done for this project. The "how" is described in the individual Test Cases
that this Test Plan links to.
Introduction
Why is this Test Plan needed?
A Test Plan is needed to provide intelligent focus of limited project testing resources and expertise on the highest risk areas of the
application and the "bread and butter" functionality of the system. A Test Plan is also needed to determine when the testing
is done.
What Test lessons were learned in previous releases?
None yet. This is the first release.
Test Plan Synopsis
Testing objectives and their rationale (risk, requirements, etc.)
The software system must print Completion Certificates correctly on the DSA continuous form Certificate blanks
as if the Certificates were produced on a typewrite. Failure to do so puts the DSA at risk for fines and
possible loss of business license imposed by state agencies.
The software system must assist DSA clerks with increasing the number of Completion Certificats produced
per unit time by a factor of four. Failure to do so defeats the original DSA automation intent of leveling
the staffing necessary to support current and increased Certificate production.
Scope and Limitations
Only the DSA procedures and activities supported directly by the new software system will be tested.
The adjunct manual procedures and activities of instructor certification and classroom material presentation are out of
testing scope for this project.
Testing strategy
Identify all the support software layers to be tested as well as the project software.
Identify the general testing approach to be taken for each software layer during each
development phase.
Cross-reference the testing strategy to the Test Cases to validate testing approach
coverage.
DSA Testing Strategy Chessboard
Sources of business expertise for test planning, test case development, and tet execution
The DSA management will be the primary source for business expertise.
Sources of development expertise for test planning, test case development, and test execution
The CPI development team will be the primary source for technical expertise beyond
the test team's experience with this project's technology.
The Database Management software vendor will be a secondary source for technical expertise.
Test Plans for Design
Sources of test data for Design testing
A small sample of real class rosters and associated Completion Certificates from earlier this year
will be used for initial testing cycles to demonstrate software stability.
A full month's worth of real class rosters and associated Completion Certificates from earlier this year
will be used for full
software capability testing and software performance testing.
The list of all DSA certified instructors will be used to validate instructor code/name lookups and
instructor name use in the student certificate records.
The list of all DSA classroom sites will be used to validate roster input data edits.
The Post Office list of all US Zipcodes will be used to validate roster input data edits.
The US Social Security office list of valid Social Security number-component ranges will be used
to validate roster input data edits.
Design test environments and their management
The Design test environment will be two identically configured DSA workstations.
One test workstation will be designated a Clerk workstation and will be used for all Certficate
production testing.
One test workstation will be designated an Administrator workstation and will be used for all
testing that involves workstation/workstation communication like file transfers, Master Certificate
File searching, and archiving.
Test tools for functional and performance testing will be installed on the test environment
workstations
How can you tell when you are ready to start Design testing ?
The test environment has been set up and verified ready for test execution.
The test data for Design has been collected.
How can you tell when you are finished Design testing ?
100% of the Design Test Cases have been attempted.
100% of the Design Test Cases are successful.
100% of the Design defect backlog has been corrected and re-verified by testing.
Testing schedule during Design
This schedule represents the master schedule of testing activity that the test team must
manage.
This schedule is the composite of all test case documentation, execution, and reporting
schedules for all development phases.
Each schedule entry represents one Test Case -> link to Test Case Suite
Each entry: test case id, name, status(w=written,as=attempted,successful,au=attempted,unsuccessful,c=corrected,crt=corrected, retested),status date,execution date
Test Case execution results analysis and reporting schedule
This schedule extends the test execution schedule to include time to analyze and report
test results.
Usually, there is some analysis time and minimal report drafting time associated with
each test execution so that the testers can begin to see the nature of the testing
outcome early in the test execution schedule. Then if adjustments need to be made in
the test execution to accommodate surprises, there is test execution schedule remaining
for the tactical adjustments.
A first allotment of analysis and reporting time can range from a half day to a couple
of days per test case.
A second allotment of time is made for analysis of results across multiple test case
executions for more pervasive trends and completion of the drafted report. This
second allotment of time can range from two to four weeks depending on the size and
complexity of the testing project.
Test Case summary list (ID, title, brief description)
This list replaces test-suite in the ReadySET standard templates.
This is the summary list of test cases that you curently think are needed to adequately
cover all the testing situations in this development phase.
Each Test Case ID in this table should ultimately point to a .html file of the same name.
This list is refined a number of times as more details about the application design
become available.
Consider using a Test Case ID naming standard like [devel phase]-[test approach]-[sequence number].
This naming standard will cause the test case .html files to be sorted first by development
phase, next by test approach (functional, structural, performance), and finally by sequence.
That does allow sequence numbers to be reused across development phases.
Have human resources been allocated to carry out the Test activities?
Yes, human resources have been allocated. They are listed in
the Resource Needs document.
No, human resources have not been allocated. They are listed as
"pending" in the Resource Needs document.
Have machine and software resources been allocated as needed for
the Test activities?
Yes, the Test team will use desktop machines and servers that are
configured like a "typical" business user.
Yes, a Test Lab has been set up. The needed machine and software
resources are listed in the Resource
Needs document.
No, needed machine and software resources are listed as pending
in the Resource Needs document.
Has this Test Plan been communicated to the development team, business expert team, and
other stakeholders?
Yes, everyone is aware of our prioritized quality goals for this
release and understands how their work will help achieve those
goals. Feedback is welcome.
Yes, this document is being posted to the project website.
Feedback is welcome.
No, some developers or business experts are not aware of the testing goals, test activities, or
testing schedules for this release. This is a risk that is noted in the Risk Management section
of the Project Plan.