Test Plan

Release Information

Project: Drive Safe America2.1
Internal Release Number: 1.0.0
Related Documents:
LINKS TO RELEVANT STANDARDS
LINKS TO OTHER DOCUMENTS
Process impact: This document specifies "why" and "when" the testing will be done for this project. The "how" is described in the individual Test Cases that this Test Plan links to.

Introduction

Why is this Test Plan needed?
A Test Plan is needed to provide intelligent focus of limited project testing resources and expertise on the highest risk areas of the application and the "bread and butter" functionality of the system. A Test Plan is also needed to determine when the testing is done.
What Test lessons were learned in previous releases?
None yet. This is the first release.

Test Plan Synopsis

  1. Testing objectives and their rationale (risk, requirements, etc.)
    • The software system must print Completion Certificates correctly on the DSA continuous form Certificate blanks as if the Certificates were produced on a typewrite. Failure to do so puts the DSA at risk for fines and possible loss of business license imposed by state agencies.
    • The software system must assist DSA clerks with increasing the number of Completion Certificats produced per unit time by a factor of four. Failure to do so defeats the original DSA automation intent of leveling the staffing necessary to support current and increased Certificate production.
  2. Scope and Limitations
    • Only the DSA procedures and activities supported directly by the new software system will be tested.
    • The adjunct manual procedures and activities of instructor certification and classroom material presentation are out of testing scope for this project.
  3. Testing strategy
    • Identify all the support software layers to be tested as well as the project software.
    • Identify the general testing approach to be taken for each software layer during each development phase.
    • Cross-reference the testing strategy to the Test Cases to validate testing approach coverage.
    • DSA Testing Strategy Chessboard

  4. Sources of business expertise for test planning, test case development, and tet execution
    • The DSA management will be the primary source for business expertise.
  5. Sources of development expertise for test planning, test case development, and test execution
    • The CPI development team will be the primary source for technical expertise beyond the test team's experience with this project's technology.
    • The Database Management software vendor will be a secondary source for technical expertise.

Test Plans for Design

  1. Sources of test data for Design testing
    • A small sample of real class rosters and associated Completion Certificates from earlier this year will be used for initial testing cycles to demonstrate software stability.
    • A full month's worth of real class rosters and associated Completion Certificates from earlier this year will be used for full software capability testing and software performance testing.
    • The list of all DSA certified instructors will be used to validate instructor code/name lookups and instructor name use in the student certificate records.
    • The list of all DSA classroom sites will be used to validate roster input data edits.
    • The Post Office list of all US Zipcodes will be used to validate roster input data edits.
    • The US Social Security office list of valid Social Security number-component ranges will be used to validate roster input data edits.
  2. Design test environments and their management
    • The Design test environment will be two identically configured DSA workstations.
    • One test workstation will be designated a Clerk workstation and will be used for all Certficate production testing.
    • One test workstation will be designated an Administrator workstation and will be used for all testing that involves workstation/workstation communication like file transfers, Master Certificate File searching, and archiving.
    • Test tools for functional and performance testing will be installed on the test environment workstations
  3. How can you tell when you are ready to start Design testing ?
    • The test environment has been set up and verified ready for test execution.
    • The test data for Design has been collected.
  4. How can you tell when you are finished Design testing ?
    • 100% of the Design Test Cases have been attempted.
    • 100% of the Design Test Cases are successful.
    • 100% of the Design defect backlog has been corrected and re-verified by testing.
  5. Testing schedule during Design
    • This schedule represents the master schedule of testing activity that the test team must manage.
    • This schedule is the composite of all test case documentation, execution, and reporting schedules for all development phases.
  • Each schedule entry represents one Test Case -> link to Test Case Suite
  • Each entry: test case id, name, status(w=written,as=attempted,successful,au=attempted,unsuccessful,c=corrected,crt=corrected, retested),status date,execution date
  • Test Case execution results analysis and reporting schedule
    • This schedule extends the test execution schedule to include time to analyze and report test results.
    • Usually, there is some analysis time and minimal report drafting time associated with each test execution so that the testers can begin to see the nature of the testing outcome early in the test execution schedule. Then if adjustments need to be made in the test execution to accommodate surprises, there is test execution schedule remaining for the tactical adjustments.
    • A first allotment of analysis and reporting time can range from a half day to a couple of days per test case.
    • A second allotment of time is made for analysis of results across multiple test case executions for more pervasive trends and completion of the drafted report. This second allotment of time can range from two to four weeks depending on the size and complexity of the testing project.
  • Test Case summary list (ID, title, brief description)
    • This list replaces test-suite in the ReadySET standard templates.
    • This is the summary list of test cases that you curently think are needed to adequately cover all the testing situations in this development phase.
    • Each Test Case ID in this table should ultimately point to a .html file of the same name.
    • This list is refined a number of times as more details about the application design become available.
    • Consider using a Test Case ID naming standard like [devel phase]-[test approach]-[sequence number]. This naming standard will cause the test case .html files to be sorted first by development phase, next by test approach (functional, structural, performance), and finally by sequence. That does allow sequence numbers to be reused across development phases.
  • Design Testing Schedule

      Test Result Codes
  • SND = Successful, no defect correction required
  • UNS = Unsuccessful, awaiting defect correction
  • SAD = Successful after defect correction
  • Test Case ID: Title Date to Write : done? Date to Attempt : done? Test Result Code
    . . .
    STHWR-30: Workstation Computing Platform Done Design wk 1
    STHWR-31: Workstation Connectivity Done Design wk 1
    STSWR-40: DBMS Print Capabilities Done Design wk 1
    STSWR-41: DBMS File Storage Capacities Done Design wk 1
    STSWR-42: DBMS Screen Navigation/Input/Output/Display Capbilities Done Design wk 1
    . . .
    FTPOS-04.0: Certificate Printing positive path testing Design wk 1 Prelim Constr
    FTNEG-04.0: Certificate Printing negative path testing Design wk 1 Prelim Constr
    FTDAT-04.0: Certificate Record Printout Design wk 1 Prelim Constr
    PT-04.0: Certificate Record Printout Cycle Design wk 1 Prelim Constr
    FTPOS-04.1: Certificate Printing - Printer Fault on Original(positive) Design wk 1 Prelim Constr
    FTNEG-04.1: Certificate Printing - Printer Fault on Original(negative) Design wk 1 Prelim Constr
    FTDAT-04.1: Certificate Printing - Printer Fault on Original Design wk 1 Prelim Constr
    PT-04.1: Certificate Printing Cycle - Print Fault on Original Design wk 1 Prelim Constr
    . . .
    FTPOS-03.0: Roster Input (positive) Design wk 1 Prelim Constr
    FTNEG-03.0: Roster Input (negative) Design wk 1 Prelim Constr
    FTDAT-03.0: Roster Input Screen Certificate Record ADDs Design wk 1 Prelim Constr
    PT-03.0: Roster Input Screen Entry Cycle Design wk 1 Prelim Constr
    FTPOS-09.0: Instructor Input (positive) Design wk 1 Prelim Constr
    FTNEG-09.0: Instructor Input (negative) Design wk 1 Prelim Constr
    FTDAT-09.0: Instructor Input Certificate Record ADDs Design wk 1 Prelim Constr
    PT-09.0: Instructor Input Entry Cycle Design wk 1 Prelim Constr
    FTPOS-10.0: Instructor Update (positive) Design wk 1 Prelim Constr
    FTNEG-10.0: Instructor Update (negative) Design wk 1 Prelim Constr
    FTDAT-10.0: Instructor Update Certificate Record MODIFYs Design wk 1 Prelim Constr
    PT-10.0: Instructor Update Entry Cycle Design wk 1 Prelim Constr
    . . .
    FTPOS-05.0: Certificate Search (positive) Design wk 2 Prelim Constr
    FTNEG-05.0: Certificate Search (negative) Design wk 2 Prelim Constr
    FTDAT-05.0: Certificate Search finds correct results Design wk 2 Prelim Constr
    PT-05.0: Certificate Search Cycle Design wk 2 Prelim Constr
    . . .
    FTPOS-07.0: Certificate Record Backup (positive) Design wk 2 Prelim Constr
    FTNEG-07.0: Certificate Record Backup (negative) Design wk 2 Prelim Constr
    FTDAT-07.0: Certificate Record Backup - record origin Design wk 2 Prelim Constr
    STINT-07.0: Certificate Record Backup - record transfer Design wk 2 Prelim Constr
    STBAK-07.0: Certificate Record Backup - record destination Design wk 2 Prelim Constr
    PT-07.0: Certificate Record Backup Cycle Design wk 2 Prelim Constr
    FTPOS-08.0: Certificate Record Archive (positive) Design wk 3 Prelim Constr
    FTNEG-08.0: Certificate Record Archive (negative) Design wk 3 Prelim Constr
    FTDAT-08.0: Certificate Record Archive - record origins Design wk 3 Prelim Constr
    STINT-08.0: Certificate Record Archive - record transfer Design wk 3 Prelim Constr
    STBAK-08.0: Certificate Record Backup - record destination Design wk 3 Prelim Constr
    PT-08.0: Certificate Record Archive Cycle Design wk 3 Prelim Constr
    . . .
    UIFTPOS-20: Screen Navigation (positive) Design wk 3 Prelim Constr
    UIFTNEG-20: Screen Navigation (negative) Design wk 3 Prelim Constr
    UIERMSG-20: Error messages from the screens Design wk 3 Prelim Constr
    . . .
    FTPOS-06.0: Certificate Duplicate Printing (positive) Design wk 4 Prelim Constr
    FTNEG-06.0: Certificate Duplicate Printing (negative) Design wk 4 Prelim Constr
    FTDAT-06.0: Certificate Duplicate Record Printout Design wk 4 Prelim Constr
    PT-06.0: Certificate Duplicate Record Printout Cycle Design wk 4 Prelim Constr
    FTPOS-06.1: Certificate Duplicate Printing - Printer Fault on Original (positive) Design wk 4 Prelim Constr
    FTNEG-06.1: Certificate Duplicate Printing - Printer Fault on Original (negative) Design wk 4 Prelim Constr
    FTDAT-06.1: Certificate Duplicate Printing - Printer Fault on Original Design wk 4 Prelim Constr
    PT-06.1: Certificate Duplicate Printing Cycle - Print Fault on Original Design wk 4 Prelim Constr

    Test-Plan Checklist

    Have human resources been allocated to carry out the Test activities?
    Yes, human resources have been allocated. They are listed in the Resource Needs document.
    No, human resources have not been allocated. They are listed as "pending" in the Resource Needs document.
    Have machine and software resources been allocated as needed for the Test activities?
    Yes, the Test team will use desktop machines and servers that are configured like a "typical" business user.
    Yes, a Test Lab has been set up. The needed machine and software resources are listed in the Resource Needs document.
    No, needed machine and software resources are listed as pending in the Resource Needs document.
    Has this Test Plan been communicated to the development team, business expert team, and other stakeholders?
    Yes, everyone is aware of our prioritized quality goals for this release and understands how their work will help achieve those goals. Feedback is welcome.
    Yes, this document is being posted to the project website. Feedback is welcome.
    No, some developers or business experts are not aware of the testing goals, test activities, or testing schedules for this release. This is a risk that is noted in the Risk Management section of the Project Plan.
    Company Proprietary
    Copyright © 2003-2004 Jason Robbins. All rights reserved. License terms. Retain this copyright statement whenever this file is used as a template.
    Copyright © 2005 Jerry Everett, Ken Everett, and Ray McLeod. All rights reserved. License terms. Retain this copyright statement whenever this file is used as a template.