Вы находитесь на странице: 1из 20

Test Planning

Readings: Ron Patton ch 17 20 & Daniel Galin ch 10

Objectives of Test Planning

Record, organize and guide testing activities Schedule testing activities according to the test strategy and project deadlines Describe the way to show confidence to Customers Provide a basis for re-testing during system maintenance Provide a basis for evaluating and improving testing process

IEEE Std. 829-1998 Test Plan

Describes scope, approach resources, schedule of intended testing activities. Identifies

test items, features to be tested, testing tasks, who will do each task, and any risks requiring contingency planning.

IEEE Std. 829-1998 Test Plan 1. Test plan identifier Outline


2. Introduction (refers to Project plan, Quality assurance plan, Configuration management plan, ...) 3. Test items - identify test items including version/revision level (e.g requirements, design, code, ...) 4. Features to be tested 5. Features not to be tested 6. Testing Approach 7. Significant constraints on testing (test item availability, testing-resource availability, deadlines). 8. Item pass/fail criteria 9. Suspension criteria and resumption requirements

IEEE Std. 829-1998 Test Plan Outline 10. Test deliverables (e.g. test design specifications, test cases specifications, test procedure
specifications, test logs, test incident reports, test summary report) 11. Testing tasks 12. Environmental needs 13. Responsibilities 14. Staffing and training needs 15. Schedule 16. Risks and contingencies 17. Approvals 18. References

Test Design Specification


2. Features to be tested

IEEE Std. 829-1998 Test Plan Outline

1. Test design specification identifier features addressed by document

3. Approach refinements 4. Test identification

Identifier and description of test cases associated with the design Pass/fail criteria for each feature

5. Features pass/fail criteria

IEEE Std. 829-1998 Test Plan Test Case Specification Outline


1. Test case specification identifier 2. Test items

Items and features to be exercised by the test case. input required to execute the test case, data bases, files, etc.

3. Input specifications

4. Output specifications 5. Environmental needs 6. Special procedural requirements 7. Intercase dependencies

Test Procedure Specification


1. Purpose 2. Special requirements 3. Procedure steps

IEEE Std. 829-1998 Test Plan Outline

Log how to log results Set Up how to prepare for testing Start how to begin procedure execution Proceed procedure actions Measure how test measurements will be made Shut Down how to suspend testing procedure Restart how to resume testing procedure Stop how to bring execution to an orderly halt Wrap Up how to restore the environment Contingencies how to deal with anomalous events during execution

Test Log

IEEE Std. 829-1998 Test Plan Outline

1.Test log identifier


2. Description

Information on all the entries in the log

3. Activity and event entries


Execution Description Procedure Results - observable results (e.g. messages), outputs Environmental Information specific to the entry Anomalous Events (if any) Incident Report Identifiers (identifier of test incident reports if any generated)

IEEE Std. 829-1998 Test Plan Test Incident Report Outline 1. Test incident report identifier
2. Summary - items involved, references to linked documents (e.g. procedure, test case, log) 3. Incident description

Inputs Expected results Actual results Anomalies Date and time Procedure step Environment Attempts to repeat Testers Observers

IEEE Std. 829-1998 Test Plan Test Incident Report Outline


4. Impact on testing process

S: Show Stopper testing totally blocked, bypass needed H: High - major portion of test is partially blocked, test can continue with severe restrictions, bypass needed M: Medium - test can continue but with minor restrictions, no bypass needed L: Low testing not affected, problem is insignificant

IEEE Std. 829-1998 Test Plan Test Summary Report Outline


1. Test summary report identifier 2. Summary - Summarize the evaluation of the test items, references to plans, logs, incident reports 3. Variances of test items (from specification), plan, procedure, ... 4. Comprehensive assessment - of testing process against comprehensiveness criteria specified in test plan 5. Summary of results issues (resolved, unresolved) 6. Evaluation - overall evaluation of each test item including its limitations 7. Summary of activities 8. Approvals - names and titles of all persons who must approve the report

Test Implementation Phase

Assessing Testing effort

How to know when testing can stop? How to assess testing success at removing defects? Examples of approaches

Independent test groups Fault seeding technique Mutant-based testing Defect plotting

Estimation using Independent Test Groups

Evaluate how many defects are in a software product and the efficiency of defect removal Uses independent groups testing same program Example: 2 groups
x = number of faults detected by Test Group 1 y = number of faults detected by Test Group 2 q = number of common faults detected by both Test Groups n = total number of faults in the program

E1 effectiveness of Group 1: x/n E2 effectiveness of Group 2: y/n n = (x*y)/q = q/(E1*E2)

Fault seeding technique

To estimate the number of faults in a program Before testing, seed program with a number of typical faults After a period of testing, compare the number of seeded and nonseeded faults detected

N = number of nonseeded (unintentional) faults S = number of seeded faults placed in the program n = actual number of faults detected during testing s = number of seeded faults detected during testing

N = S*(n/s)

Mutant-based testing

To assess the effectiveness of a test suite for defects detection Mutant version of a program obtained by replicating the original program except for one small change (a mutation)

corresponds to a typical error Value change constants, subscripts, parameters by adding/subtracting one, etc. Decision modify the sense of decisions (e.g. < to >=) Statement delete/exchange statements

Examples of mutations

Mutant killed by a test suite if it is revealed (fail) Kill ratio of a test suite = # mutant killed/# of mutants

higher the kill ratio, better the test suite is

Needs tool support (e.g. Mothra)

Mutant-based testing - Example


/* returns index of 1st occurrence of maximum element in an array */ int max(int[] input) { int j, result; result = 0; for (j =1; j < input.length; j++) { if (input[j] > input[result]) result = j; } return result; }

Original Test suite


Test Case 1 2 3 Input[0] 1 3 1 Input[1] 2 1 3 Input[2] 3 2 2 Expected 2 0 1

Examples of mutants
Mutant 1 2 3 4 Statement changed to Result input[j] < input[result] Killed input[j] == input[result] Killed input[j] <= input[result] Killed input[j] >= input[result] Live

need to add a test case for mutant #4

Mutant-based testing Equivalent programs


Mutant that produces exact same outputs as original Example

int max(int a, int b) { int max; if (a > b) max = a; else max = } Changing a > b by a >= b produces an equivalent program

Defect Plotting

To help decide when to stop testing Plot number of defects found per period of testing Graph supposed to

peak, then drop and plateau