Вы находитесь на странице: 1из 13

Project Name Functional Test Plan

For

Client Company
Prepared by Real-Time Technology Solutions, Inc. Author name Author name

55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020

page 1

Revision History
Date 00/00/2000 Version 1.0 Description First draft Author

55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020

page 2

RTTS Test Plan for Client Company


Table of Contents 1 Introduction................................................................................................................3 1.1 Document Purpose..............................................................................................................3 1.2 Objectives............................................................................................................................4 2 Project Scope.............................................................................................................4 2.1 In Scope...............................................................................................................................4 2.2 Out Of Scope but Critical to Project Success......................................................................4 2.3 Out of Scope........................................................................................................................4 3 Project Resources.....................................................................................................4 Testers......................................................................................................5 4 Test Strategies/Techniques......................................................................................6 4.1 Test Design..........................................................................................................................6 4.2 Test Data.............................................................................................................................7 5 Automation Coding Strategy....................................................................................7 6 Test Suite Backup Strategy......................................................................................8 7 Test Suite Version Control Strategy.........................................................................8 8 Metrics Table..............................................................................................................8 9 Project Tasks/Schedule............................................................................................9 10 Tool Inventory........................................................................................................10 11 Hardware/Software Configuration........................................................................10 12 Naming Conventions.............................................................................................11 13 Defect Responsibility/Resolution.........................................................................11 14 Exit Criteria............................................................................................................12 15 Goals and Deliverables.........................................................................................12 16 Glossary of Standard Terms.................................................................................13

1 Introduction
1.1 Document Purpose

Document overview; high-level summary of major issues addressed.


55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020 page 3

This Test Plan reviews: 1.2 Existing project information. Business Requirements and critical transactions to be tested. Testing types and strategies to be implemented A proposed testing schedule Objectives

State the objective of the testing project, its duration and justification. General comments concerning the objective of testing are appropriate (e.g. make the QA function more efficient; lower testing cycle time; improve software quality; enhance testing process.

2 Project Scope
2.1 In Scope

State scope in detail and duration of process. 2.2 Out Of Scope but Critical to Project Success

State any out-of-scope critical project dependency. e.g.: Database snapshots for test system that accurately reflect current user population. 2.3 Out of Scope

State in detail any out-of-scope activities. (e.g., Performance, stress, and volume testing (beyond the gathering of timing information from automated script executions) are out of project scope.)

3 Project Resources
Table 3.1. Project Roles and Responsibilities Responsibilities Role Resource Name(s)

55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020

page 4

Testers

Developers

Business Analysts Users

DBA

Network Administrator Desktop Administrators

Management

Plan testing activities Execute Test Cases Automate Test Cases Find, report and track defects Measure test effort Analyze results Deliver complete builds of the application Provide Testers with feedback regarding changes, new functionality Provide expertise and knowledge of the application-under-test Eliminate agreed upon defects Interview Users Create Business Requirements Create Test Scenarios, Test Cases Describe and review Business Requirements Describe and review user profiles Perform User Acceptance Testing (UAT) Provide access rights to database Assist with extraction of data for testing purposes Provide a stable testing environment Assist with returning database instance to a known preferred state Provide trouble-shooting and knowledge Provide network access privileges General troubleshooting and knowledge Installation of software Troubleshooting of hardware/software Information regarding standard desktop High-level problem solving Mediation of issues Interface of activities with different business units

55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020

page 5

4 Test Strategies/Techniques
4.1 Test Design

Describe test types that are relevant to the project. Provide justification for their relevance. Table 4.1. Summary of Test Types Test Type Definition Unit Test Test verifies the program (or module) logic and is based on the knowledge of the program structure. This test type is performed by programmers using the White Box technique. Integration Test Test verifies the entire systems functionality (including feeds to and from the system) according to the business and design specifications. Business Requirements Verifies the specific requirements of the user are met. Also known as Business Rules Acceptance Testing Verifies that the system needs to meet the initial objectives and users exceptions. Used to prove that the system works. Known as positive testing. Regression Testing Verify that t he fixes/modifications are correct and no other parts of the system have been affected. System Test Testing the Volume testing - to determine whether the application architecture in a program can handle the required volume of production-simulated environment for data, requests, etc.
normal and worst-case situations.

Load Testing Identify peak load conditions at which the program will fail to handle required processing loads within required time span. Performance testing - determine whether the program meets its performance requirements. Resource Usage testing - determine whether the program uses resources (memory, disk space, etc.) at levels which exceed expectations Interoperability Testing - Assure the application can co-exist on the desktop with other applications running. Also known as Compatibility Testing. Security testing - Show that the programs security requirements have been met Concurrency testing - Tests to verify that the system does not corrupt data when 2 or more
55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020 page 6

Test Type

Graphical User Interface (GUI) Test

Definition users attempt to update the same record at the same time or when 2 or more users update different records at the same time and set a unique field to the same value Verify GUI features and elements and compare them to GUI standards and test design

Summarize types of test strategies and their respective emphases, e.g., critical requirements testing (comprising X% of Y total requirements); X user paths through the application that comprise Y% of the paths through the application; transaction-based testing (testing includes X critical transactions, which are Y% of the total transactions performed by the application) How will test types proposed above be tested? Table 4.2. Use Case listing with brief description and Test Case mapping. Use Case ID Description Test Case UC-1 Use Case 1 description TC-1a TC-1b TC-1c

Table 4.3. Test Case listing with mapping to generative Use Case, description and Requirement reference. Test Case Use Case Description Requirement ID ID TC-1a UC-1 Test Case 1a description R1.1-R5.3 TC-1b UC-1 R6.1-R10.3 TC-1c UC-1 R10.3-R11

4.2

Test Data

Description of data sets to be used in testing, origin of data sets, purpose for using each set (e.g. different user data for different user permissions), where data sets obtained, whose expertise guided data set selection, etc.

5 Automation Coding Strategy


This section describes the automation coding strategy that will be used for every test script: Generic examples follow.
55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020 page 7

Automation of the test suite for the XX application will be performed using XX Softwares XX suite (automation tool: XX; scripting language: XX). The automation coding strategy that will be used in test suite building will include the following rules: Start and Stop Point: All Test Script navigation will start and finish on the XX window/page of the XX application. Browser Caption Verifications: Browser Captions will be verified on every window that is encountered in the application. The execution of these verifications will occur immediately after each window is loaded. Object Properties: Properties of objects that must be verified will be retrieved from application objects using the test tools data capture functionality. The retrieved data will then be compared against validated data in test suite files. Results will be output to the test log. Maintainability: Scripting will adhere to modular coding practices and will test following the strategy described above. Test suite builds will employ RTTS proprietary language extension (rttsutil.dll).

6 Test Suite Backup Strategy


List all paths to test artifacts here. How will test suite data (code, external files, tool data, etc.) be backed up for storage? How often? Where will backup location be? How accessible will backups be? How many backups will be kept at any given time? How long will backup data be kept? Will it be archived?

7 Test Suite Version Control Strategy


As test suites are modified for each build, how will test suite version control/change management be addressed? Will an external software tool be used for version control? Will there be a need to run archived test suites against old builds of the application? If so, how will this be facilitated?

8 Metrics Table
A central part of test planning is the gathering of metrics. An accurate collection of metrics for all key project activities provides an assessment of the total effort required for the project.
55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020 page 8

Table 8.1. Project Metrics Activity Interview with a knowledgeable User to characterize one user transaction Walkthrough of the valid test case Creation of a written test case by a business analyst/SME Automation and debugging of the script reflecting the test case Extraction of one requirement from requirements documentation Extraction of one requirement from user guide documentation Extraction of one requirement from release notes

Metric

9 Project Tasks/Schedule
Table 9.1. Project Schedule Task Resources Test Plan Completed Test Environment Prepared Requirements Processed by Tool Test Cases Planned Test Cases Created Test Cases Recorded and Executed Defects submitted and tracked Test Cycle Evaluation Test Suite Backup Strategy Comments None Installation of Automated Tool Sectioned by user paths through the application One per requirement One per requirement Recorded in Tool, executed against each build and release of PROJECT. Submitted and tracked in Defect Tracking Tool Continuous effort Continuous effort
page 9

Projected Completion

55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020

Task Test Suite Version Control

Resources

Comments Continuous effort

Projected Completion

10 Tool Inventory
Table 10.1. Software tools to be used in the Automated Testing of Project Function Tool Name Vendor Version Project Administration Test Management Capture/Playback Defect/Issue Tracking Requirements Management Team Communications (email, WebEx) Utilities (RTTS Utilities)

11 Hardware/Software Configuration
Table 11.1. Hardware/Software System Resources Resource Details Test PC(s) Network OS Communication Protocol Server Database Server - Web Applications Server Database Automation Software Other Software
55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020 page 10

Front-End Development Tools

12 Naming Conventions
All Test Cases and Test Scripts created for this project will adhere to the following naming convention. Each Test Script will have the same name as its respective Test Case. We will use the following scheme, based upon : B char. C char. 2 char. 3 B char. 4 I char. 5 2__ char. 6 Numeric
counter 1 through 9 (only used in the case of more than one) FIELD 3: Type of Transaction: I=Issue R=Release

Underscore FIELD 1: The initial characters describe the stream by name: QBC=Quote, Binder, Cert stream N

FIELD 2: Area of application where the transaction is performed: NC=Name Clearance Q=Quote

Field 1 represents the defined user stream through PROJECT by name. This section varies in length from one to three characters. Separation Character is an underscore. Field 2 represents the section of PROJECT being tested. Field 3 represents the type of transaction being tested. Additional Character (if needed) represents a numeric counter for multiple scripts of the same type and name.

13 Defect Responsibility/Resolution
Possible defects identified through automated or manual testing will be discussed with development team members and/or the Project Manager to verify that the observed behavior constitutes a defect. Identified defects will be logged in defect tracking software. Defects found manually will be coded into relevant automated test scripts for inclusion in future regression testing. Once the development team has corrected a defect, the defect will be retested using the same Test Script that detected the defect. Validated fixes will be entered into the defect tracking tool. Accurate defect status data will be maintained in the defect tracking tool by . In order to preserve data quality in the defect tracking process, will serve as
55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020 page 11

gatekeeper for the defect database. Responsibilities include: evaluation of all reported defects to verify the conditions under which they occur; reproducibility of reported defects; accuracy of defect descriptions; uniqueness of logged defects.

14 Exit Criteria
The following exit criteria will be used for each stage of the testing process. Testing can proceed to the next stage of the process when a sufficient proportion of the current stage has been completed (for example, test case preparation need not be completed before automated coding begins). All exit criteria should be satisfied by the end of the project. Stage 1: Test Process Assessment Delivery of a written assessment of the current test process with recommendations for improvement Stage 2: Test Planning Stage Test Plan delivery. Stage 3: Test Design Stage The application hierarchy, requirements hierarchy, defined transactions, and detailed, written test cases approved. Stage 4: Test Automation, Execution, and Defect Tracking Stage 100% of test cases are scripted and executed, 100% of produced documents are verified, and 100% of defects are retested and removed. Stage 5: Evaluation and Improvement Automated suite evaluation and improvement.

15 Goals and Deliverables


Sample generic goals and deliverables follow. Goals The following list describes the defined goals for the test process: 1. To accomplish all tasks described in this test plan. 2. To install a measurable, improvable, repeatable, and manageable test process at Client Company. 3. To decrease the time necessary to test new builds of Client Companys PROJECT. 4. To verify the functionality and content of the current version of the PROJECT application. 5. To reduce the frequency of error associated with manual testing. 6. To find and successfully track 100% of defects present along the user path defined in this plan. Deliverables The following list describes the defined deliverables for each stage of the testing process:
55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020 page 12

Test Process Assessment - An assessment of the current test process with recommendations for improvement. Test Planning Stage - A complete Test Plan, including preliminary Test Requirements. Test Design Stage - Test Cases describing input, conditions, and expected output for each requirement, verified Test Requirements. Test Automation, Execution, and Defect Tracking stage Test Scripts, logged test results, defect/issue reports. Evaluation and Improvement Metrics proving the efficiency and benefit of automated testing, Test Cycle Evaluation, Project summary/evaluation.

16 Glossary of Standard Terms


Table 16.1. Glossary Term Definition Test Scenario A path through an application to elicit the normal functioning of the application. The path may be a user path, a path defined by specific requirements or a path examining back-end functionality. Examples: make a deposit (path=common user path); send request to server for cost of mailing a package from point A to point B (path=back-end path). Test Case A text document that states the objectives and details of a test scenario: the steps taken, specific test data used, test conditions, and expected results. Test Script A script containing the Automation Tool code that executes the Test Scenario described in the corresponding Test Case.

55 02/2002 Real-Time Technology Solutions, Inc. John Street 12th Floor New York, NY 10038-3712 phone 212.240-9050 fax 212.240-9020

page 13

Вам также может понравиться