Вы находитесь на странице: 1из 21

Table of Contents Page

1. Introduction ............................................................................................................... 4
1.1. Overview of This New System .................................................................................. 4
1.2. Purpose of this Document ........................................................................................ 4
1.3. Objectives of System Test ...................................................................................... 4
1.4. Software Quality Assurance Involvement .................................................................... 4

2. Scope and Objectives .................................................................................................. 5


2.1. Scope of Test Approach - System Functions .............................................................. 5
2.1.1. Inclusions ........................................................................................................ 5
2.1.2. Exclusions ....................................................................................................... 5
2.2. Testing Scope ......................................................................................................... 6
2.2.1. Functional Testing ............................................................................................ 6
2.2.2. Interface Testing .............................................................................................. 7
2.2.3. Acceptance Testing ......................................................................................... 7
2.2.4. Final Acceptance Testing ................................................................................ 7
2.3. Testing Process ..................................................................................................... 7
2.4. System Test Entrance/Exit Criteria.......................................................................... 8
2.4.1. Entrance Criteria ............................................................................................. 8
2.4.2. Exit Criteria ..................................................................................................... 8

3. Test Phases and Cycles ............................................................................................. 9


3.1. Project Integration Test ......................................................................................... 9
3.2. Operations Acceptance Test.................................................................................. 10

4. System Test Schedule ................................................................................................ 11

5. Resources ................................................................................................................... 12
5.1. Human ................................................................................................................... 12
5.2. Hardware .............................................................................................................. 12
5.2.1. Hardware components required ........................................................................ 13
5.3. Software ................................................................................................................ 13
5.3.1. Test Host environments .................................................................................... 13
5.3.2. Test Branch Software ...................................................................................... 13
5.3.3. Error Measurement System .............................................................................. 13

6. Roles and Responsibilities ......................................................................................... 14


6.1. Management Team ................................................................................................ 14
6.2. Testing Team ......................................................................................................... 14
6.3. Business Team ....................................................................................................... 14
6.4. Testing Support Team ............................................................................................ 14
6.5. External Support Team .......................................................................................... 14

7. Error Management/Configuration Management ........................................................... 15


gb_consultant@rediffmail.com
8. Reviewing & Status Reporting .................................................................................. 16
8.1. Status Reporting .................................................................................................... 16
8.2. Formal Review Process.......................................................................................... 16
8.2.1. Review Points .................................................................................................. 16

9. Issues/Risks/Assumptions .......................................................................................... 17

10. Signoff ........................................................................................................................ 19

11. Appendices ................................................................................................................ 20

12. Control Documentation .............................................................................................

1. INTRODUCTION

1.1. Overview of System X

To aim of this phase of the project is to implement a new X System platform that will enable:

2 Only For IT People.


gb_consultant@rediffmail.com
• Removal of legacy office systems
• Introduction of ABC
• Processing of Special Transactions
• No constraint on location of capture
• Enable capture of transactions for other processing systems
• New Reconciliation Process
• Positioning for European ECU Currency and future initiatives

This programme will result in significant changes to the current departmental and inter-office
processes. The functionality will be delivered on a phased basis.
Phase 1 will incorporate the following facilities :

• Replacement of the legacy System A


• New Reconciliation System
• Outsourcing system for departments in different european countries.
• New/Revised Audit Trail & Query Facilities

[Detailed inclusions are listed later in this document]

1.2. Purpose of this Document

This document is to serve as the Draft Test Approach for the Business Systems Development
Project.
Preparation for this test consists of three major stages:-

• The Test Approach sets the scope of system testing, the overall strategy to be adopted,
the activities to be completed, the general resources required and the methods and
processes to be used to test the release. It also details the activities, dependencies and
effort required to conduct the System Test.

• Test Planning details the activities, dependencies and effort required to conduct the
System Test.

• Test Conditions/Cases documents the tests to be applied, the data to be processed, the
automated testing coverage and the expected results.

1.3. Formal Reviewing


There will be several formal review points before and during system test. This is a vital element
in achieving a quality product.
1.3.1. Formal Review Points

1. Design Documentation
2. Testing Approach
3. Unit Test Plans
4. Unit Test Conditions & Results
5. System Test Conditions
3 Only For IT People.
gb_consultant@rediffmail.com
6. System Test Progress
7. Post System Test Review

1.4. Objectives of System Test

At a high level, this System Test intends to prove that :-

• The functionality, delivered by the development team, is as specified by the business in


the Business Design Specification Document and the Requirements Documentation.

• The software is of high quality; the software will replace/support the intended business
functions and achieves the standards required by the company for the development of
new systems.

• The software delivered interfaces correctly with existing systems, including Windows 98.

[Detailed objectives are listed later in this document.]

1.4.1. Software Quality Assurance involvement

The above V Model shows the optimum testing process, where test preparation commences as
soon as the Requirements Catalogue is produced. System Test planning commenced at an early
stage, and for this reason, the System test will benefit from Quality initiatives throughout the
project lifecycle.
The responsibility for testing between the Project & Software Qualtiy Assurance (S.Q.A.) is as
follows:

• Unit Test is the responsibility of the Development Team


• System Testing is the responsibility of SQA
• User Acceptance Testing is the Responsibility of the User Representatives Team

4 Only For IT People.


gb_consultant@rediffmail.com
• Technology Compliance Testing is the responsibility of the Systems Installation &
Support Group.

2. SCOPE AND OBJECTIVES

2.1. Scope of Test Approach - System Functions

2.1.1. INCLUSIONS

The contents of this release are as follows :-


Phase 1 Deliverables

o New & revised Transaction Processing with automated support


o New Customer Query Processes and systems
o Revised Inter-Office Audit process
o Relocate Exceptions to Head Office
o New centralised Agency Management system
o Revised Query Management process
o Revised Retrievals process
o New International Reconciliation process
o New Account Reconciliation process

2.1.2. EXCLUSIONS

When the scope of each Phase has been agreed and signed off, no further inclusions will be
considered for inclusion in this release, except:

(1) Where there is the express permission and agreement of the Business Analyst
and the System Test Controller;

(2) Where the changes/inclusions will not require significant effort on behalf of
the test team (i.e. requiring extra preparation - new test conditions etc.) and will
not adversely affect the test schedule.

[See Section 9.1.]


2.1.3. SPECIFIC EXCLUSIONS

• Cash management is not included in this phase


• Sign On/Sign Off functions are excluded - this will be addressed by existing processes
• The existing Special Order facility will not be replaced
• Foreign Currency Transactions
• International Data Exchanges
• Accounting or reporting of Euro transactions

Reference & Source Documentation:


1. Business Processes Design Document - Document Ref: BPD-1011
2. Transaction Requirements for Phase 1 - Document Ref: TR_PHASE1-4032
3. Project Issues & Risks Database - T:\Data\Project\PROJECT.MDB

5 Only For IT People.


gb_consultant@rediffmail.com
4. The System Development Standards - Document Ref: DEVSTD-1098-2
5. System Development Lifecycle - Document Ref: SDLC-301

2.2. Testing Process

The diagram above outlines the Test Process approach that will be followed.

a. Organise Project involves creating a System Test Plan, Schedule & Test Approach, and
requesting/assigning resources.
b. Design/Build System Test involves identifying Test Cycles, Test Cases, Entrance & Exit
Criteria, Expected Results, etc. In general, test conditions/expected results will be identified
by the Test Team in conjunction with the Project Business Analyst or Business Expert. The
Test Team will then identify Test Cases and the Data required. The Test conditions are
derived from the Business Design and the Transaction Requirements Documents
c. Design/Build Test Procedures includes setting up procedures such as Error Management
systems and Status reporting, and setting up the data tables for the Automated Testing Tool.
d. Build Test Environment includes requesting/building hardware, software and data set-ups.
e. Execute Project Integration Test - See Section 3 - Test Phases & Cycles
f. Execute Operations Acceptance Test - See Section 3 - Test Phases & Cycles
g. Signoff - Signoff happens when all pre-defined exit criteria have been achieved. See Section
2.4.

2.2.1. Exclusions
SQA will not deal directly with the business design regarding any design / functional issues /
queries.
The development team is the supplier to SQA - if design / functional issues arise they should be
resolved by the development team and its suppliers.

2.3. Testing Scope

Outlined below are the main test types that will be performed for this release. All system test
plans and conditions will be developed from the functional specification and the requirements
catalogue.

6 Only For IT People.


gb_consultant@rediffmail.com
2.3.1. Functional Testing

The objective of this test is to ensure that each element of the application meets the functional
requirements of the business as outlined in the :

• Requirements Catalogue
• Business Design Specification
• Year 2000 Development Standards
• Other functional documents produced during the course of the project i.e. resolution to
issues/change requests/feedback.

This stage will also include Validation Testing - which is intensive testing of the new Front end
fields and screens. Windows GUI Standards; valid, invalid and limit data input; screen & field
look and appearance, and overall consistency with the rest of the application.
The third stage includes Specific Functional testing - these are low-level tests which aim to test
the individual processes and data flows.

2.3.2. Integration Testing

This test proves that all areas of the system interface with each other correctly and that there are
no gaps in the data flow. Final Integration Test proves that system works as integrated unit when
all the fixes are complete.

2.3.3. Business (User) Acceptance Test

This test, which is planned and executed by the Business Representative(s), ensures that the
system operates in the manner expected, and any supporting material such as procedures, forms
etc. are accurate and suitable for the purpose intended. It is high level testing, ensuring that there
are no gaps in functionality.

2.3.4. Performance Testing

These tests ensure that the system provides acceptable response times (which should not exceed
4 seconds).

2.3.5. Regression Testing

A Regression test will be performed after the release of each Phase to ensure that -

• There is no impact on previously released software, and


• to ensure that there is an increase in the functionality and stability of the software.

The regression testing will be automated using the automated testing tool.

2.3.6. Bash & Multi-User Testing

7 Only For IT People.


gb_consultant@rediffmail.com
Multi-user testing will attempt to prove that it is possible for an acceptable number of users to
work with the system at the same time. The object of Bash testing is an ad-hoc attempt to break
the system.

2.3.7. Technical Testing

Technical Testing will be the responsibility of the Development Team.

2.3.8. Operations Acceptance Testing (OAT)

This phase of testing is to be performed by the Systems Installation and Support group, prior to
implementing the system in a live site. The SIS team will define their own testing criteria, and
carry out the tests.

2.4. System Test Entrance/Exit Criteria

2.4.1. Entrance Criteria

The Entrance Criteria specified by the system test controller, should be fulfilled before System
Test can commence. In the event, that any criterion has not been achieved, the System Test may
commence if Business Team and Test Controller are in full agreement that the risk is
manageable.

• All developed code must be unit tested. Unit and Link Testing must be completed and
signed off by development team.
• System Test plans must be signed off by Business Analyst and Test Controller.
• All human resources must be assigned and in place.
• All test hardware and environments must be in place, and free for System test use.
• The Acceptance Tests must be completed, with a pass rate of not less than 80%.

Acceptance Tests:
25 test cases will be performed for the acceptance tests. To achieve the acceptance criteria 20 of
the 25 cases should be completed successfully - i.e. a pass rate of 80% must be achieved before
the software will be accepted for System Test proper to start. This means that any errors found
during acceptance testing should not prevent the completion of 80% of the acceptance test
applications.

Note: These tests are not intended to perform in depth testing of the software.
[For details of the acceptance tests to be performed see
X:\Testing\Phase_1\Testcond\Criteria.doc]

Resumption Criteria
In the event that system testing is suspended resumption criteria will be specified and testing will
not re-commence until the software reaches these criteria.

2.4.2. Exit Criteria

8 Only For IT People.


gb_consultant@rediffmail.com
The Exit Criteria detailed below must be achieved before the Phase 1 software can be
recommended for promotion to Operations Acceptance status. Furthermore, I recommend that
there be a minimum 2 days effort Final Integration testing AFTER the final fix/change has been
retested. [See section 9.3]

• All High Priority errors from System Test must be fixed and tested
• If any medium or low-priority errors are outstanding - the implementation risk must be
signed off as acceptable by Business Analyst and Business Expert
• Project Integration Test must be signed off by Test Controller and Business Analyst.
• Business Acceptance Test must be signed off by Business Expert.

3. TEST PHASES AND CYCLES

There will be two main stages of testing for the new application during System Test :-

• System Testing
• Operations Acceptance Testing

3.1. System Testing Cycles

The main thrust of the approach is to intensively test the front end in the first two releases, thus
raising approximately 80% of errors in this period. With the majority of these errors fixed,
standard and/or frequently used actions will be tested to prove individual elements and total
system processing in Release v0.3. Regression testing of outstanding errors will be performed on
an ongoing basis.
When all errors (which potentially impact overall processing) are fixed, an additional set of test
cases are processed in Release v0.4 to ensure the system works in an integrated manner. It is
intended that Release v0.4 be the final proving of the system as a single application. There
should be no A or B class errors outstanding prior to the start of Release v0.4 testing.
Test Cases by Release version:
Testing by Phase
Acceptance 1
Release v0.1 Functional 1
User Acceptance
Acceptance 2
Release v0.2 Functional 2
Regression 1
Acceptance 3
Functional 3
Release v0.3 Performance 1
Bash & Multi-User Testing
Regression 1
Regression 2
Integration 1
Technical 1
Release v0.4 Regression 1
Regression 2

9 Only For IT People.


gb_consultant@rediffmail.com
Regression 3
Installation Test
Contingency Per Bug Fix Test Only

3.1.2. Automated Testing


Automated testing tools will be used in the test environment for functional and regression
testing. The main focus of the automated testing will be the regression testing of the previously
delivered functionality - i.e. when development version 0.2 of the software is delivered the
majority of the regression testing of the functionality delivered in development version 0.1 will
be automated. It is estimated that the full benefit of the automated testing will only occur when
the tests have been executed three or more times.

3.2. Software Delivery

During System Test the release of new versions of the software will be co-ordinated between the
Development Team leader and the System Test Controller. However, unless it concerns a fix to a
very serious error, new versions should only be released when a agreed targets have been reached
(i.e. the next version contains fixes to X or more numbers of bugs).
Release Schedule:
Functionality to v0.1 v0.2 v0.3 v0.4 v1.0
be Delivered * 1st May 17th May 31st May 18th June 29th June
1. Function A
2. Process B No New Bug Fix
3. Euro Reqs' Functionality contingency
4. Y2K Reqs. to be release
5. Inter Office Trans delivered only.
6. International Trans. in this
7. Other. release.
* (per functional spec, by priority)
Notes:
It is intended that 80% of the functionality will have been tested in full prior to the Phase 3
Release.
All the functionality must be present in the Phase 3 Release.
No previously undelivered functionality will be accepted for testing after Phase 3.

3.3. Formal Reviewing

There will be several formal review points before and during system test, including the review of
this document. This is a vital element in achieving a quality product.

10 Only For IT People.


gb_consultant@rediffmail.com
3.3.1. Formal Review Points

1. Design Documentation - Requirements Specification & Functional Specification


2. System Test Plan
3. Unit Test Plans & Test Conditions
4. Unit Test Results
5. System Test Conditions
6. System Test Progress/Results
7. Post System Test Review
8. Integration Test Results
9. Pilot Implementation Review
10. Project Review
The diagram above outlines the Test Approach. Boxes 1 - 6 show the major review stages prior
to Test execution. Boxes 7 - 10 show the phases planned for & after test execution.
While the above diagram concentrates on the testing aspect of SQA's role, there is an ongoing
role also, in ensuring the quality of the major deliverables throughout the lifecycle of the project.
SQA's role will be to ensure that all Quality Inspections occur for all the agreed deliverables and
that follow up actions and initiatives are pursued.

3.3.2. Progress/Results Monitoring

• Acceptance Test 1 Results


• Test Results - Release v0.1
• Test Results - Release v0.2
• Test Results - Release v0.3
• Performance Test 1 Results
• Regression 1 & 2 Results
• Test Results - Release v0.4
• Technical Test Results

5. RESOURCES

11 Only For IT People.


gb_consultant@rediffmail.com
5.1. Human

Resource Type Resource Title No. Date Req'd Who Status

Project Business Analyst 1 - A.N. Assigned


Mgmt/Functional Other

Testing Test Controller 1 - A. Smith Assigned


Testers 4 1st May To Be Assigned

Test Support Team Support Programmers 4 15th May To be Assigned


Technical Support 1 1st May To be Assigned
WAN Support 1 25th May To be Assigned

Technical - External CIS Support 1 25th May To be Assigned


Bookkeeping Support 1 15th May To be Assigned
External Liaison 1 25th May C. Jones Assigned
Support

Business Business Expert/ 1 1st May To be Assigned


Business Representative

5.2. Hardware

One seperate, controlled system will be required for the initial phase
of testing, setup as per one standard, complete office environment.
In order to maintain the integrity of the test environment his network
will not be accessible to anybody outside this project. The printers are
also exclusively for use by the test network.

12 Only For IT People.


gb_consultant@rediffmail.com

Hardware components required

• 1 Network Controller
• 6 Networked PC's (See below)
• 1 DAP Workstation
• 1 Motorola 6520
• 1 Alpha AXP Server
• 1 Batch Waste Printer
• 1 HP LaserJet 4v Printer

PC Specifications
The 6 PC's required for the test environment will include the following:
1 x P100, 1Gb HD, 16Mb RAM [Current Minimum Specification]
3 x P166, 1.5Gb HD, 32Mb RAM [Current Standard Specification]
1 x P333, 2.5Gb HD, 64Mb RAM [Current Maximum Specification]
These specifications are the various specifications currently in use in different branches.
1 x Pentium running Windows NT is also required as the Test center for controlling and
executing the automated testing.

5.3. Software

Test IMS environments

13 Only For IT People.


gb_consultant@rediffmail.com
Test IMS region X will be required for System Testing. Additional or amended
data will be populated where required.

Test Environment Software

System Test will be run on the following Software Versions :-


Custom Destop Vers.97.0.1
Windows 95 Operating System
Visual Basic 5 Runtime Files
MS Office 97
Novell Netware

Error Measurement System

This system test will use a bespoke MS Access database Error Management system.
A new database will be implemented for the sole use of this project.
[See Chapter x ]

6. ROLES AND RESPONSIBILITIES

6.1. Management Team

Project Leader - B. Ruthlenn


Ensure Phase 1 is delivered to schedule, budget & quality
Ensure Exit Criteria are achieved prior to System Test Signoff
Regularly review Testing progress with Test Controller.
Liaise with external Groups e.g. New Systems
Raise and manage issues/risks relating to project or outside Test Teams control.
Review & sign off Test approach, plans and schedule.
SQA Project Leader - C. Nicely
Ensure Phase 1 is delivered to schedule, budget & quality
Regularly review Testing progress
Manage issues/risks relating to System Test Team
Provide resources necessary for completing system test.

6.2. Testing Team

Test Planner / Controller - D. Everyman


Ensure Phase 1 is delivered to schedule, budget & quality
Produce High Level and Detailed Test Conditions
Produce Expected Results
Report progress at regular status reporting meetings
Co-ordinate review & signoff of Test Conditions
Manage individual test cycles & resolve tester queries/problems.
Ensure test systems outages/problems are reported immediately and followed up.
Ensure Entrance criteria are achieved prior to System Test start.
Ensure Exit criteria are achieved prior to System Test signoff.
Testers
Identify Test Data
14 Only For IT People.
gb_consultant@rediffmail.com
Execute Test Conditions and Markoff results
Raise Software Error Reports
Administer Error Measurement System

6.3. Business Team

Business Analyst - E. Showman


Review high level / detailed test plans for System Test
Define Procedures
Resolve design issues
Resolve Business issues
Take part in daily test Error Review Team meetings

Business Representative - ?? (To be Assigned)


Execute User Acceptance Testing
Define Test Conditions/Expected Results for Business Acceptance Test
Resolve user issues
Resolve Design issues

6.4. Testing Support Team

Support Programmers
Take part in daily Error Review Team meetings
Co-ordinate/provide support for system test.
Resolve errors
Re-release test software after amendments
Support Systems Testers

6.5. External Support Team

CIS Support
Provide CIS support, if required.
Resolve CIS queries, if required.

IMS Support
Provide System Test Support
Support IMS Regions
Resolve Spooling Issues (if necessary)
Bookkeeping Integration & Compliance (if necessary)
Resolve queries arising from remote backup

Bookkeeping Support
Provide Bookkeeping Technical support, if required.
Resolve queries, if required.

Technical Support
Provide support for hardware environment
Provide support for Test software
Promote Software to system test environment

15 Only For IT People.


gb_consultant@rediffmail.com
Access Support
Provide and support Test Databases

7. Error Management & Configuration Management

During System Test, errors will be recorded as they are detected on Error Report forms. These
forms will be input on the Error Management System each evening with status "Error Raised"
or "Query Raised". The Error Review Team will meet each morning (10am, Conference Room)
to review and prioritise DN's raised the previous day, and assign them or drop them as
appropriate. This team will consist of the following representatives:-

• A. Boring - Development Team Leader


• B. Curie - Business Analyst
• C. Durine - Test Controller
• D. Ewards - Business Representative

Errors, which are agreed as valid, will be categorised as follows by the Error Review Team :-

• Category A - Serious errors that prevent System test of a particular function continuing
or serious data type error
• Category B - Serious or missing data related errors that will not prevent implementation.
• Category C - Minor errors that do not prevent or hinder functionality.

Category A errors should be turned around by Bug Fix Team in 48 hours (this is turn around
from time raised at Error Review Team meeting to time fix is released to System Test
environment). In the event of an A error that prevents System Test continuing, the turnaround
should be within 4 hours.
Category B errors should be turned around in 1 day; while
Category C errors should be turned around in 3 days.
However, the release of newer versions of the software will be co-ordinated with the Test
Controller - new versions should only be released when agreed, and where there is a definite
benefit (i.e. contains fixes to X or more numbers of bugs).

8. STATUS REPORTING

8.1. Status Reporting

Test preparation and Testing progress will be formally reported during a weekly Status Meeting.
The attendees at this meeting are :-

• Byron Ruthlenn - Project Manager


• Dion Ryan- Business Design Team
• Pat Smith - Development Team Leader

A status report will be prepared by the Test Controller to facilitate this meeting. This report will
contain the following information :-

1. Current Status v. Plan (Ahead/Behind/On Schedule)

16 Only For IT People.


gb_consultant@rediffmail.com
2. Progress of tasks planned for previous week
3. Tasks planned for next week including tasks carried from previous week
4. Error Statistics from Error Measurement system
5. Issues/Risks
6. AOB.

9. Issues, Risks and Assumptions

9.1. Issues/Risks

1. No further changes or inclusions will be considered for inclusion in this release except (1)
where there is the express permission and agreement of the Business Analyst and the System Test
Controller; (2) Where the changes/inclusions will not require significant effort on behalf of the
test team and will not adversely affect the test schedule. This is a potentially serious issue, as any
major changes to design will entail additional time to re-plan testing and to create or amend test
conditions
.
Resp : Byron Ruthlenn
Final list of inclusions to be Signed off.

2. The design of the software must be final, and design documentation must be complete,
informative and signed off by all parties prior to System Test proper commences.
Resp : D.A. Stone

3. A weakness in the 'phased delivery' approach is that the the high degree of interdependency in
the code means that the smallest changes can have serious effects to areas of the application
which apparently have not been changed. The assumption of the test team is that previously
delivered and tested functionality will only require regression testing to verify that it 'still' works.
I.e. testing will not be intended to discover new errors. Because of this I recommend that there be
a minimum 2 days regression testing AFTER the final fix/change has been retested. This
however, imposes a fixed time constraint on the completion of system testing which requires the
agreement of the Project Leader.
Resp : Byron Ruthlenn

4. Automated Testing
The majority of the Regression testing will be performed using the automated test tool. However,
due to the workload required to implement (and debug) the test tool fully it is likely that the
return will only be maximised after the 3rd time running the regression test suite for each release.
The other major uses of the test tool are for (1) Load Testing, (2) Multi-User Testing, and (3)
Repetitive data entry.
Resp : Test Controller

9.2. Assumptions

• Software will be delivered on time.


• Software is of the required quality.
• The software will not be impacted by impending Y2K compliance changes to the external
software infrastructure - i.e. any external software changes will have to be compatible
with this application.
• All "Show-Stopper" bugs receive immediate attention from the development team.
17 Only For IT People.
gb_consultant@rediffmail.com
• All bugs found in a version of the software will be fixed and unit tested by the
development team before the next version is released.
• Functionality is delivered to schedule.
• Required resources available.
• All service agreements will be met.
• The automated test tool will function & interface correctly with the software.
• All documentation will be up to date and delivered to the system test team.
• Functional and technical specifications will be signed off by the business.
• All service agreements will be met.
• The Intranet will be fully functional prior to project commencement.

10. Formal Signoff

This document must be formally approved before System Test can commence. The following
people will be required to sign off :-

Group Signatures:
Project Manager Byron Ruthlenn
SQA Colm Jones
Testing Team Dion Hais
Development Team Erwin Smith

11. APPENDICES

11.1. Purpose of Error Review Team.

Ensure maximum efficiency of the development and system testing teams for the release of the
new office software through close co-operation of all involved parties.
This will be achieved through daily meetings whose function will be to

• Agree status of each raised Error


• Prioritisation of valid Error's
• Ensure that enough documentation is available with Error's.
• Agree content and timescale for software releases into System test.
• Ensure one agreed source of Error reporting information.
• Identify any issues which may affect the performance of system testing.

11.2. Error Review Team Meeting Agenda.

• Review any actions from last meeting.

Classify and prioritise each Error.

• Review Error's raised for Duplicates etc.


• Agree priority of each Error
• Determine adequacy of documentation associated with raised Error's.
• Agree release content and timescale.

18 Only For IT People.


gb_consultant@rediffmail.com
• Review of assigned actions from meeting.

• AOB

11.3. Classification of Bugs

1. An "A" bug is a either a showstopper or of such importance as to radically affect the


functionality of the system i.e. :
- Examples of showstoppers
- If, because of a consistent crash during processing of a particular type of application, a
user could not complete that type of
application.
- Incorrect data is passed to legacy system resulting in corruption or system crashes
- Example of severally affected functionality
- Calculation of repayment term/amount are incorrect
- Incorrect credit agreements produced
2. Bugs would be classified as "B" where :
- a less important element of functionality is affected
- Example : a value is not defaulting correctly and it is necessary to input the correct
value
- data is affected which does not have a major impact
- Example : where, for instance, some element of client capture was not propagated to the
database
- there is an alternative method of completing a particular process
- Example : a problem might occur reading all the details of a credit - this change can be
manually input.
3. "C" type bugs are mainly cosmetic bugs i.e. :
- incorrect / no helptext on screens
- drop down lists repeat an option

11.4. Procedure for maintenance of Error Management system.

1. The Test Controller will refer any major error/anomaly to either Devopment Team Leader or
designated representative on the development team before raising a formal error record. This
has several advantages :-
- it prevents the testers trying to proceed beyond 'showstoppers'
- it puts the developer on immediate notice of the problem
- it allows the developer to put on any traces that might be necessary to track down the error.
2. All bugs raised will be on the correct Error form, and contain all relevant data.
3. These errors will be logged on the day they occur with a status of 'RAISED'
4. There will be a daily 'System Test Support Group' meeting to discuss, prioritise and agree all
logged errors.
During this meeting some errors may be dropped, identified as duplicates, passed to
programmer, etc.
5. The Error Log will be updated with the status of all errors after this meeting. e.g. with pgmr,
dropped, duplicate.
6. Once errors have been fixed and 'rebundelled' for a release the paper forms must be passed to
the Test Controller and he will change their status to 'Fixed to be retested'
19 Only For IT People.
gb_consultant@rediffmail.com
7. Once the error has been retested and proven to be corrected the status will be changed to
'Closed'
8. Regular status reports will be produced from the Error system, for use in the Error Review
Team meetings.

11.5. Overnight Processing - Checking Accounting & Audit & CIS

Test Requirement Check Items Level of


Testing
Accounting
When spooling complete the Summary report should be 1. Legacy Txs on 1. Checking
checked against : Report at
1. Similar Legacy Transactions V field level
2. Test Input forms Office Transactions
on 2. Checking
Report at
field level
2. Summary report
V
Applic input forms
Accounting : after open/amend the amendment report should 1. Amendment 1. Satisfy as
be checked: report to
1. For rejected open/amend instructions reasons for
2. Detail should correspond to input Applic Forms 2. Amendment rejection
report
V 2. Checking
Test input forms at
field level
Print off Account and Customer records and check field detail Bookkeeping - Input Checking at
against applic input forms/branch summary report tx's field level
V
Test Input
forms/Amend rpt

11.7. SOFTWARE QUALITY ASSURANCE MEASURES

(i) DATES.

- Start date of SQA involvement.

(ii) EFFORT.

- No. of SQA Man Days Test Planning


- No. of SQA Man Days Reviewing Test Plans
- No. of SQA Man Days Executing Tests

(iii) VOLUME.

20 Only For IT People.


gb_consultant@rediffmail.com
- No. of Tests Identified

(iv) QUALITY.

- No. of Tests Passed First Time


- Percentage of Tests Passed First Time
- No. of Error's Raised During Regression Testing
- No. of Error's Generated as a Result of Incorrect Fixes
- No. of Error's Raised by Category (A/B/C)
- No. of Error's Raised by Reason Code
- No. of Error's Raised by High Level Business Function

(v) TURNAROUND.

- Average Error Turnaround Time


• 12. CONTROL DOCUMENTATION

12.1. Online Error Input Form

12.2. Check-Off Control Documentation

12.3. Verification/Checkoff & Output Testing

12.4. Online Non Fixed Error Report

12.5. Errors Assigned to Development Team

12.6. SQA Lines of Communication

12.7. Error Process Paths

12.8. System Test Support

12.9. Error Status Flow


-

21 Only For IT People.

Вам также может понравиться