Вы находитесь на странице: 1из 17

Chapter 12

System Testing
and uality Assur.

Introduction

Why System Testing?


What Do We Test For?

THE NATURE OF TEST DATA

The Test Plan


A.CTIVITY NETWORK FOR SYSTEM TESTING
Prepare Test Plan
Specity Conditlons for User Acceptance Testing
Prepare Test Data tor Program Testing
Preparo Test Data tor Transaction
Path Testing
Plan User Training
Compile/Assemble PIOgrams
Prepare Job Pertormance Aids
Prepare Operational Documents
358
AMGG l a n c e

isperlect. Communlcation problerns.


ever
programmers
NoSysiem d e s t g n

negilgenc
time
nce, or. constraints crédte errors tha' must be eliminated betore
hated be
ceady
s
tor
for
ready
user acceplancertesting. i
A System is tested for
e c O A system onltine
he
system
ne of transactions,
x o n s e , .v o l u
whir
stress.
e s , recovery trom tallure and usabilitry
testing, which veriflfes that the whole
system set of programs
her comes
gelher. Following syster testing is
acceptance testing. or running the hangs
system
data by ihe actual user.
Iiye uires a test plan that conslsts of several
system testing requtre
key activities and
s or prograim. string, system. and user acceptance testing. The system
s0 criteria deal with furnaround fime, backup. tile
erlormance.

h e human tactor.
protection. and
h
he For ms to be viable. controls have to be developed to ensure a
sy quality
Auaity assurance cuts acrOss the system life cycle, and is especialy
involved in implementat Auaility assurance specialists go through system
tion betore they grant certification. Auality assurance and
testing and valldatio
audit go hand in hand. The role of the auditor is to
the DP make sure that
adequate ntrols are built into the system for integrity and reliability.
Rr the erd of this chapter, you should know
1. Why systems are tested.
2. The activity nefwork for system testing
3. What steps are taken to test systems
4. The goals dt quality assurance in the system lite cycle.
5. The role ot the DP auditor in system testing.

SYSTEM TESTING
Types of System Tests

Quality Assurance
UALITY ASSURANCE GOALS IN THE SYSTEMS LIFE CYCLE
uality Factors Specitications
Soffware Requirements, Spectficatlons
Software Design Specitications
oftware Testing and Implementation
Maintenance and Support
EVELS OF AUALITY ASSURANCE

Tronds in Tosting
of the Data Processing Auditor
THE AUDIT TRAL
PART ROUR/SYSTRM DAmMINTAIoN

ENTRODUCTION

system design
i8
pertect; communiCat..
or clear, anduon betwe
or
No pogram completee
not always
and the designer is thrne ls usen the
more emors. The
The result is errors and number and
on several factors: nature ofof ervorshn
new design depend
Communications between
the user and the deni
1. ner.
to generate code that
2. The programmer's ability
a
that reflects exactly
system specifications.

3. The time frame for the design.


Theoretically, a newly designed system should have
working order, but in reality, each piece works independerall the piec
time to put all the pieces into one system and test it to deteny Now ia the
meets the user's requirements. This is the last chance to det mine whetheri
erTors before the system
is installed for user
acceptance testing,and comect
pose of system testing is to consider all the likely variationa
to The
riations which
be subjected and then push the system to its limits. It is it
necessary step in system development. but
This chapter reviews the process of system testing and the sto
to validate and prepare a system for final implementa steps taken
the following basic termns:
be familiar with
1. Unit testing is testing changes made in an existing or a new Drmin
2. Sequential or series testing is checking the logic of one or mors rogram
grams in the carididate system, where the output of one program wil pro
affect the processing done by another program.
3. System testing is executing a program to check logic changes made in t
and with the intention of finding errors-making the program fal
Effective testing does not guarantee reliability. Reliability is a design
consideration.
4. Positive testing is making sure that the new programs do in fact process
certain transactions according to specifications.
5. Acceptance testing is running the system with live data by the actua
user.

WHY SYSTKM TESTING?


Testing is vital to the success of the system. System testing makesa
assumption that if all the parts of the system are correct. the goalhal
that
successfully achieved. Inadequate testing or nontesting leads to e time

may not appear until months later. This creates two problems: the
lag between the cause and the appearance of the problem (ne the,
time interval, the more complicated the problem has become), asnal
effect of system errors on files and records within the syste ecive
Efec
system error can conceivably explode into a much langer proble
12/STSTEM TOSTNG AND
ADALM ASSURANCE 361

early
in the process trans lirectly into long-tem cost savings
number of erors
educed
nother reason for system testing is its utility as a user-oriented vehicle
ementation. The best program is worthless if it does
before
i m p l e m not meet
Unfortunately, the user'sdemands areoften
program or design efñiciency in tems of
compromised
by
efortsie
facilitate processing
ten the computer technician and the
mory utilization. Often
time
user have
unication baiers due to different backgrounds, interests, priorities,
andperha languages. The system tester (designer, programmer, or user)
developed some computer mastery can bridge this barier.
whohas

Do WE TEST FOR?
EAT
st test of a system is to see whether it produces the correct outputs.
te first
k crucial. Following this step, a
No
other test
can

are c o d u c t e d :
varietyof other tests
I Online response. Online systems must have a response time that will
at Caluse a hardship to the user. One way to test this is to input transac-
not
tions on as many CRT screens as would normally be used in peak hours and
time the response to each online function to establish a true performance
level.
2. Volume. In this test, we create as many records as would normally be
produced to verify that the hardware and software will function comectly.
The user is usually asked to provide test data for volume testing.
3. Stress testing. The purpose of stress testing is to prove that the candi-
date system does not malfunction under peak loads. Unlike volume testing,
where time is not a factor, we subject the system to a high volume of data
over a short time period. This simulates an online environment where a
high volume of activities occurs in sputs
4. Recovery and security. A forced system failure is induced to test a
backup recovery procedure for file integrity. Inaccurate data are entered to
s e e how the system responds in tems ofemor derection and protection.
Related to file integrity is a test to demonstrate that data and programs are
secure from unauthorized access.

Usability documentation and procedure. The usability test verifies the


nature of the system. This relates to normal operating
and
Ser-iriendly
error-t
nandling procedures, for example. One aspect of user-friendliness is

and complete documentation. fhe user onlyis asked to use the


erate
eumentation and procedures as a guide to determine whether the sys-
lem can be run
smoothly.
The Nature of Test Data
as the test itselil.lftest
datd as
inPper choice of test data is as important to be
provided by the user
Pare not valid or representative ofthe
data
then the reliability of the output is suspect.
PAPT POUR STSTEM nAPLRTTAIOw

Test data may be artificial (created solely for test


from the user's actual files). Properly created artificial dat.
purposesi or
livee ltaken
data should provi
all combinations of values and formats and make it possible to ta. id
and transaction path subroutines. Unlike live data, which are
oge biased .
ypical values, artificial data provide extreme values for toward
testing the lirnits
the candidate system. of
For large, complex systems, a compuler program is
used to generate
necessary test data. Data-generating programs save substantial t he .

both the programmer and the test itself. A familiarity with


system Ai
parameters, however, is necessary for writing an efective data-gener
programn.

THE TRST PLAN


The first step in system testing is to
prepare a plan that will test all
of the system in a way that
promotes its credibility among potentialaspertects
Itsen
There is psychology in testing sers.

1. Programmers usually do a better job in unit


testing because they are
expected to document and report on ihe method and are
extent of their
testing.
2. Users are involved, which means communication is iunproved between
users and the designer group.
3. Programmers are inolved when they become aware of user
and expectations. The user also becomes
more aware land
problemns
of the complexity of
programming and testing. The appreciativel
more realistic and upshot of all this is a
cooperative user for successful testing.
Acttvity Network for System Testing
A test plan entails the following activities (see Figure 12-1):
1. Prepare test plan.
2. Specify conditions for user acceptance testing.
3. Prepare test data for program testing
Prepare test data for transaction path testing.
5. Plan user training
6. Compile/assemble programs.
7. Prepare job performance aids.
8. Prepare operational documents.
Propare Test Plan
A workable test plan must be prepared in accordance with establisheá
design specifications. It includes the following items8:
FIGURE 12-1 Activity Network for System Testinng
Prepare test data for
testing program(s)_ Prepare operatorat documet
TPO40 TPO90
From detal
Prepare Verity Compile/Assemble Test Test transac
system design test plan programs programs3 tion petns Pertom total ystem test
programming
TP 010 TPOO TPO70 TO10 TO20
Specity co
ditions for Prepare job
acceptance perfomance
testing Prepare test data for transaction path testing aids Conctude system documentation
TP020 TPO50 TPO80 TO4O
Prepare user training Pertom user sccaptance test
TPO60 TOSO
TP Test planning actvity
Systenm test ectivity
Outputs expected hon the ayaten
Criteria for evaluatins outpute.
A voume of test data.
Procedure for usins test data.
Personnel and training e q u i e m e n t s .

p e c i Conamons or Uae Aceoptanco Tomn


Planning for user a c c e p t a n c e t e s t i n g calls for t h e a n a a r
a e e o n the c o n d i t i o n s f o r t h e test. Many o f t h e s e c O n d the use.
derived from the test plan. Others are an agreement on the
t h e test duration, and the persons designated for the t e s t s t c h e d
temination dates for the test should
also b e speciñed in he stan ance.
PrOpan T t Data for Program Tosting
A s e a c h program i s
ensure h a t all
coded, t e s t d a t a a r e prepared a n d docum.
aspects o f t h e Program a e prOperly tested. ARer th to
the data are iled for future reference. the testing

This
OparO Test Data lor Transacien Path Testng
activiy develops t h e d a t a
transactiorn to
required for testin8 every condition
b e introduced i n t o :

from
the system. The path o f e a c h transactin
origin to
destünation 1s
careniily t e s t e d tor reliable results. The test
verifies that h e test data are vitually comparable to live data u s e d
conversion. ater

a n 0sor Trainng
User training i s designed t o p r e p a r e t h e
t h e system. U s e r involvement a n d training take user f o r testing a n d coverting
place parallel with program-
m i n g for t h r e e reasons:

The system
Programs graup
are has
being time
written.
available t o spend on training w h l e the

.
nitiating
image of atheuser-training program gives t h e systems group
u s e r ' s interest i n t h e n e w
a clearer
system.
A t r a i n e d u s e r participates m o r e
effectively i n system testing.
O r s e r training. preparation of a checkist is useful (see Figure 12-21
neuded are prrovisions for developing training materials and other docu
e n t s . o c o m p l e t e t h e t r a i n i n g a c t v i t y : i n effect. t h e checklist calls tor
o m m i t m e n t o f p e r a o n n e l , f a c l i t i e s , a n d efforts f o r i m p l e m e n t i n g the can
date system.
hetraining plan is followed by preparation o f t h e u s e r
a n d a t h e r text materials. traini"ë waree
Facility equirements a n d t h e eces* ervisors
a e speeifñed a n d
documented.a comnmon
procedure is t o t r a i n super
PAPT R O O R / m m D U n a T A T O N

from the system


1. Outputs expected
outputs.
2. Criteria for evaluating
test data.
3. A volume of
data.
4. Procedure for using test
Personnel and training requirements.
5.

Conditlons for User Acceptance Tesmn


Specity
Planning for user acceptance testinE calls for the analyst and the user to
these conditions
the test. Many of may
the conditions for
on on the test anhe
agree Others are an agreement
derived from the test plan. for the test. The sta
the persons designated and
the test duration, and be specified in advance.
temination dates for the
test should also
Testing
Pepare Test Data for Program and documented to
Aseach is coded, test data are prepared
program After the testine
ensure that all aspects
of the program are properly tested.
reference.
the data are filed for future

Transaction Path Testing


Propare Tost Data for for testing every condition and
This activity devejops the data required
transaction to be introduced
into the system. The path of each transaction
tested for reliable results. The test
from origin to destination is caefully
verifies that the test data are virtually
comparable to live data used after
conversion.

Nan User Training and converting


designed to prepare the user for testing
User training is
take place parallel with program-
the system. User involvement and training
ming for three reasons:
training while the
1. The system group has time available
to spend on

programs are being written.


a clearer
2. user-training program gives the systems group
Initiating a
the new system.
image of the user's interest in
3. A trained user participates more effectively in system testing

For user training, preparation of a checklist is useful (see Figure 12-21


and other docu-
Included areprovisions for developing training materials
the checklist calls for a
ments to complete the training activity. In effect,
the candi-
commitment of personnel, facilities, and efforts for implementing
date system.
user training manual
The training plan is followed by preparation of the
and the necessary hardware
and other text materials. Facility requirements
are specified and documented. A common procedure is to train supervisors
Date
Company
Project Nane
Start Completion Staff in Department
in Charge Comments
Date Date Charge
Activity
1. Notification P. Solen Personnel Mgr.
Announcement to officers 10/06 10/20
10/06 10/20 J. Hill Auditing
Announcement to employees
10/29 D. Stang Cashier
Coordinated customer activities 10/06
C. Sibley Sr. Vice Pres.
Coordinate computer service 10/06 10/29
2. Procedures A. Blake Systems
10/14 11/01
Interdepartmental
10/14 11/01 J. Hill Auuting
Interdepartmental
3. Foris
11/01 11/14 A. Blake Systems
Design 11/20 A. Blake Systems
11/01
Printing
4. Equipment 12/15
Termminals 11/01
5. Training and orientation
12/01 21/16 A. Blake Systems
Manuals
12/01 12/16 A. Blake Systems
Training aicds
12/10 12/14 A. Blake Systems
Special workshops
12/30 D. Stang President
12/10
6 Lobby layout
12/10 12/15 M. Steed Purchasing Agent
7. Supplies
8. Personnel
12/10 12/12 P. Solen Personnel Mgr.
Transfers
12/15 12/30 P. Solen Personnel Mg.
New hires
Approved by: .
IProject laacer) (Svatems Departnent)
IMPLEMENTATION

PART POUR/ STSTEM


366

and department heads


who, in tum, train heir staf
their staff as
reasons are obvious:
they soe fa. T
knowledgeable about the capahil
1. User supervisors are

and the overall operation.


apabilities of thele 8taf
2.
vorably and accept
Staff members usually respond more favorably and acco-

better from supervisors


than from outsiders. inatuctilons
3. Familiarity of users with their particular problems (h
bettercandidates for handling user training than the S makes them
The analyst gets feedback to ensure that proper training inah
ining is provded
Comple/Assemble Programs
All programs have to be compiled/assembled for testing. Before
iore this,
however, a complete program description should be available.
the purpose of the program, its use, nerls) who prepuded is
the programmer
and the amount of computer time it takes to nun it. Program and
flowcharts of the project should aiso be available for reference.
In addition to these activities, desk checking the source code uncte.
programming errors or inconsistencies. Before actual program testinE. 3
order schedule and test scheme are finalized. A run order schedule specie
the transactions to test and the order in which they should be tested. Hih
priority transactions that make special demands on the candidate sVstem
are tested fist. in contrast, a test scheme specifñies how program soitware
should be debugged. A common approach, called bottom-up programming,
tests small-scale program modules, which are linked to a higher-level mocd.
ule, and so on until the program is completed. An alternative is the top
down approach, where the general program is tested first, followed by the
addition of program modiles, one level at a time to the lowest level.

Prepare Job Pertormance Alds


In this activity the materials to be used by personnel to run the system
are specified and scheduled. This includes a display of materials such as
program codes, a list of input codes attached to the CRT teminal, and a
posted instruction schedule to load the disk drive. These aids reduce the
training time and employ personnel at lower positions.

Propare Oporational Documents


During the test plan stage, all operational documents are finalized,
including copies of the operational formats required by the candidate
system. Related to operational documentation is a sectipn on the expen
ence, training, and educational qualifications of for the prope
operation of the new system.
personnel

System Testing
The purpose of system testing is to identify and correct erors in the
candidate system. As important as this
phase is, it is one that is
frequent
12/STSTEMTESTING AND QUALTY ASURANCE 367
the
mpromised. Typically, the
ically, project is behind schedule or the user is
compn to conversion.
eager
directly
to
go tem testing. performan and acceptance standards are devel-
In syndard performance or service interruptions that result in sys
oped. Substa.
ailure
are
ed
checkec during the test. The following performance criteria
iem
system testing:
are
used for
around time is the elapsed time between the receipt of the
1. T
of the
the output. In online
input
systems, high-priority
processing process
availability
and theandled during peak hours, while low-priority is done
han
nin the day or during the night shift. The objective is to decide on and
Jaee all
evaluate allt the factors that might have a bearing on the tumaround time for
handling all applications.
Rackup relates to procedures to be used when the system is down.
2Backup plans might call tor the use of another computer. The software for
the candidate system must be tested for compatibility with a backup com-
puter.
In case of a partial system breakdown, provisions must be made for
dynamic reconfiguration of the system. For example, in an online environ-
ment, when the printer breaks down, a provisional plan might call for
automaticaly "dumping" the output on tape until the service is restored.
3. File protection pertains to storing files in a
separate area for protection
against fire, flood, or natural disaster. Plans should also be established for
reconstructing files damaged through a hardware malfunction.
4. The human factor applies to the
personnel of the candidate system.
During system testing. lighting, air conditioning, noise, and other environ
mental factors are evaluated with people's desks, chairs, CRTs, etc. Hard-
ware should be
desigmed to match human comfort. This is referred to as
ergonomics. It is becoming an extremely important issue in system develop-
ment.
Types of Syatem Tests
After a test plan has been developed, system testing
begins by testing
program modules separately, followed by testing "bundled" modules as a
unit. A program module may function perfectly in isolation but fail when
nierfaced with other modules. The approach is to test each entity with
S1ucessively larger ones, up to the system test level.
System testing consists of the following steps:
.Programis) testing.
2Stiingtesting.
Sy'sten testing
System docunentation.
5. ser acceptance testing.
Pach slep is briefly explained here.
P A P O R T DALnaTAnON
388
Program Testing. A program represents the lois s
system. For a program to run satisfactorily, it must t o
correctly and tie in properly with other pograms. Achie. lementtesa
pile and od
program is the responsibility ofthe programmer. ProRra an error fre
an
wo types of errors: vntax and logic. A syitax error is ng checks i
that violates one or more rules of the
a prog ram statem
language in
which it
improperly defined field dimension or omittedted key words is written. temem . An
syntax emors. These errors are shown through error messar are
the computer. A logic error, on the other hand, deals
wit8eneralered by ommon
fields, out-of-range items, and invalid combin correct data
not detect logic eTors, the
for them.
programmer must examine the a ostics do
When a
program is tested, the actua output is
expected output. When there is a discrepancy, the
compaed with the
must be traced to detemine the problem.
sequence of instruce
The process
breaking the program down into self-contained
is facili.ons he
be checked at certain
key points. The idea is to whi portions, each of b
compare can
against desk-calculated values to isolate the problem. program valu
values
String Testing. Programs are invariably related to one
interact in a total system. Each program is tested to see whether
ánother and
to related
programs in the system. Each
it confor
portion of the system is ms
against the entire module with both test and live data tested
system is ready to be tested. before the enti
System Testing. System testing is
that were not found in
designed
earlier tests. This includes forced
to uncover
weaknesses
validation of the total system as it will be
system failure and
operational environment. implemented by its userts) in the
tions based on live Generally, it begins with low volumes of transac-
data. The volume is increased
for each until the maximum level
transactíon type is reached. The total system is also
recovery and fallback after various tested for
lost during the major failures to ensure thano data are 3.E
emergency. All this ís done
operation. After the candidate system passeswith the old system stil in
the test, the
discontinued. old system s
System Documentartion. All desigh and test C
be finalized and entered in the documentation should 6. T
central location for
library for future reference. The
library is the
maintenance of the new
system. The format, P
tion, and
standards.
language of each documentation should be in line withorganiza*
system
1. P
C
User &. A
Acceptance Testing. An
selling the user on the validity and acceptance test has the
objectivethato
the system's reliability of the system. It verifies
procedures operate to system specifications and that tho
integrity of vital data is maintained. Perfomance of an
acceptance test 1s
12/SYSTEM TESTING AND aDALTT ASSTRNCI
or's show. User knowledge are critical for the
motivation and
actually ofí tthe systetn. Then a comprehensive test report is
performance
dicates the system's tolerance, performance range
SuCCess ulThe report indicates
rpared
emor and accuracy
a U A L I T YA S S U R A N C E
of software produced today stagger the imag-
and complexity and
velopment strategies have not kept pace, however,
amount.
inatio
The
Software deve
products fall application objectives. Conse
short of meeting
s o fw a r e
be developed to ensure a quality product.
Basically
must
tly, controls mu,
reviews the
the objectives of the project and
asSurance defines corrected early in the development
qual that errors are
so th.
activities
no erTors in
e n s u r e that there are
yerall in each phase to
are taker
Steps
proces.
the f i n a l s o f t w a r e .
Life Cyele
Assurance
Goals in the Systems
each
auality development, and
includes various stages of relevance to the
software life cycle The goals and
their
The assurance.
the goal of quality next.
stage has
summarized
are
assurance
of the system
quality
u a l t y Factors Specifications contribute to the
the factors that
of this stage
is to define
determine the quality of a
The goal system.
Several factors
candidate
the
quality of
specinca-
system: meets system
a program
extent to which
C o r r e c t n e s s - t h e
1. intended
func-
objectives. performs its
tionsand user which the system
Reliability-the
degree to
2. by a program
required
tions over a time resources
amount
of computer
Ejiciency-the
3. function. a aystem.
a and operate
to perform to leam located and
effort required
are
errors
program
Unability-the which
4. with
ease
correct
M a i n t a i n a b i l i t y - t h e
ensure
its
5. program
to
a
corrected. to lest
reyuired
effort hardware
one
T e s t a b i l i t y - t h e
from
6. a program
transporting
performance.
of
ease and
P o r t a b i l i t y - t h e
computalions,
input editung,
Configuration to another
7.
ciskon in
Accuracy-the requied precis
avod-
error
versus
correction
8. tion and
detectio.
outputtolerance-errr
Eror
9.
ance
370 PART POUR / STSTEM DPLEMENTATION
10. Expandability-ease of adding or expanding the existing da
the svst.
11. Access control and audit--control of access to the system base
extent to which that access
can be audited. and the
12. Communicativeness-how descriptive or useful the inputs an.
of the system are.
Software Requirements Specificatlons
The quality assurance goal of this stage 1s to generate the requi
0ocument that provides the technical specifications for the dsent
development of the software. This document enhances the system' and
by formalizing communication between the system developer and i y
and provides the proper information for accurate documentation
Sottware Design Specifleations
In this stage, the software design document defines the overall arhit.
ture of the software that provides the funçtions and features described
the software requirements document. It addresses the question; How it w
be done? The document describes the logical subsystems and their respec.
tive physical modules. It ensures that all conditions are covered.
Sottware Testing and Implementatlon
The quality assurance goal of the testing phase is to ensure that com-
pleteness and accuracy of the system and minimize the retesting process
In the implementation phase, the goal is to provide a logical order for the
creation of the modules and, in turn, the creation of the system.
Maintenance and Support
This phase provides the necessary software adjustment for the system
to continue to comply with the original specifications. The quality as-
surance goal is to develop a procedure for correcting errors and enhancing
software. This procedure improves quality assurance by encouraging com
plete reporting and logging of problems, ensuring that reported problems
are promptly forwarded to the appropriate group for resolution, and reduc
ing redundant effort by making known problem reports available to any
department that handles complaints.
Levels f Quality Assurance
There are three levels of quality assurance: testing. validation, and certiñica-
tion.
In system testing, the cormon view is to eliminate program errors. This
is extremely difficult and time-consuming, since designers cannot prove 100
percent aceuracy. Therefore, all that can be done is to put the system
through a "fail test" cycle-detemine what will make it fail. A successful
12/57STEM TESTDIG AMD ALTT ASUTRNCA

one that finds errors. The test


strategies discussed earlier
one
then,
is
test,
are
testing.
in system
sedSystem validatio.checks the quality of the software
ion
ive environments. First the
software goes
in both
simulat
and live

to as
through phase ohen
alpha testing) in which errors and faiiures
a
referrec

Ser reyuireme
based orn
are verified and studied. The modified software is thhen simuiated
ected to phase two called beta testing) in the actual
sUnment. The system is used regularly with live user's site or a live
and errors are documented transactions. After a
enviror

failures
heduled time, failu
schec

nd enhancements are made


and finai coTection
before ipackage is
The third level of quality assurance is to released for use.
ware package is cuFTect and confomns to
certify that the program or
end toward purchasing ready-to-use software,standards. With a giowing
re important. A package that is certified certification has become
ialists who test, review, and determine how well
goes through a team of
spe-
it meets the
claims. Certification is actually issued after vendor's
Certification, however, does not assure the userthethatpackage
it is the
passes the test
package to best
adopt; only attests that it wil perform what the vendor
it
In summary, the claims.
quality of an iníormation system
design. testing, and
implementation.
One aspect of depends on its
reliability or the assurance that it does not system quality is its
strategy of error tolerance ldetection produce costly failures. The
and cotection) rather
avoidance is the basis for successful than error
testing and quality assurance.
TRENDS IN TESTINGG
In the future, we can expect
use of unparalleled growth
automated tools and software aids for
in the
development and
functional tester, which determines whether testing. One such tool is the
the hardware is
to a minimal
standard. It is a computer program that controls operating up
hardware the complete
configuration and verifies that it is functional. For
Test
computer memory by performing readwrite tests, and example. it can
it tests each
peripheral device individually.
The functional tester is of great value when minute
hardwae
software bugs. For examnle, hardware fauhs areproblems
are
disguised as

Epeatable, wheras software bugs an: usually


ne generally
delicate interaction between hardware
erratic. Problems arise when
and software hardware cause a
PDlem to appear as an eratic software bug. A functional tester detemines
mediately that the problem is in the hardware. This saves considerable
time during testing
Another software aid is the
at
debug monitor. It is à
computer prgan
regulates and imodilies the applications softwae that is
being tested. t
dalso control the execution offunctional tests and aut0matically patch
Or
modify the application program being tested.
he use of these tools will inerease as systems grow in size and com-
PART POUR/STSTEM DMOLEMENTATION

S72

the verification and insurance


plexity and
as
increasingly important.
sotware SO

Gher
DATA PROCESSING AUDITOD
ROLE OF THE
system ought to include
test of any
The planned
introduce control elements unigue to the
epT
technique and slem
auditor should be involved in most nh most phases
the systTheem
Dethi

processing (DP)
system testing.
In the past, auditors
have ot pulpr
cycle, especially
been installed. Then the cost is often too
audited tat
are

after they have


the system to incorporate adequate
ibite sysen Aures
back and modify
audit contols must be built into the system design and ntrol. Thersel
ested
the analyst and the DP. The auditor. auditor's with
ole t
Rees

cooperation of both
and make recommendations to the ws
judge the controls
jnils.

charge of the project. The user department ould participate in Hev

for the system to ensure that adeat spE


specifications
the control
has been provided.
contm standar

For testing programs, test data must include transactions should

that a sbleler
to violate control procedures incorporated
specifically designed the s
valid transactions to test their acceptance by
program as well as examined. At the time program
stem
control must be caretfully
The setup Candid

tested, all required files are accumulated and set out in the proper one
USers

and format for final testing. esting

meniat

stem

The Audit Trall proce

An important function of control


is to provide for the audit trail. In desigm chang
nature ofthe audi
ing the auditor is concerned about the changing
tests,
trail. In an online environment, the form, content, and accessiblityaSum=
records are such that the auditor often
has difficulty following a singe
1.
transaction completely through the system. Some of the following changes
in the audit trail confront the auditor:
ae
1 Source documents are no longer the system after they
used by filed n
are often
transcribed onto a machine-readable medium. They
areas or ways that mnakes later access dificult. In direct data enty

traditional source documents are simply eliminated.


Files stoed on tape or disk can be read only by a computer, w
2 limits the auditing function. A data dump is possible. though. lo t

pare the dala against a data map.


Prucessing activities a ditficult to observe, since most of tnen 4.
within the system. It is possilble, however, to get a trace when requn

One way to nmaintain a viable audit trail is to keep a detailed file ol


transactions as they occur. The ile can then be the input for an ail
12/STSTEM TLSTDIG AND aBALTT AssANA 373

transactions for selected a c c o u n t s and prints


m that

at extracts the
extracts

auditor c
can trace the status of an account in detail
the
to be part of
role of the auditor, he'she is
p m g r :

t h e n
so
that
important
expected
Given
the
development team, which includes the user. As an independent
judge the controls and make recommendations to the
the syhe role
adviser, t h e r o l e is to
in the evaluations:
a r e consider
team. T h r e e
important steps

the control objectives as separate design and test requirements.


Define
and transmissi by the user a r e important control areas
eT 1.
reparation
emphasis on audit trails, error-comection pro-
Input with an
Vste0et that are viewed

documentation during testing. These areas should


and adequate
jures,
and well documented.
c e d u r

be present
Ath te always
costs to s e e whether system testing is within the
Reexamine budget
limits.
The auditorshould evaluate program acceptance
Review specifications.
ewing fest specifications and
assist the system/programmer in developing test
Contr, tandards, various levels of testing, and
actual test conditions. He/she
should also evaluate the actual system acceptance test to rnsure an accept
hat a
confidence and reliability.
n th able level of
system in summary, it is the auditor's responsibility to build controls into
Ms a candidate systems to ensure integrity, reliability, and confidence of the
order users at all ievels. The auditor should be called in during design as well as
testing so that any suggestions he/she has can be considered before imple-
mentation. when changes are less costly. Including the auditor in the
systerm development tean makes it easier for him/her to monitor testing
procedures and consider the acceptance of new controls to replace those
desig changed by the new design.
e audi
Dility d
singe Summary
hangs 1. Inadequate testing or nontesting leads to emors that may be costly
when they appear months later. Effective testing translates into cost
ney a
savings from reduced errors. It also has utility as a user-oriented vehicle
before implementation.
filed a
2. A
enty system is subject to a variety of tests: online
candidate response
volume, stress, recovery and security, and usability tests. Each test has a
which unique benefit for a suçcessful installation.
S. Test data may be artificial or live itaken from the user's ilesi. In either
0t

case, they stiould provide all combinations of values or formats to test


all logic and transaction path subrouiines.
em a

quired The activity network for system testing entails the following
a.
eofthe
Prepare test plan.
b.
Specify conditions for user acceptance testing
C.
Prepare test data for program testing

Вам также может понравиться