Вы находитесь на странице: 1из 46

Verification and Validation

| 
  m 

    

a ectives
Ô To introduce software verification and validation and
to discuss the distinction etween them
Ô To descri e the program inspection process and its
role in V & V
Ô To explain static analysis as a verification technique
Ô To descri e the Cleanroom software development
process

| 
  m 

    

Topics covered
Ô Verification and validation planning
Ô Software inspections
Ô Automated static analysis
Ô Cleanroom software development

| 
  m 

    

Verification vs validation
Ô Verification:
"Are we uilding the product right´.
Ô The software should conform to its
specification.
Ô Validation:
"Are we uilding the right product´.
Ô The software should do what the user really
requires.

| 
  m 

    

The V & V process
Ô Ys a whole life-cycle process - V & V must e
applied at each stage in the software
process.
Ô Has two principal o ectives
‡ The discovery of defects in a system;
‡ The assessment of whether or not the system is
useful and usea le in an operational situation.

| 
  m 

    

V& V goals
Ô Verification and validation should esta lish
confidence that the software is fit for
purpose.
Ô This does NaT mean completely free of
defects.
Ô Rather, it must e good enough for its
intended use and the type of use will
determine the degree of confidence that is
needed.

| 
  m 

    

V & V confidence
Ô Depends on system¶s purpose, user
expectations and marketing environment
‡ Software function
‡ The level of confidence depends on how critical the
software is to an organisation.
‡ User expectations
‡ Users may have low expectations of certain kinds of
software.
‡ Marketing environment
‡ Getting a product to market early may e more
important than finding defects in the program.

| 
  m 

    

Static and dynamic verification

Ô Software inspectionsj Concerned with analysis of


the static system representation to discover
pro lems static verification)
‡ May e supplement y tool- ased document and code
analysis
Ô Software testingj Concerned with exercising and
o serving product ehaviour (dynamic verification)
‡ The system is executed with test data and its operational
ehaviour is o served

| 
  m 

    

Static and dynamic V&V

| 
  m 

    

Program testing
Ô Can reveal the presence of errors NaT their
a sence.
Ô The only validation technique for non-
functional requirements as the software has
to e executed to see how it ehaves.
Ô Should e used in conunction with static
verification to provide full V&V coverage.

| 
  m 

    

Types of testing
Ô Defect testing
‡ Tests designed to discover system defects.
‡ A successful defect test is one which reveals the
presence of defects in a system.
‡ Covered in Chapter 23
Ô Validation testing
‡ Yntended to show that the software meets its
requirements.
‡ A successful test is one that shows that a requirements
has een properly implemented.

| 
  m 

    

Testing and de ugging
Ô Defect testing and de ugging are distinct
processes.
Ô Verification and validation is concerned with
esta lishing the existence of defects in a program.
Ô De ugging is concerned with locating and
repairing these errors.
Ô De ugging involves formulating a hypothesis
a out program ehaviour then testing these
hypotheses to find the system error.

| 
  m 

    

The de ugging process

| 
  m 

    

V & V planning
Ô Careful planning is required to get the most
out of testing and inspection processes.
Ô Planning should start early in the
development process.
Ô The plan should identify the alance
etween static verification and testing.
Ô Test planning is a out defining standards for
the testing process rather than descri ing
product tests.

| 
  m 

    

The V-model of development

| 
  m 

    

The structure of a software test plan

Ô The testing process.


Ô Requirements tracea ility.
Ô Tested items.
Ô Testing schedule.
Ô Test recording procedures.
Ô Hardware and software requirements.
Ô Constraints.

| 
  m 

    

The software test plan
ë       

  
 
   
   
       
   ë      
                

            


      
                          
  
      
                     

ë     
ë   
   
    
     
         
      
   
    

ë        

              
   

  
        
ë   
 
        
    
      
     
  
  

ë   
 
  
   
 
      
       ë     
           
       
        
     
          
 

            
 
  

     
        
ë      
 
     
  
    

              
       
 


    

                 
          
   

            
 

| 
  m 

    

Software inspections
Ô These involve people examining the source
representation with the aim of discovering anomalies
and defects.
Ô Ynspections not require execution of a system so
may e used efore implementation.
Ô They may e applied to any representation of the
system (requirements, design,configuration data,
test data, etc.).
Ô They have een shown to e an effective technique
for discovering program errors.

| 
  m 

    

Ynspection success
Ô Many different defects may e discovered in
a single inspection. Yn testing, one defect
,may mask another so several executions
are required.
Ô The reuse domain and programming
knowledge so reviewers are likely to have
seen the types of error that commonly arise.

| 
  m 

    

Ynspections and testing
Ô Ynspections and testing are complementary and not
opposing verification techniques.
Ô Both should e used during the V & V process.
Ô Ynspections can check conformance with a
specification ut not conformance with the
customer¶s real requirements.
Ô Ynspections cannot check non-functional
characteristics such as performance, usa ility, etc.

| 
  m 

    

Program inspections
Ô Formalised approach to document reviews
Ô Yntended explicitly for defect detection (not
correction).
Ô Defects may e logical errors, anomalies in
the code that might indicate an erroneous
condition (e.g. an uninitialised varia le) or
non-compliance with standards.

| 
  m 

    
 
Ynspection pre-conditions
Ô A precise specification must e availa le.
Ô Team mem ers must e familiar with the
organisation standards.
Ô Syntactically correct code or other system
representations must e availa le.
Ô An error checklist should e prepared.
Ô Management must accept that inspection will
increase costs early in the software process.
Ô Management should not use inspections for staff
appraisal ie finding out who makes mistakes.

| 
  m 

    

The inspection process

| 
  m 

    
 
Ynspection procedure
Ô System overview presented to inspection
team.
Ô Code and associated documents are
distri uted to inspection team in advance.
Ô Ynspection takes place and discovered errors
are noted.
Ô Modifications are made to repair discovered
errors.
Ô Re-inspection may or may not e required.

| 
  m 

    
 
Ynspection roles

Π    


 
  
 
      
 
 

  
   
 




 ! 

 




    "#  

#

     
    



 $       





  




 "    

 


 


 
  

 
 !

 
!  

| 
  m 

    
 
Ynspection checklists
Ô Checklist of common errors should e used to
drive the inspection.
Ô Error checklists are programming language
dependent and reflect the characteristic errors that
are likely to arise in the language.
Ô Yn general, the 'weaker' the type checking, the larger
the checklist.
Ô Examples: Ynitialisation, Constant naming, loop
termination, array ounds, etc.

| 
  m 

    
 
Ynspection checks 1

   Œ  


 




  
%
&  %
   #'  
(
#
()%
  
  !
 

 

 #

%
#


#   %
   

 !


%
 

%
Π #%
! 
  %

'

!


 %
      Π
  
  %
Π     
  
     # 
  %
 
   
%

| 
  m 

    
 
Ynspection checks 2

    *   


          
 
*    #





    #!  #   


  #  

  
   
 

!    
 
    #


 #
 
 !     
 #

  

 # )   

  
'



   
 


 

  

| 
  m 

    
 
Ynspection rate
Ô 500 statements/hour during overview.
Ô 125 source statement/hour during individual
preparation.
Ô 90-125 statements/hour can e inspected.
Ô Ynspection is therefore an expensive
process.
Ô Ynspecting 500 lines costs a out 40
man/hours effort - a out £2800 at UK rates.

| 
  m 

    
 
Automated static analysis
Ô Static analysers are software tools for source
text processing.
Ô They parse the program text and try to
discover potentially erroneous conditions and
ring these to the attention of the V & V
team.
Ô They are very effective as an aid to
inspections - they are a supplement to ut
not a replacement for inspections.

| 
  m 

    

Static analysis checks

H     


   +
  





+
     
+
 

   


$
 # 
 

,  
 
   , 
,

 
 
      +
     
 
 
 



  $#

$ 

-)   

,  
 
 ,


  $




| 
  m 

    

Stages of static analysis
Ô Control flow analysisj Checks for loops with
multiple exit or entry points, finds unreacha le
code, etc.
Ô Data use analysisj Detects uninitialised
varia les, varia les written twice without an
intervening assignment, varia les which are
declared ut never used, etc.
Ô Ynterface analysisj Checks the consistency of
routine and procedure declarations and their
use

| 
  m 

    

Stages of static analysis
Ô Ynformation flow analysisj Ydentifies the
dependencies of output varia les. Does not
detect anomalies itself ut highlights
information for code inspection or review
Ô Path analysisj Ydentifies paths through the
program and sets out the statements
executed in that path. Again, potentially
useful in the review process
Ô Both these stages generate vast amounts of
information. They must e used with care.
| 
  m 

    

éYNT static analysis
1 38%
  !  "
#   $  
 %

     &   '
    (

)  &*% +   '( ,

&'
)

    -. /( (   (

     &    '(
     &   ' (
,

1 39%   ! "


1 40%   !  "

 ! "  &1 0'0    0      


 
 ! "  &1 0'0    0      
 
     0    #
   ! " &4 '00 ! " & 1 0'
        1   
      ! "  &4' 00 ! "  &10 '
        1   
      ! "  &4' 00 ! "  &11 '
                 


| 
  m 

    

Use of static analysis
Ô Particularly valua le when a language such
as C is used which has weak typing and
hence many errors are undetected y the
compiler,
Ô éess cost-effective for languages like Java
that have strong type checking and can
therefore detect many errors during
compilation.

| 
  m 

    

Verification and formal methods
Ô Formal methods can e used when a
mathematical specification of the system is
produced.
Ô They are the ultimate static verification
technique.
Ô They involve detailed mathematical analysis
of the specification and may develop formal
arguments that a program conforms to its
mathematical specification.

| 
  m 

    

Arguments for formal methods
Ô Producing a mathematical specification
requires a detailed analysis of the
requirements and this is likely to uncover
errors.
Ô They can detect implementation errors
efore testing when the program is analysed
alongside the specification.

| 
  m 

    

Arguments against formal methods

Ô Require specialised notations that cannot e


understood y domain experts.
Ô Very expensive to develop a specification
and even more expensive to show that a
program meets that specification.
Ô Yt may e possi le to reach the same level of
confidence in a program more cheaply using
other V & V techniques.

| 
  m 

    

Cleanroom software development
Ô The name is derived from the 'Cleanroom'
process in semiconductor fa rication. The
philosophy is defect avoidance rather than
defect removal.
Ô This software development process is ased on:
‡ Yncremental development;
‡ Formal specification;
‡ Static verification using correctness arguments;
‡ Statistical testing to determine program relia ility.

| 
  m 

    

The Cleanroom process

| 
  m 

    

Cleanroom process characteristics

Ô Formal specification using a state transition


model.
Ô Yncremental development where the
customer prioritises increments.
Ô Structured programming - limited control and
a straction constructs are used in the
program.
Ô Static verification using rigorous inspections.
Ô Statistical testing of the system (covered in
Ch. 24).

| 
  m 

    

Formal specification and inspections

Ô The state ased model is a system


specification and the inspection process
checks the program against this mode.l
Ô The programming approach is defined so
that the correspondence etween the model
and the system is clear.
Ô Mathematical arguments (not proofs) are
used to increase confidence in the inspection
process.

| 
  m 

    

Cleanroom process teams
Ô Specification teamj Responsi le for developing
and maintaining the system specification.
Ô Development teamj Responsi le for
developing and verifying the software. The
software is NaT executed or even compiled
during this process.
Ô Certification teamj Responsi le for developing
a set of statistical tests to exercise the software
after development. Relia ility growth models
used to determine when relia ility is accepta le.

| 
  m 

    

Cleanroom process evaluation
Ô The results of using the Cleanroom process have
een very impressive with few discovered faults in
delivered systems.
Ô Yndependent assessment shows that the
process is no more expensive than other
approaches.
Ô There were fewer errors than in a 'traditional'
development process.
Ô However, the process is not widely used. Yt is not
clear how this approach can e transferred
to an environment with less skilled or less
motivated software engineers.

| 
  m 

    

Key points
Ô Verification and validation are not the same
thing. Verification shows conformance with
specification; validation shows that the
program meets the customer¶s needs.
Ô Test plans should e drawn up to guide the
testing process.
Ô Static verification techniques involve
examination and analysis of the program for
error detection.

| 
  m 

    

Key points
Ô Program inspections are very effective in
discovering errors.
Ô Program code in inspections is systematically
checked y a small team to locate software faults.
Ô Static analysis tools can discover program
anomalies which may e an indication of faults in the
code.
Ô The Cleanroom development process depends on
incremental development, static verification and
statistical testing.

| 
  m