Вы находитесь на странице: 1из 11

Software Development Life Cycle (SDLC) : Specifies the various stages of software development.

1. 2. 3. 4. 5. 6. 7.

System Requirements Analysis. Feasibility study Systems Analysis and Design Code Generation Testing Maintenance Implementation

Software Testing Life Cycle (STLC): Specifies the various stages of testing.
1.Requirements stage a.Requirement Specification documents b.Functional Specification documents c.Use case Documents d.Test Trace-ability Matrix for identifying Test Coverage 2.Test Plan a.Test Scope, Test Environment b.Different Test phase and Test Methodologies c.Manual and Automation Testing d.Defect Mgmt, Configuration Mgmt, Risk Mgmt. Etc 3.Test Design a.Test Case preparation. b.Test Traceability Matrix for identifying Test Cases c.Test case reviews and Approval 4.Test Execution a.Executing Test cases b.Capture, review and analyze Test Results 5.Defect Tracking a.Find the defect & tracking for its closure. 6.Bug Reporting a.Report the defect on tool/Excels 7.Regression/retesting

Bug Life cycle

V&V Technique Taxonomy

This document describes over seventy-five V&V techniques and eighteen statistical techniques that can be used for model validation. Most of these techniques are derived from software engineering; the remaining are specific to the modeling and simulation (M&S) field. The software V&V techniques applicable to M&S V&V are presented in terms understandable by an M&S technical person. Some software V&V techniques have been modified for use in M&S V&V. The term testing is used frequently when referring to the implementation of these techniques because V&V involves the testing of the model or simulation to assess its credibility.

The V&V techniques discussed in this document are separated into four categories: informal, static, dynamic, and formal. The specific techniques in each category are discussed in separate write-ups accessible by the hot links listed.

Informal V&V techniques are among the most commonly used. They are called informal because their tools and approaches rely heavily on human reasoning and subjectivity without stringent mathematical formalism. See the hotlink, Informal V&V Techniques for discussion of specific techniques included in this category.

Static V&V techniques assess the accuracy of the static model design and source code. Static techniques do not require machine execution of the model, but mental execution can be used. The techniques are very popular and widely used, and many automated tools are available to assist in the V&V process. Static techniques can reveal a variety of information

about the structure of the model, the modeling techniques used, data and control flow within the model, and syntactical accuracy (Whitner and Balci, 1989). See the hotlink, Static V&V Techniques for discussion of specific techniques included in this category.

Dynamic V&V techniques require model execution; they evaluate the model based on its execution behavior. Most dynamic V&V techniques require model instrumentation, the insertion of additional code (probes or stubs) into the executable model to collect information about model behavior during execution. Dynamic V&V techniques usually are applied in three steps: executable model is instrumented instrumented model is executed model output is analyzed and dynamic model behavior is evaluated See the hotlink, Dynamic V&V Techniques for discussion of specific techniques included in this category.

Formal V&V techniques (or formal methods) are based on formal mathematical proofs or correctness and are the most thorough means of model V&V. The successful application of formal methods requires the model development process to be well defined and structured. Formal methods should be applied early in the model development process to achieve maximum benefit. Because formal techniques require significant effort they are best applied to complex problems, which cannot be handled by simpler methods. See the hotlink, Formal V&V Techniques for discussion of specific techniques included in this category.

Although these categories share many of the same characteristics and individual V&V techniques can overlap from one to another, the complexity and the mathematical and logical formalism involved increase as the category becomes more formal.

The taxonomy table below lists all the techniques discussed. They are grouped according to the categories described above and hot links are provided for each category. To access a specific technique, the reader should select the category hot link. Links for individual techniques are provided at the beginning of the category write-up.

audit review

Verification and Validation Technique Taxonomy Informal Techniques desk check face validation Turing test walk-through Static Techniques

inspection

Verification and Validation Technique Taxonomy control analyses data analyses calling control fault/failure cause-effect graphing structure flow data analysis data flow dependency concurrent state process transition interface analyses symbolic semantic analysis structural analysis evaluation model user interface interface syntax analysis traceability assessment Dynamic Techniques acceptance test alpha test assertion check beta test compliance tests bottom-up test comparison test authorization security debugging performance standards execution tests fault / failure functional test field test insertion test (Black Box test) monitor profile trace interface tests graphical object-flow test partition test comparison data model user predictive validation product test regression test sensitivity analysis structural tests statistical special input tests (White Box tests) techniques boundary value real-time input branch loop equivalence self-driven input partitioning submodel / condition path stress extreme input module test data flow statement trace-driven input invalid input symbolic debugging top-down test visualization / animation Formal Techniques induction inference logical deduction inductive assertion predicate proof of lambda calculus predicate calculus transformation correctness

Guidelines for Selecting V&V Techniques

In the overall problem solving process diagram shown below, the V&V Process is depicted as a subprocess of the M&S Use Process that interacts with both the M&S Development/Preparation Process and the Accreditation Process. (See the diagrams for VV&A and the New Development Process, VV&A and Legacy M&S Preparation, and VV&A and Federation Construction for a more detailed view of these interactions.)

Any V&V process involves a series of activities and tasks that are selected to address the particular needs of the application and to map to the phases and activities of the particular development or preparation process involved. What tasks are selected and what techniques are chosen to accomplish them depend upon a number of factors, such as

type of simulation (legacy, new M&S, federation) problem to be solved objectives and requirements and their acceptability criteria risks and priorities of the User constraints (time, money, personnel, equipment)

See the core documents for V&V Agent Role in the VV&A of New Simulations, V&V Agent Role in the VV&A of Legacy Simulations, V&V Agent Role in the VV&A of Federations for additional information about specific V&V activities and tasks.

In the table below, the specific informal, static, dynamic, and formal V&V techniques, listed in the taxonomy table and discussed in the individual hotlinks, are mapped to the basic phases of simulation development and use (i.e., requirements definition, conceptual model development, design, implementation, use, and assessment). Additional columns also indicate whether a technique is used primarily to support verification, validation, or both. Selecting the best technique to apply to a given V&V task in a given situation is not always straightforward (see the example at the hotlink, Selecting V&V Techniques for Defect Detection). The reference document on V&V Tools provides a discussion of the types of tools that can be used to perform various V&V techniques.

Common V&V Technique Applications M&S Phase


Cla ss

V&V Category

V&V Technique

M& M& Concept S S ual Rqm Desi Model ts gn

M&S Develop ment

M M&S Verificat Validat &S Assessm ion ion Use ent


X X X X X X

Dy n Dy n Dy n Inf Dy n Dy n Dy n Dy n Dy n St at St at Dy n St at

Acceptance test [RPG2] Alpha test [RPG3] Assertion check [RPG4] Audit [RPG5] Authorization test [RPG6] Beta test [RPG7] Bottom-up test [RPG8] Boundary value test [RPG9] Branch test [RPG10] Calling structure analysis [RPG11] Cause-effect graphing [RPG12] Comparison test [RPG13] Concurrent process

X X X X X X X X X X X

X X X X X X X

X X

X X X

Common V&V Technique Applications M&S Phase


Cla ss

V&V Category

V&V Technique

M& M& Concept S S ual Rqm Desi Model ts gn

M&S Develop ment

M M&S Verificat Validat &S Assessm ion ion Use ent

Dy n St at St at St at Dy n Dy n Dy n Inf Inf Dy n Dy n Dy n Dy n Dy n Inf

analysis [RPG14] Condition test [RPG15] Control flow analysis [RPG16] Data dependency analysis [RPG17] Data flow analysis [RPG18] Data flow test [RPG19] Data interface test [RPG20] Debugging [RPG21] Desk check [RPG22] Documentatio n check Equivalence partitioning test [RPG23] Execution monitoring [RPG24] Execution profiling [RPG25] Execution trac [RPG26]e Extreme input test [RPG27] Face validation [RPG28]

X X X X X X X X X X X X

X X X X X

Common V&V Technique Applications M&S Phase


Cla ss

V&V Category

V&V Technique

M& M& Concept S S ual Rqm Desi Model ts gn

M&S Develop ment

M M&S Verificat Validat &S Assessm ion ion Use ent

Fault/Failure St analysis at [RPG29] Fault/Failure Dy insertion test n [RPG30] Dy Field test n [RPG31] Dy Functional n test [RPG32] Graphical Dy comparison n [RPG33] Fo Induction r Fo Inductive r assertions Fo Inference r Inspection Inf [RPG34] Dy Invalid input n test [RPG35] Fo Lambda r calculus Fo Logical r deduction Dy Loop test n [RPG36] Model St interface at analysis [RPG37] Model Dy interface test n [RPG38] Dy Object-flow n test [RPG39] Dy Partition test n [RPG40] Dy Path test n [RPG41]

X X X X

X X

X X X X X

X X X X X X X X X X X X X X X

X X

X X X X

X X X X

X X X

Common V&V Technique Applications M&S Phase


Cla ss

V&V Category

V&V Technique

M& M& Concept S S ual Rqm Desi Model ts gn

M&S Develop ment

M M&S Verificat Validat &S Assessm ion ion Use ent


X X X

Dy n Fo r Fo r Dy n Dy n Fo r Dy n Dy n Inf Dy n Dy n St at Dy n Dy n St at Dy n Dy n

Performance test [RPG42] Predicate calculus Predicate transformatio n Predictive validation [RPG43] Product test [RPG44] Proof of Correctness Real-time input test [RPG45] Regression test [RPG46] Review [RPG47] Security test [RPG48] Self-driven input test [RPG49] Semantic analysis [RPG50] Sensitivity analysis [RPG51] Standards test [RPG52] State transition analysis [RPG53] Statement test [RPG54] Statistical techniques

X X X

X X X X X

X X X X X

X X X X

X X

X X

X X

X X

Common V&V Technique Applications M&S Phase


Cla ss

V&V Category

V&V Technique

M& M& Concept S S ual Rqm Desi Model ts gn

M&S Develop ment

M M&S Verificat Validat &S Assessm ion ion Use ent

[RPG55] Dy Stress test n [RPG56] Structural St analysis at [RPG57] Submodel/M Dy odule test n [RPG58] Symbolic Dy debugging n [RPG59] Symbolic St evaluations at [RPG60] Syntax St analysis at [RPG61] Dy Top-down n test [RPG62] Traceability St assessment at [RPG63] Trace-driven Dy input test n [RPG64] Turing test Inf [RPG65] User interface St analysis at [RPG66] Dy User interface n test [RPG67] Visualization/ Dy Animation n [RPG68] Walkthroughs Inf [RPG69]

Conducting an effective V&V effort is extremely important for the successful completion of complex and large-scale simulation applications and for resolution of complex problems. How much to test or when to stop testing depends on the requirements of the application or problem involved. The V&V effort should continue until the User obtains sufficient confidence in the credibility and acceptability of the simulation results.

Вам также может понравиться