You are on page 1of 298

(2- )

2.0.1 13.09.2017
. .
. .


, ......................................................... 4
1: ................................................................................. 6
1.1. .................................................. 6
1.2. ....................................................................... 9
1.3. ............................................... 12
1.4. ......................................................................... 16
2: ...................................................................................... 18
2.1. ................................................................. 18
2.1.1. .................................................................................................... 18
2.1.2. ................................................................................... 27
2.2. .............................................................. 29
2.2.1. ............................................................................................... 29
2.2.2. .................................................................................................... 30
2.2.3. ............................................................... 34
2.2.4. ........................................................................................... 36
2.2.5. ......................................................................... 41
2.2.6. ............................................................................ 46
2.2.7. .......................................................... 49
2.2.8. ............................... 58
2.3. .............................................................................. 62
2.3.1. .............................................................. 62
2.3.2. ................................................................. 64
2.3.2.1. .......................................................................... 64
2.3.2.2. ........................................................ 68
2.3.2.3. ................................. 68
2.3.2.4. ............................................................... 70
2.3.2.5. ( ) . 72
2.3.2.6. () (
) ..................................................................................... 74
2.3.2.7. ............................................ 77
2.3.2.8. ................................................................... 78
2.3.2.9. .................... 78
2.3.2.10. ..................................... 79
2.3.2.11. ............................................................... 79
2.3.2.12. .......................................................................... 80
2.3.2.13. ................................................................... 88
2.3.2.14. () ............................................ 96
2.3.3. .............. 98
2.3.4.
.............................................................................................................................. 105
2.4. -, -, - .......................................................... 110
2.4.1. - ............................................................................................................................ 110
2.4.2. - ................................................................................. 115
2.4.3. () - ..................................................................................... 119
2.4.4. .............................. 125
2.4.5. - ..................................................................... 131
2.4.6. - ..................................................................................................... 141
2.4.7. ................................................................ 147
2.4.8. -, - -
............................................................................................................................................. 155

. . EPAM Systems, 20152017 : 2/296


. .

2.5. ............................................................................................................... 162


2.5.1. , , , .. .................................................................... 162
2.5.2. ................................................................. 165
2.5.3. () ....................................................................... 169
2.5.4. .................. 179
2.5.5. ..................................................... 188
2.5.6. ......................................... 193
2.5.7. ....................................... 197
2.6. , .............................................. 203
2.6.1. ...................................................................................... 203
2.6.2. - ............................................... 206
2.6.3. .................................................................................................... 223
2.7. .......................... 229
2.7.1. - .................................................................. 229
2.7.2. .................................................... 232
2.7.3. ............................................ 237
2.7.4. ......................................................... 240
2.7.5. ........................................................................... 244
2.7.6. ................................................................... 248
3: .............................................................................. 252
3.1. ................................................................................... 252
3.1.1. ...................................................... 252
3.1.2. ...................................................................... 256
3.2. .......................................... 259
3.2.1. .................................................................................... 259
3.2.2. - ........................................................ 260
3.2.3. ........................................................... 264
3.3. ............................................... 274
4: .................................................................................................................... 275
4.1. ...................................................................................................... 275
4.2. .................................................................................................. 276
4.3. WINDOWS LINUX,
............................................................................................................. 279
4.4. ...................................................... 288
4.5. .................................................................................... 291
5: ................................................................................ 296

. . EPAM Systems, 20152017 : 3/296


,

,

EPAM Software Testing Division
.


, , ,
.


. -
,
.
, -
.
,

.
,
, . , ,
.

, -
,
.
, .
!?
-, ,
, ,
.
-,
. , , -
, ..
-, , -
(
), .
, -
:
.
.
.
, . () -
.
. , -
, ,
.
.
( , ).
{276} ,
.

. . EPAM Systems, 20152017 : 4/296


,

:
12345 , ;
{12345} -
, ( -
).
: ,
, -
. .
(!) . , !

. . EPAM Systems, 20152017 : 5/296


1:

1:

1.1.
, -
, .
-

.
ISTQB1 , -
.
(testing2).
-
-.
.
5060- -
, -
.
(debugging3). ..
(exhaustive testing4)
. , -
, ..
, .
1.1.a: , -
,
. ,
, - -
8- . -
, 100
. ?
,
( ,
)?
70- -
: -
(positive testing5),
:
(negative testing6).
,
.

1 International Software Testing Qualifications Board Glossary. [http://www.istqb.org/downloads/glossary.html]


2 Testing. The process consisting of all lifecycle activities, both static and dynamic, concerned with planning, preparation and evalu-
ation of software products and related work products to determine that they satisfy specified requirements, to demonstrate that
they are fit for purpose and to detect defects. [ISTQB Glossary]
3 Debugging. The process of finding, analyzing and removing the causes of failures in software. [ISTQB Glossary]
4 Complete testing, exhaustive testing. A test approach in which the test suite comprises all combinations of input values and
preconditions. [ISTQB Glossary]
5 Positive Testing. Testing aimed at showing software works. Also known as test to pass. [aptest.com]
6 Negative testing. Testing aimed at showing software does not work. Also known as test to fail. [aptest.com]

. . EPAM Systems, 20152017 : 6/296


, -
, ..
.
! , -
, -
. , . -
- , -

. , --
, -
- ,
. (. -, -
, -{110}).
-
( 1979, 2004, 2011 -
). , -
,
. , , . -
The art of software testing (Glenford J. Myers).
, , 70- :
, -
;
,
.
80- -
:
(software lifecycle7)
( .
{18}),
, -
.
-
-
.
90- -
, (quality assur-
ance8), -
, , -,
- . -
, -
, -
, -
.

7 Software lifecycle. The period of time that begins when a software product is conceived and ends when the software is no longer
available for use. The software lifecycle typically includes a concept phase, requirements phase, design phase, implementation
phase, test phase, installation and checkout phase, operation and maintenance phase, and sometimes, retirement phase. Note
these phases may overlap or be performed iteratively. [ISTQB Glossary]
8 Quality assurance. Part of quality management focused on providing confidence that quality requirements will be fulfilled. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 7/296


-

(Critical Testing Processes, Rex Black).

, , -
. -
, -
9 (test-driven development10, TDD). -
-
, , -
, -
.
-
. -
, : -
, , -
, , -
( -
).
(
1822 , ) The History of Software Test-
ing11 Testing References. -
The Growth of Software Testing12 (David Gelperin, Bill Hetzel).
1.1.b: TDD,
BDD, DDT, KDT, .
, .

9 , , ,
, : .. -
, , ..
10 Test-driven development. A way of developing software where the test cases are developed, and often automated, before the
software is developed to run those test cases. [ISTQB Glossary]
11 The History of Software Testing [http://www.testingreferences.com/testinghistory.php]
12 The Growth of Software Testing, David Gelperin, Bill Hetzel [http://www.clearspecs.com/down-
loads/ClearSpecs16V01_GrowthOfSoftwareTest.pdf]

. . EPAM Systems, 20152017 : 8/296


1.2.
,
.
, -
-
.
-
.
,
148 15
2009 . 1 -
().
-
: , .
1.2.a.
1.2.a .

, - ,
-

. -
.
, -
- , , -
. .

{275},
( ),
.
( -
) . ,
-, , , -
.

, , -
, , -
.
,
: , ,
, , .
, ? .
, IT-, ,
.
, -
? , :
,
(
).

. . EPAM Systems, 20152017 : 9/296


0) . , . -
. : -
IT.
, .
1.2.a: ,
, :
, -
.
1) - -
. -
, -
( , )? -
? IT ( , -
), , -
,
.
2) . IT
. -
? , . - -
? . ( -)
: ? C/C++/C#, Java, PHP, Ja-
vaScript, Python, Ruby .. , -
. , JavaScript ( -
).
3) SQL. -
, -
-
.
4) .
,
, .
5) - -
. ,

.
, , .
, ,
.
, -
:
1) ;
2) , , , -
;
3) , , , ;
4) ;
5) , -
.

. . EPAM Systems, 20152017 : 10/296


, , -
, -
.
,
. .
, , . -
, , , , -
.

. . EPAM Systems, 20152017 : 11/296


1.3.
-
, .. -
.

.
:
, , -
IT-.
IT, -
.
soft skills
, .
1.3.a: -
,
,
.


1.3.a
- -


-
, ,
-
- -

-
-
,
- {18} , -



-

-
, -
-
-
-

-

-
,
{29}
, -

,

, -
-
, -



-
-
-

. . EPAM Systems, 20152017 : 12/296


- -


- -
, -
- - -
, -
- -
-
{203}, ,


,
-
-


-

-- ,
-
-

, -
-, -
-
, -
- , -
{110}
-, -

- , -

-

- , -
-
- - , -
- -, -
{64}

-
-
-
-


-
-
, -
-
-








-
-
, -




-
, -
-
,
-
,
{162}



() , -




-
- -

. . EPAM Systems, 20152017 : 13/296


- -


,
- -

,
-
- -

,
{203}


1.3.b .
- -


, ,

,
Windows


-
, ,
,
Linux

-
Mac OS
, ,

- ,
-


-

,
-



, -

-

-

SQL -
/13


-
TCP/IP, -


-


-
--
-
,
-
,
--
- -

- HTML CSS

HTML CSS

13 MySQL, MS SQL Server Oracle , [http://svyatoslav.biz/database_book/]

. . EPAM Systems, 20152017 : 14/296


- -


-
OSI-, -



-
--
, -



-
Android

-
iOS

-
Windows Phone


1.3.c .
- -


-
-
e-mail
e-mail

-

-





, -
- ,
, -


, -

-

, , ,
.
: ) ; ) ; ) -
, (.
{252}). , -
, , -
35 . .
.

. . EPAM Systems, 20152017 : 15/296


1.4.
, -
(. ). , , -
, ,
.
-
, , (),
,
The 7 Plagues of Soft-
ware Testing14 (James Whittaker).
: (), / ,


. , ,
, .
.


. ,
. , .
.
.


, , -
. ,
, .
-
(,
, Hello, world -
).
, -
. ? .
. .


, IT-.
, .

14The plague of repetitiveness, James Whittaker [http://googletesting.blogspot.com/2009/06/by-james.html]


The Plague of Amnesia, James Whittaker [http://googletesting.blogspot.com/2009/07/plague-of-amnesia.html]
The Plague of Boredom, James Whittaker [http://googletesting.blogspot.com/2009/07/plague-of-boredom.html]
The Plague of Homelessness, James Whittaker [http://googletesting.blogspot.com/2009/07/plague-of-homelessness.html]
The Plague of Blindness, James Whittaker [http://googletesting.blogspot.com/2009/07/plague-of-blindness.html]
The 7th Plague and Beyond, James Whittaker [http://googletesting.blogspot.com/2009/09/7th-plague-and-beyond.html]
The Plague of Entropy, James Whittaker [http://googletesting.blogspot.com/2009/09/plague-of-entropy.html]

. . EPAM Systems, 20152017 : 16/296


-
. -. , -
, ( -
,
) .
(
). IT-
. .

,
, ? , -
.
, , ,
.


-
( IT-).
IT,
-,
.

, ..
, , -
. ,
, .. -
. -
,
.

, .. -

, :
. -
. , -
, -
.
? -
,
. ?
: , - (),
/ , . ,
, , :
http://svyatoslav.biz/software_testing_book_poll/

. . EPAM Systems, 20152017 : 17/296


2:

2:

2.1.
2.1.1.
, -
,
(lifecycle model15) (
(software lifecycle16) ). ,
, .
-
, : ,
-
.
(Software Development Model, SDM) ,
, -
.
,
, .
,
, , ..
,
, v-, ,
.
( ), -
, What are the
Software Development Models?17.
,
, , , -
. , -
, . -
, -
, .
, , , -
. -
. , , -
, .
! , ,

, . ,
, -
Scrum Tailoring18 .

15 Lifecycle model. A partitioning of the life of a product or project into phases. [ISTQB Glossary]
16 Software lifecycle. The period of time that begins when a software product is conceived and ends when the software is no longer
available for use. The software lifecycle typically includes a concept phase, requirements phase, design phase, implementation
phase, test phase, installation and checkout phase, operation and maintenance phase, and sometimes, retirement phase. Note
these phases may overlap or be performed iteratively. [ISTQB Glossary]
17 What are the Software Development Models? [http://istqbexamcertification.com/what-are-the-software-development-models/]
18 Scrum Tailoring, [http://cartmendum.livejournal.com/10862.html]

. . EPAM Systems, 20152017 : 18/296


(waterfall model19) -
, .. . -
, , -
, ( 2.1.a). -
,
.
,
- .





,
.

2.1.a
,
,
. -
,
, .
19 In a waterfall model, each phase must be completed fully before the next phase can begin. This type of model is basically used
for the project which is small and there are no uncertain requirements. At the end of each phase, a review takes place to deter-
mine if the project is on the right path and whether or not to continue or discard the project. [http://istqbexamcertifica-
tion.com/what-is-waterfall-model-advantages-disadvantages-and-when-to-use-it/]

. . EPAM Systems, 20152017 : 19/296


-
, -
. -
,
(-
, ..)

What is Waterfall model advantages, disadvantages
and when to use it?20.

The Rise And
Fall Of Waterfall, 21 .
V- (V-model22) -
. ( 2.1.b), , v-
-
, , -
.
, v-
,
. -
, ,
,
.




.

2.1.b V-

20 What is Waterfall model advantages, disadvantages and when to use it? [http://istqbexamcertification.com/what-is-waterfall-
model-advantages-disadvantages-and-when-to-use-it/]
21 . [http://cartmendum.livejournal.com/44064.html]
22 V-model. A framework to describe the software development lifecycle activities from requirements specification to maintenance.
The V-model illustrates how testing activities can be integrated into each phase of the software development lifecycle. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 20/296


v- What is V-
model advantages, disadvantages and when to use it?23. -
v-
Using V Models for Testing24.
(iterative model25, incremental
model26)
. , -
( ISTQB- ,
):
, .. -
;
( ) -
.
-
(),
, v-
( 2.1.c). (-
) , (build27).

2.1.c

23 What is V-model advantages, disadvantages and when to use it? [http://istqbexamcertification.com/what-is-v-model-advantages-


disadvantages-and-when-to-use-it/]
24 Using V Models for Testing, Donald Firesmith [http://blog.sei.cmu.edu/post.cfm/using-v-models-testing-315]
25 Iterative development model. A development lifecycle where a project is broken into a usually large number of iterations. An
iteration is a complete development loop resulting in a release (internal or external) of an executable product, a subset of the
final product under development, which grows from iteration to iteration to become the final product. [ISTQB Glossary]
26 Incremental development model. A development lifecycle where a project is broken into a series of increments, each of which
delivers a portion of the functionality in the overall project requirements. The requirements are prioritized and delivered in priority
order in the appropriate increment. In some (but not all) versions of this lifecycle model, each subproject follows a 'mini V-model'
with its own design, coding and testing phases. [ISTQB Glossary]
27 Build. A development activity whereby a complete system is compiled and linked, so that a consistent system is available including
all latest changes. [ daily build ISTQB Glossary]

. . EPAM Systems, 20152017 : 21/296


, -
, -
, (
)
.

, -
() .

, -
.
, -
.
-
What is Iterative model ad-
vantages, disadvantages and when to use it?28 What is Incremental model
advantages, disadvantages and when to use it?29.
(spiral model30)
,
, -
.
2.1.d. -
, :
, ;
;
( ) ;
.

-
, -
(
).
Barry Boehm 31, 32
,
.

What is Spiral model- advantages, disadvantages
and when to use it?33 Spiral Model34.

28 What is Iterative model advantages, disadvantages and when to use it? [http://istqbexamcertification.com/what-is-iterative-
model-advantages-disadvantages-and-when-to-use-it/]
29 What is Incremental model advantages, disadvantages and when to use it? [http://istqbexamcertification.com/what-is-incremen-
tal-model-advantages-disadvantages-and-when-to-use-it/]
30 Spiral model. A software lifecycle model which supposes incremental development, using the waterfall model for each step, with
the aim of managing risk. In the spiral model, developers define and implement features in order of decreasing priority. [http://dic-
tionary.reference.com/browse/spiral+model]
31 A Spiral Model of Software Development and Enhancement, Barry Boehm
[http://csse.usc.edu/csse/TECHRPTS/1988/usccse88-500/usccse88-500.pdf]
32 Spiral Development: Experience, Principles, and Refinements, Barry Boehm. [http://www.sei.cmu.edu/reports/00sr008.pdf]
33 What is Spiral model- advantages, disadvantages and when to use it? [http://istqbexamcertification.com/what-is-spiral-model-
advantages-disadvantages-and-when-to-use-it/]
34 Spiral Model, Amir Ghahrai. [http://www.testingexcellence.com/spiral-model/]

. . EPAM Systems, 20152017 : 22/296




(

)

2.1.d
(agile model35)
.. agile-36:
.
.
.
.
, -
, :
Agile Testing (Lisa Crispin, Janet Gregory).
Essential Scrum (Kenneth S. Rubin).

35 Agile software development. A group of software development methodologies based on EITP iterative incremental development,
where requirements and solutions evolve through collaboration between self-organizing cross-functional teams. [ISTQB Glos-
sary]
36 Agile- [http://agilemanifesto.org/iso/ru/manifesto.html]

. . EPAM Systems, 20152017 : 23/296


, -
,
, v-, ,
. -

.


TDD




2.1.e
( ) ,
-
( 2.1.e37, 2.1.f);
agile-
.

37 : http://www.gtagile.com/GT_Agile/Home.html

. . EPAM Systems, 20152017 : 24/296



()

( 4-
)

2.1.f scrum

, ,
.
, -
.

The Agile System Develop-
ment Life Cycle38.
2.1.a.
2.1.a

- -
- .
. -
-
.
-
. -
.
-
.
V- -
- - .
. .
-
.
. -
,

-
. .

38 The Agile System Development Life Cycle [http://www.ambysoft.com/essays/agileLifecycle.html]

. . EPAM Systems, 20152017 : 25/296


- -
. - .
- . -
. - ( -
- , ) -
- .
.
.

. .
- -
. -
.
. -
-

.
- -
-
. . -
- .
.
.

.
-
.


Project Lifecycle Models: How They Differ and
When to Use Them39 -
40.
What are the Software Development Models?41.
2.1.a: ,
, -
. ,
, .

39 Project Lifecycle Models: How They Differ and When to Use Them [http://www.business-esolutions.com/islm.htm]
40 - [http://megamozg.ru/post/23022/]
41 What are the Software Development Models? [http://istqbexamcertification.com/what-are-the-software-development-models/]

. . EPAM Systems, 20152017 : 26/296


2.1.2.
,
, -
( 2.1.g).
, (, , -
)
. , -
, ,
(-
, , ).
, ,
(, 42 43), -
. .

1

8 2


7 3


6 4 -

5

2.1.g
1 ( ) -
, , : -
; ; ; -
.. ,
, ..
.
2 ( )
-
(entry criteria44), (suspension criteria45) (resumption

42 Software Testing Life Cycle [http://softwaretestingfundamentals.com/software-testing-life-cycle/]


43 Software Testing Life Cycle [http://www.softwaretestingmentor.com/software-testing-life-cycle/]
44 Entry criteria. The set of generic and specific conditions for permitting a process to go forward with a defined task, e.g. test phase.
The purpose of entry criteria is to prevent a task from starting which would entail more (wasted) effort compared to the effort
needed to remove the failed entry criteria. [ISTQB Glossary]
45 Suspension criteria. The criteria used to (temporarily) stop all or a portion of the testing activities on the test items. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 27/296


criteria46) , (exit criteria47).


3 ( )
, : -
(test strategy48), -
.
4 ( -) , , -
, , -, -
, , -
.
5 ( -) 6 ( -
) :
-.
-
,

,
.
7 ( ) 8 ()
. -
-
, , 1, 2 3.
8 1, 2
3 . , .
-
, ,
, ,
, {203}. -

.

46 Resumption criteria. The criteria used to restart all or a portion of the testing activities that were suspended previously. [ISTQB
Glossary]
47 Exit criteria. The set of generic and specific conditions, agreed upon with the stakeholders for permitting a process to be officially
completed. The purpose of exit criteria is to prevent a task from being considered completed when there are still outstanding
parts of the task which have not been finished. Exit criteria are used to report against and to plan when to stop testing. [ISTQB
Glossary]
48 Test strategy. A high-level description of the test levels to be performed and the testing within those levels for an organization or
program (one or more projects). [ISTQB Glossary]

. . EPAM Systems, 20152017 : 28/296


2.2.
2.2.1.
, -
, .
(requirement49) , -

.
: -
10-20-30- , ,
,
.
, -
. , ..
,
.
,
, What is documentation
testing in software testing50.

49 Requirement. A condition or capability needed by a user to solve a problem or achieve an objective that must be met or possessed
by a system or system component to satisfy a contract, standard, specification, or other formally imposed document. [ISTQB
glossary]
50 What is documentation testing in software testing. [http://istqbexamcertification.com/what-is-documentation-testing/]

. . EPAM Systems, 20152017 : 29/296


2.2.2.
,
, .
, - ,
, .. .
2.2.a.
, 51, , :
, -
.
-
.
( -
).
.
.
.
, -
, , -
. (, v, ,
) .
,
, ,
-
, .



1000

500

20

,


0

2.2.a -

,
. ,
. , .
51 Requirements in the Real World, Brian Hanks [https://classes.soe.ucsc.edu/cmps109/Winter02/notes/requirementsLecture.pdf]

. . EPAM Systems, 20152017 : 30/296


, , -
? .
, (, ).
, - ,
.
, . , , :
- (, 100
), , ,
, .
2.2.a: , ,
(- -
, -, , ..). ,
?
, ,
,
. , , 2.2.b.
,
, , -
( -
).

2.2.b
-
(

. . EPAM Systems, 20152017 : 31/296


, .. -
).
(product documentation, development documenta-
tion52)
. :
o (project management plan53)
(test plan54).
o (product requirements document,
PRD55) (functional specifications56 doc-
ument, FSD57; software requirements specification, SRS58).
o (architecture and design59).
o - - (test cases60, test suites61).
o (technical specifications62),
, , ..
(project documentation63) -
, -
,
(, ). -
:

52 Development documentation. Development documentation comprises those documents that propose, specify, plan, review, test,
and implement the products of development teams in the software industry. Development documents include proposals, user or
customer requirements description, test and review reports (suggesting product improvements), and self-reflective documents
written by team members, analyzing the process from their perspective. [Documentation for Software and IS Development,
Thomas T. Barker, Encyclopedia of Information Systems (Elsevier Press, 2002, pp. 684694.)]
53 Project management plan. A formal, approved document that defines how the project is executed, monitored and controlled. It
may be summary or detailed and may be composed of one of more subsidiary management plans and other planning documents.
[PMBOK, 3rd edition]
54 Test plan. A document describing the scope, approach, resources and schedule of intended test activities. It identifies amongst
others test items, the features to be tested, the testing tasks, who will do each task, degree of tester independence, the test
environment, the test design techniques and entry and exit criteria to be used, and the rationale for their choice, and any risks
requiring contingency planning. It is a record of the test planning process. [ISTQB Glossary]
55 Product requirements document, PRD. The PRD describes the product your company will build. It drives the efforts of the entire
product team and the companys sales, marketing and customer support efforts. The purpose of the product requirements doc-
ument (PRD) or product spec is to clearly and unambiguously articulate the products purpose, features, functionality, and be-
havior. The product team will use this specification to actually build and test the product, so it needs to be complete enough to
provide them the information they need to do their jobs. [How to write a good PRD, Martin Cagan]
56 Specification. A document that specifies, ideally in a complete, precise and verifiable manner, the requirements, design, behavior,
or other characteristics of a component or system, and, often, the procedures for determining whether these provisions have
been satisfied. [ISTQB Glossary]
57 Functional specifications document, FSD. . Software requirements specification, SRS.
58 Software requirements specification, SRS. SRS describes as fully as necessary the expected behavior of the software system.
The SRS is used in development, testing, quality assurance, project management, and related project functions. People call this
deliverable by many different names, including business requirements document, functional spec, requirements document, and
others. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
59 Architecture. Design. A software architecture for a system is the structure or structures of the system, which comprise elements,
their externally-visible behavior, and the relationships among them. Architecture is design, but not all design is architecture.
That is, there are many design decisions that are left unbound by the architecture, and are happily left to the discretion and good
judgment of downstream designers and implementers. The architecture establishes constraints on downstream activities, and
those activities must produce artifacts (finer-grained designs and code) that are compliant with the architecture, but architecture
does not define an implementation. [Documenting Software Architectures, Paul Clements .]
Architecture, Design, Implementation, Ammon Eden, Rick Kazman.
[http://www.eden-study.org/articles/2003/icse03.pdf]
60 Test case. A set of input values, execution preconditions, expected results and execution postconditions, developed for a particular
objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement.
[ISTQB Glossary]
61 Test suite. A set of several test cases for a component or system under test, where the post condition of one test is often used as
the precondition for the next one. [ISTQB Glossary]
62 Technical specifications. Scripts, source code, data definition language, etc. [PMBOK, 3rd edition] . Specification.
63 Project documentation. Other expectations and deliverables that are not a part of the software the team implements, but that are
necessary to the successful completion of the project as a whole. [Software Requirements (3rd edition), Karl Wiegers and Joy
Beatty]

. . EPAM Systems, 20152017 : 32/296


o (user and ac-


companying documentation64), , -
, ..
o (market requirements document, MRD65),

( ),
(
).

,
.. .
,
(. 2.2.c) , -
, ( )
.

2.2.c


,
: ,
( , , ),
(-
,
).

64 User documentation. User documentation refers to the documentation for a product or service provided to the end users. The
user documentation is designed to assist end users to use the product or service. This is often referred to as user assistance.
The user documentation is a part of the overall product delivered to the customer. [ doc-department.com]
65 Market requirements document, MRD. An MRD goes into details about the target market segments and the issues that pertain
to commercial success. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]

. . EPAM Systems, 20152017 : 33/296


2.2.3.
. (gathering)
(elicitation) 66
( 2.2.d).
. , -
( , -
) ( , ..)
( -
), .. , -
(
, , ,
).
. -
, ,
( , , / -
, / -
).

2.2.d
. -
, .. -
. -
-
.
,
.
.
(
), , , -
-
. ,
. -

66 , : Requirements Gathering
vs. Elicitation (Laura Brandenburg): http://www.bridging-the-gap.com/requirements-gathering-vs-elicitation/

. . EPAM Systems, 20152017 : 34/296


, -
.
.
,
. , , ( -
) , -
,
.
.
(, -
, ).
, -
(-
) ( -
, ,
, ).
. ,
() , , -
. -
, -
, -
.
.
- (: -
, ),
(: -
, -
). -
-, .. -
( ) .
. -
, . (
!) ,
-
.
- -
. , -
:
:
( ).
Business Analysis Techniques. 72 Essential Tools for Success (James
Cadle, Debra Paul, Paul Turner).
Business Analysis (Second Edition) (Debra Paul, Donald Yeates, James
Cadle).

. . EPAM Systems, 20152017 : 35/296


2.2.4.
,
67, -
2.2.e68.
- (business requirements69) ,
( , ,
). -
(vision and scope70) ,
, , . -
,
-, ..
,
-:
, -
.
- , -
.
-
.

2.2.e

67 ( ) Software Require-
ments Engineering: What, Why, Who, When, and How, Linda Westfall [http://www.westfallteam.com/Pa-
pers/The_Why_What_Who_When_and_How_Of_Software_Requirements.pdf]
68 , Software Requirements (3rd edition) (Karl Wiegers and
Joy Beatty).
69 Business requirement. Anything that describes the financial, marketplace, or other business benefit that either customers or the
developing organization wish to gain from the product. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
70 Vision and scope. The vision and scope document collects the business requirements into a single deliverable that sets the stage
for the subsequent development work. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]

. . EPAM Systems, 20152017 : 36/296


(user requirements71) , -
(-
, ). -
,
, ,
.. -
(use cases72), (user stories73),
(user scenarios74). ( . -
{141} .)
,
:
-
.

, .

.
- (business rules75) -
(/ ) ,
. -,
, ..
,
-:
,
, .
-
.
.
(quality attributes76) -
-
( -
, , -
, , -
). 77,
.
,
:

71 User requirement. User requirements are general statements of user goals or business tasks that users need to perform. [Soft-
ware Requirements (3rd edition), Karl Wiegers and Joy Beatty]
72 Use case. A sequence of transactions in a dialogue between an actor and a component or system with a tangible result, where an
actor can be a user or anything that can exchange information with the system. [ISTQB Glossary]
73 User story. A high-level user or business requirement commonly used in agile software development, typically consisting of one
or more sentences in the everyday or business language capturing what functionality a user needs, any non-functional criteria,
and also includes acceptance criteria. [ISTQB Glossary]
74 A scenario is a hypothetical story, used to help a person think through a complex problem or system. An Introduction to Scenario
Testing, Cem Kaner. [http://www.kaner.com/pdfs/ScenarioIntroVer4.pdf]
75 Business rule. A business rule is a statement that defines or constrains some aspect of the business. It is intended to assert
business structure or to control or influence the behavior of the business. A business rule expresses specific constraints on the
creation, updating, and removal of persistent data in an information system. [Defining Business Rules What Are They Really,
David Hay .]
76 Quality attribute. A feature or characteristic that affects an items quality. [ISTQB Glossary]
77 : http://en.wikipedia.org/wiki/List_of_system_quality_attributes

. . EPAM Systems, 20152017 : 37/296


.
-
.

.
(functional requirements78) -
, .. (, , ,
..) -
.
, ,
, , (:
30
).
,
:
-
.
-
.
, ,
RFC822.
(non-functional requirements79)
( , , , -
..), .
.

.
,
:
1000 ,

100 .

2 .

5 15 .
-
, (
, ;
, ,
).

78 Functional requirement. A requirement that specifies a function that a component or system must perform. [ISTQB Glossary]
Functional requirements describe the observable behaviors the system will exhibit under certain conditions and the actions the
system will let users take. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
79 Non-functional requirement. A requirement that does not relate to functionality, but to attributes such as reliability, efficiency,
usability, maintainability and portability. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 38/296


(limitations, constraints80) , -
( ) -
.
,
:
-
800x600 1920x1080.
Flash
.


JavaScript.
(external interfaces requirements81)

.
,
:

AJAX- -
JSON.
-
.
RFC3207
(SMTP over TLS).
(data requirements82) (
), . -
.
,
:
, ,
MySQL,
MongoDB.

, -
.


.

80 Limitation, constraint. Design and implementation constraints legitimately restrict the options available to the developer. [Soft-
ware Requirements (3rd edition), Karl Wiegers and Joy Beatty]
81 External interface requirements. Requirements in this category describe the connections between the system and the rest of the
universe. They include interfaces to users, hardware, and other software systems. [Software Requirements (3rd edition), Karl
Wiegers and Joy Beatty]
82 Data requirements. Data requirement describe the format, data type, allowed values, or default value for a data element; the
composition of a complex business data structure; or a report to be generated. [Software Requirements (3rd edition), Karl
Wiegers and Joy Beatty]

. . EPAM Systems, 20152017 : 39/296


(software requirements specification, SRS83) -



( ).
,
, ,
-
(requirements management tools84, 85).
, -
-
-
(Software Requirements (3rd Edition) (Developer Best
Practices), Karl Wiegers, Joy Beatty). ( ) -
,
.

83 Software requirements specification, SRS. SRS describes as fully as necessary the expected behavior of the software system.
The SRS is used in development, testing, quality assurance, project management, and related project functions. People call this
deliverable by many different names, including business requirements document, functional spec, requirements document, and
others. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
84 Requirements management tool. A tool that supports the recording of requirements, requirements attributes (e.g. priority,
knowledge responsible) and annotation, and facilitates traceability through layers of requirements and requirements change
management. Some requirements management tools also provide facilities for static analysis, such as consistency checking and
violations to predefined requirements rules. [ISTQB Glossary]
85 :
http://makingofsoftware.com/resources/list-of-rm-tools

. . EPAM Systems, 20152017 : 40/296


2.2.5.
-
( 2.2.f).
(completeness86). -
,
.
:

(:
-
?).
(: -
PDF, PNG ..
..?).
(: . . -
123.45.b).

2.2.f
, (atomicity87). ,

.
:
, ,
(: Restart
, Log 20-
-
-
).
86 Each requirement must contain all the information necessary for the reader to understand it. In the case of functional requirements,
this means providing the information the developer needs to be able to implement it correctly. No requirement or necessary
information should be absent. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
87 Each requirement you write represents a single market need that you either satisfy or fail to satisfy. A well written requirement is
independently deliverable and represents an incremental increase in the value of your software. [Writing Good Requirements
The Big Ten Rules, Tyner Blain: http://tynerblain.com/blog/2006/05/25/writing-good-requirements-the-big-ten-rules/]

. . EPAM Systems, 20152017 : 41/296



(:
,
, -
). -
.
-
(: , -
; ,
; -
, -
,
).
, (consistency88).

.
:
(:
,
, ?)
, -
, , .. (: 712.a
Close 36452.x Close -
?)

(: ,
800x600
, ).
(unambiguousness89, clearness).
, -
. -
.
:
,
(:
?)
, :
(adequate), (be able to), (easy), -
(provide for), (as a minimum), (be capable
of), (effectively), (timely), (as applica-
ble), (if possible), (to be determined,
TBD), (as appropriate), (if
practical), (but not limited to), (capability of),
(capability to), (normal),
(minimize), (maximize), (optimize),

88 Consistent requirements dont conflict with other requirements of the same type or with higher-level business, user, or system
requirements. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
89 Natural language is prone to two types of ambiguity. One type I can spot myself, when I can think of more than one way to interpret
a given requirement. The other type of ambiguity is harder to catch. Thats when different people read the requirement and come
up with different interpretations of it. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]

. . EPAM Systems, 20152017 : 42/296


(rapid), (user-friendly), (simple), (often), (usual),


(large), (flexible), (robust),
(state-of-the-art), (improved), (efficient).
, , -
: -

, .
-
(:
-
-
? ? --
?)
, -
(: PDF
PNG
, , -
PDF PNG-, -
page-1, page-2 ..).
.
(feasibility90). -
.
:
(gold plating) ,
/ -
(: -
-
, ).

(: -
,
).
(:

).
, (obligatoriness91) (up-to-date).
,
. , ,
(. -
). ( ) , -
.
:
,
.

90 It must be possible to implement each requirement within the known capabilities and limitations of the system and its operating
environment, as well as within project constraints of time, budget, and staff. [Software Requirements (3rd edition), Karl Wiegers
and Joy Beatty]
91 Each requirement should describe a capability that provides stakeholders with the anticipated business value, differentiates the
product in the marketplace, or is required for conformance to an external standard, policy, or regulation. Every requirement
should originate from a source that has the authority to provide requirements. [Software Requirements (3rd edition), Karl
Wiegers and Joy Beatty]

. . EPAM Systems, 20152017 : 43/296


-
/ .
, .
(traceability92, 93). -
(vertical traceability94) (horizontal traceability95).
,
-, -, -
..
-
(requirements management tool96) /
(traceability matrix97).
:
, , ,
.

.
, -
.
(modifiability98).
. -
,
,
.
:
(. ) (. -
),
(. ).
(. ).
( -
) ,
.
(,
,
).

92 Traceability. The ability to identify related items in documentation and software, such as requirements with associated tests.
[ISTQB Glossary]
93 A traceable requirement can be linked both backward to its origin and forward to derived requirements, design elements, code that
implements it, and tests that verify its implementation. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
94 Vertical traceability. The tracing of requirements through the layers of development documentation to components. [ISTQB Glos-
sary]
95 Horizontal traceability. The tracing of requirements for a test level through the layers of test documentation (e.g. test plan, test
design specification, test case specification and test procedure specification or test script). [ISTQB Glossary]
96 Requirements management tool. A tool that supports the recording of requirements, requirements attributes (e.g. priority,
knowledge responsible) and annotation, and facilitates traceability through layers of requirements and requirements change
management. Some requirements management tools also provide facilities for static analysis, such as consistency checking and
violations to predefined requirements rules. [ISTQB Glossary]
97 Traceability matrix. A two-dimensional table, which correlates two entities (e.g., requirements and test cases). The table allows
tracing back and forth the links of one entity to the other, thus enabling the determination of coverage achieved and the assess-
ment of impact of proposed changes. [ISTQB Glossary]
98 To facilitate modifiability, avoid stating requirements redundantly. Repeating a requirement in multiple places where it logically
belongs makes the document easier to read but harder to maintain. The multiple instances of the requirement all have to be
modified at the same time to avoid generating inconsistencies. Cross-reference related items in the SRS to help keep them
synchronized when making changes. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]

. . EPAM Systems, 20152017 : 44/296


, , (ranked99
for importance, stability, priority). -
.
, .

.
-
.

, -
- -
.
-
, -
,
( ).


.
(correctness100) (verifiability101).
( -
, , -
). , -
- (-), ,
-
.
:
( ,
,
; );
;
,
, ;
(,
-
);
, (:
,
).

Writing Good Requirements The Big Ten
Rules102.

99 Prioritize business requirements according to which are most important to achieving the desired value. Assign an implementation
priority to each functional requirement, user requirement, use case flow, or feature to indicate how essential it is to a particular
product release. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
100 Each requirement must accurately describe a capability that will meet some stakeholders need and must clearly describe the
functionality to be built. [Software Requirements (3rd edition), Karl Wiegers and Joy Beatty]
101 If a requirement isnt verifiable, deciding whether it was correctly implemented becomes a matter of opinion, not objective analysis.
Requirements that are incomplete, inconsistent, infeasible, or ambiguous are also unverifiable. [Software Requirements (3rd
edition), Karl Wiegers and Joy Beatty]
102 Writing Good Requirements The Big Ten Rules, Tyner Blain [http://tynerblain.com/blog/2006/05/25/writing-good-require-
ments-the-big-ten-rules/]

. . EPAM Systems, 20152017 : 45/296


2.2.6.
-
(non-functional testing103). -
.
(peer review104). (-
) -
(
):
(walkthrough105)

,
, .
,
.
: ,
,
.
(technical review106) -
.
. -
, .
: , -
, ..
(inspection107) -
, -
. ,
, -
( ,
, -
).
: -
( , , -
..)
. -
() -
, ( ) .
-
. , -
. -
, , -
. , ,
.

103 Non-functional testing. Testing the attributes of a component or system that do not relate to functionality, e.g. reliability, efficiency,
usability, maintainability and portability. [ISTQB Glossary]
104 Peer review. A review of a software work product by colleagues of the producer of the product for the purpose of identifying
defects and improvements. Examples are inspection, technical review and walkthrough. [ISTQB Glossary]
105 Walkthrough. A step-by-step presentation by the author of a document in order to gather information and to establish a common
understanding of its content. [ISTQB Glossary]
106 Technical review. A peer group discussion activity that focuses on achieving consensus on the technical approach to be taken.
[ISTQB Glossary]
107 Inspection. A type of peer review that relies on visual examination of documents to detect defects, e.g. violations of development
standards and non-conformance to higher level documentation. The most formal review technique and therefore always based
on a documented procedure. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 46/296


, -
. 2.2.a -
, .
, .
2.2.a

? ( -
. ,
, - -
, -
).
? ? -
( -
- ?
.) -
? (, . ,
?) ? -
, -
?
? ( -
, , - PDF ? ,
- -
PDF. .) ? PDF
PDF -
? (
, ?
,
.) (, PDF-
? (! ) -
, PDF?
.)
? ( - , ,
, . , ?) -
. - , ? ,
? ( - -
, - ? ,
- -
.) ? P.S. -
? , -
( , , ?
,
.
,
: -
, -
.)

- -. ,
, ,
, . - -
- ,
.
-, , (-
, - ).
.

. . EPAM Systems, 20152017 : 47/296


, (
, ..)
-
, -
. , - .
,

, , ..
, -
.
, -
, , -
.
.
( - -), ,
, , ,
. -
, ,
. , -
, -
, , .
( ). -
, , , , -
-108 ..
(, UML- ,
, ).
, - , - -
.. -
(, UML), -
: -
,
.
. ,
.

, -
, -
, , (,
) .

108 Mind map [http://en.wikipedia.org/wiki/Mind_map]

. . EPAM Systems, 20152017 : 48/296


2.2.7.
, -
, -
.

-
(Software Requirements (3rd Edition) (Developer Best Practices), Karl
Wiegers, Joy Beatty).
, :
, -
( ).
, , -
. ,
.
-. - (.
{36}) : -
.
.
, .
2.2.b: ,
.
( ,
HTML, MD, - )? ( , .)
? ( .)
? (
.)
? ( .)
( , , ,
- )? ( . ,
, .)
? ( .)
(, -
)? (200300 .)
? (Notepad++.)
, -
- ( ,
-
).
: , -
, -
.
:
-
.
, -
.

. . EPAM Systems, 20152017 : 49/296


:

.
12 -
-
.
:
-
.
, 12
? .
, (
, ),
:
: plain text, HTML, MD.
: CP1251, UTF8, CP866, KOI8R.
: UTF8.
,
, ..
- , -
.
.
(. {36}).

,
(
). -
,
.
,
2.2.g (, -
, ).
-
.
, ,
.
! . .

-1: .
-2: PHP.
-3: .

. .
-1: .
o -1.1: PHP
converter.php .

. . EPAM Systems, 20152017 : 50/296


o -1.2:
Ctrl+C.
-2: .
o -2.1:
.
o -2.2: UTF8.
-3: .
o -3.1:
-.
o -3.2: - , -
.

uc Primary Use Cases

2.2.g
-
-1:
o -1.1: , -
.
o -1.2: , ,
.

. . EPAM Systems, 20152017 : 51/296



-1:
o -1.1:
5 /.
-2:
o -2.1:
50 .
o -2.2: ,
.

{58}, -
, Word -
. -
2.2.h.

2.2.h Word
, (-
, .. , ,
DOCX-),
.
,
, (, , -
) .
.
2.2.c:
{41}, -
, .


-1: .
-2: PHP.
o PHP
? (5.5.x)
o PHP
? (,
mbstring.)

. . EPAM Systems, 20152017 : 52/296


o PHP?
, . (, PHP. ,
.)
o
PHP? (.)
-3: .
o ? (, PHP.)
o ? ( ,
.)

. .
-1: .
o -1.1: PHP
(, : php ( ))
(, OK.) converter.php .
? (
, .)
:
; ( .)
; ( -
, .)
. (
, .)
o -1.2:
Ctrl+C ( -
, ) (, OK.).
-2: .
o -2.1:
.
? ( , -
.)
o -2.2: UTF8.
,
UTF8 ? ( UTF8,
.)
-3: .
o -3.1:
-.
? (-, ,
. , -
.)
-?
(.)
-? ( -
. converter.log
php-.)
o -3.2: - , -
.
-
? (.)

. . EPAM Systems, 20152017 : 53/296


- -
, ? (. ,
.)
-
-1:
o -1.1: , (, -
) (.) , .
-
? ( , .)
o -1.2: , ,
( -
, -
). ( , .)

-1:
o -1.1:
5 /.
? (i7, 4GB
RAM)
-2:
o -2.1:
50 .
, -
50 ? ( .)
o -2.2: ,
.
? (
. , ,
.)

, -
:
-
, . . ,
.
(
, , , -
, ,
-). , .. - -
. (
) ( ).
,
(
). , ,
,
. -
(,
,
, ).

. . EPAM Systems, 20152017 : 54/296


(. -
{36}). .. (.
{34}) .
, -
. , -
, ,
, -
. .

-1: .
-2: PHP (-
PHP -1 , -
PHP -1
).
-3: -4 -
.

. .
-1: .
o -1.1: php
converter.php SOURCE_DIR DESTINATION_DIR [LOG_FILE_NAME]
( -2.1,
-2.2, -
2.3, -2.4).
o -1.2:
Ctrl+C , .
-2: .
o -2.1: -
(. -2).
o -2.2: -
UTF8 ( . -5).
-3: .
o -3.1:
- (. -4), -
, -2.1.
o -3.2: -4.1, -
- -
4.2 -4.3 .
-
-1:
o -1.1: , -
, (. -2.1 -3.2).
o -1.2: , ,
, -
(. -2.1 -3.2).

. . EPAM Systems, 20152017 : 55/296



-1:
o -1.1:
5 / , -
: i7, 4 , -
/ 30 /. . -6.
-2:
o -2.1:
-5.1.
o -2.2:
-5.2.
o -2.3: -
-5.3.

-1: PHP, -

IT-.
-2: PHP -
-1 .
-3: PHP
.
-4:
Windows Linux, -
PHP , -1.1.
-5: UTF8 ,
.
-6: -1.1 , -
-
(, -
).
-
.

-1: PHP
-1.1: 5.5.
-1.2: -
mbstring.
-2:
-2.1: -
:
SOURCE_DIR ,
, ;
DESTINATION_DIR , -
, (
SOURCE_DIR
(. -1.1 -1.2));
LOG_FILE_NAME ,
- ( - converter.log -
, con-
verter.php);

. . EPAM Systems, 20152017 : 56/296


-2.2:
,
(-3.1).
-2.3:
, -
-2.1.
-2.4:
,
(-3.1), ,
(. -3.2).
-3:
-3.1: : USAGE converter.php SOURCE_DIR
DESTINATION_DIR LOG_FILE_NAME.
-3.2: :
Directory not exists or inaccessible.
Destination dir may not reside within source dir tree.
Wrong file name or inaccessible path.
-4:
-4.1: -
-: YYYY-MM-DD HH:II:SS _ _ -
_.
-4.2: - , -
-.
-4.3: - , -
.
-5:
-5.1:
: WIN1251, CP866, KOI8R.
,
:
Plain Text (TXT);
Hyper Text Markup Language Document (HTML);
Mark Down Document (MD).
-5.2: 50 (-
), , 50 .
-5.3: -5.1 ,
, .
2.2.d: ,
(
)? ( , .)
, , .
( ),
, , -
.
2.2.e:
35 , -
.

. . EPAM Systems, 20152017 : 57/296


2.2.8.

, .
. -
-
, ( ), Word
Excel .. :
.
- ,
.
, ,
, ,
(PDF, ).
-
, ,
DOCX-, Word
,
(. 2.2.i) (. 2.2.j).

2.2.i Word
, 2.2.j:
( ),
, -

.

2.2.j
, :
. , -
- ,
, .
. .

. . EPAM Systems, 20152017 : 58/296


, . -
/ .
, -

, .
. ,
, , -
(, ).
,
. -
, ( )
, , -
.
,
. -
, ,
(, Word -
). ,
.
, .. ,
.
. -
(. {46}
2.2.a{47}). , :
- ,

(, -?,
?, ?).
- :
, {-}?,
{-}? -
, .
, ,
/ -
, ?. , ..
. ( -
, ).
: -
, . -
: ? -
?. , -
.
.
?, ? -
. ,
, .

. . EPAM Systems, 20152017 : 59/296


/ .
, 2030
. .
, , -
. ,
, , 510
,
.
. ,
, ( ).
, ,
, .
.
-
, , .
, ,
, -
. ,
: , -
. -
? ?
. ,
-
-. ,
.. . .
, ,
: , -
,
. , , 20 30.
20 , 30, .
.
.
.
, (., ,
2.2.j -
, ). ,
, ..
, . -
, , -
- , , , -
. , -
.
.
(. ) ,
-
. . ,
. ,
,

. . EPAM Systems, 20152017 : 60/296


. , -
, -
.
.
. , -
, . , -
, , , ,
- , -
. : - ,
( ).
,
, .. -
.
, . -
, , .. -
:
- .
--
( ,
).
( , -
, , -,
, -
).

. . EPAM Systems, 20152017 : 61/296


2.3.
2.3.1.
-
,
( ) .
, -
,
: , -
, ,
.
-
. , :

()



()

2.3.a
:
o .
o .
:
o .
o .
o , .
:
o - .
o - -
.
( ):
o ()
.
o
.
o -
.
() ( -
):
o (
109) ,
, -
.

109 Smoke test, Wikipedia [http://en.wikipedia.org/wiki/Smoke_testing_(electrical)]

. . EPAM Systems, 20152017 : 62/296


o , -
-
.
o () -
, .
:
o -
, -
.. , -
.
o
() , -
( ).
! ! -
.
,
(
, , -
).

, ,
, , .. ..
,
110 111, ..
.
, (testing
type) (testing kind) .

110 , Wikipedia [https://ru.wikipedia.org/wiki/]


111 , Wikipedia [https://ru.wikipedia.org/wiki/]

. . EPAM Systems, 20152017 : 63/296


2.3.2.
2.3.2.1.
-
. ,
, .
2.3.b 2.3.c , -
. , -
112, -,
, (.. -
).
2.3.b 2.3.c (.
115) -
. - -
.
-
:
112;

(Lee Copeland, A Practitioner's Guide to Software
Test Design);
Types of Software Testing: List of 100 Differ-
ent Testing Types113.
? -
-
-, ,
.
-
, -
.
, -
. , 114 ISTQB
,
, .
, ,
,
.

. , -
, .
,
, .

112 [http://habrahabr.ru/company/npo-comp/blog/223833/]
113 Types of Software Testing: List of 100 Different Testing Types [http://www.guru99.com/types-of-software-testing.html]
114 International Software Testing Qualifications Board, Downloads. [http://www.istqb.org/downloads.html]

. . EPAM Systems, 20152017 : 64/296


, 2.3.b 2.3.c, -
, -
.
:
1)
,
( , -
), . : -
, -
.
2) .

2.3.b 2.3.c ,
. -
115.
,

115 2.3.b [http://svyatoslav.biz/wp-pics/software_testing_classification_ru.png] 2.3.c


[http://svyatoslav.biz/wp-pics/software_testing_classification_en.png]

. . EPAM Systems, 20152017 : 65/296




( )

()

( ) ( )


...
()

- - - - -
- ()


()


,
, -


-
()

()




























()
A/B

()




()
()
()
()

1 2 3

-
-
-

()

2.3.b (
)

. . EPAM Systems, 20152017 : 66/296


Software testing classification

By code execution By access to application code and architecture By automation level

White box Black box Gray box Automated


Static testing Dynamic testing Manual
method method method (+ automatic)

By specification level By functions under test importance (decreasingly)


By application nature
(by testing level) (by functional testing level)

Integration Critical path Web app Mobile app Desktop app


Unit testing System testing Smoke testing Extended testing ...
testing testing testing testing testing

By architecture tier By end users participation By formalization level

Presentation tier Business-logic


Data tier testing Alpha testing Beta testing Gamma testing Test case based Exploratory Ad hoc
testing tier testing

By aims and goals By techniques and approaches

By ways of dealing with


Functional testing Installation testing Positive testing Negative testing By error source (knowledge)
application

Positive testing Nonfunctional testing Regression testing Error guessing


Based on testers experience, scenarios,
checklists

Negative testing Re-testing Heuristic evaluation


Exploratory Ad hoc

Acceptance testing Error seeding

By intrusion to application work process


Mutation testing

By automation
Usability testing Reliability testing Intrusive testing Nonintrusive testing
techniques

By operational environment
Accessibility testing Data-driven
Recoverability testing
testing
By input data selection techniques
Development testing
Interface testing Keyword-driven
Failover testing
testing
Equivalence partitioning
Operational testing
Data quality testing and Behavior-driven
Security testing
Database integrity testing testing
Boundary value analysis

Internationalization testing Resource utilization testing By application behavior (models)


Domain testing
By code structures
Localization testing Decision table testing
Performance testing Pairwise testing
Statement
testing
State transition testing
Load testing
Compatibility testing Orthogonal array testing
(Capacity testing)
Branch testing
Specification-based testing
Configuration
testing Scalability testing
Condition testing By code
Model-based testing
Cross-browser
Volume testing
testing Multiple
Control flow testing
condition testing
Use case testing
Stress testing
Comparison testing Decision testing Data flow testing
Parallel testing
Concurrency testing Modified
Qualification testing condition State transition testing
decision testing Random testing

Exhaustive testing Path testing Code review


A / B testing

By chronology

General chronology By component hierarchy By attention to requirements and requirements components

Bottom-up Top-down Hybrid


Negative testing testing testing Nonfunctional
Positive Functional
Negative (complex) components testing
Positive (complex) Requirements components testing
(simple)
(simple) testing

General typical scenarios

Typical scenario 1 Typical scenario 2 Typical scenario 3

Critical path Extended testing Integration System testing Gamma testing


Beta testing
Smoke testing testing Unit testing testing Alpha testing

Hybrid Acceptance testing


testing Operational testing

2.3.c ( )

. . EPAM Systems, 20152017 : 67/296


2.3.2.2.
-
. :
(static testing116)
. -
:
o (, -, -
, ..).
o (, -
).
o (
(code review117), -
{46} ).
-
{92}.
o () .
o .
(dynamic testing118)
. -
( {73}), -
( {72}), (-
{72}) . -
, -
() .

2.3.2.3.
(white box testing119, open box testing, clear box testing,
glass box testing)
, . -
-
(design-based testing120).

{91}
{91}, {92}.
, -

( {72} -
,
).

116 Static testing. Testing of a software development artifact, e.g., requirements, design or code, without execution of these artifacts,
e.g., reviews or static analysis. [ISTQB Glossary]
117 Jason Cohen, Best Kept Secrets of Peer Code Review (Modern Approach. Practical Advice.). -
: http://smartbear.com/SmartBear/media/pdfs/best-kept-secrets-of-peer-
code-review.pdf
118 Dynamic testing. Testing that involves the execution of the software of a component or system. [ISTQB Glossary]
119 White box testing. Testing based on an analysis of the internal structure of the component or system. [ISTQB Glossary]
120 Design-based Testing. An approach to testing in which test cases are designed based on the architecture and/or detailed design
of a component or system (e.g. tests of interfaces between components or systems). [ISTQB Glossary]

. . EPAM Systems, 20152017 : 68/296


(black box testing121, closed box testing, specification-


based testing)
, , -
. -
2.3.b 2.3.c -
,
: -
( ) , -

. -
-
( (requirements-based testing122))
( ,
; -
, ).
(gray box testing123)
, ,
, . 2.3.b 2.3.c
,
:
,
, .
! 124
, -
, -

. , , -
,
, , ..
.
-
, (. 2.3.a).
-
, ,
.

121 Black box testing. Testing, either functional or non-functional, without reference to the internal structure of the component or
system. [ISTQB Glossary]
122 Requirements-based Testing. An approach to testing in which test cases are designed based on test objectives and test condi-
tions derived from requirements, e.g. tests that exercise specific functions or probe non-functional attributes such as reliability or
usability. [ISTQB Glossary]
123 Gray box testing is a software testing method, which is a combination of Black Box Testing method and White Box Testing
method. In Gray Box Testing, the internal structure is partially known. This involves having access to internal data structures
and algorithms for purposes of designing the test cases, but testing at the user, or black-box level. [Gray Box Testing Funda-
mentals, http://softwaretestingfundamentals.com/gray-box-testing].
124 Gray box testing (gray box) definition, Margaret Rouse [http://searchsoftwarequality.techtarget.com/definition/gray-box]

. . EPAM Systems, 20152017 : 69/296


2.3.a ,


-
. , -
- -
- - .
-
. -
- ,
, -
. .
-
.
-
, .
- -
, -
. {141}.
-
() , -
. .
- ,
-
- -
. .
- -
- - -
{141}. .
- -

. .
- - -
- -
. .
-, -
- -
. .
.

2.3.2.4.
(manual testing125) , -
-
. ,
, ,
, , -
, , ,
.. -
.

125 Manual testing is performed by the tester who carries out all the actions on the tested application manually, step by step and
indicates whether a particular step was accomplished successfully or whether it failed. Manual testing is always a part of any
testing effort. It is especially useful in the initial phase of software development, when the software and its user interface are not
stable enough, and beginning the automation does not make sense. (SmartBear TestComplete user manual, http://sup-
port.smartbear.com/viewarticle/55004/)

. . EPAM Systems, 20152017 : 70/296


(automated testing, test automation126)


, , -
.
- -
, -, , -
,
- .
112 -
-

( , -
). .. -
-
,
.
,
(. 2.3.b).
2.3.b

- -
- ,
. (
, , ..)
- (- -
, ..) ,
- -.
- (
). , ..
- -
-, .
, - ,
.
- -
, , , - ( ), -
- ( ).
,
. , -
- ( ,
- ) - -
, , -
.. .

-
, , -
(test coverage127), -
.

126 Test automation is the use of software to control the execution of tests, the comparison of actual outcomes to predicted outcomes,
the setting up of test preconditions, and other test control and test reporting functions. Commonly, test automation involves
automating a manual process already in place that uses a formalized testing process. (Ravinder Veer Hooda, An Automation of
Software Testing: A Foundation for the Future)
127 Coverage, Test coverage. The degree, expressed as a percentage, to which a specified coverage item has been exercised by a
test suite. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 71/296


2.3.a:
. : -
-
.

2.3.2.5. (
)

! , , -
, -
:
= .
() =
.
() (unit testing, module testing, com-
ponent testing128) -
, ( )
.
, , -
, , .

{71},
-.
-
: -, , -
,
,
.
,
-
.
(integration testing129, component integration
testing130, pairwise integration testing131, system integration testing132, incremental
testing133, interface testing134, thread testing135) -
( ,
, ).
,
128 Module testing, Unit testing, Component testing. The testing of individual software components. [ISTQB Glossary]
129 Integration testing. Testing performed to expose defects in the interfaces and in the interactions between integrated components
or systems. [ISTQB Glossary]
130 Component integration testing. Testing performed to expose defects in the interfaces and interaction between integrated com-
ponents. [ISTQB Glossary]
131 Pairwise integration testing. A form of integration testing that targets pairs of components that work together, as shown in a call
graph. [ISTQB Glossary]
132 System integration testing. Testing the integration of systems and packages; testing interfaces to external organizations (e.g.
Electronic Data Interchange, Internet). [ISTQB Glossary]
133 Incremental testing. Testing where components or systems are integrated and tested one or some at a time, until all the compo-
nents or systems are integrated and tested. [ISTQB Glossary]
134 Interface testing. An integration test type that is concerned with testing the interfaces between components or systems. [ISTQB
Glossary]
135 Thread testing. An approach to component integration testing where the progressive integration of components follows the im-
plementation of subsets of the requirements, as opposed to the integration of components by levels of a hierarchy. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 72/296


, .
. (.
, -
{96}.)
(system testing136)
, ,
.
,
,
, .

: , -
;
, -
.
, -
2.3.d.

2.3.d

ISTQB -
(test level137), , ,
, -
{82}, -
.
, , -
.
-
What are Software Testing Levels?138.
2.3.e, -
, .

136 System testing. The process of testing an integrated system to verify that it meets specified requirements. [ISTQB Glossary]
137 Test level. A group of test activities that are organized and managed together. A test level is linked to the responsibilities in a
project. Examples of test levels are component test, integration test, system test and acceptance test. [ISTQB Glossary]
138 What are Software Testing Levels? [http://istqbexamcertification.com/what-are-software-testing-levels/]

. . EPAM Systems, 20152017 : 73/296


2.3.e

2.3.2.6. ()
( )

.
! , , -
, -
:
= .
() =
.
(smoke test139, intake test140, build verification test141)
, ,
,

139 Smoke test, Confidence test, Sanity test. A subset of all defined/planned test cases that cover the main functionality of a
component or system, to ascertaining that the most crucial functions of a program work, but not bothering with finer details.
[ISTQB Glossary]
140 Intake test. A special instance of a smoke test to decide if the component or system is ready for detailed and further testing. An
intake test is typically carried out at the start of the test execution phase. [ISTQB Glossary]
141 Build verification test. A set of automated tests which validates the integrity of each new build and verifies its key/core function-
ality, stability and testability. It is an industry practice when a high frequency of build releases occurs (e.g., agile projects) and it
is run on every new build before the build is released for further testing. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 74/296


( ,
).
! ! -
-
smoke test{74}, acceptance
test{82}, .
, -
, -
-.
, -
()-

. - -
, ,
, .
- -
100 %
100 %.
, smoke test
sanity test. ISTQB : sanity test: See smoke test.
142, 143 -
( 2.3.f):
1
,

2 Smoke Test


3

30
,
31 Sanity Test

32

2.3.f smoke test sanity test


(critical path144 test) -
, -
. -
, -
: -
,

142 Smoke Vs Sanity Testing Introduction and Differences [http://www.guru99.com/smoke-sanity-testing.html]


143 Smoke testing and sanity testing Quick and simple differences [http://www.softwaretestinghelp.com/smoke-testing-and-san-
ity-testing-difference/]
144 Critical path. Longest sequence of activities in a project plan which must be completed on time for the project to complete on due
date. An activity on the critical path cannot be started until its predecessor activity is complete; if it is delayed for a day, the entire
project will be delayed for a day unless the activity following the delayed activity is completed a day earlier. [http://www.business-
dictionary.com/definition/critical-path.html]

. . EPAM Systems, 20152017 : 75/296


(. 2.3.g).
, ,
( ). - -
,
.
-
, ,
( , 708090 % ).

2.3.g
(extended test145)
,
. -
, , -
. -
-
.
-
, , -
,
. -
,
( 3050 %, .. -

).
, , -
, -
{77} -
{77} ,
. . ,
( )
. , -
,
-.

145 Extended test. The idea is to develop a comprehensive application system test suite by modeling essential capabilities as ex-
tended use cases. [By Extended Use Case Test Design Pattern, Rob Kuijt]

. . EPAM Systems, 20152017 : 76/296


2.3.h ()
( )

2.3.2.7.
(positive testing146)
, -
, ,
.. - ,
(
, ).
- -
(,
) , -
.
(negative testing147, invalid testing148)
,
() / , -
( ). -
(
, ,
..), - -
, (
). - -
, ..
() .

146 Positive testing is testing process where the system validated against the valid input data. In this testing tester always check for
only valid set of values and check if application behaves as expected with its expected inputs. The main intention of this testing
is to check whether software application not showing error when not supposed to & showing error when supposed to. Such
testing is to be carried out keeping positive point of view & only execute the positive scenario. Positive Testing always tries to
prove that a given product and project always meets the requirements and specifications. [http://www.softwaretest-
ingclass.com/positive-and-negative-testing-in-software-testing/]
147 Negative testing. Tests aimed at showing that a component or system does not work. Negative testing is related to the testers
attitude rather than a specific test approach or test design technique, e.g. testing with invalid input values or exceptions. [ISTQB
Glossary]
148 Invalid testing. Testing using input values that should be rejected by the component or system. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 77/296


2.3.2.8.
,
,
-
, .
- (web-applications testing) -
{84} ( -
- {85}), -
{86}, -
.
(mobile applications testing)
{84}, -
{86} ( -
),
.
(desktop applications testing) -
-
, ,
, ..
. , -
(console appli-
cations testing) (GUI-applications
testing), (server applications testing) -
(client applications testing) ..

2.3.2.9. -

, , -
.
(presentation tier testing) -
,
( , ).
, -
, , .
- (business logic tier testing)
-
, - -
.
(data tier testing)
, -
( ). -
, --
, .
-
, 149 .

149 Multitier architecture, Wikipedia [http://en.wikipedia.org/wiki/Multitier_architecture]

. . EPAM Systems, 20152017 : 78/296


2.3.2.10.

{83}.
- (alpha testing150) --
.
{82}. -
,
, -
. :
, ,
-.
- (beta testing151) --
/.
{82}.
: ,
, , -
.
- (gamma testing152)
,
, -. , -
/.
{82}.
: ,
.

2.3.2.11.
- (scripted testing153, test case based test-
ing) ,
-, - -
. ,

,

.

150 Alpha testing. Simulated or actual operational testing by potential users/customers or an independent test team at the developers
site, but outside the development organization. Alpha testing is often employed for off-the-shelf software as a form of internal
acceptance testing. [ISTQB Glossary]
151 Beta testing. Operational testing by potential and/or existing users/customers at an external site not otherwise involved with the
developers, to determine whether or not a component or system satisfies the user/customer needs and fits within the business
processes. Beta testing is often employed as a form of external acceptance testing for off-the-shelf software in order to acquire
feedback from the market. [ISTQB Glossary]
152 Gamma testing is done when software is ready for release with specified requirements, this testing done directly by skipping all
the in-house testing activities. The software is almost ready for final release. No feature development or enhancement of the
software is undertaken and tightly scoped bug fixes are the only code. Gamma check is performed when the application is ready
for release to the specified requirements and this check is performed directly without going through all the testing activities at
home. [http://www.360logica.com/blog/2012/06/what-are-alpha-beta-and-gamma-testing.html]
153 Scripted testing. Test execution carried out by following a previously documented sequence of tests. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 79/296


(exploratory testing154) -
,
{141}, , , -

. -
,
. -
, (session-based test-
ing155). -
-,
- (checklist-based testing156).
-
-
?157
() (ad hoc testing158) -
,
-, -, -
(experience-based testing159)
, , ,
. -

, -
(?) -.

. -
, -
.

2.3.2.12.
( {77}).
( {77}).
(functional testing160) ,
-
( {38}).
-
{69}, {68}
.

154 Exploratory testing. An informal test design technique where the tester actively controls the design of the tests as those tests
are performed and uses information gained while testing to design new and better tests. [ISTQB Glossary]
155 Session-based Testing. An approach to testing in which test activities are planned as uninterrupted sessions of test design and
execution, often used in conjunction with exploratory testing. [ISTQB Glossary]
156 Checklist-based Testing. An experience-based test design technique whereby the experienced tester uses a high-level list of
items to be noted, checked, or remembered, or a set of rules or criteria against which a product has to be verified. [ISTQB
Glossary]
157 What is Exploratory Testing?, James Bach [http://www.satisfice.com/articles/what_is_et.shtml]
158 Ad hoc testing. Testing carried out informally; no formal test preparation takes place, no recognized test design technique is
used, there are no expectations for results and arbitrariness guides the test execution activity. [ISTQB Glossary]
159 Experience-based Testing. Testing based on the testers experience, knowledge and intuition. [ISTQB Glossary]
160 Functional testing. Testing based on an analysis of the specification of the functionality of a component or system. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 80/296


, -
(functional testing160) -
(functionality testing161). -
What is Functional testing (Testing
of functions) in software?162,
What is functionality testing in software?163.
, :
( -
) ,
, ;
,

, .
(non-functional testing164) -
, -
( {38}),
, , , -
..
(installation testing, installability testing165)
, , -
() . -
-
, :
o , -
;
o ();
o ();
o -
();
o , -
;
o ;
o ;
o .

161 Functionality testing. The process of testing to determine the functionality of a software product (the capability of the software
product to provide functions which meet stated and implied needs when the software is used under specified conditions). [ISTQB
Glossary]
162 What is Functional testing (Testing of functions) in software? [http://istqbexamcertification.com/what-is-functional-testing-test-
ing-of-functions-in-software/]
163 What is functionality testing in software? [http://istqbexamcertification.com/what-is-functionality-testing-in-software/]
164 Non-functional testing. Testing the attributes of a component or system that do not relate to functionality, e.g. reliability, efficiency,
usability, maintainability and portability. [ISTQB Glossary]
165 Installability testing. The process of testing the installability of a software product. Installability is the capability of the software
product to be installed in a specified environment. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 81/296


(regression testing166) ,
, -
,
. -
-167 : -
,
(2050 %) .

.
(re-testing168, confirmation testing)
-, ,
. -
{165}, -
, .
(acceptance testing169) -
,
/ ,
( ).
( , -
):
o (factory acceptance
testing170)
-
.
-{79}.
o (operational acceptance
testing171, production acceptance testing) -
{83}, , -
, -
..
o (site acceptance testing172) -
( )

, -
.

166 Regression testing. Testing of a previously tested program following modification to ensure that defects have not been introduced
or uncovered in unchanged areas of the software, as a result of the changes made. It is performed when the software or its
environment is changed. [ISTQB Glossary]
167 Frederick Brooks, The Mythical Man-Month.
168 Re-testing, Confirmation testing. Testing that runs test cases that failed the last time they were run, in order to verify the success
of corrective actions. [ISTQB Glossary]
169 Acceptance Testing. Formal testing with respect to user needs, requirements, and business processes conducted to determine
whether or not a system satisfies the acceptance criteria and to enable the user, customers or other authorized entity to determine
whether or not to accept the system. [ISTQB Glossary]
170 Factory acceptance testing. Acceptance testing conducted at the site at which the product is developed and performed by
employees of the supplier organization, to determine whether or not a component or system satisfies the requirements, normally
including hardware as well as software. [ISTQB Glossary]
171 Operational acceptance testing, Production acceptance testing. Operational testing in the acceptance test phase, typically
performed in a (simulated) operational environment by operations and/or systems administration staff focusing on operational
aspects, e.g. recoverability, resource-behavior, installability and technical compliance. [ISTQB Glossary]
172 Site acceptance testing. Acceptance testing by users/customers at their site, to determine whether or not a component or system
satisfies the user/customer needs and fits within the business processes, normally including hardware as well as software.
[ISTQB Glossary]

. . EPAM Systems, 20152017 : 82/296


(operational testing173) , -

(operational environment174), ,
, , -,
..
(usability175 testing) -
, , -
, (understandability176, learnability177, op-
erability178), ,
(attractiveness179).
, .
-
, -
..
! (usability175 testing)
(GUI testing184)
! ,
, .
(accessibility testing180) , -

( ..)
(interface testing181) , -
. -
ISTQB-
{72}, -
(API testing182) -
(CLI testing183),
, -
,
.

(GUI testing184).

173 Operational testing. Testing conducted to evaluate a component or system in its operational environment. [ISTQB Glossary]
174 Operational environment. Hardware and software products installed at users or customers sites where the component or system
under test will be used. The software may include operating systems, database management systems, and other applications.
[ISTQB Glossary]
175 Usability. The capability of the software to be understood, learned, used and attractive to the user when used under specified
conditions. [ISTQB Glossary]
176 Understandability. The capability of the software product to enable the user to understand whether the software is suitable, and
how it can be used for particular tasks and conditions of use. [ISTQB Glossary]
177 Learnability. The capability of the software product to enable the user to learn its application. [ISTQB Glossary]
178 Operability. The capability of the software product to enable the user to operate and control it. [ISTQB Glossary]
179 Attractiveness. The capability of the software product to be attractive to the user. [ISTQB Glossary]
180 Accessibility testing. Testing to determine the ease by which users with disabilities can use a component or system. [ISTQB
Glossary]
181 Interface Testing. An integration test type that is concerned with testing the interfaces between components or systems. [ISTQB
Glossary]
182 API testing. Testing performed by submitting commands to the software under test using programming interfaces of the applica-
tion directly. [ISTQB Glossary]
183 CLI testing. Testing performed by submitting commands to the software under test using a dedicated command-line interface.
[ISTQB Glossary]
184 GUI testing. Testing performed by interacting with the software under test via the graphical user interface. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 83/296


! (GUI testing184)
(usability175 testing)
! , -
, -
.
(security testing185) , -

, -
.

What is Security testing in software testing?186.
(internationalization testing, i18n test-
ing, globalization187 testing, localizability188 testing) , -

.
-
( , .
), -
(: , ; -
, ; -
..)
(localization testing189, l10n) ,
-

. -
(. ) -
, .
(compatibility testing, interoperability testing190)
, -
. , , :
o ,
( , configura-
tion testing191).

185 Security testing. Testing to determine the security of the software product. [ISTQB Glossary]
186 What is Security testing in software testing? [http://istqbexamcertification.com/what-is-security-testing-in-software/]
187 Globalization. The process of developing a program core whose features and code design are not solely based on a single
language or locale. Instead, their design is developed for the input, display, and output of a defined set of Unicode-supported
language scripts and data related to specific locales. [Globalization Step-by-Step, https://msdn.microsoft.com/en-
us/goglobal/bb688112]
188 Localizability. The design of the software code base and resources such that a program can be localized into different language
editions without any changes to the source code. [Globalization Step-by-Step, https://msdn.microsoft.com/en-
us/goglobal/bb688112]
189 Localization testing checks the quality of a product's localization for a particular target culture/locale. This test is based on the
results of globalization testing, which verifies the functional support for that particular culture/locale. Localization testing can be
executed only on the localized version of a product. [Localization Testing, https://msdn.microsoft.com/en-us/li-
brary/aa292138%28v=vs.71%29.aspx]
190 Compatibility Testing, Interoperability Testing. The process of testing to determine the interoperability of a software product
(the capability to interact with one or more specified components or systems). [ISTQB Glossary]
191 Configuration Testing, Portability Testing. The process of testing to determine the portability of a software product (the ease
with which the software product can be transferred from one hardware or software environment to another). [ISTQB Glossary]

. . EPAM Systems, 20152017 : 84/296


o (- -
, cross-browser testing192). (C. --
{78}).
o (mobile testing193). (.
{78}).
o .
(
, ) ..
(compliance testing194, conformance testing, regulation testing).
-

What is Mobile Testing?195 Beginners Guide to Mobile Application
Testing196.
(data quality197 testing) (database integrity
testing198) , -
, , ,
, .. -
, -
, -
, ..
(resource utilization testing199, effi-
ciency testing200, storage testing201) , -
-
-
. -
{86}.

192 Cross-browser testing helps you ensure that your web site or web application functions correctly in various web browsers.
Typically, QA engineers create individual tests for each browser or create tests that use lots of conditional statements that check
the browser type used and execute browser-specific commands. [http://support.smartbear.com/viewarticle/55299/]
193 Mobile testing is a testing with multiple operating systems (and different versions of each OS, especially with Android), multiple
devices (different makes and models of phones, tablets, phablets), multiple carriers (including international ones), multiple
speeds of data transference (3G, LTE, Wi-Fi), multiple screen sizes (and resolutions and aspect ratios), multiple input controls
(including BlackBerrys eternal physical keypads), and multiple technologies GPS, accelerometers that web and desktop
apps almost never use. [http://smartbear.com/all-resources/articles/what-is-mobile-testing/]
194 Compliance testing, Conformance testing, Regulation testing. The process of testing to determine the compliance of the
component or system (the capability to adhere to standards, conventions or regulations in laws and similar prescriptions). [ISTQB
Glossary]
195 What is Mobile Testing? [http://smartbear.com/all-resources/articles/what-is-mobile-testing/]
196 Beginners Guide to Mobile Application Testing [http://www.softwaretestinghelp.com/beginners-guide-to-mobile-application-
testing/]
197 Data quality. An attribute of data that indicates correctness with respect to some pre-defined criteria, e.g., business expectations,
requirements on data integrity, data consistency. [ISTQB Glossary]
198 Database integrity testing. Testing the methods and processes used to access and manage the data(base), to ensure access
methods, processes and data rules function as expected and that during access to the database, data is not corrupted or unex-
pectedly deleted, updated or created. [ISTQB Glossary]
199 Resource utilization testing, Storage testing. The process of testing to determine the resource-utilization of a software product.
[ISTQB Glossary]
200 Efficiency testing. The process of testing to determine the efficiency of a software product (the capability of a process to produce
the intended outcome, relative to the amount of resources used). [ISTQB Glossary]
201 Storage testing. This is a determination of whether or not certain processing conditions use more storage (memory) than esti-
mated. [Software Testing Concepts And Tools, Nageshwar Rao Pusuluri]

. . EPAM Systems, 20152017 : 85/296


(comparison testing202) ,
-
.
(qualification testing203)
, -
. -
{82} , -
.
(exhaustive testing204) -

.
,
.
(reliability testing205)

.
(recoverability testing206) -

, -
, ()
.
(failover testing207) , -


, , -
.
(performance testing208) -

.
:
o (load testing209, capacity testing210) -
-

( ).
202 Comparison testing. Testing that compares software weaknesses and strengths to those of competitors' products. [Software
Testing and Quality Assurance, Jyoti J. Malhotra, Bhavana S. Tiple]
203 Qualification testing. Formal testing, usually conducted by the developer for the consumer, to demonstrate that the software
meets its specified requirements. [Software Testing Concepts And Tools, Nageshwar Rao Pusuluri]
204 Exhaustive testing. A test approach in which the test suite comprises all combinations of input values and preconditions. [ISTQB
Glossary]
205 Reliability Testing. The process of testing to determine the reliability of a software product (the ability of the software product to
perform its required functions under stated conditions for a specified period of time, or for a specified number of operations).
[ISTQB Glossary]
206 Recoverability Testing. The process of testing to determine the recoverability of a software product (the capability of the software
product to re-establish a specified level of performance and recover the data directly affected in case of failure). [ISTQB Glossary]
207 Failover Testing. Testing by simulating failure modes or actually causing failures in a controlled environment. Following a failure,
the failover mechanism is tested to ensure that data is not lost or corrupted and that any agreed service levels are maintained
(e.g., function availability or response times). [ISTQB Glossary]
208 Performance Testing. The process of testing to determine the performance of a software product. [ISTQB Glossary]
209 Load Testing. A type of performance testing conducted to evaluate the behavior of a component or system with increasing load,
e.g. numbers of parallel users and/or numbers of transactions, to determine what load can be handled by the component or
system. [ISTQB Glossary]
210 Capacity Testing. Testing to determine how many users and/or transactions a given system will support and still meet perfor-
mance goals. [https://msdn.microsoft.com/en-us/library/bb924357.aspx]

. . EPAM Systems, 20152017 : 86/296


o (scalability testing211) -
-
-
.
o (volume testing212) -
( ,
) .
o (stress testing213) -
,
,
.
-
: , , -
(destructive testing214)
{77}.
o (concurrency testing215)
, -
,
( , -
, , ..)

,
.


{85}, {86}, -
{86}, {86} ..
-

: -
216.

211 Scalability Testing. Testing to determine the scalability of the software product (the capability of the software product to be
upgraded to accommodate increased loads). [ISTQB Glossary]
212 Volume Testing. Testing where the system is subjected to large volumes of data. [ISTQB Glossary]
213 Stress testing. A type of performance testing conducted to evaluate a system or component at or beyond the limits of its antici-
pated or specified workloads, or with reduced availability of resources such as access to memory or servers. [ISTQB Glossary]
214 Destructive software testing assures proper or predictable software behavior when the software is subject to improper usage or
improper input, attempts to crash a software product, tries to crack or break a software product, checks the robustness of a
software product. [Towards Destructive Software Testing, Kiumi Akingbehin]
215 Concurrency testing. Testing to determine how the occurrence of two or more activities within the same interval of time, achieved
either by interleaving the activities or by simultaneous execution, is handled by the component or system. [ISTQB Glossary]
216 :
[http://svyatoslav.biz/technologies/performance_testing/]

. . EPAM Systems, 20152017 : 87/296


2.3.2.13.
( {77}).
( {77}).
, , -:
o ( {80}).
o () ( {80}).
:
o (intrusive testing217) , -

(,
) (level of
intrusion218) (, -
, -
..) -
219 {77}
{87} .
o (nonintrusive testing220) ,
-
.
:
o (data-driven testing221)
-, -
-
, ..
o (keyword-
driven testing222) --
, - -
, -
, ().
o (behavior-driven test-
ing223) -, -
--
, .

217 Intrusive testing. Testing that collects timing and processing information during program execution that may change the behavior
of the software from its behavior in a real environment. Intrusive testing usually involves additional code embedded in the software
being tested or additional processes running concurrently with software being tested on the same processor. [http://encyclope-
dia2.thefreedictionary.com/intrusive+testing]
218 Level of intrusion. The level to which a test object is modified by adjusting it for testability. [ISTQB Glossary]
219 Intrusive testing can be considered a type of interrupt testing, which is used to test how well a system reacts to intrusions and
interrupts to its normal workflow. [http://www.techopedia.com/definition/7802/intrusive-testing]
220 Nonintrusive Testing. Testing that is transparent to the software under test, i.e., does not change its timing or processing char-
acteristics. Nonintrusive testing usually involves additional hardware that collects timing or processing information and processes
that information on another platform. [http://encyclopedia2.thefreedictionary.com/nonintrusive+testing]
221 Data-driven Testing (DDT). A scripting technique that stores test input and expected results in a table or spreadsheet, so that a
single control script can execute all of the tests in the table. Data-driven testing is often used to support the application of test
execution tools such as capture/playback tools. [ISTQB Glossary]
222 Keyword-driven Testing (KDT). A scripting technique that uses data files to contain not only test data and expected results, but
also keywords related to the application being tested. The keywords are interpreted by special supporting scripts that are called
by the control script or the test. [ISTQB Glossary]
223 Behavior-driven Testing (BDT). Behavior-driven Tests focuses on the behavior rather than the technical implementation of the
software. If you want to emphasize on business point of view and requirements then BDT is the way to go. BDT are Given-when-
then style tests written in natural language which are easily understandable to non-technical individuals. Hence these tests allow
business analysts and management people to actively participate in test creation and review process. [Jyothi Rangaiah,
http://www.womentesters.com/behaviour-driven-testing-an-introduction/]

. . EPAM Systems, 20152017 : 88/296


() :
o (error guessing224) -
,
,
. -
.. -
(failure-directed testing225),
-
.
o (heuristic evaluation226) -
{83}, -
, -
.
o (mutation testing227) -
,
, -
( -
- -
).
( -
).
o (error seeding228) -
, -
, -
, ,
. -
(-
).
:
o (equivalence
partitioning229) ,
- -
. -
- ( -
)
-,
.

224 Error Guessing. A test design technique where the experience of the tester is used to anticipate what defects might be present
in the component or system under test as a result of errors made, and to design tests specifically to expose them. [ISTQB
Glossary]
225 Failure-directed Testing. Software testing based on the knowledge of the types of errors made in the past that are likely for the
system under test. [http://dictionary.reference.com/browse/failure-directed+testing].
226 Heuristic Evaluation. A usability review technique that targets usability problems in the user interface or user interface design.
With this technique, the reviewers examine the interface and judge its compliance with recognized usability principles (the heu-
ristics). [ISTQB Glossary]
227 Mutation Testing, Back-to-Back Testing. Testing in which two or more variants of a component or system are executed with the
same inputs, the outputs compared, and analyzed in cases of discrepancies. [ISTQB Glossary]
228 Error seeding. The process of intentionally adding known faults to those already in a computer program for the purpose of
monitoring the rate of detection and removal, and estimating the number of faults remaining in the program. [ISTQB Glossary]
229 Equivalence partitioning. A black box test design technique in which test cases are designed to execute representatives from
equivalence partitions. In principle test cases are designed to cover each partition at least once. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 89/296


o (boundary value analy-


sis230)
, -
, -
. -
- -,
.
o (domain analysis231, domain testing)
-
, -,
() ( -
). -

- -.
o (pairwise testing232) ,
- -
() ,
.
N- (n-wise
testing233) -
( ,

).
(pairwise testing232) -
(pair testing234)!
o (orthogonal array
testing235) N-
, ..
( , -
: ,
-
, ).
.
!
Orthogonal array236 Orthogonal matrix237.

230 Boundary value analysis. A black box test design technique in which test cases are designed based on boundary values (input
values or output values which are on the edge of an equivalence partition or at the smallest incremental distance on either side
of an edge, for example the minimum or maximum value of a range). [ISTQB Glossary]
231 Domain analysis. A black box test design technique that is used to identify efficient and effective test cases when multiple varia-
bles can or should be tested together. It builds on and generalizes equivalence partitioning and boundary values analysis. [ISTQB
Glossary]
232 Pairwise testing. A black box test design technique in which test cases are designed to execute all possible discrete combinations
of each pair of input parameters. [ISTQB Glossary]
233 N-wise testing. A black box test design technique in which test cases are designed to execute all possible discrete combinations
of any set of n input parameters. [ISTQB Glossary]
234 Pair testing. Two persons, e.g. two testers, a developer and a tester, or an end-user and a tester, working together to find defects.
Typically, they share one computer and trade control of it while testing. [ISTQB Glossary]
235 Orthogonal array testing. A systematic way of testing all-pair combinations of variables using orthogonal arrays. It significantly
reduces the number of all combinations of variables to test all pair combinations. See also combinatorial testing, n-wise testing,
pairwise testing. [ISTQB Glossary]
236 Orthogonal array, Wikipedia. [http://en.wikipedia.org/wiki/Orthogonal_array]
237 Orthogonal matrix, Wikipedia. [http://en.wikipedia.org/wiki/Orthogonal_matrix]

. . EPAM Systems, 20152017 : 90/296


. {102}, -
-
.
, -
, -
(Lee
Copeland, A Practitioner's Guide to Software Test Design), -
:
3.
4.
8.
-
6.
!
,
.

:
o (development testing238) -
,
/ ,
. , -
.
o ( {83}).
(code based testing).
- (
,
,
). ,
:
o (control flow testing239) -
, -
-
, -
. .
(. {92}).
o (data-flow testing240)
, -
, -
.
. ( , -
ISO/IEC/IEEE 29119-4{103}).

238 Development testing. Formal or informal testing conducted during the implementation of a component or system, usually in the
development environment by developers. [ISTQB Glossary]
239 Control Flow Testing. An approach to structure-based testing in which test cases are designed to execute specific sequences of
events. Various techniques exist for control flow testing, e.g., decision testing, condition testing, and path testing, that each have
their specific approach and level of control flow coverage. [ISTQB Glossary]
240 Data Flow Testing. A white box test design technique in which test cases are designed to execute definition-use pairs of variables.
[ISTQB Glossary]

. . EPAM Systems, 20152017 : 91/296


o (state transition
testing241) , - -
-
. (state dia-
gram242) (state table243).
-
What is State transition testing in
software testing?244.

(finite state machine245 testing).

( ),
-
.
o () (code review, code inspection246)
,
.
-
.
( -
) , -
, ,
.. .
(structure-based techniques) -

:
o (statement testing247)
( ), -
( ) .
o (branch testing248) -
( ),
( ,

).

241 State Transition Testing. A black box test design technique in which test cases are designed to execute valid and invalid state
transitions. [ISTQB Glossary]
242 State Diagram. A diagram that depicts the states that a component or system can assume, and shows the events or circumstances
that cause and/or result from a change from one state to another. [ISTQB Glossary]
243 State Table. A grid showing the resulting transitions for each state combined with each possible event, showing both valid and
invalid transitions. [ISTQB Glossary]
244 What is State transition testing in software testing? [http://istqbexamcertification.com/what-is-state-transition-testing-in-soft-
ware-testing/]
245 Finite State Machine. A computational model consisting of a finite number of states and transitions between those states, possibly
with accompanying actions. [ISTQB Glossary]
246 Inspection. A type of peer review that relies on visual examination of documents to detect defects, e.g. violations of development
standards and non-conformance to higher level documentation. The most formal review technique and therefore always based
on a documented procedure. [ISTQB Glossary]
247 Statement Testing. A white box test design technique in which test cases are designed to execute statements (statement is an
entity in a programming language, which is typically the smallest indivisible unit of execution). [ISTQB Glossary]
248 Branch Testing. A white box test design technique in which test cases are designed to execute branches (branch is a basic block
that can be selected for execution based on a program construct in which one of two or more alternative program paths is
available, e.g. case, jump, go to, if-then-else.). [ISTQB Glossary]

. . EPAM Systems, 20152017 : 92/296


o (condition testing249) -
( ), -
( ,
).
o (multiple condition
testing250) ( ),
() .
o , -
( ) (modified condition decision coverage
testing251) ( ),

,
.
o (decision testing252) -
( ), -
( ).
,
.
o (path testing253) -
( ),
.
,
-
, ..
, -
. ISTQB -
,
. , , -
What is the difference between a Decision and
a Condition?254.
-
2.3.c.

249 Condition Testing. A white box test design technique in which test cases are designed to execute condition outcomes (condition
is a logical expression that can be evaluated as True or False, e.g. A > B). [ISTQB Glossary]
250 Multiple Condition Testing. A white box test design technique in which test cases are designed to execute combinations of single
condition outcomes (within one statement). [ISTQB Glossary]
251 Modified Condition Decision Coverage Testing. Technique to design test cases to execute branch condition outcomes that
independently affect a decision outcome and discard conditions that do not affect the final outcome. [Guide to Advanced Soft-
ware Testing, Second Edition, Anne Mette Hass].
252 Decision Testing. A white box test design technique in which test cases are designed to execute decision outcomes (decision is
program point at which the control flow has two or more alternative routes, e.g. a node with two or more links to separate
branches). [ISTQB Glossary]
253 Path testing. A white box test design technique in which test cases are designed to execute paths. [ISTQB Glossary]
254 What is the difference between a Decision and a Condition? [http://www-01.ibm.com/support/docview.wss?uid=swg21129252]

. . EPAM Systems, 20152017 : 93/296


2.3.c
( )

- Statement testing ,
x = 10
- Branch testing .

- Condition testing, ,
Branch Condition Test- if (a == b)
ing
- Multiple condition test- ,
ing, if ((a == b) || (c == d))
Branch Condition
Combination Testing
- Modified Condition De- , -
- cision Coverage Test-
, ing , if ((x == y)
(- && (n == m))
) -
false
-

- Decision testing ,
switch
- Path testing

() (application behav-
ior/model-based testing):
o (decision table test-
ing255) ( ),
- .. -
, ( ) -
,
.
o (
{92}).
o (specification-based testing, black
box testing) ( {69}).
o (model-based
testing256) , -
( -) :
{94}, {92}, -
{141}, {86} ..
o (use case test-
ing257) ( ),
- . -
-
-,
255 Decision Table Testing. A black box test design technique in which test cases are designed to execute the combinations of
inputs and/or stimuli (causes) shown in a decision table (a table showing combinations of inputs and/or stimuli (causes) with their
associated outputs and/or actions (effects), which can be used to design test cases). [ISTQB Glossary]
256 Model-based Testing. Testing based on a model of the component or system under test, e.g., reliability growth models, usage
models such as operational profiles or behavioral models such as decision table or state transition diagram. [ISTQB Glossary]
257 Use case testing. A black box test design technique in which test cases are designed to execute scenarios of use cases. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 94/296


{89}.
-
,
{37} .

, -
(user story
testing258).
o (parallel testing259) -
, ( ) -
(-
).
-
, -
, .. .
( ) -
{89}.
o (random testing260)
( ),
, -
() , -
(operational profile261) , -
.
.. -
(monkey testing262).
o A/B- (A/B testing, split testing263) ,
-
.
A/B- -
{83},
,
.
, -
, -
(Lee
Copeland, A Practitioner's Guide to Software Test Design):
5.
7.
9.

258 User story testing. A black box test design technique in which test cases are designed based on user stories to verify their correct
implementation. [ISTQB Glossary]
259 Parallel testing. Testing a new or an altered data processing system with the same source data that is used in another system.
The other system is considered as the standard of comparison. [ISPE Glossary]
260 Random testing. A black box test design technique where test cases are selected, possibly using a pseudo-random generation
algorithm, to match an operational profile. This technique can be used for testing non-functional attributes such as reliability and
performance. [ISTQB Glossary]
261 Operational profile. The representation of a distinct set of tasks performed by the component or system, possibly based on user
behavior when interacting with the component or system, and their probabilities of occurrence. A task is logical rather that physical
and can be executed over several machines or be executed in non-contiguous time segments. [ISTQB Glossary]
262 Monkey testing. Testing by means of a random selection from a large range of inputs and by randomly pushing buttons, ignorant
of how the product is being used. [ISTQB Glossary]
263 Split testing is a design for establishing a causal relationship between changes and their influence on user-observable behavior.
[Controlled experiments on the web: survey and practical guide, Ron Kohav]

. . EPAM Systems, 20152017 : 95/296


2.3.2.14. ()
-
, , - ,
,
, -
, .
,
, -
.

,
-, ( -
). ,
-, ( ,
). , -
, .. (
) - , -
,
. :
1) ;
2) ;
3) ;
4) .
, :
o (bottom-up testing264)
{72}, -
, -
.
o (top-down testing265)
{72}, -
,
.
o (hybrid testing266) -
,
.
-
, -
, -
. ,
.

264 Bottom-up testing. An incremental approach to integration testing where the lowest level components are tested first, and then
used to facilitate the testing of higher level components. This process is repeated until the component at the top of the hierarchy
is tested. [ISTQB Glossary]
265 Top-down testing. An incremental approach to integration testing where the component at the top of the component hierarchy is
tested first, with lower level components being simulated by stubs. Tested components are then used to test lower level compo-
nents. The process is repeated until the lowest level components have been tested. [ISTQB Glossary]
266 Hybrid testing, Sandwich testing. First, the inputs for functions are integrated in the bottom-up pattern discussed above. The
outputs for each function are then integrated in the top-down manner. The primary advantage of this approach is the degree of
support for early release of limited functionality. [Integration testing techniques, Kratika Parmar]

. . EPAM Systems, 20152017 : 96/296


,
:
1) ,
-
,
, .
2) -
-
, .. - , -
, , -
, -
.
3) -
,
.
,
. -
(, -
1 2).
o 1:
1) {74}.
2) {75}.
3) {76}.
o 2:
1) {72}.
2) {72}.
3) {73}.
o 3:
1) -{79}.
2) -{79}.
3) -{79}.

, -
- . -

.

. . EPAM Systems, 20152017 : 97/296


2.3.3. -


. ( 2.3.i 2.3.j)
. (
2.3.k 2.3.l) , -
, ( -
, -
).
: . -
-
. , -
, , .. .

2.3.i Foundations of Soft-


ware Testing: ISTQB Certification (Erik Van Veenendaal, Isabel Evans) (-
)

. . EPAM Systems, 20152017 : 98/296


Testing Techniques

Static Dynamic

Informal Review Experience-based

Walkthrough Error Guessing

Technical Review Exploratory Testing

Inspection Specification-based Structure-based

Equivalence
Static Analysis Statement
Partitioning

Boundary Value
Control Flow Decision
Analisys

Data Flow Decision Tables Condition

State Transition Multiple Condition

Use Case Testing

2.3.j Foundations of Soft-


ware Testing: ISTQB Certification (Erik Van Veenendaal, Isabel Evans) (-
)

,
( ). -

2.3.k 2.3.l.

. . EPAM Systems, 20152017 : 99/296






.


,

, -
() (


)

2.3.k ISO/IEC/IEEE 29119-4


( )

. . EPAM Systems, 20152017 : 100/296


2.3.l ISO/IEC/IEEE 29119-4 (-


)

. . EPAM Systems, 20152017 : 101/296


(classification tree267
method268) ( ),
- -
.
(syntax testing269) -
( ), -
.
(combinatorial
testing270)
,

. :
o (all combinations testing271) -

(, ).
o ( {90}).
o - (each choice
testing272) ,

-.
o (base choice
testing273) ,
( ),
, -
, , ,
.
.
{89}, .
- (cause-effect gra-
phing274) ( ), -
-
(
).

267 Classification tree. A tree showing equivalence partitions hierarchically ordered, which is used to design test cases in the classi-
fication tree method. [ISTQB Glossary]
268 Classification tree method. A black box test design technique in which test cases, described by means of a classification tree,
are designed to execute combinations of representatives of input and/or output domains. [ISTQB Glossary]
269 Syntax testing. A black box test design technique in which test cases are designed based upon the definition of the input domain
and/or output domain. [ISTQB Glossary]
270 Combinatorial testing. A means to identify a suitable subset of test combinations to achieve a predetermined level of coverage
when testing an object with multiple parameters and where those parameters themselves each have several values, which gives
rise to more combinations than are feasible to test in the time allowed. [ISTQB Glossary]
271 All combinations testing. Testing of all possible combinations of all values for all parameters. [Guide to advanced software
testing, 2nd edition, Anne Matte Hass].
272 Each choice testing. One value from each block for each partition must be used in at least one test case. [Introduction to
Software Testing. Chapter 4. Input Space Partition Testing, Paul Ammann & Jeff Offutt]
273 Base choice testing. A base choice block is chosen for each partition, and a base test is formed by using the base choice for
each partition. Subsequent tests are chosen by holding all but one base choice constant and using each non-base choice in
each other parameter. [Introduction to Software Testing. Chapter 4. Input Space Partition Testing, Paul Ammann & Jeff Offutt]
274 Cause-effect graphing. A black box test design technique in which test cases are designed from cause-effect graphs (a graphical
representation of inputs and/or stimuli (causes) with their associated outputs (effects), which can be used to design test cases).
[ISTQB Glossary]

. . EPAM Systems, 20152017 : 102/296


(data-flow testing275)
,
, -
. , :
, ; ,
; , -
; .
. -
( -
x):
(declaration): int x;
(definition, d-use): x = 99;
(computation use, c-use): z = x + 1;
(predicate use, p-use): if (x > 17) { };
(kill, k-use): x = null;
.
3.3 5
(Software Testing Techniques, Second Edition,
Boris Beizer), :
o (all-definitions testing276)
, -

.
o (all-c-uses
testing277) , -

.
o (all-p-uses
testing278) , -

.
o -
(all-uses testing279) , -
-
.
o -
( -
) (all-du-paths testing280) -

( ,
-).

275 Data flow testing. A white box test design technique in which test cases are designed to execute definition-use pairs of variables.
[ISTQB Glossary]
276 All-definitions strategy. Test set requires that every definition of every variable is covered by at least one use of that variable (c-
use or p-use). [Software Testing Techniques, Second Edition, Boris Beizer]
277 All-computation-uses strategy. For every variable and every definition of that variable, include at least one definition-free path
from the definition to every computation use. [Software Testing Techniques, Second Edition, Boris Beizer]
278 All-predicate-uses strategy. For every variable and every definition of that variable, include at least one definition-free path from
the definition to every predicate use. [Software Testing Techniques, Second Edition, Boris Beizer]
279 All-uses strategy. Test set includes at least one path segment from every definition to every use that can be reached by that
definition. [Software Testing Techniques, Second Edition, Boris Beizer]
280 All-DU-path strategy. Test set includes every du path from every definition of every variable to every use of that definition.
[Software Testing Techniques, Second Edition, Boris Beizer]

. . EPAM Systems, 20152017 : 103/296



( Figure 5.7. Relative
Strength of Structural Test Strategies),
( 2.3.m).

All-Paths

All-DU-Paths

All-Uses

All-C-Uses/Some-P-Uses All-P-Uses/Some-C-Uses

All-C-Uses All-Defs All-P-Uses

Branch

Statement

2.3.m
( )

. . EPAM Systems, 20152017 : 104/296


2.3.4. -



. 2.3.d,
.
(
, ).
! ISTQB-
.
, ,
. , ,
, -
-,
.

,

2.3.d -

(-
( ) )
{68} Static testing
- Dynamic testing
{68}
{70} Manual testing
- Automated testing
{71}
() Unit testing, Module testing,
{72} Component testing
- Integration testing
{72}
{73} System testing
{74} Smoke test, Intake test, Build
verification test
Critical path test
{75}
- Extended test
{76}
{77} Positive testing
{77} Negative testing, Invalid testing
-- Web-applications testing
{78}
Mobile applications testing
{78}
Desktop applications testing
{78}

. . EPAM Systems, 20152017 : 105/296


- Presentation tier testing


{78}
- Business logic tier testing
{78}
- Data tier testing
{78}
-{79} Alpha testing
-{79} Beta testing
-{79} Gamma testing
- Scripted testing, Test case
{79} based testing
- Exploratory testing
{80}
() - Ad hoc testing
{80}
- Functional testing
{80}
- Non-functional testing
{81}
- Installation testing
{81}
- Regression testing
{82}
{82} Re-testing, Confirmation testing
{82} Acceptance testing
- Operational testing
{83}
- Usability testing
{83}
{83} Accessibility testing
{83} Interface testing
- Security testing
{84}
- Internationalization testing
{84}
{84} Localization testing
- Compatibility testing
{84}
- Configuration testing
{84}
- - Cross-browser testing
{85}
Data quality testing and Data-
{85} base integrity testing
Resource utilization testing
{85}
- Comparison testing
{86}

. . EPAM Systems, 20152017 : 106/296


- Qualification testing
{86}
{86} Exhaustive testing
{86} Reliability testing
- Recoverability testing
{86}
- Failover testing
{86}
- Performance testing
{86}
{86} Load testing, Capacity testing
- Scalability testing
{87}
{87} Volume testing
{87} Stress testing
- Concurrency testing
{87}
{88} Intrusive testing
- Nonintrusive testing
{88}
- Data-driven testing
{88}
- Keyword-driven testing
{88}
- Error guessing
{89}
{89} Heuristic evaluation
- Mutation testing
{89}
Error seeding
{89}
Equivalence partitioning
{89}
- Boundary value analysis
{90}
{90} Domain testing, Domain analy-
sis
{90} Pairwise testing
- Orthogonal array testing
{90}
Development testing
{91}
Control flow testing
{91}
- Data flow testing
{91}
State transition testing
{92}
() {92} Code review, code inspection

. . EPAM Systems, 20152017 : 107/296


- Statement testing
{92}
- Branch testing
{92}
Condition testing
{93}
- Multiple condition testing
{93}
- Modified condition decision
, - coverage testing
{93} (-
)
- Decision testing
{93}
- Path testing
{93}
Decision table testing
{94}

Model-based testing
{94}
- Use case testing
{94}
- Parallel testing
{95}
- Random testing
{95}
A/B-{95} A/B testing, Split testing
{96} Bottom-up testing
{96} Top-down testing
{96} Hybrid testing
- Classification tree method
{102}
- Syntax testing
{102}
{102} Combinatorial testing
( -
)
- All combinations testing
{102}
- Each choice testing
-{102}
- Base choice testing
{102}
- Cause-effect graphing
- -
{102}
All-definitions testing
{103}

. . EPAM Systems, 20152017 : 108/296


All-c-uses testing
-
{103}
All-p-uses testing
{103}
All-uses testing

{103}
All-du-paths testing
-
{103}
( -
)

. . EPAM Systems, 20152017 : 109/296


-, -, -

2.4. -, -, -
2.4.1. -
, -
,
. -
, -
-.
- (checklist281) [-].
282, .. - :
, ,
.
- ,
:
, (-
, ).
, (,
).
() (
), .
,
- , . -
(,
283 -284), -
.
, -
- ,
.
- , -
.
. - , ,
. ,
-
, .
.
-
. , ,
- , -
, -
(,

281 - ,
.
(, , ..), -
-.
282 ,
-, - . . Extended BackusNaur
form, Wikipedia. [https://en.wikipedia.org/wiki/Extended_Backus%E2%80%93Naur_form]
283 Mind map, Wikipedia. [http://en.wikipedia.org/wiki/Mind_map]
284 Concept map, Wikipedia. [http://en.wikipedia.org/wiki/Concept_map]

. . EPAM Systems, 20152017 : 110/296


-

-{77}, -,
-,
).
. - -
, ( -
), -
.
-
, -
. -
(. {41})
-.
2.4.a: -
{41} ,
-.
, -.
{49} {55}, -
.
( -
, ), -
- , (
, ).
--
:
{74};
( {120}) ;
, , {36};
{141};
, .
,
, , , - ,
.
-, -
-
(. {74}):
,
(.. , ),

. (. {74}).
,
. (. {75}).
( ,
). (.
{76}).

. . EPAM Systems, 20152017 : 111/296


-

,
- , -
.
.
:

TXT HTML MD
WIN1251 + + +
- CP866 + + +
KOI8R + + +
.
, . .
.
, . -
, . , -

.
: -
,
-
;
.
. ,
- , -
-
, -
.
.
, ( )
, -
( -
).

,
, -
, -
.
{74} . -
, . , (
) , .
, -
.
:
o :
SOURCE_DIR, DESTINATION_DIR, LOG_FILE_NAME
(-
Windows *nix ,

. . EPAM Systems, 20152017 : 112/296


-


(/ \)).
LOG_FILE_NAME .
o .
o .
o :
SOURCE_DIR.
DESTINATION_DIR.
LOG_FILE_NAME.
DESTINATION_DIR SOURCE_DIR.
DESTINATION_DIR SOURCE_DIR .
:
o , :

TXT HTML MD
WIN1251 100 50 10
- CP866 10 100 50
KOI8R 50 10 100
0
50 + 1 50 + 1 50 + 1
-

o :
.
.
.
:
o .
:
o ( ).
o ( ) .
:
o .
, -
, ,
, , .
, - -
, . ,
(. , ) -
, -.


-
, ,
.
:
o SOURCE_DIR, DESTINATION_DIR, LOG_FILE_NAME:
(Windows- + *nix-)
, .
UNC-.

. . EPAM Systems, 20152017 : 113/296


-

LOG_FILE_NAME SOURCE_DIR.
LOG_FILE_NAME DESTINATION_DIR.
o LOG_FILE_NAME :
24 .
4+ .
o :
SOURCE_DIR, DESTINATION_DIR,
LOG_FILE_NAME.
SOURCE_DIR LOG_FILE_NAME,
DESTINATION_DIR.
DESTINATION_DIR LOG_FILE_NAME, -
SOURCE_DIR.
:
o ,
.
o :
24 .
4+ .
2.4.b: , -
- :
-,
, -
- , -
-.
, -
,
,
- ( -
) - ( )
. , ,
-
, -.

. . EPAM Systems, 20152017 : 114/296


-

2.4.2. -


, -
,
, .
. .
(test285) -.

-
, -
, -, -, - -
. :
, .
-.
- (test case286) ,
,
.
- ,
-.
{147},
: - ,
, / - - (-
, ).
: test case -
. , - , - -
, .
, , - -
, -
. ISTQB- T, -
,
:
, -
. .
- (high level test case287) - -
.
, ,
-. -
{72} {73}, -
{74}. -
{80} -.

285 Test. A set of one or more test cases. [ISTQB Glossary]


286 Test case. A set of input values, execution preconditions, expected results and execution postconditions, developed for a particular
objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement.
[ISTQB Glossary]
287 High level test case (logical test case). A test case without concrete (implementation level) values for input data and expected
results. Logical operators are used; instances of the actual values are not yet defined and/or available. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 115/296


-

- (low level test case288) -


.
-
-.
, ..
, , ,
-.
- (test case specification289) , -
- ( , , -
, ) (test item290, test ob-
ject291).
(test specification292) , -
- (test design specification293), - (test
case specification289) / - (test procedure specifica-
tion294).
- (test scenario295, test procedure specification, test script) -
, (
-).
! - -
( ) -{141}.

-
- ( , ; ,
-
). - :
( -
).
(test coverage296 metrics)
(-
, ).
( -
-, , -
..).
, -

288 Low level test case. A test case with concrete (implementation level) values for input data and expected results. Logical operators
from high level test cases are replaced by actual values that correspond to the objectives of the logical operators. [ISTQB Glos-
sary]
289 Test case specification. A document specifying a set of test cases (objective, inputs, test actions, expected results, and execution
preconditions) for a test item. [ISTQB Glossary]
290 Test item. The individual element to be tested. There usually is one test object and many test items. [ISTQB Glossary]
291 Test object. The component or system to be tested. [ISTQB Glossary]
292 Test specification. A document that consists of a test design specification, test case specification and/or test procedure specifi-
cation. [ISTQB Glossary]
293 Test design specification. A document specifying the test conditions (coverage items) for a test item, the detailed test approach
and identifying the associated high level test cases. [ISTQB Glossary]
294 Test procedure specification (test procedure). A document specifying a sequence of actions for the execution of a test. Also
known as test script or manual test script. [ISTQB Glossary]
295 Test scenario. A document specifying a sequence of actions for the execution of a test. Also known as test script or manual test
script. [ISTQB Glossary]
296 Coverage (test coverage). The degree, expressed as a percentage, to which a specified coverage item (an entity or property
used as a basis for test coverage, e.g. equivalence partitions or code statements) has been exercised by a test suite. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 116/296


-

(- -
, ).

(
).
{82} {82} (-
- ).
( : -
- {47}).
,
.

-
, -
{165}, - (.
2.4.a), ( -
).

2.4.a ( ) -
(new) -
. - .
(planned, ready for testing) - -
, ,
.

. . EPAM Systems, 20152017 : 117/296


-

(not tested) -
().
- , ,
.
(work in progress) - -
, -
, , .
- , , -
, , -
, .
(skipped) , - -
-
.
(failed) ,
- , ,
- -
. -
, - -
, - ( ,
, ).
(passed) ,
- , -
.
(blocked) , - -
- ( , -
, -
).
(closed) , .. -, , -
/ / / .
- ,
, -
.
(not ready) , (
) - ,
, ,
, , -
.
, ,
, - -
,
( ) -
( /
-).

. . EPAM Systems, 20152017 : 118/296


() -

2.4.3. () -
, - -
- . -
, () -
.
- -
,
, .
- 2.4.b.
-

- -
() -

UG_U1.12 A R97 - , 1. -
- , - .
2. -
: -
-
#$%^&.jpg. .
- 1. - 3.
.
2. - .
, - . 4.
- 3. ,
- . -
4. OK.
5. - .
. 5.
-
.

2.4.b -

.
(identifier) , -
- -
. -
, (
-) : , -
,
- ( ), (:
UR216_S12_DB_Neg).
(priority) -. -
(A, B, C, D, E), (1, 2, 3, 4, 5), ( ,
, , , ) .
,
.
- :
, {141} , -
-;
{174}, -
;

. . EPAM Systems, 20152017 : 119/296


() -

, - , -
.

( - ),
, -
- , -
-.
- (requirement)
, - ( -
, - ).
-, {138}.
, , :
? . -
, (?) -
. , -
.
? ,

(, , R56.1, R56.2, R56.3 ..,
R56).
,
, .
- ,
.
(module and submodule)
, -,
.
,
-
, -
. (), ,
, (). -
- -
.
,
, - ,

.
: . -
. ,
{55} -
:
:
o ;
o ;
o .
:
o SOURCE_DIR;
o .

. . EPAM Systems, 20152017 : 120/296


() -

:
o ;
o ;
o .
:
o ;
o ;
o .
, -
. :
:
o ;
o ;
o .
:
o ;
o .
:
o ;
o ;
o .
:
o ;
o ;
o .
, (
)? -
(
), .. ,
.
! !
, , . -
, , , ,
( , -
( -
), ).
( ): ,
, , , ; ,
, , .
-
, {138}.

. . EPAM Systems, 20152017 : 121/296


() -

() - (title)
() - .
-
.
.

1 , ,
2
78 () , ,
Ctrl+C


- , , -
, :
.
( -).
! ! -
, . - -
,
.
,
. test case
(). , (),
.. -, .
, -
(precondition, preparation, initial data, setup), ,
-, :
.
.
.
, ,
, , , -
. ,
. , .
- -
. , , -
.. ,
, .

.
, .
: preparation,
initial data setup -
, precondition -
.
-, -
, ,
.

. . EPAM Systems, 20152017 : 122/296


() -

- (steps) ,
-.
:
,
( , ..);
- , ( -

);
, (,
, , , , -
), to (.. -
start application, to start application);
-,
, {74} ..
-
;

(, 35 );
,
.
! !
- - : , -
, -
, ,
- ,
-.
(expected results) - -
, -.
.
:
, -
(, ,
);
,
, -
(
- ,
);
, ;
.
! ! -
.
-
.


. ,
:

. . EPAM Systems, 20152017 : 123/296


() -

, -
. ( ) -
, / ,
.
- -

-, - -{155}.

. . EPAM Systems, 20152017 : 124/296


2.4.4.
(test management
tool297) 298,
.
, - -
, -
. ,
,
(, -
/ ):
- -;
,
, , ;
, -
, ;
,
- ;
;
, -
-.
, -
,
{27}.

-
, .., -

.
--
-
{119}

.

, -
.
,
(: -
, .., -
).

297 Test management tool. A tool that provides support to the test management and control part of a test process. It often has several
capabilities, such as testware management, scheduling of tests, the logging of results, progress tracking, incident management
and test reporting. [ISTQB Glossary]
298 Test management tools [http://www.opensourcetesting.org/testmgt.php]

. . EPAM Systems, 20152017 : 125/296


QAComplete299
5
1 2 4
3 6

7
9
10 8
11
12
13 15

14
16
17

18
19

2.4.c - QAComplete

1. Id (), , -
.
2. Title (), , -
.
3. Priority () high (), me-
dium (), low ().
4. Folder name () -
-
, , -.
5. Status () -: new (), ap-
proved (), awaiting approval ( ), in design ( -
), outdated (), rejected ().
6. Assigned to () , -
- (
, , -).
7. Last Run Status ( ) ,
(passed) (failed).
8. Last Run Configuration (, -
) , - -

299 QAComplete [http://smartbear.com/product/test-management-tool/qacomplete/]

. . EPAM Systems, 20152017 : 126/296


.
9. Avg Run Time ( ) -
, -.
10. Last Run Test Set ( ) -
-, - -
.
11. Last Run Release ( ) -
() , - -
.
12. Description ()
- ( , ..).
13. Owner () - ( -
).
14. Execution Type ( ) -
manual ( ), -
( -
automated ( )).
15. Version () - (
, - ).
,
.
16. Test Type ( ) , negative
(), positive (), regression (), smoke-test
().
17. Default host name ( ) -
- -
, .
18. Linked Items ( ) -
, ..
19. File Attachments () ,
, ..
-
- -
:

2.4.d - QAComplete

,
.

. . EPAM Systems, 20152017 : 127/296


TestLink300

3 4

5 6

2.4.e - TestLink
1. Title () .
2. Summary ()
- ( , ..).
3. Steps ( ) .
4. Expected Results ( ) -
, .
5. Available Keywords ( )
, - -
-.
( ).
6. Assigned Keywords ( ) -
, -.
, -
.

300 TestLink [http://sourceforge.net/projects/testlink/]

. . EPAM Systems, 20152017 : 128/296


TestRail301

2 3 4 5

10

2.4.f - TestRail
1. Title () .
2. Section () ,
, -
.
3. Type () : auto-
mated (), functionality ( ), per-
formance (), regression (), usability (-
), other ().
4. Priority () ,
: must test ( ), test if
301 TestRail [http://www.gurock.com/testrail/]

. . EPAM Systems, 20152017 : 129/296


time (, ), dont test ( ).


5. Estimate () ,
-.
6. Milestone ( ) , -
- -
( ).
7. References () , -
, ,
( ).
8. Preconditions () -
-
.
9. Step Description ( )
-.
10. Expected Results ( ) -
.

2.4.c: 35
-, , -
-.

. . EPAM Systems, 20152017 : 130/296


-

2.4.5. -
- ,
.
, -
. , -,
.
(. () -{119}), -
:
, ;
(, -
);
-
;
(,
, , , -
);
(,
- -
, , ..
, );
-
( : -
,
, ).
. -
, ,
.., .. . ,
- , .
- (-
, - , ):
- 1:

- 1. -
_ started,
: source dir c:/a, destination dir c:/b, log file
C:/A, C:/B, C:/C, C:/D. c:/c/converter.log, C:/C
C:/D 1.html, 2.txt, converter.log, -
3.md . _ started, source dir c:/a,
1. , destination dir c:/b, log file c:/c/converter.log.
php converter.php c:/a c:/b c:/c/con- 2. 1.html, 2.txt, 3.md
verter.log. C:/A,
2. 1.html, 2.txt, 3.md C:/B.
C:/D C:/A. C:/C/converter.log
3. Crtl+C. () _ processing 1.html
(KOI8-R), _ processing 2.txt
(CP-1251), _ processing
3.md (CP-866).
3. C:/C/converter.log -
_ closed.
.

. . EPAM Systems, 20152017 : 131/296


-

- 2:

- 1. -, -
UTF-8.
1. -

.

- ,
, : - ,
, . ,
{116} {115} -
.
(- 1):
-
,
;
, -;

, -
, .. ,
.
(- 2):
-
, ;
--
;
, -, , -
( -
).
, -
(, , - , -
). :
- 3:

- 1. -
-
: .
- 2. -
, - ,
, -
.
- .
- 3. -
.
1. , .

( -
).
2. -
.
3. .

. . EPAM Systems, 20152017 : 132/296


-

- ,
,
, - (
) ,
.
: -
- ,
-.
. -
, , - -
( ), -
; - -
.
-:
, ;
;
( ,
,
);
, .. .
-:
-
;
, , , -
;
(
).
.
-:

1. .
1. .

-:

2. -
: ,
- -
, .
, . 3. -
,
-
- .
, -
5. -
,
,
.
-
1. ,
-

-
( -
.
).

. . EPAM Systems, 20152017 : 133/296


-

2. 6. -
. ,
3. -
-
. -
4. .

.
5.
-
.
6.

.

-
.
2.4.d: -, ,
( -
).
- - 3{132}
.
- :

, - 3. ,

: .
-
5.
, -
,
, .
-
-
, ()

:

a. source file inaccessible, retrying.
.
b. destination file inaccessible, retrying.
1. ,
c. log file inaccessible, retrying.

( -
). ,
2.
(. 1).
3. (
(. 1). ).
4.
(high) (low) . ()
5. - ,
- -
. ,
-
.

- ,
, . -
( -
, ..
,
).

. . EPAM Systems, 20152017 : 134/296


-

, -
- - ( -
- ,
),
-.
( ). -
{74}, , -
, (
). -
- .
() -:

1. .
1. - 2. .
.
2. .

() -:

, - 1. -
, .
1. - :
(SOURCE_DIR, DESTINA- a. SOURCE_DIR: directory not exists or inac-
TION_DIR, LOG_FILE_NAME), - cessible.
b. DESTINATION_DIR: directory not exists or
(: z:\src\, inaccessible.
z:\dst\, z:\log.txt , c. LOG_FILE_NAME: wrong file name or inac-
z). cessible path.

, - - -
, ,
, , -
-.
, - -
, .. ,
(:
,
, ,
, -
, ,
).

. . EPAM Systems, 20152017 : 135/296


-

.
, -
.
-
-. -
:

- 1. -
-
: .
- 2. -
, - ,
, -
.
- .
- 3. -
.
1. , .

5. -
( -
-
).
.
2. -
6. -
.

3. .
.
4. .
5.
.
6. .

35 -, -
,
.
. ,
- , -
:

1. - 1. -
. (, c:\src\,
2. - c:\dst\, c:\log.txt , -
. -
3. - ).
.
4. .

- -
.
-
,
( - ), -
, -
, , .

. . EPAM Systems, 20152017 : 136/296


-

- ,
:

1. . 1. DOCX- -
2. . .
3. .
4. ,
DOCX
.

-
- (,
, ). -
{141} , ,
- .
( )
302 -
JUnit TestNG -
(fixture), -
( -
) .
-. -
- ,
- ,
, . -
- -
{62} (. , -
{89} {90}).
-,
, , , , -
-
-.
(
).
, -
, . -
-.
-:

5. - 6. .
, 7. -
. - UTF-8 .
KOI8-R (
).
6. test. txt -
.
7. test.txt.

302 Unit testing (component testing). The testing of individual software components. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 137/296


-

-:

5. 6. : .
. ( - ( UTF8).
. -
KOI8-R, CP866).
6. .

-
UTF-8 ,
:
KOI8-R;
, ;
;
- -
, .
-
( -
) , , -
.
. - -
, , -
. -
-{119} ( , , ),
- , ..
, ,
-, , -
.
-:
- -
-
-

-4 - 1. -
: - ,
.
-
.
1. -
.

, - ( - -
), , -
:
( , -4
{55}).
(
-
), .
, -
-5.1 -5.3,
.

. . EPAM Systems, 20152017 : 138/296


-

-:

-2.4, - 1. -
-3.2 , -
,
. -
1. -
a. SOURCE_DIR: direc-
, - tory not exists or inac-
- cessible.
- b. DESTINATION_DIR:
directory not exists or
. inaccessible.
c. LOG_FILE_NAME:
wrong file name or in-
accessible path.

, - -2 -3 ,
, -
, - .
, -
{137} - (
-,
).
. -
-{116}, -
{115} , :
- -
;
- -
-
.
-, , -
- .
, -,
, -
:

, 1. , -
1. ,
-
. .

. - ,
.
:
-, , -
-,
(: - ,
-273 +500 );
() - -
( ) (: ,
..) -

. . EPAM Systems, 20152017 : 139/296


-

, -
-; -
, - -
(, ).
. -
, , : -

-. , -
, ,
- .
, ..
-,
,
-, - -{155}.

. . EPAM Systems, 20152017 : 140/296


-

2.4.6. -


- (test case suite303, test suite, test set)
-, -
. -
- -
-.
! - -
.
, ,
.
-,
( , !) -
.
- -
. -.
- (-
- ) (
- ).
:
- ,
.
- - , -
-.
:
- -
-,
-.
-
, -
.

( )
use cases ( ),
{36}. -
,
.
- (
-, , -)
304 ( ), -
,
.

303 Test case suite (test suite, test set). A set of several test cases for a component or system under test, where the post condition
of one test is often used as the precondition for the next one. [ISTQB Glossary]
304 A scenario is a hypothetical story, used to help a person think through a complex problem or system. [Cem Kaner, An Introduction
to Scenario Testing, http://www.kaner.com/pdfs/ScenarioIntroVer4.pdf]

. . EPAM Systems, 20152017 : 141/296


-

, .
,
, ! :
1) .
2) ( ).
3) .
4) .
5) .
6) (, ).
7) .
,
- -.
,
,
:
-
( -, -
).
-
.
, -
( ) .
(
, ,
- ).
( -)
, -
, .
. -
(, -
, )
, -
.
2.4.a


, -
, , -
.
, , -
, -
:

. . EPAM Systems, 20152017 : 142/296


-

2.4.b
-

- , -

? ! ! !
- - !
. - !

, ,
. -
, -
.
, -
, , -
An Introduction to Scenario Testing305.

-
-
(
-{145}). , -
, .
- -
.
2.4.c -
-


-
-

-


- ( 2.4.g):
-,
- .
- ( 2.4.h): -
(
-), - .
- ( 2.4.i): -
--
, - .
- ( 2.4.j):
( -
-), - -
.

305 An Introduction to Scenario Testing, Cem Kaner [http://www.kaner.com/pdfs/ScenarioIntroVer4.pdf]

. . EPAM Systems, 20152017 : 143/296


-

: -
, -.
:
( ).
:
-, .. -
.
: - -
, , - ( -
) - - .

2.4.g -

3
1

2.4.h -

. . EPAM Systems, 20152017 : 144/296


-

2.4.i -

2.4.j -

-
: -. -
: . . -
-
-, -
, {141}
- ..
- - , - -
, , -
.

-, :
-. -{111},
{110}: -
- -
.

. . EPAM Systems, 20152017 : 145/296


-

{120}.
( ) -
.
, -
( --
{111}).
- -
{36}, .
-
(, , -
, , ,
).
(. 149 -
):
, -,
.
, : -, -
, -,
, -, , ..
(. -
{64}).
. ,
, . : ,
- , .
: --
- , .. -
, ,
, ..

. . EPAM Systems, 20152017 : 146/296


2.4.7.
, -{110} -
-{119}, -{131}, -
- {145}, , -
, , ,
, .
: - ,
, / -
-. . ,
, -
,
.

(, ..) ? ,
306, user/customer needs and
expectations ( /).
:
(). -
, , - (
, , -
, ). , (,
)?
, -
, ( ..) ,
.
-
,
, -
.
-
( ), ,
, .
:
.
.
,
.
, -
. -
, , ..
,
.
:
-
, ( ) -
, ,
:
();
306 Quality. The degree to which a component, system or process meets specified requirements and/or user/customer needs and
expectations. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 147/296


( ) -

, .
, -
.
-, - -,
:
? , ,
.
( )? -
-
{141} , .
?
{77} ( --
).
, .. ? -
, -
{77} ( -).


, :
-
, -
-, ,
..
- ,
,
307 .. ,
-
.
-. , -
, .
.
-, - ..
. ,
, , -
.
-
(
),
, , ..
{46}
.
- ( , -
..).
( ) -
. -
, {77}, -
{77} .

307 Functional decomposition, Wikipedia [http://en.wikipedia.org/wiki/Functional_decomposition]

. . EPAM Systems, 20152017 : 148/296


, .
-, -
-.
, -. -
{89}, -
{90}, {90}.
{135} - ,
, .
, - , -
-.
-{77} ,
-{77} .
, - (
- ..) , .
,
( -
).
2.4.e: ,
, ..


-{110} -
{55}. :
, .
-, , -
, , -
, .
? ?
. , -
,

.
:
, , :
2.4.d ,

TXT HTML MD
WIN1251 100 50 10
CP866 10 100 50
KOI8R 50 10 100
- 0
50 + 1 50 + 1 50 + 1
-

. . EPAM Systems, 20152017 : 149/296


- ( )?
. :
;
.
.
:
2.4.e ,

TXT HTML MD
WIN1251 100 50 10
- CP866 10 100 50
KOI8R 50 10 100
18 9 + 9 ( -
),
.
:
0 ( ,
). 0 .
50 + 1 ( ).
52428801 .
:
o ( , .txt, .html, .md).
, ( 1
50 , .jpg).
o (, .jpg .txt). -
.txt.
. . .
-
,
,
-
.
? 22 (-
, , -
, ).

. . EPAM Systems, 20152017 : 150/296


2.4.f

1 WIN1251.txt WIN1251 100
2 CP866.txt CP866 10
3 KOI8R.txt KOI8R 50
4 win-1251.html WIN1251 50
5 cp-866.html CP866 100
6 koi8-r.html KOI8R 10
7 WIN_1251.md WIN1251 10
8 CP_866.md CP866 50
9 KOI8_R.md KOI8R 100
10 WIN1251.txt UTF8 100
11 CP866.txt UTF8 10
12 KOI8R.txt UTF8 50
13 win-1251.html UTF8 50
14 cp-866.html UTF8 100
15 koi8-r.html UTF8 10
16 WIN_1251.md UTF8 10
17 CP_866.md UTF8 50
18 KOI8_R.md UTF8 100
19 .md - 0
20 .txt - 52428801
21 .jpg - ~ 1
22 TXT.txt - ~ 1

-
-. -
. Windows Linux, -
{279} , -
{74}
22 .
2.4.f: {279}
,
,
, . -
, -
.
-, ,
{74}
{75}.
.
, , - , -
: .
{86}
,
. -
? -1.1
5 /.
-1.1 ,
(, -
) ( ). ? -
. , -

. . EPAM Systems, 20152017 : 151/296


, -
.
-:
:
o :
SOURCE_DIR, DESTINATION_DIR, LOG_FILE_NAME
(-
Windows *nix ,

(/ \)). ( -
22 .)
LOG_FILE_NAME . ( -
.)
o .
o .
o :
SOURCE_DIR.
DESTINATION_DIR.
LOG_FILE_NAME.
DESTINATION_DIR SOURCE_DIR.
DESTINATION_DIR SOURCE_DIR .
:
o , . ( .)
o :
.
.
.
:
o . (. -
, PHP
.)
:
o ( ), -
.
o ( ) , -
.
:
o . ( , -
-
.)
, -
{75}. ! -!
-, .

. . EPAM Systems, 20152017 : 152/296


2.4.g -

. .
-
. .
:
o SOURCE_DIR. , -
o DESTINATION_DIR.
o LOG_FILE_NAME. .
o DESTINATION_DIR
SOURCE_DIR.
o DESTINATION_DIR
SOURCE_DIR .
:
o . , -
o . .
o .
:
o (
), . .
o ( )
, -
.

, {76}.
, ,
.
:
o SOURCE_DIR, DESTINATION_DIR, LOG_FILE_NAME:
(Windows- + *nix-)
, .
UNC-.
LOG_FILE_NAME SOURCE_DIR.
LOG_FILE_NAME DESTINATION_DIR.
o LOG_FILE_NAME :
24 .
4+ .
o :
SOURCE_DIR, DESTINATION_DIR,
LOG_FILE_NAME.
SOURCE_DIR LOG_FILE_NAME,
DESTINATION_DIR.
DESTINATION_DIR LOG_FILE_NAME, -
SOURCE_DIR.
:
o ,
.
o :
24 .
4+ .
, - .
,
.

. . EPAM Systems, 20152017 : 153/296


,
,
{141} .
- (
) :
1) (. 2.4.f).
2) (. -
Windows Linux, -
{279}).
3) 1
( 2.4.h).

2.4.h

. .
-
. .
:
o SOURCE_DIR. , -
o DESTINATION_DIR.
o LOG_FILE_NAME. .
o DESTINATION_DIR
SOURCE_DIR.
o DESTINATION_DIR
SOURCE_DIR .
:
o . , -
o . .
o .
:
o (
), . .
o ( ) -
, .

4) --
{76} -
{80}.
. , -
, ,
, -
-
.
2.4.g: , 2.4.h -
.
.

. . EPAM Systems, 20152017 : 154/296


-, - -

2.4.8. -, -
-


- . -
-
.
- ,
N -,
.
.
- -
,
, .
, -
- . , -
, - .
-,
- , -.
, , ,
.. -, .. -
, . . -
-
( ) -.
/ (
). - -
, , -
(, {41} -
-) , -
. - , -
.
. -{115}
, -
23 (
-), (..
, , 5.7.1, 5.7.2, 5.7.3, 5.7.7, 5.7.9, 5.7.12,
5.7, ). -
- -
, .
.
, , , -
.. -
, ,

.
-
. ,
,
.

. . EPAM Systems, 20152017 : 155/296


-, - -

( ) -
. - , -
, . - !
? :

. .
. .
. .
. .
. .

-
( , ,
): , -
, .
, , (
): ( Division by zero detected),
(
), ( , ).

.
(system button Close), - -
(double click), -
, -
(hint).
, ,
. .


- -. -
-
-{143} . ,
-, , ,
. , -, -
, - -
, , .
, -
{74}. , {74} -
-
,
-, ,
{75}
{76} ( , ,
).

. . EPAM Systems, 20152017 : 156/296


-, - -

-
. , -
( -), , . -
, :
C. (.. C:\? ? ?)
. (, ico-
, ? ?)
. (?)
. (! , , ?)
OK. (? OK?)
. ( ?)
. ( -
? ? ?)
/.
, . -
{120} , . :
, .
-
. , : X.
-, ,
() X.
, .
, : X. ?
, , , , -
(, ), - - -
? - , ..
.
. , -
( , -
) , (, )
, , .
, , -
.
( , ) -
, .

( ,
, - ?). -
-
C:/SavedDocuments ( C:/SavedDocuments, ,
.. , , ).

. . EPAM Systems, 20152017 : 157/296


-, - -

-.
- -
. , ,
. , -
, .. -
. ~200? ?

1. - 1.
. -
2. C:/MyData. .
2. -
~200.

- ,
: C:/MyData
( ).
1000 , 200 , -
.
() -
, , -
.
{74}, , . -
:
- -

. . .
. - -
. .
-
. .

. -
.

-
.

, () -
. , -
. :


1. . 1. DOCX-,
2. -> . .
3. DOCX-, 2.
.
4. . .
5. -> . 3. ,
6. .
.
7. .
8. .
9. .

. . EPAM Systems, 20152017 : 158/296


-, - -

, -
, -. -
, , -
( ), -
( , , -
) ,
.
.
, - , -
. , - -
"Close" "Close window".
, , .
( -) : "Close
window". ! . Close window , -
.
:
. . ? ? , ,
, ? ,
(, ?),
?
, , , -
: -
. , ? ,
, .
-
-. :
, (: , -
(system tray, taskbar notification area)), -
, -
, , .
.
- -
- , -
, , - ..
. ,
-
{75}, - .
.
. -

. -
-.

. . EPAM Systems, 20152017 : 159/296


-, - -

-. -
, :


4. Alt+F4. 4. .
5. . 5.
, -
999.99.

, -
, , -
. ? , -
.
. -
, {89} .
. , -
, 10 100 (). -
09 , 10100 , 101+ , .. -
. , ,
9.5 , 100.1 , 100.7 .. -
: 0 < 10 , 10 100 , 100 < .
: [0, 10) , [10, 100] , (100, )
, .
-, . ,
. ,
- ( , -
, .. ,
, ):
.
.
, .
, .
, .
.
.
/ .
-. ,
, ,
, :
.
getMessage().
.
.
.
.
,
, .
, .. -
-. .

. . EPAM Systems, 20152017 : 160/296


-, - -

. -{111} -
{110}.


-{119}, -{131} -
{147} - -.

. . EPAM Systems, 20152017 : 161/296


2.5.
2.5.1. , , , ..


( -
!), : -
(, , .., -
) (, , ..,
).
.
.
, .
,
.
! (
) . -
. .
-
, -
.

,
, -
, -
.
ISTQB 308, , -
, , ,
( - -
, ..)
, :

2.5.a , ,

308 A human being can make an error (mistake), which produces a defect (fault, bug) in the program code, or in a document. If a
defect in code is executed, the system may fail to do what it should do (or do something it shouldn't), causing a failure. Defects
in software, systems or documents may result in failures, but not all defects do so. Defects occur because human beings are
fallible and because there is time pressure, complex code, complexity of infrastructure, changing technologies, and/or many
system interactions. Failures can be caused by environmental conditions as well. For example, radiation, magnetism, electronic
fields, and pollution can cause faults in firmware or influence the execution of software by changing the hardware conditions.
[ISTQB Syllabus]

. . EPAM Systems, 20152017 : 162/296


, , , ..

,
ISTQB , :
Defect
Failure,
Error (Mistake) (Problem, Bug,
Interruption
Fault)

Anomaly, Incident (Deviation)

2.5.b
.
(error309, mistake) , -
.
, -
( , , -
, , ,
..) , ,
. , ,
.
(defect310, bug, problem, fault) -
, .
, -
, , .. -
? -
,
, -
.
(interruption311) (failure312) -
.
27.002-89 :
,
.
, -
.

, ,
( ,
).

309 Error, Mistake. A human action that produces an incorrect result. [ISTQB Glossary]
310 Defect, Bug, Problem, Fault. A flaw in a component or system that can cause the component or system to fail to perform its
required function, e.g. an incorrect statement or data definition. A defect, if encountered during execution, may cause a failure of
the component or system. [ISTQB Glossary]
311 Interruption. A suspension of a process, such as the execution of a computer program, caused by an event external to that
process and performed in such a way that the process can be resumed. [http://www.electropedia.org/iev/iev.nsf/display?open-
form&ievref=714-22-10]
312 Failure. Deviation of the component or system from its expected delivery, service or result. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 163/296


, , , ..

(anomaly313) (incident314, deviation) -


() , , ,
, , -
, , -
.
, , ,
. , , , ..
-
. , -
, ..
-
.
, , -
, :
(deviation314) (actual re-
sult315) (expected result316),
, , -
.
,
, , ,
.
,
, -
. ,
, -
,
.

-
,
, , -
(Rudolph Frederick Stapelberg,
Handbook of Reliability, Availability, Maintainability and Safety in Engineering
Design).
, -
IEEE 1044:2009
Standard Classification For Software Anomalies.

313 Anomaly. Any condition that deviates from expectation based on requirements specifications, design documents, user documents,
standards, etc. or from someones perception or experience. Anomalies may be found during, but not limited to, reviewing,
testing, analysis, compilation, or use of software products or applicable documentation. See also bug, defect, deviation, error,
fault, failure, incident, problem. [ISTQB Glossary]
314 Incident, Deviation. Any event occurring that requires investigation. [ISTQB Glossary]
315 Actual result. The behavior produced/observed when a component or system is tested. [ISTQB Glossary]
316 Expected result, Expected outcome, Predicted outcome. The behavior predicted by the specification, or another source, of
the component or system under specified conditions. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 164/296


2.5.2.
, -
.
(defect report317) , -
, -
.
, -
:

, ;

;


, -
.
. , -
-
. , (
{197}), , , -
,
,
, ,
, .
! -
( -
). , -
.
( )
, ( 2.5.c):
(submitted) (
(new)), .
(draft)
.
(assigned) , -
-
. -
, , ,
-
.
(fixed) -
-
.
(verified) , -
, . ,
, .

317 Defect report, Bug report. A document reporting on any flaw in a component or system that can cause the component or system
to fail to perform its required function. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 165/296


,
, ,
, . -
, ,
,
.
, -
, : -
-
, , -
, -
.

2.5.c -

,
-
.
.
2.5.c .
(closed) , ,
. -
, -
:
o
, ,
- (, -
..), -
, -
.
o ( -
).

. . EPAM Systems, 20152017 : 166/296


o
-
:
,
.
.
-
.
, - -
.

,

, -
.
, -
,
,
, .
(reopened) ( , -
) , , -
, -
.
(to be declined) -
-
.
,
(. ).
(declined) , -
,

.
(deferred) , -

, , -
( , -
,
..)

, -
JIRA318 ( 2.5.d).

318 What is Workflow. [https://confluence.atlassian.com/display/JIRA/What+is+Workflow]

. . EPAM Systems, 20152017 : 167/296


2.5.d JIRA

. . EPAM Systems, 20152017 : 168/296


()

2.5.3. ()
-
,
, .
2.5.e.

19 - 1. --
, - -
- .
- 2.
: ,
, - .
- 3. .
-
:
-
-
.
-,
: -
-
- -,
- -- -
. -
: - ( -
- -
-, ).
-.
: -2.1.

-
-

--

,
-
.

2.5.e
2.5.a: ,
-
?

. . EPAM Systems, 20152017 : 169/296


()

(identifier) , -

.
, ( -
) : -
, , -
( ), .
(summary)
? ?
?. :
, :
? .
? .
? -
.

, :
,
;
(,
) 12 , -
;
, (
,
);
,
;
(
), -
;
( )
, .
-
:
1. . , -
, , ,
.
2. (description)
.
3. , -
.
4. (, ),
, , .
5. 4 -
.
6. , ,
( ,
). ,
.

. . EPAM Systems, 20152017 : 170/296


()

.
1. -, -
250 ; -
, .
1. : , , -
, /
.
2. :
,
http://testapplication/admin/goods/edit/.
3. :
: ( ,
http://testapplication/admin/goods/edit/) -
(MAX=250 ).
: 251+ -
.
4. , :
: .
: , , http://testapplication/ad-
min/goods/edit/.
: ( ,
).
5. : -
, .
6. ( ):
. : no check for max
length.
2.

.
1. :
, , , ;
, -
, ;
(
, ).
2. : -
-
.
3. :
: -
( ) -

(. BR852345).
:
; -
.
4. , :
: .
: ( ).
: .

. . EPAM Systems, 20152017 : 171/296


()

5. :
-
.
6. ( ):
. : client crash and data
loss on damaged/empty files opening.
3.
( ,
, .. -
).
1. : , , -
; ,
, ,
.
2. : -
-
.
3. :
:

, .
:
( ).
4. , :
: .
: ( ).
: , -
.
5. : -
.
6. ( ):
. : wrong presenta-
tion of Russian text in case of external fonts inaccessibility.

2.5.a.

. . EPAM Systems, 20152017 : 172/296


()

2.5.a -

-

- - No check for
, max length.
250 .
; -
, .
Client crash and data loss
- on damaged/empty files
. opening.
.
- Wrong presentation of
Russian text in case of
( . external fonts inaccessi-
, bility.
, -
.. -
).

.
(description) -
, (!)
, ( ).
:
, -
.
: -
.
: -
.
: R245.3.23b.

, , , -
, .
( ) -
, .
(steps to reproduce, STR) ,
.
-, : -
,
, ..
.
:
1. http://testapplication/admin/login/.
2. defaultadmin dapassword.
: (
logo).

. . EPAM Systems, 20152017 : 173/296


()

(reproducibility) , -
.
: (always) (sometimes).
, ,
.
:
,
(.. -
).
, -
.
-
, .. -
, (,
1020 ).
,
: -
100 ,
, 101- .
, ,
,
.
(severity) ,
.
:
(critical) -
, : , -
, -
..
(major)
, : -
,
, -
.
(medium) -
, /
, : -
OK/Cancel,
, -
.
(minor) -
() , -
: ,
(
),
.

. . EPAM Systems, 20152017 : 174/296


()

(priority) , .
:
(ASAP, as soon as possible) -
, . -
, -
.
(high) , -
, .. ,
.
(normal) , -
.
.
(low) ,

.

.
, .
. -
X Y . -
, ,
.
, -
: , ,
? .
,
, .. ,
..
.
: ?
, . ,
.
?, , ? -
, .
, ,
.

2.5.b.
2.5.b


-
-
. .
-
- ,
, .

-
-
.

. . EPAM Systems, 20152017 : 175/296


()

(symptom) -
. . -
,
, , , .
.
(cosmetic flaw) -
, (,
).
/ (data corruption/loss) -
, ( )
(, -
).
(documentation issue)
, (,
).
(incorrect operation) -
(, 17 -
2 3).
(installation problem) -
/ (.
{81}).
(localization issue) - -
(. -
{84}).
(missing feature) -
(,
,
).
(scalability) -
-
(. {86} -
{87}).
(low performance)
(. -
{86}).
(system crash)
( -
, - ..).
(unexpected behavior) -
(
) (, -
, ).
(unfriendly behavior)
(, -
OK Cancel).
(variance from specs) -
, ,
, .
(enhancement)

. . EPAM Systems, 20152017 : 176/296


()

, .. -
: ,
,
.
, -
. , . ,
. -
,
.
, -
, (, ,
, -
, ;
, ).
(workaround) , -
, -
(, Ctrl+P
, ,
).
,
. ,
.
(comments, additional info) -
. ,
, .
(attachments) ,
( ,
..)
:
, , .
.. (, ,
).
:
o (Alt+PrintScreen),
(PrintScreen).
o ( Snipping Tool Paint Windows,
Pinta XPaint Linux).
o (,
, , -
).
o -
( ), -
. -
, ,
, : br_9_sc_01.png,
br_9_sc_02.png, br_9_sc_03.png.
o JPG (
) PNG ( -
).

. . EPAM Systems, 20152017 : 177/296


()

, -
,
(
). ,
-
.

. -
-
.


-
{197}.

. . EPAM Systems, 20152017 : 178/296


2.5.4. -

-
- -
, - .. -
.
(bug tracking
system, defect management tool319) 320, -
. -

{125}.
,
,
. , , -
:
,
, -
.
, -
.
,
.
, -, -
-
.
.
.
, -

, -
, -
.

-
.

, -
.
,
(: -
, .., -
).

319 Defect management tool, Incident management tool. A tool that facilitates the recording and status tracking of defects and
changes. They often have workflow-oriented facilities to track and control the allocation, correction and re-testing of defects and
provide reporting facilities. See also incident management tool. [ISTQB Glossary]
320 Comparison of issue-tracking systems, Wikipedia [http://en.wikipedia.org/wiki/Comparison_of_issue-tracking_systems]

. . EPAM Systems, 20152017 : 179/296


Jira321
1. Project () , .
2. Issue type ( /) , -
. JIRA -
, 322,
323. :
Improvement ( )
, (.
, {176}).
New feature ( ) -
, , .
Task () -
.
Custom issue ( ) ,
JIRA , , -
Issue.
3. Summary ( ) .
4. Priority () .
JIRA :
Highest ( ).
High ( ).
Medium ( ).
Low ( ).
Lowest ( ).
: severity () .
.
5. Components () , -
( ).
6. Affected versions ( ) ,
.
7. Environment () -
, .
8. Description ( ) -
.
9. Original estimate ( )
, .
10. Remaining estimate ( ) ,
.
11. Story points ( ) (
) , -
.
12. Labels () (, ),
.
13. Epic/Theme (/) ,
,
, , -
..

321 JIRA Issue & Project Tracking Software [https://www.atlassian.com/software/jira/]


322 What is an Issue [https://confluence.atlassian.com/display/JIRA/What+is+an+Issue]
323 Defining Issue Type Field Values [https://confluence.atlassian.com/display/JIRA/Defining+Issue+Type+Field+Values]

. . EPAM Systems, 20152017 : 180/296


5
4
6

10
11

12

13

14

15

16

17

18
19

2.5.f JIRA

. . EPAM Systems, 20152017 : 181/296


14. External issue id ( ) -


.
15. Epic link ( /) /
(. 13), .
16. Has a story/s () /
, ( ,
).
17. Tester () .
18. Additional information ( ) -
.
19. Sprint () (24-
),
.

( -
, ..)

Bugzilla324
1. Product () , ()
.
2. Reporter ( ) e-mail .
3. Component () , -
.
4. Component description ( ) -
, . -
.
5. Version () , -
.
6. Severity () .
:
Blocker ( ) -
.
Critical ( ).
Major ( ).
Normal ( ).
Minor ( ).
Trivial ( ).
Enhancement ( )
, (.
, {176}).
7. Hardware ( ) -
, .
8. OS ( ) ,
.

324 Bugzilla [https://www.bugzilla.org]

. . EPAM Systems, 20152017 : 182/296


1 2

4
3 6

5 8

10
11 9
12

13

14 15

16
17

18

19

20

21
22

2.5.g Bugzilla
9. Priority () .
Bugzilla :
Highest ( ).
High ( ).
Normal ( ).
Low ( ).
Lowest ( ).

. . EPAM Systems, 20152017 : 183/296


10. Status () .
Bugzilla :
Unconfirmed ( ) , -
, .
Confirmed () ,
.
In progress ( ) -
.

Bugzilla -
.
11. Assignee () e-mail , -
.
12. CC () e-mail -
, -
.
13. Default CC ( ) e-mail ()
,
( e-mail
).
14. Original estimation ( )
, .
15. Deadline ( ) , -
.
16. Alias () -
(, )
.
17. URL (URL) URL, (-
-).
18. Summary ( ) .
19. Description ( ) -
.
20. Attachment ()
.
21. Depends on ( ) ,
.
22. Blocks () , -
.

. . EPAM Systems, 20152017 : 184/296


Mantis325

1 2
3
5 4
6
7
9
8 10

11

12

13

14
15
16

2.5.h Mantis
1. Category () -
, .
2. Reproducibility () . Mantis
:
Always ().
Sometimes ().
Random ( ) ,
.
Have not tried ( ) ,
, Mantis .
325 Mantis Bug Tracker [https://www.mantisbt.org]

. . EPAM Systems, 20152017 : 185/296


Unable to reproduce ( ) -
, ,
Mantis .
N/A (non-applicable, ) , -
(, -
).
3. Severity () .
:
Block ( ) -
.
Crash ( ) , ,
.
Major ( ).
Minor ( ).
Tweak () , .
Text () , ( ..).
Trivial ( ).
Feature () -
, /
.
4. Priority () .
Mantis :
Immediate ().
Urgent ( ).
High ( ).
Normal ( ).
Low ( ).
None () .
5. Select profile ( ) -
- ,
. -
, 678 (. ).
6. Platform () ,
.
7. OS ( ) ,
.
8. OS Version ( ) -
, .
9. Product version ( ) ,
.
10. Summary ( ) .
11. Description ( ) -
.
12. Steps to reproduce ( ) -
.
13. Additional information ( ) -
,
.
14. Upload file ( ) -
, .

. . EPAM Systems, 20152017 : 186/296


15. View status ( ) -


:
Public ().
Private ().
16. Report stay ( )
-
.
2.5.b: 35
, ,
.

. . EPAM Systems, 20152017 : 187/296


2.5.5.
( , -
),
.
.
:
, , ..
:
.
,
.
.
, ,
: .
(,
, ..)
( -
copy-paste ).
( , ) ,
. -
, , (!) -
.
: -
, ,
. : .
( : .)
-
. : ,
?!
, -
. -
- , -
, -
..
( , ) -
. , -
,
, .
(
) . :
,
.
.
, ,
: Not keyboard in parameters accepting values ( -
; , ).

. . EPAM Systems, 20152017 : 188/296


.
, -, ,
. {131}.
:

,
- , ,
. SOURCE_DIR.
Act: -
( )
SOURCE_DIR .
Exp: , SOURCE_DIR
-
, -
Non-empty subfolder [ ] in
SOURCE_DIR folder detected. Remove it manu-
ally or restart application with --force_file_opera-
tions key to remove automatically.
Req: UR.56.BF.4.c.

. -, ,

. , , -
: -
. , -
, - , , .
:

1. - 1.
, HTML 100 1 , -
.
UTF8 (10 100 ) WIN-1251
: -
(20 100 ).
.
: ,
UTF8, (
).

, .. -

,
( ),
.
: - -
, , -
- -
.

. . EPAM Systems, 20152017 : 189/296


/ .
,
, :

1. - 1. -
. (
2. - , SOURCE_DIR DESTINA-
. TION_DIR
3. - ).
.
:
4. .
SOURCE_DIR and DESTINA-
: - TION_DIR may not be the same! (
,
,
. .)

-
.
-
( -)
() .
:

1. : -
. 30 .
2. 30 . 1. -
3. - .
.
:
: .
.

. -
, ,
. ,
, -
, . -
:
, ,
-
.
(..
). -
,
, -
.
-
: , -
, ..
.
,
. ,
.
..

. . EPAM Systems, 20152017 : 190/296


(clarification meetings326), .. -
, -
- , -
, .
- ,
( ),
.
. ,
, -
. -
,
.
:


SOURCE_DIR. SOURCE_DIR -
, -

.
Act: () -
, SOURCE_DIR
.
Exp: SOURCE_DIR

, -
( ?),
,

Non-empty subfolder [ ] in
SOURCE_DIR folder detected. Remove it manu-
ally or restart application with --force_file_opera-
tions key to remove automatically.
Req: UR.56.BF.4.c.

-
: ? ?
, -
. ,
( -
),
: .. ,
.
. -
, ,
. -
-
: ,
, -, ( ; -
), ..

,

326 Clarification meeting. A discussion that helps the customers achieve advance clarity consensus on the desired behavior of
each story by asking questions and getting examples. [Agile Testing, Lisa Crispin, Janet Gregory]

. . EPAM Systems, 20152017 : 191/296



.
. -
:
( -
, -
).
. -
, -
.
,
: ,
, , . -
? .
: ,
,
.

.
.
-,
: -
.
,
, : -
, . .

. . EPAM Systems, 20152017 : 192/296


2.5.6.
-
:
0. .
1. .
2. .
3. -
.
4. , , -
.
5. , .
6. , -
.
7. , .. 6 - .
.


.
-
, -
. -
, .
(
,
).
, :

, -
- ,
SOURCE_DIR.
SOURCE_DIR. .
Act:
SOURCE_DIR.
Exp:
,
SOURCE_DIR.
Req: -2.1.
- - -
- -



. . EPAM Systems, 20152017 : 193/296


, :

- SOURCE_DIR - 1. -
,
- :
. : /SRC/
) - /DST/
SOURCE_DIR, /X/
, 2. SRC X
DESTINATION_DIR -
,
. .
3. SRC
) -
: )
SOURCE_DIR, -
-
,
SRC; )

X.
DESTINATION_DIR, -
4. .

, - : DST -
, -
. ; -
Act: - X
- DST.
(. ).
Exp:
SOURCE_DIR -
, -

Symbolic link [
] in SOURCE_DIR folder de-
tected. Remove it manually or restart ap-
plication with --force_file_operations key
to remove automatically.
Req: UR.56.BF.4.e.

- - -
-

- ,
is_file()
file_exists(). ,
. -
-
(. BR-999.99).

SOURCE_DIR -
: -
-
,
SOURCE_DIR, .. -

.

. . EPAM Systems, 20152017 : 194/296




{174}, ,
, (
) - -,
.


, -
, . -
, , -
, .


( ),
( ), -
( ) -
, .
, , -
:
: , -
, -
.
: , ,
,
, .
:
, ,
- , .
:
, , -
,
.
:
-
, -
-
.
, ,
.


{169}, -
, , ..
,
.
, - -
, . ,

. . EPAM Systems, 20152017 : 195/296


( ), -
, ( ) , ,
, ( -
,
).
, -
, ,
:
, .

( )
, , -
. , -
- -
, - , - - .
, ,
.

-
,
. , .

. . EPAM Systems, 20152017 : 196/296


2.5.7.
-
- -{155}, ..
( ,
).


(summary).
, :
. -
:
?, ?, ?.
.
.

, , , -
:
.
19 .
.
.
.
Error when entering just the name of computer disk or folder.
No reaction to 'Enter' key.
, -
, -
, , -
.
-
.
{170}

(summary description). ,
, -
(, File (
Fille)), - -
, :
(
);
(
);
-
.
,
(-
, , -
).

. . EPAM Systems, 20152017 : 197/296


-
, , ,
.
, (
: ).
,
OpenXML-?
, ? ? , -
? , (!),
,
. .
, , .
() , . -
:
(not a bug).
, . -
, ? -
? , ,
. , -
, , .. .

. ,
, (
):
Enter.
+.
"-
".
.
The application doesn't work with the from keyboard by the user in the field "What
to search".
. -
, ,
, :

1. . 1.
2. . .
3. . 2. -> -> -
4. . -> -> .
5. . 3. , -
6. . .
7. .
: - .
8.
.
9. , -
.
: - .

. . EPAM Systems, 20152017 : 198/296


.
- ,
Alt+PrintScreen. , , -
-
.
, . -
,
.
, -
. : -


, .. :

/ -
() ,
;

, -
.

.
, ,
(
, ).
.
, , - -
( !) .
, ,
. : , -
.
, ,
. .


. -

(not a bug), - -
.
,
. , -
, - ,
. -
,
- , ,
. ? ? .

. . EPAM Systems, 20152017 : 199/296


.
, -
,
.
, -
, ( -

-
). ,
? ?! .
. , -
, ,
.
( ) .
-
( -
), , -
, , -
.
. -
, ,
,
( , -
, ). , -
, - -
, ,
.
. , . -
, -
( -
, .. ): -
. , ? -
, .
( ,
):
: . (
.)
: -
. (. .)
:
. (! ,
, ,
.)

. . EPAM Systems, 20152017 : 200/296


, -
.
,
. :

1. J: Data. 1.
2. Data ( ) -
song1.mp3 - .mp3.
999.99 Kb song2.mp3 2. ( -
888.88 Kb. -> , ->
3. J:\Data. ()
4. - 1).
. 3. .
5. .
:
: 2 .mp3.
.

, -
J:?
? ,
, . , , -
, .
.
, -
. , , -
. . . -
- ,
. (-
, ):
,
( ).
,
, -
, ,
memory_limit .
,
, -
(NULL- , ).
, ? -
. -
, . -
.
.
, , , -
, -
, , .. .. ,
,
.

. . EPAM Systems, 20152017 : 201/296


.. .
(, -
, ). ,
, .
, .. : -
.
.

. . EPAM Systems, 20152017 : 202/296


,

2.6. ,
2.6.1.
{147} -
,
. , ..
. , -
,
. -
{64} -
( , ,
) , ( -
) .
, -
, , :
?
? ,
?
?
?
?
?
, ?
, -
?

. ,
,
, .
{27}:
, -
(. 2.6.). ,
,
,
.

2.6.a ()

. . EPAM Systems, 20152017 : 203/296


, :
, .
, , .
, -
, .
-
.
-
.
. . .
, ?
. , -
-
. , , ,
,
. ( , ,
,
.)
,
, () -
(,
, , ; , -
, ).
, . .
(planning327) -

-
.
:
;
;
;
.

(reporting328) -
( ,
).
:
, -
;
( );
( );

.
327 Planning is a continuous process of making entrepreneurial decisions with an eye to the future, and methodically organizing the
effort needed to carry out these decisions. There are four basic reasons for project planning: to eliminate or reduce uncertainty;
to improve efficiency of the operation; to obtain a better understanding of the objectives; to provide a basis for monitoring and
controlling work. [Project Management: A Systems Approach to Planning, Scheduling, and Controlling, Harold Kerzner]
328 Reporting collecting and distributing performance information (including status reporting, progress measurement, and forecast-
ing). [PMBOK, 3rd edition]

. . EPAM Systems, 20152017 : 204/296


,
, .
, -
:
Project Management: A Systems Approach to Planning, Scheduling, and
Controlling, Harold Kerzner.
PMBOK (Project Management Body of Knowledge).
, -
( , ) -
.

. . EPAM Systems, 20152017 : 205/296


-

2.6.2. -

-
- (test plan329) ,
, -
, , , , -
.
:
;
;
, ;
;
;

, .
, - -
. - -
{41}, :
( ).
( -
, ,
-
).

(, ).
- -
-
,
. -, ,
(-).
- (
, ).
(purpose).
( -{36}, -
,
-
).
, (features to be tested).
/ ,
.
.
, (features not to be tested). -
/ ,
. -

329 Test plan. A document describing the scope, approach, resources and schedule of intended test activities. It identifies amongst
others test items, the features to be tested, the testing tasks, who will do each task, degree of tester independence, the test
environment, the test design techniques and entry and exit criteria to be used, and the rationale for their choice, and any risks
requiring contingency planning. It is a record of the test planning process. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 206/296


-

-
-
. , -
, -
- -
.
(test strategy330) (test approach331).
, , -
, , ..
(criteria). :
o , (acceptance criteria332)
,
-
, .
o (entry criteria333) ,
. -

, .
o (suspension criteria334) -
, -
. -
, -
.
o (resumption criteria335) -
, -
( , ).
o (exit criteria336)
, . -
-
, ,
.
(resources). - -
, -
:
o ( ,
(
));

330 Test strategy. A high-level description of the test levels to be performed and the testing within those levels (group of test activities
that are organized and managed together, e.g. component test, integration test, system test and acceptance test) for an organi-
zation or program (one or more projects). [ISTQB Glossary]
331 Test approach. The implementation of the test strategy for a specific project. It typically includes the decisions made that follow
based on the (test) projects goal and the risk assessment carried out, starting points regarding the test process, the test design
techniques to be applied, exit criteria and test types to be performed. [ISTQB Glossary]
332 Acceptance criteria. The exit criteria that a component or system must satisfy in order to be accepted by a user, customer, or
other authorized entity. [ISTQB Glossary]
333 Entry criteria. The set of generic and specific conditions for permitting a process to go forward with a defined task, e.g. test phase.
The purpose of entry criteria is to prevent a task from starting which would entail more (wasted) effort compared to the effort
needed to remove the failed entry criteria. [ISTQB Glossary]
334 Suspension criteria. The criteria used to (temporarily) stop all or a portion of the testing activities on the test items. [ISTQB
Glossary]
335 Resumption criteria. The criteria used to restart all or a portion of the testing activities that were suspended previously. [ISTQB
Glossary]
336 Exit criteria. The set of generic and specific conditions, agreed upon with the stakeholders for permitting a process to be officially
completed. The purpose of exit criteria is to prevent a task from being considered completed when there are still outstanding
parts of the task which have not been finished. Exit criteria are used to report against and to plan when to stop testing. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 207/296


-

o ( , -
);
o ( -
-
);
o (
);
o ( -
,
);
, .. -
.
(test schedule337). , ,
.
.. (milestones),
.
(roles and responsibility).
(, , -
) ,
.
(risk evaluation). , -
.
-
.
(documentation). -
, .
(metrics338). , -
, .. , ,
-.
, -
. .
(metric338) . -
.
. ,
-
. -
, , , , , ..
.
: 79 % ( .. 94 % -
), 63 % 71 %,
- 85 % 89 %. ,
,
.

337 Test schedule. A list of activities, tasks or events of the test process, identifying their intended start and finish dates and/or times,
and interdependencies. [ISTQB Glossary]
338 Metric. A measurement scale and the method used for measurement. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 208/296


-

( -
, ), - -
. .
:
, ,
(. -);
-
;
;
, ;
;
;
..
( ),
( ). -
-, .. -
,
(. 2.6.1).
2.6.1

( )
=

100%, =
,

- -
-, ,
- -,
-, -,
- ,
-. -,
, --
:
.
: 10 %.
: 40 %. :
: 85 %.
.
-
15 % -
-
.

, -
-
. 339:
() - ;
- (.
2.6.1);
-;
;
;
;
..

339 Important Software Test Metrics and Measurements Explained with Examples and Graphs [http://www.softwaretest-
inghelp.com/software-test-metrics-and-measurements/]

. . EPAM Systems, 20152017 : 209/296


-

,
, ,
(
).
,
:

= 1,

,
,
, .
.

, ,
. :
, -
,
, , . ,
2.6.1
.
, -
.
, , , ..
.
(coverage340) , -
(coverage item341)
-.
:
( ,
-):

= 100%,

,
, -,
.
(, -
):

= 100%,
,
-, i- ,
-,
.

340 Coverage, Test coverage. The degree, expressed as a percentage, to which a specified coverage item has been exercised by a
test suite. [ISTQB Glossary]
341 Coverage item. An entity or property used as a basis for test coverage, e.g. equivalence partitions or code statements. [ISTQB
Glossary]

. . EPAM Systems, 20152017 : 210/296


-

(,
-).

= 100%,
,
, -,
.
(,
-).

= 100%,
,
, -,
.
-.
, (-
, , , ..) ,
-.
, ISTQB-
. -
, ISTQB- coverage.
-
- -
{55}. , , -
(, ,
).

-
-, -
, -
. - ,
, , .. .

-
, -
.
,
(. .)
-1.*: .
-2.*: , .
-3.*: .
-1.*: , .
-2.*: , .
-4: .
-5: .
-*: , .

. . EPAM Systems, 20152017 : 211/296


-

,
-1: .
-2, -1, -2: PHP .
-1.1: -
, .
-3: .
-6: .


.

,
--
. , .. -
.
:
:
Windows Linux.
: .
, .. -
.

,
.
.


: 100 % - -
90 % - (. -
-) 100 %
(.
). - (. -
-) 80 %.
: .
: -
100 % -
(. -);
, 25 % -
- 50 % -
(. -).
: 50 % -
(.
).
: 80 % -
- (. -).

: ( Windows 7
Ent x64, Linux Ubuntu 14 LTS x64), PHP Storm 8.
: (8GB RAM, i7 3GHz).

. . EPAM Systems, 20152017 : 212/296


-

:
o (100%-
). : ,
.
o PHP (100%-
). : .
: (40 ).
: .
.

25.05 .
26.05 - -
.
27.0528.05 ( -, -
).
29.05 .

: ,
.
: , -
, .

( ): -
-
( -
).
( ):
01.06, . -
, 28.05
, (29.05) .
: .

. , 25.05.
- . , -
26.0528.05.
. , -
29.05.

-:

= 100%,

-,
-,
-.
:
o : 10%.
o : 40%.
o : 80%.

. . EPAM Systems, 20152017 : 213/296


-

:


= 100%,






,


.
:


10% 40% 50% 80%

15% 50% 75% 90%

20% 60% 100% 100%
:


= 100%,



, ,

,

.
:


60% 60% 60% 60%

65% 70% 85% 90%

70% 80% 95% 100%
-:
, 25% && < 50%
={ ,
, < 25% || 50%
,
,
.
-:

= 100%,

-,
-,
-, .
():
o : 80 %.
o : 95100 %.
-:

= 100%,

-,
- ,
.
:
o : 40 %.
o : 60 %.
o : 80 % ( 90 % ).

. . EPAM Systems, 20152017 : 214/296


-

2.6.a: -
. , ,
.. ( ) -, , -
.
,
.


(test progress report342, test summary
report343) ,
, -
- .
:
;
- ( -
);

;
, ,
, -
.
,
. -
{41},
:
(
, -
).
(
,
).
-
( )
, .

.
, , (-). -
.
-
:
-
;
(-)
;

342 Test progress report. A document summarizing testing activities and results, produced at regular intervals, to report progress of
testing activities against a baseline (such as the original test plan) and to communicate risks and alternatives requiring a decision
to management. [ISTQB Glossary]
343 Test summary report. A document summarizing testing activities and results. It also contains an evaluation of the corresponding
test items against exit criteria. [ISTQB Glossary]

. . EPAM Systems, 20152017 : 215/296


-

(-) -
-
, -
;
, -
, .
-
( , -
).
! - -
- , -
(, -
). -
,
.
(summary). -
, , .

, -
( , .. -
).
! -
{170}!

!
(test team). ,
, -
.
(testing process description). -
, -
.
(timetable). -
/ .
(new defects statistics). ,

( ).
(new defects list). -
.
(overall defects statistics). ,
-
( ).
, , -
.
(recommendations).
( -,
..)
, (summary),
, .

. . EPAM Systems, 20152017 : 216/296


-

(appendixes). ( , -
).


-
, -
(. 2.6.b), -
, (summary)
(recommendations):
( ).
.
, .
.



2.6.b

:
. :

1.17. - 1.11. -
, (. 2.12.2).

1.23.
, , -
-
, -
(. 2.32.4).
-
- 1.28.
. Linux - -
SR-85 (. 2.5).

. :

1.8. - 1.8.
, (.
, - BR 834).
.
1.9.
1.9. -
(. BR 745,

BR 877, BR 878).
.
1.10.
1.10. ,
, -
.
.

. . EPAM Systems, 20152017 : 217/296


-

. :

1.18. -
. !

1.19. -
-
.

1.20. , -
, -
.

1.21. - -
.

1.22. ,
.

:
. , , ..
. :

2.98. - 2.98. -
- . -
:
[, , ] -
- ;
[
, : - ] -
- cflk_n_r_coding (-
- , ).
.

. :

2.107. 2.107.
, -
Google. (. )
2.304. 2.304.
- 40-50% (
. 512 ).
2.402. 2.402.
.

cflk_n_r_flstm.

. . EPAM Systems, 20152017 : 218/296


-

, , -
. :

2.212. - 2.212. :
. )
) [!]
2.245. -
)
.
2.245. -
2.278.


-
.
.
2.278.
, -
,
.

-
-
. :
?
?!
?
:

4.107. - 4.107. -
. (
63 % -
4.304. -
60 % ).
.
4.304. -
4.402.
, ..
.
21 -
(. 5.43)
, -
.
4.402.
, ..
30 -
-
R84.* R89.*.

, -
. ,
,
.. ,
( ),
, -
, ..
-

{55}. , ,
.

. . EPAM Systems, 20152017 : 219/296


-


, ,
-
. , -
, , ..
.

. 2628 ,
100 % -
76 % - . 98 % -
. ,
( -
).
(29 ) -.
.

-

-

. -
(36) Windows 7 Ent x64
Linux Ubuntu 14 LTS x64 PHP 5.6.0. (.
http://projects/FC/Testing/SmokeTest) -
(. \\PROJECTS\FC\Testing\Aut\Scripts). -
(. http://projects/FC/Testing/CriticalPathTest) -
. -
( ), -
( 83 % -
).
.
,
27.05.2015 - 2
27.05.2015 2
27.05.2015 - 1

27.05.2015 2
27.05.2015 1
27.05.2015 2
28.05.2015 - 3
28.05.2015 1
28.05.2015 2
28.05.2015 - 1

28.05.2015 1
28.05.2015 1

. . EPAM Systems, 20152017 : 220/296


-

.


23 2 12 7 2
17 0 9 6 2
13 0 5 6 2
- 1 0 0 1 0

3 0 2 1 0

.

BR 21
.
BR 22 .md -
.
23 .

.


34 5 18 8 3
25 3 12 7 3
17 0 7 7 3
- 1 0 0 1 0

4 0 3 1 0


40
35
30
25
20
15
10
5
0
26.05.2015 26.05.2015 27.05.2015 27.05.2015 28.05.2015 28.05.2015
1 2 3 4 5 6

. .

. . EPAM Systems, 20152017 : 221/296


-

. .


200
180
160
140
120
100
80
60
40
20
0
26.05.2015 26.05.2015 27.05.2015 27.05.2015 28.05.2015 28.05.2015
1 2 3 4 5 6

Tsp Dftp Dfcp S Te Rc

2.6.b:
. ,
, .. ( ) , , -
.

. . EPAM Systems, 20152017 : 222/296


2.6.3.
,
.
(man-hours344) , -
( -).

, - ,
:
?
?
- ?
?

,
.
. -
, -
, ,
.
. ,
, .


, ,
, -
, , -
, ,
- ( , ..).
, ,
100 % ,
( , ,

). ,
, (
).
. ,
, ,
, . -
, ,
. -,
, -
.

344 Man-hour. A unit for measuring work in industry, equal to the work done by one man in one hour. [http://dictionary.refer-
ence.com/browse/man-hour]

. . EPAM Systems, 20152017 : 223/296


. -
(. ) ,
. -
-
-
, .
? .
. . -
. . ,
, , , -
.

, -
:
The Mythical Man Month, Frederick Brooks.
Controlling Software Projects, Tom De Marco.
Software engineering metrics and models, Samuel Conte.
:
. ,
, -
. .
. . -
: (-
, ), , -
.
.
, -
, , .
-
,
.
.
. -
, , -
.

. ,
.
:
( , -
) . -
, 510 % 30
40 %. .
: -
,
. -
, ,
. , ,
1.3 .
.

. . EPAM Systems, 20152017 : 224/296


. ,
, N -
-, .. ,
- . -
,
.
. , , -
( !) ( ) ,
.. -
, ,
.
.
, - -
(, -, -
).
, -
.
, -
.
, ( ),
.
, -
. -
, -
.

:
, -
.


:
Essential Scrum, Kenneth Rubin.
Agile Estimating and Planning, Mike Cohn.
Extreme programming explained: Embrace change, Kent Beck.
PMBOK (Project Management Body of Knowledge).

Software Estimation Techniques Common Test Estimation Tech-
niques used in SDLC345.
(work breakdown structure, WBS346) -
-
, -
.

345 Software Estimation Techniques - Common Test Estimation Techniques used in SDLC [http://www.softwaretest-
ingclass.com/software-estimation-techniques/]
346 The WBS is a deliverable-oriented hierarchical decomposition of the work to be executed by the project team, to accomplish the
project objectives and create the required deliverables. The WBS organizes and defines the total scope of the project. The WBS
subdivides the project work into smaller, more manageable pieces of work, with each descending level of the WBS representing
an increasingly detailed definition of the project work. The planned work contained within the lowest-level WBS components,
which are called work packages, can be scheduled, cost estimated, monitored, and controlled. [PMBOK, 3rd edition]

. . EPAM Systems, 20152017 : 225/296



, :
,
, -
;
-
( );

, , -
..

--
.
-
:
Test Effort Estimation Using Use Case Points347, Suresh Nageswaran.
Test Case Point Analysis348, Nirav Patel.
,
:
,
-;
-
( -, -,
..);
.
-2.4{57}:
-
, (-3.1),
, (.
-3.2).

, ,
:
,
.
,
, ( ) -
, :
o SOURCE_DIR DESTINATION_DIR: Directory
not exists or inaccessible.
o DESTINATION_DIR SOURCE_DIR: Destination dir
may not reside within source dir tree.
o LOG_FILE_NAME): Wrong file name or inaccessi-
ble path.

347 Test Effort Estimation Using Use Case Points, Suresh Nageswaran [http://www.bfpug.com.br/Artigos/UCP/Nageswaran-
Test_Effort_Estimation_Using_UCP.pdf]
348 Test Case Point Analysis, Nirav Patel [http://www.stickyminds.com/sites/default/files/article/file/2013/XUS373692file1_0.pdf]

. . EPAM Systems, 20152017 : 226/296


- -
,
:
{1 -}.
/ :
o SOURCE_DIR {3 -};
o DESTINATION_DIR {3 -}.
LOG_FILE_NAME {3 -}.
SOURCE_DIR DESTINATION_DIR -
, DESTINATION_DIR
SOURCE_DIR {3 -}.
/
{5 -}.
SOURCE_DIR DESTINATION_DIR /-
, DESTINATION_DIR
SOURCE_DIR {3 -}.
22 -. -
, - (, 5)
.
2.6.a, -
. , -
, -
; -
, .
-, ,
.

:
: 11.5 ( ).
: 2.
: 35.
2.6.a -

12 22
() 1 1.2
12 26.4
-

, -
-. , -

,
( , ):
;
-;
;
;
;
.

. . EPAM Systems, 20152017 : 227/296



,
: -
, --
, , , ..
, -
, .
, -
(28 ) :
300 - ( 11 - , 1.4 ).
1000 - ( 36 - , 4.5 ).
2.6.a 2.6.b.
2.6.b

12 22
() 1 1.2
12 26.4
-, 0.7 0.2
, 8.4 5.2
13.6


,
, .. -
, , -
. , 28
,
.
,
,
- 8 , (-
, 6).
13.6 , 1.7 . -
, ,
.
,
-
,
. .
2.6.c: -{154}, -
2.4, -
.

. . EPAM Systems, 20152017 : 228/296


2.7.
2.7.1. -
{148} -,
-
:
?
( )?
?
, .. ?
, -
, .. -
{77} {77} -. -
{55},
SOURCE_DIR , ,
.
? . , , ,
{56} Windows
Linux, -
. .
( )?
, .. -
(,
, -
, ). -
, .. -
.
? -
.
:
o Windows:
X:\dir
X:\dir with spaces
.\dir
..\dir
\\host\dir
\ .
X:\
o Linux:
/dir
/dir with spaces
host:/dir
smb://host/dir
./dir
../dir
/ .
/

. . EPAM Systems, 20152017 : 229/296


-

, ..
-
( ).
- .
? , .
, {232}.
,
-, ..
. ,
, ..
.
, .. ?
- ( ) , -
. , - ( -
) ?
:
o ().
o :
Windows: 256 . (! -
256 ,
, ..
, .)
Linux: 4096 .
o , : ? < > \ * | " \0.
o , : ....\dir.
:
o .
o .
, .
:
o .
o .
o .
:
o Windows: com1-com9, lpt1-lpt9, con, nul, prn.
o Linux: ...
, : , -
.

, .
:
?
- ?
, ,
{147}. -
, , .. -
,
.

. . EPAM Systems, 20152017 : 230/296


-

2.7.a: , -
, SOURCE_DIR
DESTINATION_DIR?

. . EPAM Systems, 20152017 : 231/296


2.7.2.
-
{89} {90}.
, :
(equivalence class349) , -
.
(border condition, boundary condition350) ,
.
-, -
. -
, .. ,
.
. ,
, ,
.
:
.
, ,
.
,
[3, 20] ( 18-
63- , .. 2.4441614509104E+32). -
, -
21 , 100, 10000, , ..

(. 2.7.a).

3 20

0 2 21

2.7.a -

-
, , ,
(. 2.7.b).
:
[0, 2] ;
[3, 20] ;
[21, ] .

349 An equivalence class consists of a set of data that is treated the same by a module or that should produce the same result. [Lee
Copeland, A practitioners guide to software test design]
350 The boundaries the edges of each equivalence class. [Lee Copeland, A practitioners guide to software test design]

. . EPAM Systems, 20152017 : 232/296


3 20

0 2 21

2.7.b

, [0, 2] [21, ]
, ..
.
2.7.b 2, 3, 20 21. -
0 , .. -
, NULL, .. .
-
( -
, , ..
, , ).
2.7.a - (
)
- -
- AAA 123_zzzzzzzzzzzzzzzz AA 1234_zzzzzzzzzzzzzzzz

- - -
-
- ,

-

. ,
, , . ,
.. -
( 2.7.c ASCII-),
,
.
, -
. ,
-
,
.

A-Z _ a-z
48 57 65 90 95 97 122

0 47 58 64 91 94 96 123 255

2.7.c
( ASCII-
)

. . EPAM Systems, 20152017 : 233/296


,
, - (
). , ,
, (. 2.7.d).

A-Z
a-z


_

2.7.d
(
)
, -
, (
). 2.7.a
2.7.b.
2.7.b -
- -
- AAA 123_zzzzzzzzzzzzzzzz AA 1234_zzzzzzzzzzzzzzzz #$%


- - - - -

- - , ,
- -
-
-

, (,
) -
.
, ,
, .
{55} -
, -
{230}

-{229}.
, SOURCE_DIR, -
( ):
( ).
.
.
.
( ).
( ).
.

. . EPAM Systems, 20152017 : 234/296


, .
, .
2.7.b:
?
, .
-351 ( 2.7.e).

SOURCE_DIR

2.7.e -

351 Concept map, Wikipedia [http://en.wikipedia.org/wiki/Concept_map]

. . EPAM Systems, 20152017 : 235/296


-
, 2.7.e
(SOURCE_DIR) , -
.
2.7.e .
: SOURCE_DIR Windows
UTF16 .
Windows 256 :
[][:\][][null] = 1 + 2 + 256 + 1 = 260. 1
( ). ,
2.7.f.

1 256

0 257

2.7.f
352, , -
32767 , 260
.. . , , -
, 200
200 ,
400 ( 260).
,
,
, (,
SOURCE_DIR 100 ,
SOURCE_DIR 160 , -
260 ).
, -
2.7.f -
, ( ),
200 + 200 , .
2.7.c - -
2.7.e
- -
- . () C:\256 C:\257

-
-
- ,

, 2.7.e .
, , ,
-
.

352 Naming Files, Paths, and Namespaces, MSDN [https://msdn.microsoft.com/en-us/library/aa365247.aspx#maxpath]

. . EPAM Systems, 20152017 : 236/296


2.7.3.
{90} :
(domain testing, domain analysis353) -
- , -
-
.

,
{232} . -
.
2.7.e , -
, :

o Windows
o Linux

o
o

o
o

o
o
, . -
(. 2.7.g). -
( -
),
( 2.7.g ).
,
( -
,
).
-
, .

. 2.7.d.
2.7.d
Windows Linux
+ +
+ +

-
( , +) -
, , , ..

353 Domain analysis is a technique that can be used to identify efficient and effective test cases when multiple variables can or
should be tested together. [Lee Copeland, A practitioners guide to software test design]

. . EPAM Systems, 20152017 : 237/296


- -

- -
Windows

SOURCE_DIR

- -
Linux

- -

2.7.g

. . EPAM Systems, 20152017 : 238/296


( )
2.7.e.
2.7.e
Windows Linux
- + +
- -
+ +

+ +

( ) -
2.7.f. ,
,
. , .
2.7.f

Windows Linux Windows Linux

- - + +
-

- - - -


+ + + +


+ + + +

-
, -
. + -
(
),
.
, ,
, 2.7.f -
. ,
-
.
, -
.

. . EPAM Systems, 20152017 : 239/296


2.7.4.
{90} :
(pairwise testing354) ,
-
-
.
. -
?
:
355, 359;
356;
IPO (in parameter order) 357;
358;
359.
-
359. -
360 355, -
361.

, :
- , -
-, .
2.7.e -
, ,
2.7.g. :
, : , : Windows
Linux ..
. -
-
, ..
.

354 The answer is not to attempt to test all the combinations for all the values for all the variables but to test all pairs of variables. [Lee
Copeland, A practitioners guide to software test design]
355 Pairwise Testing, Michael Bolton [http://www.developsense.com/pairwiseTesting.html]
356 An Improved Test Generation Algorithm for Pair-Wise Testing, Soumen Maity and oth. [http://www.iiser-
pune.ac.in/~soumen/115-FA-2003.pdf]
357 A Test Generation Strategy for Pairwise Testing, Kuo-Chung Tai, Yu Lei
[http://www.cs.umd.edu/class/spring2003/cmsc838p/VandV/pairwise.pdf]
358 Evolutionary Algorithm for Prioritized Pairwise Test Data Generation, Javier Ferrer and oth.
[http://neo.lcc.uma.es/staff/javi/files/gecco12.pdf]
359 On the Construction of Orthogonal Arrays and Covering Arrays Using Permutation Groups, George Sherwood
[http://testcover.com/pub/background/cover.htm]
360 A Practitioner's Guide to Software Test Design, Lee Copeland.
361 Pairwise Testing: A Best Practice That Isnt, James Bach [http://citeseerx.ist.psu.edu/viewdoc/down-
load?doi=10.1.1.105.3811&rep=rep1&type=pdf].

. . EPAM Systems, 20152017 : 240/296


2.7.g ,
-
-

2 25 32
2 2 2
2 3 155
2 4 28
2 7 23
2 3 16
2 4 4096
2 4 82
- 256 201600 34331384872960

, (
), 256 -
. ,
200 -.
, 11 .
-
-,
. . ,
, -
, , , .
, 0, com1
. ,
.
2.7.h

/ / - 1. X:\
/ - 2. X:\dir
/ 3. X:\
4. .\dir
5. ..\dir
6. \\host\dir
7. [256 Windows]
+ 2-6 \ .

8. /
9. /dir
10. /
11. host:/dir
12. smb://host/dir
13. ./dir
14. ../dir
15. [4096 Linux]
+ 9-14 / .

.
16. [0 ]
17. [4097 Linux]
18. [257 Windows]
19. "
20. //
21. \\

. . EPAM Systems, 20152017 : 241/296


22. ..
23. com1-com9
24. lpt1-lpt9
25. con
26. nul
27. prn
1.
2.
1.
2.
3. ,
1. Windows 32 bit
2. Windows 64 bit
3. Linux 32 bit
4. Linux 64 bit
1. UTF8
2. UTF16
3. OEM

- 2736 (38*2*3*4*3),
200 , .
-
362 (, PICT)
. -
2.7.i. 152 -
, .. 1326 (201600 / 152) 18
(2736 / 152) .
2.7.i ,
/ / - -
/
/ -
-

1 X:\ - Windows 64 UTF8
bit
2 smb://host/dir/ , Windows 64 UTF16
bit
3 / Windows 32 OEM
bit
4 [0 ] Linux 32 bit UTF8
5 smb://host/dir - Linux 32 bit UTF16

6 ../dir , Linux 64 bit OEM

7 [257 Windows 64 OEM
Windows] bit
8 [4096 , Windows 32 UTF8
Linux] bit
9 [256 , Linux 32 bit OEM
Windows]
10 /dir/ Windows 32 UTF16
bit

,
, (, -
Windows Linux .

362 Pairwise Testing, Available Tools [http://www.pairwise.org/tools.asp]

. . EPAM Systems, 20152017 : 242/296


4 8). , 124 . -
, -
{288} -
( , , Linux
, Windows). 85 -,
256 -,
.
2.7.c: -
{288} -
. , ?
,
.
.
, -
,
-. , -
.

. . EPAM Systems, 20152017 : 243/296


2.7.5.
{80} {80} -
. ,
,
, -{79}. -
.
363 , -


, ,
. , -
, .
-, -
,
,
,
.
2.7.h.

2.7.h , -
-363
,
.
:
.
.
-.
, -
.
( ,
).

363 A Tutorial in Exploratory Testing, Cem Kaner [http://www.kaner.com/pdfs/QAIExploring.pdf]

. . EPAM Systems, 20152017 : 244/296


363 -
, , -
, , , -
, , -
, ..
{55}. -
: , - (
, ) , -
. , :
: -1, -2, -3, -1.1, -1.2, -2.1, -
3.1, -3.2, -1.1, -1.2, -1.1, -2.1, -2.2, -2.3, -2.4, -3.1, -3.2
( ), -4.1, -4.2, -4.3.
, -
, . .

2.7.j
, , -
.
2.7.j

-1 , .. -
.
-2 , .
-3 Windows Linux.
-1.1 -
-2.1 . ,
-2.2 ( , ).
-2.3 . , 1.
-2.4
-1.2 . , 2.
-2.1 , .
-3.1 -
-3.2 , .. . . ,
-4.1 4.
-4.2
-4.3
-1.1 . , 3.
-1.2
-1.1 PHP 5.5.
-3.1 1-2 (.
-3.2 .



2.7.j, -
, (, -
-).

. . EPAM Systems, 20152017 : 245/296


:
1. :
a. .
b. , , .
c. , , , , ,
.
2. Ctrl+C.
3. :
a. - - .
b. - -.
c. -, -.
4. .
5. ,
.
2.7.d: -
{147}, {229}, {232}, {237}, {240} -
, ?
, . , ,
2 ( ) 1 3).


, , -
,
, -
.
? -
, .. :
.
- , ( -
, , ,
, ).

(. 2.7.k).
2.7.k
/
0 )
, ( -
).
) , - , , -
_ _ _, ..

?
1 php converter.php Error: Too few command line parameters.
USAGE: php converter.php SOURCE_DIR .
DESTINATION_DIR [LOG_FILE_NAME]
Please note that DESTINATION_DIR may
NOT be inside SOURCE_DIR.
2 php converter.php zzz:/ c:/ Error: SOURCE_DIR name [zzz:] is not a , zzz:/ -
valid directory. zzz:.

. . EPAM Systems, 20152017 : 246/296


3 php converter.php "c:/non/exist- Error: SOURCE_DIR name [c:\non\exist- -


ing/directory/" c:/ ing\directory] is not a valid directory. , -
: ?
, ,
.
4 php converter.php c:/ d:/ 2015.06.12 13:37:56 Started with parame-
ters: SOURCE_DIR=[C:\], DESTINA- , -
TION_DIR=[D:\], -. -
LOG_FILE_NAME=[.\converter.log] - -
?
5 php converter.php c:/ c:/ Error: DESTINATION_DIR [C:\] and .
SOURCE_DIR [C:\] mat NOT be the same must may.
dir.
6 php converter.php "c:/ Error: SOURCE_DIR name [c:\ : -
/" c:/ ] is not a valid directory. .
7 php converter.php / c:/Win- Error: SOURCE_DIR name [] is not a valid Linux: -
dows/Temp directory. , , -
/ - -
, / -
, -
Windows,
Linux.
8 : e: -- DVD- file_put_con- : PHP
. tents(e:f41c7142310c5910e2cfb57993b4d .
php converter.php c:/ e:/ 004620aa3b8): failed to open stream: Per-
mission denied in \classes\CLPAna-
lyser.class.php at line 70 Error: DESTINA-
TION_DIR [e] is not writeable.
9 php converter.php /var/www Error: SOURCE_DIR name [var/www] is : Linux
/var/www/1 not a valid directory. / -
, .. -
, Linux -

( -
, -
. ..).

(, , ):

, -
. , ,
, .
Windows , -
.
Linux / ,
.
, ,
Windows Linux.
, , -
, ( 2.7.k)
, -
-.
2.7.e: , 2.7.k
.

. . EPAM Systems, 20152017 : 247/296


2.7.k -
. ? , -
? -
? ,
.

2.7.6.
{163},
, , -
. -
,
, (-
) .
(root cause analysis364)
, -
, , , ,
.
, -
, -. ,
, 2.7.i.

N-1

...

2.7.i
( ),
. ,
.
. -

364 Root cause analysis (RCA) is a process designed for use in investigating and categorizing the root causes of events with
safety, health, environmental, quality, reliability and production impacts. [James Rooney and Lee Vanden Heuvel, Root Cause
Analysis for Beginners, https://www.env.nm.gov/aqb/Proposed_Regs/Part_7_Excess_Emissions/NMED_Exhibit_18-
Root_Cause_Analysis_for_Beginners.pdf]

. . EPAM Systems, 20152017 : 248/296


, -
( -
, ).
. -
:
.
( ).
.
. 2.7.k -
9{247} Linux
, ,
/, Linux .
, 2.7.i, -
2.7.l:
2.7.l

php ,
- converter.php /var/www /var/www/1 -
: Error:
SOURCE_DIR name [var/www] is not a valid
directory. , - /. -
. -
-

/
.


,
.
: ) ; )
.
2.7.l []

N : -
/ - :
. -
(php converter.php . .) , .
Windows (php converter.php c:\ d:\) : -
, - / (
. , \).
N-1 php converter.php \\\\\c:\\\\\ :
\\\\\d:\\\\\ php converter.php /////c://///
/////d:///// , / \,
Windows , -
: Started with .
parameters: SOURCE_DIR=[C:\],
DESTINATION_DIR=[D:\]

, -
/ \ -
Linux . ?

. . EPAM Systems, 20152017 : 249/296


2.7.l []

N-2 : -
, - , -
-
. . -
. ,
, -
. ,
, ,
: -
.
private function getCanonicalName($name)
-
{
.
$name = str_replace('\\', '/', $name);
$arr = explode('/', $name);
$name = trim(implode(DIRECTORY_SEPARATOR,
$arr), DIRECTORY_SEPARATOR);
return $name;
}

2.7.f: ,
/ \ (..
, ). ?

-
(. 2.7.j):

o ?
o ?

o ?
o ?
o ?

o ?
o -
?
o -
?

o .
o , .
, ,

o
.
o ,
.
.
: -
, , (
).

. . EPAM Systems, 20152017 : 250/296


2.7.j
, -
.
, -
.

. . EPAM Systems, 20152017 : 251/296


3:

3:

3.1.
3.1.1.
, {64},
, -
{71}: , , -
. -
2.3.b{71} -
, .
-
. ,
,
:
. 36 ,
{279},
.
-
(, ..) -
: , , (-
!) 100 ?
10? 20? ?
, .
.
-, -
, .
:
,
,
.
- {86}, -
,
. -
, ,
, ? . .
, , , -
-
.
, -
, .
, -
.
, -
,
, 23 , -
, , -
, -
.

, , ..

. . EPAM Systems, 20152017 : 252/296


,
, -
. -
. -
, -
, -
(, -
) .
, -
,

?
,
{210} :
-, ;
- ;
-.

? , .
3.1.:

-


...

t
-

3.1. -

,
, , -
. , -
:

, ( -
, , ..).
, , -
, , , ,
.
-,
. -
, (

. . EPAM Systems, 20152017 : 253/296


) -
: -
, , ( -
, ) -
.
-
, ..
(. ).
,

. -
:
, ,
.
,
, -
, -
, -
.
,
, 365, 366, 367, 368
.
, :
-
-, . , -
.
-.
, .
,
-. , -
, -
.
.
, -
.

365 Implementing Automated Software Testing Continuously Track Progress and Adjust Accordingly, Thom Garrett
[http://www.methodsandtools.com/archive/archive.php?id=94]
366 The Return of Investment (ROI) of Test Automation, Stefan Mnch and others.[https://www.ispe.org/pe-ja/roi-of-test-automa-
tion.pdf]
367 The ROI of Test Automation, Michael Kelly [http://www.sqetraining.com/sites/default/files/articles/XDD8502filelistfile-
name1_0.pdf]
368 Cost Benefits Analysis of Test Automation, Douglas Hoffman [http://www.softwarequalitymethods.com/pa-
pers/star99%20model%20paper.pdf]

. . EPAM Systems, 20152017 : 254/296


-
369:


= ,
+
,
;

, -
;


;

-
.

, ,

( 3.1.b). , :

= 30 ;

= 5 ;

= 300 .

3.1.b -

3.1.b, 12- -
13- . ,
.
.

369 Introduction to automation, Vitaliy Zhyrytskyy.

. . EPAM Systems, 20152017 : 255/296


3.1.2.
,
:
-, .
.
.
.
.
-
.

(. 3.1.a).
3.1.a
/
- ,
{82}. ,
, -
.
-
{81} , ,
. , ..
-
.
- -
{84} - , .
{84}. : , ,
: 100100 -
.
- -
- -
{102} ( .. .
{90}, {237}).
- -
{72}. -
,
.
- ,
{72}. , .. -

, .
- , , -
{84}. , .., ..
, -
- , .
- , -
{86}. . -
.
.
{74} - -
. -.
( -
) - ( ). -
. , -
(-, , ..)

. . EPAM Systems, 20152017 : 256/296


, , , , -
- , -
/ - , -
- .. , -
. .
(, -
- 30000+ , -
(, - ).
..) .
,
-
. .
. , -
, , , -
..

, , ,
, . ,
,
.
( 3.1.b):
3.1.b
/
{203}. .
-{115}.
{165}.
-
{203}.
, (- .
) .
-,
(
).
,
. , -
-.
- - -
- , -
.
.
. ,
,
.
- , -
. -, -
.
- -
. ,
.
, .
-
. -
( ). -
:
-, -
.

. . EPAM Systems, 20152017 : 257/296


, , ,
( - ,
{83}, - . -
{83} ..) , , .

: ,
. , -
, -
.

. . EPAM Systems, 20152017 : 258/296


3.2.
3.2.1.
, -
, 3.2.a

( ).

- -

- - - -
- -

3.2.a

, -
, -
. : -
, , , , -
.. .
-
, . -

, -
. ,
.
:
( , -
) .

.


, , -
, , .
, : .

{264}.

. . EPAM Systems, 20152017 : 259/296


-

3.2.2. -
( )
-, (, ,
) .. -, -
{115}.
, -
( ) -, -
.
, , -
- -
,
, - -
, .
- -
:
- -
.
:


7. - 7. : title =
. Search page, -
input type="text", input
type="submit" value="Go!",
logo.jpg -
(img).

- -
, , -
. :

1. Search. 1. Search.
2. clickAndWait - 2. -
. .

: -
,
-,
. :


8. 8.
WM_CLICK . (
).
9.
.

-
:
, , -
, -
.
- . :

. . EPAM Systems, 20152017 : 260/296


-


1. Expand data. 1. Expand data.
2. - 2.
Unknown. Extended data (select id="ex-
tended_data"):
enabled.
3. Extended data -
Unknown.

-
(.. ).
/ , .
:

1. http://application/. 1. .


.
- , ,
. :


8. Search - 8.
WM_KEY_DOWN, {}, WM_KEY_UP, Search (
- -
. !)

- .
, -
, , -
-. :

1. , - 1. - Use stream buffer
file checked.
2.
( Start).
3.

, - ,
-
.. , . -
:

if ($date_value == '2015.06.18') if ($date_value == date('Y.m.d'))
{ {


} }

if (POWER_USER == $status)
if ($status = 42) {
{ ,
(= ==) } -
}

. . EPAM Systems, 20152017 : 261/296


-

-
, , - -
- , ..
, .
- -
, : -
, -
, .. ,
, .
-
Selenium IDE370, -
. , -
, - id=cb (checked). -
, ,
- (enabled, editable),
.
( ) ( )


verifyEditable id=cb verifyChecked id=cb

.

, -

. ,
. - -
. /, .. -
, .. -
.
, , -,
,
, -
, .
- -
. -
- , .. -
3510
. - 50100
500 , .
:
: , ,
..
() :
,
(, 10 100
, ),
(, 10 , 100 , 1000 ..).

370 Selenium IDE. [http://docs.seleniumhq.org/projects/ide/]

. . EPAM Systems, 20152017 : 262/296


-

: -
, - ..
, .. , -
(, -
,
doc(x)- ,
).
, --
. , ( 50100)
e-mail- ,
.

.

. . EPAM Systems, 20152017 : 263/296


3.2.3.
-
, , ,
( , -
, ..)
,
, - ( -
, ).
3.2.a -

1 . , . ,
-
.
.
-
.

2 - - - -
- -
{88} (DDT). -
. . ,
-
-
.

3 - - -
- -
{88} . . .
(KDT). -



-.
4 , - -
. - . (
-
. ).

5 - - , -
(Record & , , -
Playback). - -.
- -. -
, -
-
. .

. . EPAM Systems, 20152017 : 264/296


6 - -
- - -
{88} (BDT). - -
. . -
- ,
-
-
-
. -
-.

3.2.a
(. 3.2.b):











3.2.b
,
,
, -
.

(functional decomposition371)

.
-
, -

.
( 3.2.c): , -
( , )
(, ,
..).

371 Functional decomposition. The process of defining lower-level functions and sequencing relationships. [System Engineering
Fundamentals, Defense Acquisition University Press]

. . EPAM Systems, 20152017 : 265/296


- - - -

-
-

3.2.c -


, -
, .. , ,
.
.


( ,
) ,
, -
(Java, C#, PHP
..)
.

(
Windows Linux {279}).
:
, (, -
).
(,
-).
/ .
, ,
, .
, .
, , ,
.
-
3.2.b.

. . EPAM Systems, 20152017 : 266/296


3.2.b


. , -
- , -
, (
. ).
- -
. ( , -
- , , ).
, - , -
, -
, - ( , -
- , -
. , ).
-
- ,
. .

(DDT)
, {279}
, ( -
, ). - -
. -
.
PHP, CSV- (
) .
function compare_list_of_files($file_with_csv_data)
{
$data = file($file_with_csv_data);
foreach ($data as $line)
{
$parsed = str_csv($line);
if (md5_file($parsed[0]) === md5_file($parsed[1])) {
file_put_contents('smoke_test.log',
"OK! File '".$parsed[0]."' was processed correctly!\n");
} else {
file_put_contents('smoke_test.log',
"ERROR! File '".$parsed[0]."' was NOT
processed correctly!\n");
}
}
}

CSV- ( ):
Test_ETALON/ WIN1251.txt,OUT/ WIN1251.txt
Test_ETALON/ CP866.txt,OUT/ CP866.txt

CSV-
, - .

:
-
.

. . EPAM Systems, 20152017 : 267/296


-
.
- , -
( -
{288}).
- -
, -
- (. -
{262}).
-
3.2.c.
3.2.c -


-. -
- .
. -
-
, - .
.
.
-
-. .
- ,
. - -
, -
- ,
, - -.
. -
.
- -
, -
.


-
- ( ). -
, CSV- ,
:
moved - -
;
intact - -
;
equals .
function execute_test($scenario)
{
$data = file($scenario);
foreach ($data as $line)
{
$parsed = str_csv($line);
switch ($parsed[0])
{

. . EPAM Systems, 20152017 : 268/296


//
case 'moved':
if (is_file($parsed[1]))&&(!is_file($parsed[2])) {
file_put_contents('smoke_test.log',
"OK! '".$parsed[0]."' file was processed!\n");
} else {
file_put_contents('smoke_test.log',
"ERROR! '".$parsed[0]."' file was
NOT processed!\n");
}
break;
//
case 'intact':
if (!is_file($parsed[1]))||(is_file($parsed[2])) {
file_put_contents('smoke_test.log',
"OK! '".$parsed[0]."' file was processed!\n");
} else {
file_put_contents('smoke_test.log',
"ERROR! '".$parsed[0]."' file was
NOT processed!\n");
}
break;
//
case 'equals':
if (md5_file($parsed[1]) === md5_file($parsed[2])) {
file_put_contents('smoke_test.log',
"OK! File '".$parsed[0]."' Was
processed correctly!\n");
} else {
file_put_contents('smoke_test.log',
"ERROR! File '".$parsed[0]."' Was
NOT processed correctly!\n");
}
break;
}
}
}

CSV- ( ):
moved,IN/ WIN1251.txt,OUT/ WIN1251.txt
moved,IN/ CP866.txt,OUT/ CP866.txt
intact,IN/.jpg,OUT/.jpg
equals,Test_ETALON/ WIN1251.txt,OUT/ WIN1251.txt
equals,Test_ETALON/ CP866.txt,OUT/ CP866.txt

-
, -
, Selenium IDE370, -
:
( ) 1 2


, -
. ,
, -
CSV- ( ) XLSX-
.

. . EPAM Systems, 20152017 : 269/296


-
(
)
. , ,
CSV- PHP, C#,
Java, Python -
.
-
3.2.d.
3.2.d -


(, , -
-. ) .
--
, . -.
- ( ) -
- ,
. - -
- .
-.
(
- , - -
. - -
).
.
( -
- .
). -
, -.


,
,
,
.
, ,
:
( -
) -
;
;
( ).
, -
, , . -
(, xUnit), -
(, Selenium), , -
..
, -

, - ..

. . EPAM Systems, 20152017 : 270/296


,
. -
, Selenium WebDriver372.
-
3.2.e.
3.2.e -


. .
-
. -
- .
, -
. .
. -
-

.
.
.

(Record & Playback)


(Record & Playback)
, -
. -
, , -
:
1. -, -
.
2.
( ).
3. .
4. -
.
, -
. , -
.

,
. -
3.2.f.

372 Selenium WebDriver Documentation [http://docs.seleniumhq.org/docs/03_webdriver.jsp]

. . EPAM Systems, 20152017 : 271/296


3.2.f

( -:
, , ,
). -
- .
- ( -
. -
, (
( ) -
, , ).
..) , ..
(- - ,
, , ..) -
- -
- .
. ,
-,
, -
-
.
-
,
..

, -

-,
.


-
-
: -
( ).
, -
, -
.
,
, , --
, .. given-
when-then:
Given (, , ) ,
.
When () .
Then () .
:
, .

.
, -
UTF-8.

. . EPAM Systems, 20152017 : 272/296


,
, -
-,
-, .. ,
-
-. (, Behat,
JBehave, NBehave, Cucumber), -
.
-
3.2.g.
3.2.g -


-
. ,
-
. -
-
- (, , .
, -
). -, -
-
.


(Test-driven Develop-
ment, TDD) , , (Red-Green-
Refactor), (Behavior-driven Devel-
opment), (Unit Testing) ..
,
.

. . EPAM Systems, 20152017 : 273/296


3.3.
, -
-.
, -
, ,
, .
.
,
, -
. :

-
. -
-
, , -
.

, ,
. , , -
.
.
-
, ,
,
.
: ,
, .. .. -
( !) , .
. -
.
,
, -
.
-
. -
, -
. -


.
, --
, , -
.

. . EPAM Systems, 20152017 : 274/296


4:

4:

4.1. 373

373 [http://svyatoslav.biz/wp-pics/testing_career_path.png]

. . EPAM Systems, 20152017 : 275/296


4.2.

1.1.a{6} : 2e+42 , .. 1.66e+32 ,
.
: (264 * 264 * 264) / (100000000 * 31536000 ) =
1.9904559029003934436313386045179e+42 .
1.1.b{8}
ISQB Glossary:
http://www.istqb.org/downloads/glossary.html
1.2.a{10}
-
-, :
http://www.cambridgeenglish.org/test-your-english/
1.3.a{12} , . . -
, - -
(!!!) () () .
2.1.a{26} , , -

http://istqbexamcertification.com/what-are-the-software-development-
models - -
.
2.2.a{31}
, ? -
, , -
. , -
,
?
2.2.b{49}
: ) , -
. ) ,
.
-
, -
, - ,
.
2.2.c{52} , -
-
{58}. -
, .
2.2.d{57} :
- ? , .
2.2.e{57} -
. , -
, ..
, -
.

. . EPAM Systems, 20152017 : 276/296


2.3.a{72} , , -
: ( , -
) -
, -
.
2.4.a{111} , -
, -, ,
-, (..
).
2.4.b{114} -? -
-
?
2.4.c{130} , - -
, -
. , -
/ -
.
2.4.d{134} -? -
, -
? (: -
, .. ,
UTF8,
, . -
- ,
- UTF8 .)
2.4.e{149} -
,
?
2.4.f{151} - ,
,
.
2.4.g{154} -
?
? ?
2.5.a{169} : -2.1,
, -,
. , ,
,
, -
,
. ! -
, ,
- -
, .. -

,
( ,
, .. -
, - - -
).

. . EPAM Systems, 20152017 : 277/296


2.5.b{187} , - -
, -
. , -
/ -
.
2.6.a{215} :
http://www.softwaretestingclass.com/wp-
content/uploads/2013/01/Sample-Test-Plan-Template.pdf
2.6.b{222} :
https://www.assess.com/xcart/skin1/downloads/Sample_I4_output.pdf
2.6.c{228} ?
( --
, - )? ,
,
?
2.7.a{231} : , -
. ,
.
2.7.b{235}
.
- , -
,
.
-
, .
2.7.c{243} , ..
, -
. , , , 18
22 / ,
Linux, Windows ( -
). , -
.
2.7.d{246} , , -
?
?
2.7.e{247} 2.5.e{169} -
.
2.7.f{250} : , ..
/////C:/dir/name/. , -
,
. /
.

. . EPAM Systems, 20152017 : 278/296


Windows Linux

4.3. Windows Linux, -



CMD- Windows
rem
rem ( ):
chcp 65001

rem :
del smoke_test.log /Q

rem :
del IN\*.* /Q

rem :
start php converter.php IN OUT converter.log

rem :
copy Test_IN\*.* IN > nul

rem 10 , :
timeout 10

rem :
taskkill /IM php.exe

rem =========================================================================
rem ,
rem ,
rem , :
echo Processing test: >> smoke_test.log

IF EXIST "OUT\ WIN1251.txt" (


echo OK! ' WIN1251.txt' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' WIN1251.txt' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ CP866.txt" (


echo OK! ' CP866.txt' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' CP866.txt' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ KOI8R.txt" (


echo OK! ' KOI8R.txt' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' KOI8R.txt' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ win-1251.html" (


echo OK! ' win-1251.html' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' win-1251.html' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ cp-866.html" (


echo OK! ' cp-866.html' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' cp-866.html' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ koi8-r.html" (


echo OK! ' koi8-r.html' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' koi8-r.html' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ WIN_1251.md" (


echo OK! ' WIN_1251.md' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' WIN_1251.md' file was NOT processed! >> smoke_test.log
)

. . EPAM Systems, 20152017 : 279/296


Windows Linux

IF EXIST "OUT\ CP_866.md" (


echo OK! ' CP_866.md' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' CP_866.md' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ KOI8_R.md" (


echo OK! ' KOI8_R.md' file was processed! >> smoke_test.log
) ELSE (
echo ERROR! ' KOI8_R.md' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ .txt" (


echo ERROR! 'Too big' file was processed! >> smoke_test.log
) ELSE (
echo OK! 'Too big' file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\.jpg" (
echo ERROR! Picture file was processed! >> smoke_test.log
) ELSE (
echo OK! Picture file was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ TXT.txt" (


echo OK! Picture file with TXT extension was processed! >> smoke_test.log
) ELSE (
echo ERROR! Picture file with TXT extension was NOT processed! >> smoke_test.log
)

IF EXIST "OUT\ .md" (


echo OK! Empty was processed! >> smoke_test.log
) ELSE (
echo ERROR! Empty file was NOT processed! >> smoke_test.log
)
rem =========================================================================

rem =========================================================================
rem ,
rem ,
rem , :
echo. >> smoke_test.log
echo Moving test: >> smoke_test.log

IF NOT EXIST "IN\ WIN1251.txt" (


echo OK! ' WIN1251.txt' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' WIN1251.txt' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ CP866.txt" (


echo OK! ' CP866.txt' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' CP866.txt' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ KOI8R.txt" (


echo OK! ' KOI8R.txt' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' KOI8R.txt' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ win-1251.html" (


echo OK! ' win-1251.html' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' win-1251.html' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ cp-866.html" (


echo OK! ' cp-866.html' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' cp-866.html' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ koi8-r.html" (


echo OK! ' koi8-r.html' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' koi8-r.html' file was NOT moved! >> smoke_test.log
)

. . EPAM Systems, 20152017 : 280/296


Windows Linux

IF NOT EXIST "IN\ WIN_1251.md" (


echo OK! ' WIN_1251.md' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' WIN_1251.md' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ CP_866.md" (


echo OK! ' CP_866.md' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' CP_866.md' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ KOI8_R.md" (


echo OK! ' KOI8_R.md' file was moved! >> smoke_test.log
) ELSE (
echo ERROR! ' KOI8_R.md' file was NOT moved! >> smoke_test.log
)
IF NOT EXIST "IN\ .txt" (
echo ERROR! 'Too big' file was moved! >> smoke_test.log
) ELSE (
echo OK! 'Too big' file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\.jpg" (


echo ERROR! Picture file was moved! >> smoke_test.log
) ELSE (
echo OK! Picture file was NOT moved! >> smoke_test.log
)

IF NOT EXIST "IN\ TXT.txt" (


echo OK! Picture file with TXT extension was moved! >> smoke_test.log
) ELSE (
echo ERROR! Picture file with TXT extension was NOT moved! >> smoke_test.log
)
rem =========================================================================

cls

rem =========================================================================
rem
rem :
echo. >> smoke_test.log
echo Comparing test: >> smoke_test.log

:st1
fc "Test_ETALON\ WIN1251.txt" "OUT\ WIN1251.txt" /B > nul
IF ERRORLEVEL 1 GOTO st1_fail
echo OK! File ' WIN1251.txt' was processed correctly! >> smoke_test.log
GOTO st2
:st1_fail
echo ERROR! File ' WIN1251.txt' was NOT processed correctly! >> smoke_test.log

:st2
fc "Test_ETALON\ CP866.txt" "OUT\ CP866.txt" /B > nul
IF ERRORLEVEL 1 GOTO st2_fail
echo OK! File ' CP866.txt' was processed correctly! >> smoke_test.log
GOTO st3
:st2_fail
echo ERROR! File ' CP866.txt' was NOT processed correctly! >> smoke_test.log

:st3
fc "Test_ETALON\ KOI8R.txt" "OUT\ KOI8R.txt" /B > nul
IF ERRORLEVEL 1 GOTO st3_fail
echo OK! File ' KOI8R.txt' was processed correctly! >> smoke_test.log
GOTO st4
:st3_fail
echo ERROR! File ' KOI8R.txt' was NOT processed correctly! >> smoke_test.log

:st4
fc "Test_ETALON\ win-1251.html" "OUT\ win-1251.html" /B > nul
IF ERRORLEVEL 1 GOTO st4_fail
echo OK! File ' win-1251.html' was processed correctly! >> smoke_test.log
GOTO st5
:st4_fail
echo ERROR! File ' win-1251.html' was NOT processed correctly! >> smoke_test.log

. . EPAM Systems, 20152017 : 281/296


Windows Linux

:st5
fc "Test_ETALON\ cp-866.html" "OUT\ cp-866.html" /B > nul
IF ERRORLEVEL 1 GOTO st5_fail
echo OK! File ' cp-866.html' was processed correctly! >> smoke_test.log
GOTO st6
:st5_fail
echo ERROR! File ' cp-866.html' was NOT processed correctly! >> smoke_test.log

:st6
fc "Test_ETALON\ koi8-r.html" "OUT\ koi8-r.html" /B > nul
IF ERRORLEVEL 1 GOTO st6_fail
echo OK! File ' koi8-r.html' was processed correctly! >> smoke_test.log
GOTO st7
:st6_fail
echo ERROR! File ' koi8-r.html' was NOT processed correctly! >> smoke_test.log

:st7
fc "Test_ETALON\ WIN_1251.md" "OUT\ WIN_1251.md" /B > nul
IF ERRORLEVEL 1 GOTO st7_fail
echo OK! File ' WIN_1251.md' was processed correctly! >> smoke_test.log
GOTO st8
:st7_fail
echo ERROR! File ' WIN_1251.md' was NOT processed correctly! >> smoke_test.log

:st8
fc "Test_ETALON\ CP_866.md" "OUT\ CP_866.md" /B > nul
IF ERRORLEVEL 1 GOTO st8_fail
echo OK! File ' CP_866.md' was processed correctly! >> smoke_test.log
GOTO st9
:st8_fail
echo ERROR! File ' CP_866.md' was NOT processed correctly! >> smoke_test.log

:st9
fc "Test_ETALON\ KOI8_R.md" "OUT\ KOI8_R.md" /B > nul
IF ERRORLEVEL 1 GOTO st9_fail
echo OK! File ' KOI8_R.md' was processed correctly! >> smoke_test.log
GOTO st10
:st9_fail
echo ERROR! File ' KOI8_R.md' was NOT processed correctly! >> smoke_test.log

:st10
fc "Test_ETALON\ .md" "OUT\ .md" /B > nul
IF ERRORLEVEL 1 GOTO st10_fail
echo OK! File ' .md' was processed correctly! >> smoke_test.log
GOTO end
:st10_fail
echo ERROR! File ' .md' was NOT processed correctly! >> smoke_test.log

:end
echo WARNING! File ' TXT.txt' has NO etalon decision, and it's OK for this file to
be corrupted. >> smoke_test.log
rem =========================================================================

Bash- Linux
#!/bin/bash

# :
rm -f smoke_test.log

# :
rm -r -f IN/*

# :
php converter.php IN OUT converter.log &

# :
cp Test_IN/* IN/

# 10 , :
sleep 10

# :
killall php

. . EPAM Systems, 20152017 : 282/296


Windows Linux

# ===========================================================================
# , ,
# , :
echo "Processing test:" >> smoke_test.log

if [ -f "OUT/ WIN1251.txt" ]
then
echo "OK! ' WIN1251.txt' file was processed!" >> smoke_test.log
else
echo "ERROR! ' WIN1251.txt' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ CP866.txt" ]
then
echo "OK! ' CP866.txt' file was processed!" >> smoke_test.log
else
echo "ERROR! ' CP866.txt' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ KOI8R.txt" ]
then
echo "OK! ' KOI8R.txt' file was processed!" >> smoke_test.log
else
echo "ERROR! ' KOI8R.txt' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ win-1251.html" ]
then
echo "OK! ' win-1251.html' file was processed!" >> smoke_test.log
else
echo "ERROR! ' win-1251.html' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ cp-866.html" ]
then
echo "OK! ' cp-866.html' file was processed!" >> smoke_test.log
else
echo "ERROR! ' cp-866.html' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ koi8-r.html" ]
then
echo "OK! ' koi8-r.html' file was processed!" >> smoke_test.log
else
echo "ERROR! ' koi8-r.html' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ WIN_1251.md" ]
then
echo "OK! ' WIN_1251.md' file was processed!" >> smoke_test.log
else
echo "ERROR! ' WIN_1251.md' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ CP_866.md" ]
then
echo "OK! ' CP_866.md' file was processed!" >> smoke_test.log
else
echo "ERROR! ' CP_866.md' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ KOI8_R.md" ]
then
echo "OK! ' KOI8_R.md' file was processed!" >> smoke_test.log
else
echo "ERROR! ' KOI8_R.md' file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ .txt" ]
then
echo "ERROR! 'Too big' file was processed!" >> smoke_test.log
else
echo "OK! 'Too big' file was NOT processed!" >> smoke_test.log
fi

. . EPAM Systems, 20152017 : 283/296


Windows Linux

if [ -f "OUT/.jpg" ]
then
echo "ERROR! Picture file was processed!" >> smoke_test.log
else
echo "OK! Picture file was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ TXT.txt" ]
then
echo "OK! Picture file with TXT extension was processed!" >> smoke_test.log
else
echo "ERROR! Picture file with TXT extension was NOT processed!" >> smoke_test.log
fi

if [ -f "OUT/ .md" ]
then
echo "OK! Empty file was processed!" >> smoke_test.log
else
echo "ERROR! Empty file was NOT processed!" >> smoke_test.log
fi
# ===========================================================================

# ===========================================================================
# , ,
# , :
echo "" >> smoke_test.log
echo "Moving test:" >> smoke_test.log

if [ ! -f "IN/ WIN1251.txt" ]
then
echo "OK! ' WIN1251.txt' file was moved!" >> smoke_test.log
else
echo "ERROR! ' WIN1251.txt' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ CP866.txt" ]
then
echo "OK! ' CP866.txt' file was moved!" >> smoke_test.log
else
echo "ERROR! ' CP866.txt' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ KOI8R.txt" ]
then
echo "OK! ' KOI8R.txt' file was moved!" >> smoke_test.log
else
echo "ERROR! ' KOI8R.txt' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ win-1251.html" ]
then
echo "OK! ' win-1251.html' file was moved!" >> smoke_test.log
else
echo "ERROR! ' win-1251.html' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ cp-866.html" ]
then
echo "OK! ' cp-866.html' file was moved!" >> smoke_test.log
else
echo "ERROR! ' cp-866.html' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ koi8-r.html" ]
then
echo "OK! ' koi8-r.html' file was moved!" >> smoke_test.log
else
echo "ERROR! ' koi8-r.html' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ WIN_1251.md" ]
then
echo "OK! ' WIN_1251.md' file was moved!" >> smoke_test.log
else
echo "ERROR! ' WIN_1251.md' file was NOT moved!" >> smoke_test.log
fi

. . EPAM Systems, 20152017 : 284/296


Windows Linux

if [ ! -f "IN/ CP_866.md" ]
then
echo "OK! ' CP_866.md' file was moved!" >> smoke_test.log
else
echo "ERROR! ' CP_866.md' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ KOI8_R.md" ]
then
echo "OK! ' KOI8_R.md' file was moved!" >> smoke_test.log
else
echo "ERROR! ' KOI8_R.md' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ .txt" ]
then
echo "ERROR! 'Too big' file was moved!" >> smoke_test.log
else
echo "OK! 'Too big' file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/.jpg" ]
then
echo "ERROR! Picture file was moved!" >> smoke_test.log
else
echo "OK! Picture file was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ TXT.txt" ]
then
echo "OK! Picture file with TXT extension was moved!" >> smoke_test.log
else
echo "ERROR! Picture file with TXT extension was NOT moved!" >> smoke_test.log
fi

if [ ! -f "IN/ .md" ]
then
echo "OK! Empty file was moved!" >> smoke_test.log
else
echo "ERROR! Empty file was NOT moved!" >> smoke_test.log
fi

# ===========================================================================

clear

# ===========================================================================
#
# :
echo "" >> smoke_test.log
echo "Comparing test:" >> smoke_test.log

if cmp -s "Test_ETALON/ WIN1251.txt" "OUT/ WIN1251.txt"


then
echo "OK! File ' WIN1251.txt' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' WIN1251.txt' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ CP866.txt" "OUT/ CP866.txt"


then
echo "OK! File ' CP866.txt' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' CP866.txt' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ KOI8R.txt" "OUT/ KOI8R.txt"


then
echo "OK! File ' KOI8R.txt' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' KOI8R.txt' was NOT processed correctly!" >> smoke_test.log
fi

. . EPAM Systems, 20152017 : 285/296


Windows Linux

if cmp -s "Test_ETALON/ win-1251.html" "OUT/ win-1251.html"


then
echo "OK! File ' win-1251.html' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' win-1251.html' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ cp-866.html" "OUT/ cp-866.html"


then
echo "OK! File ' cp-866.html' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' cp-866.html' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ koi8-r.html" "OUT/ koi8-r.html"


then
echo "OK! File ' koi8-r.html' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' koi8-r.html' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ WIN_1251.md" "OUT/ WIN_1251.md"


then
echo "OK! File ' WIN_1251.md' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' WIN_1251.md' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ CP_866.md" "OUT/ CP_866.md"


then
echo "OK! File ' CP_866.md' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' CP_866.md' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ KOI8_R.md" "OUT/ KOI8_R.md"


then
echo "OK! File ' KOI8_R.md' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' KOI8_R.md' was NOT processed correctly!" >> smoke_test.log
fi

if cmp -s "Test_ETALON/ .md" "OUT/ .md"


then
echo "OK! File ' .md' was processed correctly!" >> smoke_test.log
else
echo "ERROR! File ' .md' was NOT processed correctly!" >> smoke_test.log
fi

echo "WARNING! File ' TXT.txt' has NO etalon decision, and it's OK for this file to
be corrupted." >> smoke_test.log
# ===========================================================================

( , -
)
Processing test:
OK! ' WIN1251.txt' file was processed!
OK! ' CP866.txt' file was processed!
OK! ' KOI8R.txt' file was processed!
OK! ' win-1251.html' file was processed!
OK! ' cp-866.html' file was processed!
OK! ' koi8-r.html' file was processed!
OK! ' WIN_1251.md' file was processed!
OK! ' CP_866.md' file was processed!
OK! ' KOI8_R.md' file was processed!
OK! 'Too big' file was NOT processed!
OK! Picture file was NOT processed!
OK! Picture file with TXT extension was processed!

Moving test:
ERROR! ' WIN1251.txt' file was NOT moved!
ERROR! ' CP866.txt' file was NOT moved!
ERROR! ' KOI8R.txt' file was NOT moved!
ERROR! ' win-1251.html' file was NOT moved!
ERROR! ' cp-866.html' file was NOT moved!
ERROR! ' koi8-r.html' file was NOT moved!
ERROR! ' WIN_1251.md' file was NOT moved!

. . EPAM Systems, 20152017 : 286/296


Windows Linux

ERROR! ' CP_866.md' file was NOT moved!


ERROR! ' KOI8_R.md' file was NOT moved!
OK! 'Too big' file was NOT moved!
OK! Picture file was NOT moved!
ERROR! Picture file with TXT extension was NOT moved!

Comparing test:
ERROR! File ' WIN1251.txt' was NOT processed correctly!
ERROR! File ' CP866.txt' was NOT processed correctly!
ERROR! File ' KOI8R.txt' was NOT processed correctly!
ERROR! File ' win-1251.html' was NOT processed correctly!
ERROR! File ' cp-866.html' was NOT processed correctly!
ERROR! File ' koi8-r.html' was NOT processed correctly!
ERROR! File ' WIN_1251.md' was NOT processed correctly!
ERROR! File ' CP_866.md' was NOT processed correctly!
ERROR! File ' KOI8_R.md' was NOT processed correctly!
OK! File ' .md' was processed correctly!
WARNING! File ' TXT.txt' has NO etalon decision, and it's OK for this file to be
corrupted.

. . EPAM Systems, 20152017 : 287/296


4.4.
/ - -
/
/
/ -


1. X:\ - Windows 64 UTF8
bit
2. smb://host/dir Linux 32 bit UTF16
3. ../dir , - Linux 64 bit OEM

4. [257 Windows 64 OEM
bit
Windows]
5. smb://host/dir/ - Linux 64 bit UTF8

6. nul , - Windows 64 OEM
bit
7. \\ Linux 64 bit UTF16
8. /dir , - Linux 32 bit OEM

9. ./dir/ Linux 32 bit OEM
10. ./dir - Linux 64 bit UTF8

11. smb://host/dir Linux 64 bit UTF8
12. \\host\dir\ - Linux 32 bit UTF8

13. host:/dir Windows 32 UTF8
bit
14. .\dir\ Windows 64 UTF8
bit
15. [0 ] Windows 32 UTF16
bit
16. [4097 Linux 32 bit UTF16
Linux]
17. ..\dir\ Windows 32 UTF16
bit
18. "/ - - Windows 32 OEM
/" bit
19. smb://host/dir/ Linux 32 bit OEM
20. nul Windows 32 UTF8
bit
21. "/ - Linux 32 bit OEM
"
22. host:/dir/ Windows 64 UTF8
bit
23. ../dir Windows 64 UTF16
bit
24. ./dir/ Linux 64 bit UTF16
25. [257 Windows 32 UTF16
bit
Windows]
26. "/ - Linux 64 bit UTF8
/"
27. .. Windows 32 UTF8
bit
28. host:/dir/ Linux 64 bit OEM

. . EPAM Systems, 20152017 : 288/296


29. X:\dir\ - Windows 64 UTF8


bit
30. \\ , - Windows 64 UTF8
bit
31. // Windows 64 UTF8
bit
32. ..\dir\ , - Windows 64 OEM
bit
33. X:\dir Windows 64 OEM
bit
34. "X:\ - Windows 64 UTF16
\" bit
35. \\host\dir\ Windows 32 UTF16
bit
36. [256 - Windows 32 UTF8
bit
Windows]
37. [4096 Linux 64 bit UTF16
Linux]
38. /dir/ - Linux 64 bit UTF8

39. [256 - Windows 64 OEM
bit
Windows]
40. .\dir - Windows 32 UTF16
bit
41. // , - Windows 32 OEM
bit
42. prn , - Windows 64 UTF16
bit
43. ..\dir , - Windows 64 UTF16
bit
44. \\host\dir\ Windows 64 UTF16
bit
45. ../dir/ , - Linux 64 bit UTF8

46. .. Linux 32 bit OEM
47. ..\dir Windows 32 UTF8
bit
48. /dir Linux 64 bit UTF8
49. " Windows 32 UTF8
bit
50. ../dir/ - Linux 32 bit UTF16

51. .\dir Windows 64 OEM
bit
52. host:/dir/ , - Linux 32 bit UTF16

53. "/ - - Linux 64 bit UTF16
"
54. com1-com9 , - Windows 64 UTF16
bit
55. lpt1-lpt9 Windows 32 UTF8
bit
56. [0 ] Linux 64 bit UTF16
57. \\host\dir , - Windows 32 UTF16
bit
58. "X:\ - Windows 64 UTF16
" bit
59. \\host\dir Linux 64 bit UTF8

. . EPAM Systems, 20152017 : 289/296


60. lpt1-lpt9 Windows 64 UTF8


bit
61. "X:\ - - Windows 32 OEM
" bit
62. host:/dir - Linux 32 bit OEM

63. X:\ Windows 32 OEM
bit
64. \\ Windows 32 OEM
bit
65. [4096 - Linux 32 bit UTF8
Linux]
66. \\host\dir - Windows 64 OEM
bit
67. " , - Linux 32 bit OEM

68. con - Windows 32 UTF16
bit
69. ../dir Linux 32 bit UTF16
70. X:\dir - Windows 32 OEM
bit
71. ./dir - Linux 32 bit UTF16

72. // - Linux 32 bit UTF16

73. host:/dir , - Linux 64 bit UTF8

74. / - Linux 64 bit UTF8

75. "X:\ - , - Windows 32 OEM
\" bit
76. .\dir\ , - Windows 32 OEM
bit
77. // Linux 64 bit OEM
78. X:\dir\ Windows 32 UTF8
bit
79. " , - Linux 64 bit UTF16

80. / - Linux 32 bit UTF16

81. .. - Windows 64 UTF16
bit
82. com1-com9 , - Windows 32 OEM
bit
83. .. , - Linux 64 bit OEM

84. /dir/ - Linux 32 bit UTF16

85. [4097 - Linux 64 bit UTF16
Linux]

. . EPAM Systems, 20152017 : 290/296


4.5.
-
, . -
,
, .

(- (-
) )
- Automated test- ,
- ing ,
{71}
.
-- Alpha testing ,
{79} - -
-
.
.
- Root cause -
{248} analysis ,
, , -
, , -
.
-- Beta testing , -
{79} - -
/.
Border condi- ,
{232} tion, boundary .
condition
{164} Defect, anomaly -
, -
, , -
.
Dynamic testing -
- .
{68}
- Smoke test , -
{74} , ,
, -

(
, -
).

. . EPAM Systems, 20152017 : 291/296


Code review,
() {92} code inspection , -

.
(
),
, -
,
..
.
- Integration test- , -
- ing
{72} ( ,
,
).
- Equivalence ,
{232} class -
.
White box test- , -
{68} ing -
, -
.
Gray box testing
{69} , ,
,
. (. -
: {69}.)
Black box test- , -
{69} ing
, -
, -

.
{208} Metric -
.
.
- Software Devel- ,
{18} opment Model , -
-
. -
,
,
.
Unit testing, , -
() component test- , -
- ing ( ) -
{72} .
- Test case suite, -, -
{141} test suite, test -
set .

. . EPAM Systems, 20152017 : 292/296


- , -
Negative testing
{77} ,
()
/ ,
.
- Non-functional , -
- testing
{81} ( -
), -
, , -
, ..
- Non-functional ,
- requirements ( , ,
{38} , ..), -

.
- Defect report , -
{165} , -
.
- Test progress ,
- report, test sum- ,
{215} mary report -
-
.
{204} Reporting -
( ,
-
).
- Planning -
{204}
-
-
.
- Positive testing , -
{77} , -

, ,
..
{210} Coverage ,
-
-.
Acceptance , -
- testing
{82} / -
,
( ).
Extended test , -
- -
{76} , -
.

. . EPAM Systems, 20152017 : 293/296


Regression test- ,
- ing ,
{82} , -

.
- Manual testing , - -
{70} -
.
- System testing ,
{73} , -
,
.
Static testing -
- .
{68}
Work break- -
- down structure,
{225} WBS ,
.
{115} Test -.
Critical path test , -
, -
{75} -
.
Data-driven
- testing -,
- -
{88} -
, ..
Keyword-driven
- testing -, -
-
- , -
{88} -, -
().
Behavior-driven
- testing -,
- --
{88} , -
.
Software testing
-
{6} -
.
Performance
- testing
{86}
.

. . EPAM Systems, 20152017 : 294/296


-{115} Test case ,


,

. -
-
, -
-.
-{206} Test plan , -
,
,
, , -
, .
{29} Requirement , -
-

.
- Man-hours ,
{223} ( -
-).
- Functional de- -
- composition -
{265} .
- Functional test- ,
- ing -
{80} ( -
).
- Functional re- , -
- quirements , .. (, -
{38} , , ..)
-{110} Checklist [-].
, ..
- :
, ,

.

. . EPAM Systems, 20152017 : 295/296


5:

5:

Creative Commons Attribu-


tion-NonCommercial-ShareAlike 4.0 International374.
.
, , -
, : http://svyatoslav.biz/software_testing_book/.
, -
stb@svyatoslav.biz.

***

, ,
: MySQL, MS SQL Server
Oracle .
: 3 , 50+ , 130+ , 500+ -
. SELECT *
; ,
, . , : -
SQL, ;
SQL, ; -
SQL-.
: http://svyatoslav.biz/database_book/

374 Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International. [https://creativecommons.org/licenses/by-nc-


sa/4.0/legalcode]

. . EPAM Systems, 20152017 : 296/296