Вы находитесь на странице: 1из 9

# ISSN 2347-1921

Volume 12 Number 05
Journal of Advances in Mathematics

## The asymptotic behaviour of threshold-based classification rules

in case of three prescribed classes
Oksana Kubaychuk
Department of Computer Science, National University of Food Technologies, Kyiv, Ukraine

kubaychuk@gmail.com
ABSTRACT
We consider the problem of the classification of an object from the observation after its numerical characteristic in case of
three prescribed classes. We also study a problem on finding and asymptotic behaviour of threshold-based classification
rules constructed from a sample from a mixture with varying concentrations.

Keywords
classification rule; mixture with varying concentrations; estimator; threshold.

## Academic Discipline And Sub-Disciplines

Mathematics, Probability and Statistics

SUBJECT CLASSIFICATION
62G05, 62G20, 60G15

1. INTRODUCTION
Object classification by its numerical characteristic is an important theoretical problem and has practical
significance, for example, the definition of a person as not healthy, if the temperature of its body exceeds 37C.
To solve this problem we consider the threshold-based rule. According to this rule, an object is classified to
belong to the first class if its characteristic does not exceed a threshold 37C; otherwise, an object is classified to
belong to the second class. The empirical Bayes classification (EBC) (Devroye and Giorfi, 1985; Ivan ko and
Maiboroda, 2002) and minimization of the empirical risk (ERM) (Vapnik, 1989; Vapnik, 1996) are widely used
methods to estimate the best threshold. The case when the learning sample is obtained from a mixture with
varying concentrations is considered in (Ivanko and Maiboroda, 2006).
However, it is often necessary to classify an object in case of more than one threshold, for example, the
definition of a person as not healthy, if the temperature of its body exceeds 37C or lower then 36C. Another
example: the person is sick, if the level of its haemoglobin exceeds 84 units or lower than 72 units. In particular,
this problem is discussed in (Kubaychuk, 2008; Kubaychuk, 2010).
In all previous examples we have only two prescribed classes. The case of two thresholds and three
prescribed classes deserves special attention. An example is the classification of the disease stages. Thus,
during the diagnosis of breast cancer a tumor marker CA 15 -3 is used. If the value is less than 22 IU/ml, then the
person is healthy; if its level is in the range from 22 to 30 IU/ml precancerous conditions can be diagnosed; if the
index is above 30 IU/ml patient has cancer. When solving some technical problems it is needed to consider the
substance in its various aggregate forms: gaseous, liquid, solid. The transition from state to state occurs at a
specific temperature. According to this, a boiling point and a melting point are used.

## 2. THE SETTING OF THE PROBLEM

The problem of the classification of an object O from the observation after its numerical characteristic (O)
is studied. We assume that the object may belong to one of the three prescribed classes. An unknown number of
a class containing O is denoted by ind (O) . A classification rule (briefly, classifier) is a function

g : {1,2,3} that assigns a value to ind (O) by using characteristic . In general, classification rule is
defined as a general measurable function, but we restrict the consideration in this paper to the so -called
threshold-based classification rules of the six forms

1, t1 ,
2, t1 ,
1, t1 ,

2
3
g t ,t ( ) 2, t1 t2 , , g t ,t ( ) 1, t1 t2 , , g t ,t ( ) 3, t1 t2 , ,
1 2
1 2
1 2
3, t ,
3, t ,
2, t ,
2
2
2

6261 | P a g e
June 2016

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics

3, t1 ,
3, t1 ,
2, t1 ,

5
6
g t ,t ( ) 2, t1 t2 , , g t ,t ( ) 1, t1 t2 , , g t ,t ( ) 3, t1 t2 ,
1 2
1 2
1 2
1, t ,
2, t ,
1, t .
2
2
2

Hi

is

## assumed to be random, and its distribution depends on

The distributions

G {g t : t 2 } .

hi

are given by

Analogically,

## L( gt4 ) ( p1 p2 ) H3 (t1 ) ( p1 p3 ) H 2 (t1) ( p1 p3 ) H 2 (t2 ) ( p2 p3 ) H1(t2 ) p2 p3 .

Furthermore,

L1 (t2 , t1 ) L4 (t1, t2 ) 2 p2 p3 p1 .

Further, similarly

## L( gt2 ) ( p1 p3 ) H 2 (t1 ) ( p2 p3 ) H1(t1) ( p2 p1) H3 (t2 ) ( p2 p3 ) H1(t2 ) p2 p1 ,

L( gt5 ) ( p1 p2 ) H3 (t1 ) ( p3 p2 ) H1(t1) ( p3 p1) H 2 (t2 ) ( p2 p3 ) H1(t2 ) p3 p1 ,
L2 (t2 , t1 ) L5 (t1, t2 ) 2 p1 p3 p2 ,
L( gt3 ) ( p2 p3 ) H1 (t1 ) ( p3 p1 ) H 2 (t2 ) ( p2 p1 )( H3 (t2 ) H3 (t1 )) p3 p1 ,
L( gt6 ) ( p2 p3 ) H1 (t2 ) ( p1 p3 ) H 2 (t1 ) ( p1 p2 )( H3 (t2 ) H 3 (t1)) p2 p3 ,
L3 (t2 , t1 ) L6 (t1, t2 ) 2 p3 p1 p2 .

## g B G is called a Bayes classification rule in the class G , if L( g ) attains its minimum

g B ( g B arg min L( gt ) ). The threshold t B for a Bayes classification rule is called the Bayes threshold:

A classification rule

at

gG

## t B arg min L(t )

(1)

t 2

For

Lit , i 1, 6

we have: t

iB

arg min Li (t1, t2 ) (arg min L1i (t1 ),arg min Li2 (t2 )) , and
t1 ,t2

t1

t2

## L11 (t1 ) ( p2 p3 ) H1 (t1 ) ( p1 p3 ) H 2 (t1 ) ,

L12 (t2 ) ( p3 p1 ) H 2 (t2 ) ( p2 p1 ) H3 (t2 ) p1 p2 ,
L12 (t1 ) ( p2 p3 ) H1 (t1 ) ( p1 p3 ) H 2 (t1 ) ,
L22 (t2 ) ( p3 p2 ) H1 (t2 ) ( p2 p1 ) H3 (t2 ) p1 p2 ,
L13 (t1 ) ( p1 p2 ) H3 (t1 ) ( p3 p2 ) H1 (t1 ) ,

6262 | P a g e
June 2016

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics

## L32 (t2 ) ( p3 p1 ) H 2 (t2 ) ( p2 p1 ) H3 (t2 ) p1 p3 ,

L14 (t1 ) ( p1 p3 ) H 2 (t1 ) ( p1 p2 ) H3 (t1 ) ,
L42 (t2 ) ( p3 p1 ) H 2 (t2 ) ( p2 p3 ) H1 (t2 ) p3 p2 ,
L15 (t1 ) ( p1 p2 ) H3 (t1 ) ( p3 p2 ) H1 (t1 ) ,
L52 (t2 ) ( p3 p1 ) H 2 (t2 ) ( p2 p3 ) H1 (t2 ) p1 p3 ,
L16 (t1 ) ( p1 p3 ) H 2 (t1 ) ( p1 p2 ) H3 (t1 ) ,
L62 (t2 ) ( p3 p2 ) H1 (t2 ) ( p2 p1 ) H3 (t2 ) p3 p2 .
Let us consider the threshold rule
functions from the data

independent, if

{ }

N
j:N j 1 ,

3
i 1

(and, hence

j:N

are

## P{ j:N x} w H1 ( x) w H 2 ( x) w H 3 ( x) . Here w , i 1,3 is a

mixture of objects of the i -th class at the moment when an observation j is made
1
j:N

is fixed and

## known concentration in the

(Maiboroda, 2003),

2
j:N

3
j:N

i
j:N

wij:N 1 .

## H i , empirical distribution function

1 N
H iN ( x) a ij:N I{ j x}
N j 1
is used, where

I{ A}

a ij:N

and

Sugakova, 1998)

a kj:N
defined if
and

ki

det N 0 ,

is the

(k , i )

where

main minor of

w ,w
k

1
det N
l

k i

i 1

ki wkj:N

3
k ,l 1

wk , wl

1
N
wk wl

j 1 j:N j:N
N

N .

1
hiN ( x)
Nk N
where

a
j 1

i
j:N

x j:N
K
,
kN

kN 0

## 1998; Ivanko, 2003).

Let us construct the threshold estimator using EBC method (Kubaychuk, 2008). The empirical Bayes estimator is
constructed as follows. First, one determines the sets

TN1

and

TN2

## ( p2 p3 )h1N (t ) ( p1 p3 )h2N (t ) 0 , and ( p1 p3 )h2N (t ) ( p1 p2 )h3N (t ) 0

respectively. Second, one chooses

## t EBC arg min L1N (t1, t2 ) ,

t1TN1 ,t2TN2 ,t1t2

6263 | P a g e
June 2016

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics
as an estimator for

t B , where

L1N (t1, t2 )
N (t ) ( p p ) H
N (t ) ( p p ) H
N (t ) ( p p ) H
N (t ) p p
( p2 p3 ) H
1
1
1
3
2
1
3
1
2
2
2
1
2
2
1
2
and

L1N (t1, t2 )

L1 (t1, t2 ) :

## is the estimator for

N (t ) ( p p ) H
N (t ) ,
L1N1 (t1 ) ( p2 p3 ) H
1
1
1
3
2
1
N (t ) ( p p ) H
N (t ) p p ,
L1N2 (t2 ) ( p3 p1 ) H
2
2
2
1
2
2
1
2
tNEBC
arg min L1N1 (t1 ) , tNEBC
arg min L1N2 (t2 ) .
1
2
t1TN1

The sets

TN1

and

TN2

hi

t2TN2

exist and be

t1 t2 .

## times continuously differentiable in some neighborhood of the points

t1B , t2B .

Put

d sh
d sh
f s2 (t ) (1) s ( p3 p1 ) s2 ( p2 p3 ) s1 ,
dt
dt

d sh
d sh
f s2 (t ) (1) s ( p2 p1 ) s1 ( p3 p1 ) s2 .
dt
dt

Lets assume,

lim rNi ri , i 1, 2

exist. Put
1

2
2
rNi N 1 bij:N w1j:N h1 tiB w2j:N h2 tiB w3j:N h3 tiB , i 1, 2 , where
j 1

Lets denote
2
1
1

## WN1i N 3 L1Ni tiB N 3 L1Ni tiB L1i tiB N 3 L1i tiB , i 1, 2 .

3. MAIN RESULTS
In what follows we assume that:
(

A)

the threshold

tB

defined by (1) exists and it is the unique point of the global minimum for

( B ) The limits

Remark 1.

## is the unique global minimum point for L2 (t2 ) ).

M

exist;

lim (a k )2 , wr

r 1

lim rNi ri , i 1, 2

## heorem 1. Let conditions ( A ) and ( B ) hold. Assume

k N 0 as NkN , k is the continuous function, and
6264 | P a g e
June 2016

L1 (t ) ( t1B is

hr ( x) , 1 k M , M 3 .

existence.

hi

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics

def

d 2 k 2 (t )dt .

Then

t EBC
t B (tNEBC
t1B , tNEBC
t2B )
N
1
2

in probability.

Proof. According to Theorem 1 of (Sugakova, 1998), the assumptions of the theorem imply that

hiN ( x) hi ( x)

. Therefore

## uN1 ( x) (( p2 p3 )h1N ( x) ( p1 p3 )h2N ( x)) u1( x) (( p2 p3 )h1( x) ( p1 p3 )h2 ( x)) ,

uN2 ( x) (( p1 p3 )h2N ( x) ( p1 p2 )h3N ( x)) u2 ( x) (( p1 p3 )h2 ( x) ( p1 p2 )h3 ( x))
in probability.
Put

## i 0 . We can show that

for

P( AN ( i )) 1, N
Since

t1B

t2B

## is the point of minimum L1 (t ) ,

( L12 (t )) uN2 (t )

## means that there are

ti

and

ti

(2)

uNi (t )

( L11 (t )) uN1 (t )

and

tiB .

This

such that

and

functions,

Thus,

Since

uNi (t )

are continuous

## {ui (t )ui (t ) 0} AN (i ) . Therefore (2) is proved.

Let us fix an

L11 () p2 p1 ,

i , i 1, 2 .

Hence,

L11 (t )

L12 () p2 p1,

i 0 i ti : ti tiB i

it

follows

and

L12 (t )

L12 () p3 p1
that

and

condition

Let

A)

is

0 i i

L11 () 0 ,

satisfied,

then

be

such

that

that

## t [tiB i , tiB i ] : L1i (ti ) L1i (tiB ) i 4 . Put

BNi {

inf

t[ tiB i ,tiB i ]

Fix an arbitrary

i 0 .

P( BNi ) 1 i 2 .

BNi

## L1Ni (t ) L1i (tiB ) i 2

L1Ni

P( AN ( i )) 1 i 2

## ti TNi [tiB i , tiB i ]

such that

inf

t[ tiB i ,tiB i ]

to

L1i ,

L1Ni (t )} .

. If the event

for all

AN ( i )

ti tiB i , tiB i

## occurs. Therefore, hence

P( AN ( i ) BNi ) P( AN ( i )) P( BNi ) 1
it follows that

P{ tiEBC tiB } 1 i
for sufficiently large

Remark 2.

The estimator

6265 | P a g e
June 2016

Hk

is arbitrary.

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics

a k wm
Then, it is easy to see that

## I m k , for all m 1,, M , N M .

1.

Remark 3. Often, H k is not a probability distribution, but it is not important. To estimate H k you can use the
corrected weighted empirical distribution function, if necessary. (Kubaychuk, 2003;Maiboroda and Kubaichuk,
2003; Maiboroda and Kubaichuk, 2004).
For the proof next theorem we need some auxiliary result on the asymptotic behavior of the processes

i 1, 2 .

WN1i ,

## Lemma 1. Let condition ( A ) hold and 1 2 . Put

AN AN (1, 2 ) [ N 1 31, N 1 3 2 ] .
Then
N

j 1
N

## WN12 ( 2 ) WN12 (1 ) N 1 3 b2j:N (I{ j:N AN } P{ j:N AN }) .

j 1

Proof.

WN11 ( 2 ) WN11 ( 1 )
N 2 3[ L1N1 (t1B N 1 3 2 ) L11 (t1B N 1 3 2 ) L1N1 (t1B N 1 31 ) L11 (t1B N 1 31 )]
N 2 3[ p2 H 1N (t1B N 1 3 2 ) p1H 2N (t1B N 1 3 2 )
p3 ( H 1N (t1B N 1 3 2 ) H 2N (t1B N 1 3 2 )) p2 H1 (t1B N 1 3 2 ) p1H 2 (t1B N 1 3 2 )
p3 ( H1 (t1B N 1 3 2 ) H 2 (t1B N 1 3 2 )) p2 H 1N (t1B N 1 31 ) p1H 2N (t1B N 1 31 )
p ( H N (t B N 1 3 ) H N (t B N 1 3 )) p H (t B N 1 3 ) p H (t B N 1 3 )
3

1 3

[ p a
j 1

1
j:N

## p3 (a1j:N P{ j:N AN } a 2j:N P{ j:N AN })]

N 1 3 j 1[( p2a1j:N p1a 2j:N ) p3 (a1j:N a 2j:N )]
N

## I j:N AN P j:N AN N 1 3 j 1 b1j:N (I{ j:N AN } P{ j:N AN }) .

N

WN12 ( 2 ) WN12 (1 )
N 2 3[ L1N2 (t2B N 1 3 2 ) L12 (t2B N 1 3 2 ) L1N2 (t2B N 1 31 ) L12 (t2B N 1 31 )]

6266 | P a g e
June 2016

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics

## N 2 3[( p3 p1 ) H 2N (t2B N 1 3 2 ) ( p2 p1 ) H 3N (t2B N 1 3 2 )

( p p ) H N (t B N 1 3 ) ( p p ) H N (t B N 1 3 )
3

( p3 p1 ) H 2 (t N
B
2

2 ) ( p2 p1 ) H 3 (t N 2 )

1 3

B
2

1 3

( p3 p1 ) H 2 (t2B N 1 3 1 ) ( p2 p1 ) H 3 (t2B N 1 3 1 )]
N 1 3 j 1[( p3 p1 )a 2j:N ( p2 p1 )a 3j:N ]
N

N

## This completes the proof of the lemma.

In what follows, the symbol

Theorem 2.

be the space of functions without discontinuity of the second kind equipped with the

uniform metric,

W , i 1,2
1
Ni

Let

D(ui )

## stands for weak convergence.

the two sided standard Wiener process, ( B ) holds. Then, stochastic processes

W be

weakly converge as

to the process

rW
i

in the space

D(ui )

## on an arbitrary finite interval

ui [ i , i ] .
Proof.
The trajectories of

WN1i , i 1,2

## are continuous. It is enough to prove: the finite dimensional distributions of

W , i 1, 2

are asymptotically Gaussian, the second moments of incremen ts converge and the distributions of

W , i 1,2

are tight in

1
Ni

1
Ni

1

N

N

N

N

## w3j:N H 3 ( AN )) ( w1j:N H1 ( AN ) w2j:N H 2 ( AN ) w3j:N H 3 ( AN )) 2 ] .

Taking into account that

## Hi ( AN ) hi (t1B ) N 1 3 ( 2 1 ), i 1,2,3 , we obtain:

E(WN11 ( 2 ) WN11 ( 1 )) 2
N 2 3 j 1 N 1 3 (b1j:N )2 [ w1j:N h1 (t1B ) w2j:N h2 (t1B ) w3j:N h3 (t1B )]( 2 1 )
N

## [1 N 1 3 ( w1j:N h1 (t1B ) w2j:N h2 (t1B ) w3j:N h3 (t1B ))( 2 1 )]

r12 ( 2 1 ) E(rW
( 2 ) rW
( 1 )) 2 as N ,
1
1
where

r1 lim rN1 , rN1 [ N 1 j 1 (b1j:N )2[w1j:N h1 (t1B ) w2j:N h2 (t1B ) w3j:N h3 (t1B )]]
N

6267 | P a g e
June 2016

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics
Similarly,

taking

into

account

Hi ( AN ) hi (t2B ) N 1 3 ( 2 1 ), i 1,2,3 ,

that

## E(W ( 2 ) W (1 )) r ( 2 1 ) E(r2W ( 2 ) r2W (1 ))

1
N2

1
N2

2
2

as

we

obtain

N ,

where

r2 lim rN2 , rN2 [ N 1 j 1 (b2j:N )2[ w1j:N h1 (t1B ) w2j:N h2 (t1B ) w3j:N h3 (t1B )]]
N

The condition ( B ) holds, than all terms at sum from lemma 1 are uniformly bounded. Therefore, the finite
dimensional distributions of processes

WN1i , i 1, 2

WN1i , i 1,2

is proving

## analogically to (Ivanko and Maiboroda, 2006). This completes the proof.

heorem 3. Let conditions (
(i) the derivatives

## A ) and ( B ) hold. Assume that:

hk(t ) d 2hk (t ) dt 2

## exist and are bounded in a neighborhood of

t1B , t2B

and

f (tiB ) 0 ,

i 1, 2 ;

def

(ii)

(iii)

kN c N 1 5

zK ( z )dz 0 , D 2 z 2 K ( z )dz

hen N

and d

0.

## (tiEBC tiB ) Ai Bii , where

Ai D2c2 5 f 2i (tiB ) (2 f1i (tiB )) , Bi dri (c1 10 f1i (tiB )) ,

and

i 1, 2 .

Proof. Let

## uN1 (t ) ( p2 p3 )h1N (t ) ( p1 p3 )h2N (t ) ,

uN2 (t ) ( p1 p3 )h1N (t ) ( p1 p2 )h3N (t ) .
By the definition of

t NEBC
i

we have

uNi (t NEBC
) 0 . Put Ni t NEBC
tiB , i 1, 2 . Theorem 1 implies that Ni 0
i
i

in probability. Hence

uN1 (t1B )
( p2 p3 )(h1N (t1B ) h1 (t1B )) ( p1 p3 )(h2N (t1B ) h2 (t1B ))
,
N1

uN1 (t1B )
f11 (t1B )

uN2 (t2B )
( p1 p3 )(h2N (t2B ) h2 (t2B )) ( p1 p2 )(h3N (t2B ) h3 (t2B ))
.

uN2 (t2B )
f12 (t2B )

## N 2 5 ([( p2 p3 )(h1N (t1B ) h1 (t1B )) ( p1 p3 )(h2N (t1B ) h2 (t1B ))])

D 2c 2 5 f 21 (t1B ) 2 (dr1 c1 10 )1 ,
N 2 5 ([( p1 p3 )(h2N (t2B ) h2 (t2B )) ( p1 p2 )(h3N (t2B ) h3 (t2B ))])
D 2c 2 5 f 22 (t2B ) 2 (dr2 c1 10 )2
For

k N c N 1 5 , where i

6268 | P a g e
June 2016

i 1, 2 .

## council for Innovative Research

www.cirworld.com

ISSN 2347-1921
Volume 12 Number 05
Journal of Advances in Mathematics
This completes the proof.

## SUMMARY AND CONCLUSIONS

The results obtained in this paper allow one to see the asymptotic behaviour of threshold -based classification
rules constructed from a sample from a mixture with varying concentrations in case of three prescribed classes.
This is another important step to solving the problem of the classification of an object from the observation after
its numerical characteristic. Future research will be devoted to the situation with an arbitrary number of classes.

ACKNOWLEDGEMENTS
The author would like to thank the referees for their valuable comments.

REFERENCES


Billingsley, P. (1968): Convergence of Probability Measures. New York: John Wiley & Sons, Inc.



Devroye, L. and Gyorfi, L. (1985): Nonparametric Density Estimation. The L1 View. New York: John Wiley
&Sons, Inc.



Ivanko, Yu. O. (2003): The asymptotic behavior of kernel estimators and their derivatives constructed from
observations from a mixture with varying concentrations. Visnyk KNU, Ser. Matematika.Mekhanika, 9, 29
35. (Ukrainian).



Ivanko, Yu.O. and Maiboroda, R.E. (2002): Exponential estimates for the empirical Bayes risk in the
classification of a mixture with varying concentrations. Ukrain. Mat. Zh., 54, no. 10, 14211428; English
transl. in Ukrainian Math. J., 54, no. 10, 17221731.



Ivanko, Yu.O. and Maiboroda, R.E. (2006): The asymptotic behaviour of threshold -based classification
rules constructed from a sample from a mixture with varying concentrations. Teor. Imovirnost. Matem.
Statist., 74, 3443; English transl. in Theor. Probability Math. Statist., 74, 3747.



Kubaychuk, O.O. (2003): Estimation of moments from mixtures using the corrected weighted empirical
distribution functions. Visnyk KNU, Ser. Matematika. Mekhanika, 9, 4852. (Ukrainian).



Kubaychuk, O.O. (2008): The asymptotic behavior estimator of Bayes threshold. Visnyk KNU, Ser.
Matematika.Mekhanika, 19, 4750. (Ukrainian).



Kubaychuk, O.O. (2010): The estimator asymptotic behavior of the empirical risk mini mization method for
Bayesian border. Research Bulletin NTU KPI Ser. Physics and Mathematics, 4, 7885. (Ukrainian).



Maiboroda, R. and Kubaichuk, O. (2003): Asymptotic normality of improved weighted empirical distribution
functions. Teor. Imovirnost. Matem. Statist., 69, 8995; English transl. in Theor. Probability Math. Statist.,
69, 95102.



Maiboroda, R. and Kubaichuk, O. (2004): Improved estimators for moments constructed from observations
of a mixture. Teor. Imovirnost. Matem. Statist., 70, 7481; English transl. in Theor. Probability Math.
Statist., 70, 8392.





## Sugakova, O. V. (1998): Asymptotics of a kernel estimate for distribution density constructed

fromobservations of a mixture with varying concentrations. Teor. Imovirnost. Matem. Statist., 59, 156166;
English transl. in Theor. Probability Math. Statist., 59, 161171.



Vapnik, V. N. (1996): The Nature of Statistical Learning Theory. New York: Springer.



Vapnik, V.N. (1989): Inductive principles for the search for empirical laws. Pattern. Recognition.
Classification. Prediction, 1, 1781. (Russian)

6269 | P a g e
June 2016

www.cirworld.com