Вы находитесь на странице: 1из 22

Last updated: 14 April 2010

Introduction to
Artificial Neural Network
- theory, application
and practice using WEKA-

Anto Satriyo Nugroho, Dr.Eng


Center for Information & Communication Technology,
Agency for the Assessment & Application of Technology (PTIK-BPPT)
Email: asnugroho@gmail.com URL: http://asnugroho.net

1
Example of BP calculation
Sepal Sepal Petal Petal Class
Length Width Length Width
5.1 3.5 1.4 0.2 Iris-setosa
7.0 3.2 4.7 1.4 Iris-
versicolor

2-class problem
4 attributes
ANN Architecture

Sepal-length η = 0.1 (learning rate)

Sepal-width Iris-setosa

Iris-versicolor
Petal-length

Petal-width
Suppose the hidden neurons is 3
Initialization
w j ,i
Sepal-length
0.1
0.1 wk , j
0.2 0.1
0.3 0.1
0.4 Iris-setosa
Sepal-width 0.2 0.2
0.5
0.6 0.3
0.2
0.7 0.4
0.8 0.3 0.5 Iris-versicolor
Petal-length 0.6
0.9
1.0 1.1 k
Petal-width 1.2 j
i
Forward Pass
Calculating the output of Input Layer

5.1 I1 = x1 = 5.1

Iris-setosa
3.5 I 2 = x2 = 3.5

Iris-versicolor
1.4 I 3 = x3 = 1.4

0.2 I 4 = x4 = 0.2
Forward Pass
Calculating the output of Hidden Layer

net1 = θ1 + ∑ w1,i I i
= 0.1 + (0.1× 5.1) + (0.4 × 3.5) + (0.7 ×1.4 ) + (1.0 × 0.2 )
i

= 3.19

1
H1 = f (net1 ) = −3.19
= 0.96
1+ e
Forward Pass

net2 = θ 2 + ∑ w2,i I i
= 0.2 + (0.2 × 5.1) + (0.5 × 3.5) + (0.8 ×1.4 ) + (1.1× 0.2 )
i

= 4.31

1
H 2 = f (net2 ) = − 4.31
= 0.99
1+ e
Forward Pass

net3 = θ 3 + ∑ w3,i I i
= 0.3 + (0.3 × 5.1) + (0.6 × 3.5) + (0.9 ×1.4 ) + (1.2 × 0.2 )
i

= 5.43

1
H 3 = f (net3 ) = −5.43
= 1.00
1+ e
Forward Pass
Calculating the output of Hidden Layer

5.1
H1 = 0.96

Iris-setosa
3.5 H 2 = 0.99

Iris-versicolor
1.4

H 3 = 1.0

0.2
Initialization
w j ,i
0.1 0.1 wk, j
0.2
0.3 0.1 0.1
0.4 0.2
0.5 0.2 0.3
0.6
0.4
0.7 0.2
0.8 0.5
0.3 0.6
0.9
1.0 1.1 k
1.2 j
i
Forward Pass
Calculating the output of Output Layer

net1 = θ1 + ∑ w1, j H j
j
= 0.1 + (0.1× 0.96) + (0.3 × 0.99 ) + (0.5 × 1.0 )
= 0.99
1
O1 = f (net1 ) = − 0.99
= 0.73
1+ e
Err1 = t1 − O1 = 1 − 0.73 = 0.27
Forward Pass
Calculating the output of Output Layer

net2 = θ 2 + ∑ w2, j H j
j
= 0.2 + (0.2 × 0.96) + (0.4 × 0.99) + (0.6 × 1.0 )
= 1.39

1
O2 = f (net2 ) = −1.39
= 0.80
1+ e
Err2 = t 2 − O2 = 0 − 0.80 = −0.80
Forward Pass
Calculating the output of Output Layer

5.1 I1 = 5.1

O1 = 0.73 t1 = 1
H1 = 0.96 Iris-setosa
3.5 I 2 = 3.5

H 2 = 0.99 Iris-versicolor
1.4 I 3 = 1.4
O2 = 0.80 t2 = 0

H 3 = 1.0
0.2 I 4 = 0.2
Training Process: Backward Pass

1. Calculate the δ k of Output Layer


δ k = Ok (1 − Ok )(tk − Ok )
2. Calculate the δ j of Hidden Layer

δ j = H j (1 − H j )∑ wk , jδ k
k

3. Update the weight between Hidden & Output Layer


Δwk , j = ηδ k H j wk , j ( new) = wk , j ( old ) + Δwk , j

4. Update the weight between Input & Hidden Layer


Δw j ,i = ηδ j I i w j ,i ( new) = w j ,i ( old ) + Δw j ,i
Backward Pass
Update the weight between Hidden & Output Layer

H1 = 0.96
0.1

0.2 O1 = 0.73
H 2 = 0.99 t1 = 1
0.3
0.4
0.5 O2 = 0.80

0.6 t2 = 0

H 3 = 1.0
Backward Pass
δ1 = (t1 − O1 )O1 (1 − O1 ) = (1 − 0.73) × 0.73 × (1 − 0.73) = 0.053
δ 2 = (t 2 − O2 )O2 (1 − O2 ) = (0 − 0.80) × 0.80 × (1 − 0.80) = −0.128

H1 = 0.96
0.1 δ1 = 0.053
0.2 O1 = 0.73
H 2 = 0.99 t1 = 1
0.3
0.4
0.5 O2 = 0.80

0.6 t2 = 0
δ 2 = −0.128
H 3 = 1.0
Training Process: Backward Pass

1. Calculate the δ k of Output Layer


δ k = Ok (1 − Ok )(tk − Ok )
2. Calculate the δ j of Hidden Layer

δ j = H j (1 − H j )∑ wk , jδ k
k

3. Update the weight between Hidden & Output Layer


Δwk , j = ηδ k H j wk , j ( new) = wk , j ( old ) + Δwk , j

4. Update the weight between Input & Hidden Layer


Δw j ,i = ηδ j I i w j ,i ( new) = w j ,i ( old ) + Δw j ,i
Backward Pass
δ1 = H1 (1 − H1 )(δ1w1,1 + δ 2 w2,1 ) = 0.96 × (1 − 0.96) × (0.053 × 0.1 + (−0.128) × 0.2) = −0.00078

δ 2 = H 2 (1 − H 2 )(δ1w1, 2 + δ 2 w2, 2 ) = 0.99 × (1 − 0.996) × (0.053 × 0.3 + (−0.128) × 0.4) = −0.00035


δ 3 = H 3 (1 − H 3 )(δ1w1,3 + δ 2 w2,3 ) = 1× (1 − 1) × (0.053 × 0.5 + (−0.128) × 0.6) = 0
δ1 = −0.00078
H1 = 0.96
0.1 δ1 = 0.053
δ 2 = −0.00035 0.2 O1 = 0.73
H 2 = 0.99 t1 = 1
0.3
0.4
0.5 O2 = 0.80

0.6 t2 = 0
δ3 = 0 δ 2 = −0.128
H 3 = 1.0
Training Process: Backward Pass

1. Calculate the δ k of Output Layer


δ k = Ok (1 − Ok )(tk − Ok )
2. Calculate the δ j of Hidden Layer

δ j = H j (1 − H j )∑ wk , jδ k
k

3. Update the weight between Hidden & Output Layer


Δwk , j = ηδ k H j wk , j ( new) = wk , j ( old ) + Δwk , j

4. Update the weight between Input & Hidden Layer


Δw j ,i = ηδ j I i w j ,i ( new) = w j ,i ( old ) + Δw j ,i
Backward Pass
Update the weight between Hidden & Output Layer

Δw1,1 = ηδ1H1 = 0.1× 0.053× 0.96 = 0.0051


w1,1 = w1,1 + Δw1,1 = 0.1 + 0.0051 = 0.1051
Δw1, 2 = ηδ1H 2 = 0.1× 0.053× 0.99 = 0.0052
w1, 2 = w1, 2 + Δw1, 2 = 0.3 + 0.0052 = 0.3052
Δw1,3 = ηδ1H 3 = 0.1× 0.053×1.00 = 0.0053
w1,3 = w1,3 + Δw1,3 = 0.5 + 0.0053 = 0.5053
Backward Pass
Update the weight between Hidden & Output Layer

Δw2,1 = ηδ 2 H1 = 0.1× (−0.128) × 0.96 = −0.0123


w2,1 = w2,1 + Δw2,1 = 0.2 − 0.0123 = 0.188

Δw2, 2 = ηδ 2 H 2 = 0.1× (−0.128) × 0.99 = −0.0127


w2, 2 = w2, 2 + Δw1, 2 = 0.4 − 0.0127 = 0.387
Δw2,3 = ηδ 2 H 3 = 0.1× (−0.128) ×1.00 = −0.0128
w2,3 = w2,3 + Δw2,3 = 0.6 − 0.0128 = 0.587
Backward Pass
i j w ji (old ) Ii Δw ji w ji (new)
Update the weight between
1 1 0.1 5.1 - 0.099602 Input & Hidden Layer
0.000398
2 1 0.4 3.5 - 0.399727
0.000273
3 1 0.7 1.4 - 0.699891 ∂E
0.000109 Δw ji = −η = ηδ j xi
4 1 1.0 0.2 - 0.999984
∂w ji
0.000016
-
wnew = wold + Δw ji
1 2 0.2 5.1 0.000179 0.199822

2 2 0.5 3.5 - 0.499878


0.000123
3 2 0.8 1.4 - 0.799951
0.000049
4 2 1.1 0.2 - 1.099993
0.000007
1 3 0.3 5.1 0 0.3
2 3 0.6 3.5 0 0.6
3 3 0.9 1.4 0 0.9
4 3 1.2 0.2 0 1.2

Вам также может понравиться