Вы находитесь на странице: 1из 13

Chapter 6: Neural Network I (Adaline)

Chapter 6 Neural Network I


Adaline
Notation
scalars - small italic letters: a, b, c
vectors - small bold non-italic letters: a, b, c
matrices - capital BOL non-italic letters: A, B, C
!in"le-Input Neuron
scalar input p # scalar wei"ht w
bias b
it$s like a wei"ht e%cept that it has a constant input o& '
a neuron ma( have or ma( not have a bias
n - summer output )net input*
f - trans&er &unction )activation &unction*
a - neuron output= f(wp+b)
+%ample
Consider the parameters: w = 3, p = 2, b = -1.5. What is the neuron output?
a = f(3*2 -1.5) = f(4.5), where f is chosen by the desiner
,rans&er -unctions
!he trans"er "unction may be a linear or a nonlinear "unction o" n# A particular
trans"er "unction is chosen to satis"y some speci"ication o" the problem that the neuron
is attemptin to sol$e# !he most commonly used "unctions are discussed below
%& .ard limit )step* trans"er "unction & used in perceptrons
'r# (ssam Al 'aoud %
Chapter 6: Neural Network I (Adaline)
a=f(n)=

'

<

) )
) %
n
n
*ymmetrical +ard ,imit
a=f(n)=

'

<

) %
) %
n
n
-& Linear !rans"er .unctions/ used in A'A,IN( networks
a 0 n
1& Lo"-si"moid and tan-si"moid trans"er "unctions/ used in multilayer networks
trained with backpropaation
'r# (ssam Al 'aoud -
Chapter 6: Neural Network I (Adaline)
/ultiple-Input Neuron
!he "ollowin is a neuron with 2 inputs
In 3atri4 "orm
n 0 0p 5 b
a0 f6)0p 5 b7)
A sin"le-la(er network o& ! neurons
'r# (ssam Al 'aoud 1
Chapter 6: Neural Network I (Adaline)
Abbreviated Notation
.ow to 1ick an Architecture
8roblem speci"ications help de"ine the network in the "ollowin ways:
Number o" network inputs 0 number o" problem inputs
Number o" neurons in output layer 0 number o" problem outputs
9utput layer trans"er "unction choice at least partly determined by problem
speci"ication o" the outputs
+%ample
:i$en a two&input neuron with the "ollowin parameters: b 0 %#-/ W 0 ;1 -< and
p0; => 6<
T
/ calculate the neuron output "or the "ollowin trans"er "unctions:
A *ymmetrical hard limit trans"er "unction
A tanent simoid (tansig) trans"er "unction
!olution
a 0 hardlims6(=%#?)70 )
a0tansig6(=%#?7)0=)#@A6?
+%ample
A sinle&layer neural network is to ha$e si4 inputs and two outputs# !he outputs are to
be limited to and continuous o$er the rane ) to %# What can you tell about the
network architecture?
*peci"ically:
+ow many neurons are reBuired?
What are the dimensions o" the weiht matri4?
What kind o" trans"er "unctions could be used?
Is a bias reBuired?
!olution
!he problem speci"ications allow you to say the "ollowin about the network#
!wo neurons/ one "or each output/ are reBuired#
'r# (ssam Al 'aoud A
Chapter 6: Neural Network I (Adaline)
!he weiht matri4 has two rows correspondin to the two neurons and si4
columns correspondin to the si4 inputs# (!he product 0p is a two&element
$ector#)
!he logsig trans"er "unction would be most appropriate#
Not enouh in"ormation is i$en to determine i" a bias is reBuired#
+%ample
A sinle&layer neural network that ha$e two inputs and two outputs with the "ollowin
parameters:
b 0; %#- %<
T
/ 0 0
1
]
1

1 %
- 1
and p0; => 6<
T
,
calculate the neuron output "or the trans"er "unctions: *ymmetrical +ard limit (step)
trans"er "unction
!olution
n20p3b2
1
]
1

1 %
- 1
1
]
1

6
>
3
1
]
1

%
- # %
2
1
]
1

%A
? # %
a 0 hardlims)
1
]
1

%A
? # %
*2
1
]
1

%
%
1erceptron
%@>? & .rank 2osenblatt de$eloped the perceptrons
1erceptron2 Neuron 3Learnin" al"orithm
1erceptron Learnin" al"orithm
Input: a set o" trainin e4amples: Cp
'
/t
'
D/ Cp
4
/t
4
D/E/Cp
n
/t
n
D
5oal: classi"y all e4amples correctly
!teps
1-2andomly initialiFe the weihts 0 and the biases b
2-Choose a random input&output pair Cp/tD "rom the trainin set
3-,et the network to operate on the input to enerate output a
4-Compute the output error e0t&a
5-Gpdate weihts:
Add a matri4 60 to the weiht matri4 0/ which is proportional to the
product ep
,
between the error $ector and the input:
Add a $ector 6b to the bias $ector b/ which is proportional to the error
$ector:
6- Choose another random pair and do the correction aain
7-Continue until the stoppin criteria is satis"ied: all e4amples are correctly
classi"ied or a ma4imum number o" epochs is reached
'r# (ssam Al 'aoud >
Chapter 6: Neural Network I (Adaline)
+%ample - Apple7Banana !orter
A produce dealer has a warehouse that stores a $ariety o" "ruits# +e wants a machine
that will sort the "ruit accordin to the type###!here is a con$eyer belt on which the
"ruit is loadedE it is then passed throuh a set o" sensors/ which measure 1 properties
o" the "ruit: shape/ te4ture and weiht#
*hape sensor: &% i" the "ruit is round/ % & i" it is more elliptical
te4ture sensor: &% i" the sur"ace is smooth/ % & i" it is rouh
weiht sensor: &% i" the "ruit is H >)) / % & i" I >))
!he sensor outputs will then be input to a NN###!he purpose: is to reconiFe and
correctly sort the "ruit# .or simplicity & only - kinds o" "ruit (Jananas and Apples)
!olution
.irst construct the trainin set
Janana
Apple
Initial weihts (random):
0 ;)#> K% K )#></ b 0 )#>
Applyin p
'
:
a= hardlim( 0p
'
3 b)2 hardlim
[ ]

,
_

+
1
1
1
]
1

> # )
%
%
%
> # ) % > # )
2hardlim(&)#>)0)
.ind e0t
'
&a0%&)0%
'r# (ssam Al 'aoud 6
Chapter 6: Neural Network I (Adaline)
Gpdatin the weihts:
Applyin p
4
:
a= hardlim( Wp
%
5 b)0 hardlim
[ ]

,
_

+
1
1
1
]
1

> # %
%
%
%
> # % ) > # )
0hardlim(-#>)0%
"ind e2t
4
-a0)&%0&%
Gpdatin the weihts:
(nd o" epochL check i" the stoppin criteria is satis"ied
!he stoppin criteria is satis"ied# !top
+%ample
a) !rain by hand a perceptron with bias on this trainin set# Assume that all initial
weihts (includin the bias o" the neuron) are )# *how the set o" weihts
(includin the bias) at the end o" the each iteration# Apply the e4amples in the
'r# (ssam Al 'aoud M
Chapter 6: Neural Network I (Adaline)
i$en order# Gse the hardlim step "unction# *toppin criteria: 8atterns are
correctly classi"ied#
b) +ow many epochs were needed? Is the trainin set linearly separable?
!olution
Check i" the stoppin criteria is satis"ied = each trainin e4ample is applied to check i"
it is correctly classi"ied
% epoch was needed to train the perceptron# !he trainin set is linearly separable as
the perceptron was able to learn to separate it#
'r# (ssam Al 'aoud ?
Chapter 6: Neural Network I (Adaline)
Adaline
A'A,IN(s )AAptive LInear Neuron) use the Widrow&+o"" alorithm or ,east
3ean *Buare (,3*) alorithm to adNusts the weihts o" the linear network in order to
minimiFe the mean sBuare error
(rror & di""erence between the taret and actual network output
mean sBuare error
Adaline2Neuron 3Learnin" al"orithm
Adaline Learnin" al"orithm
Input: a set o" trainin e4amples: Cp
'
/t
'
D/ Cp
4
/t
4
D/E/Cp
n
/t
n
D
5oal: (rror is small enouh
!teps
8-2andomly initialiFe the weihts 0 and the biases b
9-Choose a random input&output pair Cp/tD "rom the trainin set
10- ,et the network to operate on the input to enerate output a
11- Compute the output error e0t&a
12- Gpdate weihts:
Add a matri4 60 to the weiht matri4 0/ which is proportional to the
product

ep
,
between the error $ector and the input:
Add a $ector 6b to the bias $ector b/ which is proportional to the error
$ector

e:
13- Choose another random pair and do the correction aain
14- Continue until the stoppin criteria is satis"ied: the per"ormance measure
(error or accuracy) is small enouh (below a threshold) or a ma4imum number o"
epochs is reached#
+%ample - Apple7Banana !orter
,earnin rate: O 0 )#A
*toppin criteria: mse I )#%
!olution
.irst construct the trainin set
'r# (ssam Al 'aoud @
Chapter 6: Neural Network I (Adaline)
Initial weihts (random):
0 ;) ) )</ no bias
p
%
/ e
%
0 t
%
& a
%
0 &% & ;)#@6 )#%6 &)#%6<
1
1
1
]
1

%
%
%
0 &% 5 )#6A 0 &)#16
p
%
/ e
-
0 t
-
& a
-
0 % & ;)#@6 )#%6 &)#%6<
1
1
1
]
1

%
%
%
0 % = %#-? 0 &)#-?
mse0
-
) -? # ) ( ) 16 # ) (
- -
+
0)#%)AH)#%
*toppin criteria is not satis"ied 0H continue with apoch -
'r# (ssam Al 'aoud %)
Chapter 6: Neural Network I (Adaline)
Learnin" 8ate
!oo bi: !he system will oscillate as the correction will be too lare and will
o$ershoot the taret
!oo small: !he system will take a lon time to con$ere
A constant: may ne$er con$ere to an unchanin $alue but will oscillate
around
!he best solution: radually drops toward Fero# !ypically used:
9 2 Constant7n
n& number learnin trials
Capabilit( and Limitations
Joth A'A,IN( and perceptron su""er "rom the same inherent limitation & can
only sol$e linearly separable problems/ "or e4ample the both can not be
trained to represent P92 "unction# ,3*/ howe$er/ is more power"ul than the
perceptronQs learnin rule#
8erceptronQs rule is uaranteed to con$ere to a solution that correctly
cateoriFes the trainin patterns but the resultin networkcan be sensiti$e to
noise as patterns o"ten lie close to the decision boundary
/atlab
+%ample: Write 3atlab code to implement the Adaline/ and use it to train the
"ollowin data#
Input Output),ar"et*
%/%/% %/%
?/M/> &%/ &%
A/>/> &%/&%
%/%/% %/%
What is the output i" the input is ;% % -<R
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
clear
S !rainin 2ate
r0)#)%

S Input 'ata (inputTsamples)
p 0;% ? A %L % M > %L % > > %<L
S!aret 'ata (output Tsamples)

t 0 ;% &% &% %L % &% &% %<L

'r# (ssam Al 'aoud %%
Chapter 6: Neural Network I (Adaline)
epoch0-)))

;rUin/cUin< 0 siFe(p)
;rUtar/cUtar<0siFe(t)
wC%D 0 rand(rUtar/ rUin)L
bC%D0rand(rUtar/%)L
k 0 %L

"or N0% :epoch
"or i0%:cUin
a0 wCkDTp(:/i)5bCkD L
e0t(:/i)&aL
i" e00)
continueL
end
k 0 k 5 %L
wCkD 0 wCk&%D 5 rTeTp(:/i)RL
bCkD0bCk&%D5rTe L
end
error0Feros(-/A)L
"or i0% : cUin
error(:/i)0t(:/i)& (wCkDTp(:/i)5bCkD)L
end
4(N)0mse(error)L
"iure(%)L
plot(4)L
end
title(RAda,ineR/R.ontsiFeR/%A/RcolorR/;) ) %< )L
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
!he output "or ;% % -<R is
(wCkDT;% % -<R5bCkD)
'r# (ssam Al 'aoud %-
Chapter 6: Neural Network I (Adaline)
ans 0
)#>66?
)#6-MA
'r# (ssam Al 'aoud %1

Вам также может понравиться