Вы находитесь на странице: 1из 15

-Artificial Neural Network-

Chapter 4 Adaline & Madaline



2
Outline
ADALINE
MADALINE
Least-Square Learning Rule
The proof of Least-Square
Learning Rule

3
ADALINE (1/3)
ADALINE: (Adaptive Linear Neuron) 1959 by
Bernard Widrow

PE
single processing element

4
ADALINE (2/3)
Method : The value in each unit must +1 or 1
(perceptron 1 )
net =
i i
W X

<
>
=
+ + + + = =
0 net 1
if
0 net 1
net 1
2 2 1 1 0 0
Y
X W X W X W W X
n n

different from perception's transfer function



5
ADALINE (3/3)
(T-Y) , T expected output
ADALINE can solve only linear problem(the limitation)
i i
i i i
W X
W W W
A = =
= + A


6
MADALINE
MADALINEIt is composed of many ADALINE
Multilayer Adaline.

Y
no W
ij
net
j
W
ij
n
x
1 x
After the second layer, the majority vote is used.
if more than half of net
j
0, then
output 1,otherwise, output 1

7
Least-Square Learning Rule (1/6)
-
L j
1 ) . i.e ( , ) , , , (
1
0
1 0
s s = = -
n
t
n
j
x
x
x
X x x x X

) . i.e ( , ) , , , (
1
0
1 0
= =
n
t
n
w
w
w
W
w w w W

j :j input pattern
t :(transpose)
L : input pattern
n
i

0
Net
j
= W
t
X
j
= w
i
x
i
= w
0
x
0
+ w
1
x
1
+ + w
n
x
n
j

8
Least-Square Learning Rule (2/6)
By applying the least-square learning rule the
weights is
,
'
R
R: Correlation Matrix
where P R W
1
' '
2
'
1
'
1 - *
P RW
*
=
=
= +
...
+ + = =

=
L
X T
P
t
X X R R R R
L
R
t
j j
L
j
t
j j L
=

9
Least-Square Learning Rule (3/6)
1 2 3
1 2 3
1 1 1
1 1 1
Example X 1 X 0 X 1
0 1 1
T T T = = =
| | | | | |
| | |
= = =
| | |
| | |
\ . \ . \ .

-1 1 1 1 X
3
1 1 0 1 X
2
1 0 1 1 X
1
T
j
X
3
X
2
X
1

10
Least-Square Learning Rule (4/6)
Sol. R
( )
( )
( )
|
|
|
|
.
|

\
|

|
|
|
.
|

\
|
=

|
|
|
.
|

\
|
=
|
|
|
.
|

\
|
=
|
|
|
.
|

\
|
=
|
|
|
.
|

\
|
=
|
|
|
.
|

\
|
=
|
|
|
.
|

\
|
=
3
2
3
1
3
2
3
1
3
2
3
2
3
2
3
2
1
2 1 2
1 2 2
2 2 3
3
1
1 1 1
1 1 1
1 1 1
111
1
1
1
1 0 1
0 0 0
1 0 1
101
1
0
1
0 0 0
0 1 1
0 1 1
110
0
1
1
'
3
'
2
'
1
R
R
R
R

11
Least-Square Learning Rule (5/6)
( ) ( )
( ) ( )
( ) ( )
( )
2 - W
2 - W
3 W
0 2 2
0 2 2
1 2 2 3
0
0
3
1
3
2
3
1
3
2
3
1
3
2
3
2
3
2
3
2
1
0 , 0 ,
3
1
100
3
1
1 1 1 1 , 1 , 1 1
101 1 , 0 , 1 1
110 0 , 1 , 1 1
1
2
1
3 2 1
3 2 1
3 2 1
3
2
1
*
3
2
1
=
=
=

= + +
= + +
= + +

|
|
|
|
|
|
|
.
|

\
|
=
|
|
|
|
.
|

\
|
|
|
|
|
.
|

\
|
=
|
.
|

\
|
= =

= =
= =
= =
W W W
W W W
W W W
W
W
W
P W R
P
P
P
P
t
t
t
t

12
Least-Square Learning Rule (6/6)
Verify the net:
1,1,0net=3X
1
-2X
2
-2X
3
=1 Y=1 ok
1,0,1net=3X
1
-2X
2
-2X
3
=1 Y=1 ok
1,1,1net=3X
1
-2X
2
-2X
3
=-1 Y=-1 ok
3
ADALINE
-2
-2
X
1
X
2
X
3
Y
solution best minimum is this ==> = > < 0
2
k

(*)

13
Proof of Least Square Learning Rule(1/3)
We use Least Mean Square Error to ensure the minimum
total error. As long as the total error approaches zero, the
best solution is found. Therefore, we are looking for the
minimum of .
Proof:
2
k




= =
= = =
= = =
+ > =<
> < + =
+ = = = > <
L
k
L
k
t
k k
t
k k
L
k
L
k
k k k
L
k
k
L
k
k k k k
L
k
k k
L
k
k
W X X W
L
Y T
L
T
T Y
L
Y T
L
T
L
Y Y T T
L
Y T
L L
1 1
1 1
2
1
2
1
2 2
1
2
1
2
2
] ) ( [
1 2
1 2 1
) 2 (
1
) (
1 1
2
k
2
k
mean represents let
mean

14
W X X W
W X X W X W x w Y ps
L
k
t
k k
t
k
L
k
L
k
L
k
t
k k
t
k
t
n
i
ik i k
=
= = = ==>


=
= = = = =
) (
) )( ( ) ( ) ( :
1
1
1 1 1 1
2 2
1
2



W X X W W X T T
W X X W W X T
L
T
W X X W X W T
L
T
W X X
L
W Y T
L
T
t
k k
t
t
k k k
L
k
t
k k
t
t
k k k
L
k
t
k k
t
k
t
k k
L
k
L
k
t
k
t
k k k
) ( + ) ( > =<
) ( + > =<
) ( + > =<
+ > =<


=
=
= =
2
] ) (
1
[ 2
) (
2
)] (
1
[
2
2
1
2
1
2
1 1
2
R
Proof of Least Square Learning Rule(2/3)

15
=P RW P R W P RW
W
X T P RW RW X T
RW W W X T T
W
RW W W X T T
X X R
X T
R X X R
* - k
t
k k k k
t
t
k k k
k
k
t
t
k k k
t
k k
L
k
t
k k
k
t
k k k
= = 2 2 if
P Let
minimal is that such W Find
of mean i.e. /L R' R Let
R R R R R' Let
Matrix. n Correlatio called also , matrix n n a is i.e., , Let
* *
'
L K 2 1
0 0
2 2 2 2
]' 2 [
*
2
R ) (
1
2
2
2
2
2
1
=
c
) c(
) ( = = + > < =
+ ) ( ) ( =
c
) c(
> <
+ ) ( > =<
> =< ==> = -
= + . + + . + + = -
= -





Proof of Least Square Learning Rule(3/3)

Вам также может понравиться