Вы находитесь на странице: 1из 12

ARABACADEMY FOR SCIENCE,TECHNOLOGY& MARITIME TRANSPORT

LATTAKIABRANCH

Department : Departmentof ComputerEnginee-ring


: Neural Networks
Course
---t
Code
, .-\
. I rtzvtL
Date
Starttime :
Exam Time : 2Hour
LecturerName:Dr. Khalid Eskaf

5\o dt,'.e,(

FinalExam
StudentName:

01

2o

")...)

a3

? -+-r-5-

a4
3

05
-F(1

Total

)-t".

For eachquestion,pleaseselecta maximumof ONE of the givenanswers(eitherA,


the BESTpossible
B, C, D or E).You shouldselectthe oneanswerthatrepresents
but
replyto the question(in somecases,theremaybe no obvious"wrong" answers,
oneanswershouldalwaysbe betterthanthe others).
Answer
1.1

1.2
1.3
t.4

i.5
1-6
l-7
1.8

e5

D
B
B
(b

t.9
1.10

b
E

1 . 1I
I.I2

A
A

1.13
1.14

{D

1.15
1.16
1.1'7

1.18

OU

1.19
1.20

1/11

(1.1)Whatarehiddenlayers?
to anyotherunits.
A. Layersof unitsthat haveno directconnections
B. Layersof unitsthat haveno directconnectionto theinput or'theoutput.
C. Layersof units that do not contributetowardsthe output.
D. Noneoflhe aboveanswers
is multipliedby a numberto giveit a weight.These
(1.2)Eachof the inputsto the perceptron
can
to be changedsothat theperceptron
weightsallow the strengthofthe differentconnections
leam.
B.FALSE.
A.TRUE.
(1.3)A perceptronaddsup all the weightedinputsit receives.If the sumexceedsa cerlainvalue,
thentheperceptronoutputsa 1, otherwiseit just outputsa 0.
A. TRUE.
B. FALSE.
- it canalsooutputcontinuous
valuesin between0 and 1.
C. Sometimes
(1.4)Thenamefor the firnctionin question(1.3)is
A. Unipoiarstepfunction.
B. Bipolarstepfunction.
C. Sigmoidfunction.
D. Logisticfunction.
function.
E. Perceptron
(1.5)Percephons
canbe usedfor manydifferenttasks,e.g.,to recognizelettersofthe alphabet.Can
a perceptronanda goodsolutionin anypattemrecognitiontask?
A. Yes.
B. No.
(1.6)Whatis back-propagation?
A. It is the transferofenor backthroughthe networkto adjustthe inputs'
B. It is the transferof errorbacktlrough the networkto allow the weightsto be
adjusted.
C. It is thetransferof errorbackthroughthenetworkusinga setof recurrent
conneclions.
D. It is the hansferof outputsbackfrom thehiddenlayerto the inputlayerusinga set
of recurrentconnections.
(1.7)A multi-layerfeedforwardnetworkwith "logsig"activationfunctionscansolve
one
eachunit canlinearlyseparate
theXOR problemsatisfactorily:this is because
part of the space,and they canthen combinetheir results.
A. True - thesenetworkscan do this, but they areunableto leam to do it - they have
to be codedby hand.
B. True- this usuallyworksandthesenetworkscanleamto classifyevencomplex
problems.
canleamto soivethe
C. False- only a networkwith sigmoidactivationfi.rnctions
XOR problem.
D. False- just havinga singlelayernetworkis enough.
(1.8)Amulti-layernetworkshouldhavethe samenumberof unitsin the inputlayer
andthe output layer.
A.TRUE. B.FALSE,
2l1l

(1,9)Whatdoesthe followingMATLAB functiondo?


>> net: neu{f(minmaxQ),
[4.21.{'ransig'.'logsig'})r
A. Initializea singleJayernetworkwith 4 inputunits,2 outputunitsandlinear
activationfunctions.
B. Initializea multi-layernetworkwith 4 hiddenunits,2 outputunitsandsigmoid
aclivalionfi-mctions.
C. Initialize a multi-layer network with nonlinear activationfunctionsandtwo hidden
layers- the firsthiddenlayerhas4 unitsandthe secondonehas2 units.
D. Initializea multi-layernetworkwith sigmoidactivationfunctions,4 hiddenunits
and2 recurrentconnections
backto the input layer.
(1.10)A neuronwith 4 inputshasthe weightvector w : ll,2,3,4lranda bias=0
(zero).The activationfirnctionis linear,wherethe constantof proportionalityequals2
thatis, the activationfunctionis givenbyflnet) : 2 xnet.If the inputvectoris x =
/ihen the output of the neuronwill be
[4,8,5,6]
A. 1.
B.56.
c. 59.
D.112.
E.118.
(1.11)A perceptronwith a unipolarstepfunctionhastwo inputswith weightsw1=
asa weight
0.5andw2= -0.2,anda tlreshold (0) = 0.3 (dcanthereforebe considered
-1).
given
trainingexamplex = [0,1]',
For a
for an extrainputwhich is alwayssetto
givethe correctanswer(thatis, is
thedesiredoutputis 0 (zero).Doestheperceptron
the actualoutputthe sameasthe desiredoutput)?
A. Yes.
B. No.
(1.12)Thenetworkoffigure l, is:
(a) a singlelayerfeed-lorwardneuralnetwork
(b) anautoassociative
neuralnetwork
(c) a multiple layer neuralnetwork

Figure I

3/II

H andT
(1.13) Thenetworkshownin Figure1 is trainedto recognizethe characters
asshownbelow:

+8

II'IPUT

OLNFUT

INPUT

OUTP1IT

If the following pattem was given


-f,]

lr

'/

IE
INPUT

OUTP"IT

What would be the output of the network?

.,7

.,I OR
E
.,TORI ORE
(1.14)Thefollowing networkis a multi-layerpeicepffon,whereall of the unitshave
binaryinputs(0 or 1) andbinaryoutputs(0 or 1).

Theweightsfor this networkarew3t : I, wzz= l, wq = -1,wqz= -l andwr : 3. The


thresholdofthe hiddenunit (3) is 1.5andthethresholdof the outputunit (4) is -0.5.
Thethresholdof both inputunits (1 and2) is 0.5,sothe outputoftheseunitsis
exactlythe sameasthe input.Whichof the foliowing
Booleanfunctionscanbe computedby this network?
A. AND.
B. OR.
C. XOR.
D. All of the aboveanswers.
E. Noneof theaboveanswers.

4III

(1.15)Neuralnetworkbuilderssaythey:

b.
,1

do not model human intelligence.


do not program solutions.
do not aim to solve specific problemsper se.
seekto give the hardwarea generalizedcapability to learn.
All of the above

patternsof the
(1.16)Hardwareandsoftwarethat attemptsto emulatetheprocessing
biologicalbrainbestdescribes:
a.
neural'network.
expertsystem.
b.
case-based
reasoning.
c.
d.
fuzzylogic.
knowledgework system.
e.
(1.17)Whatis animportantlimitationof the leamingcapabilityof perceptron
networks?
B -The training examplesneedto be presentedin a specificorder,otherwisethe network
may getconfiised.
canonly leamBooleanfirnctions.
' fi-Tl'rcy
functions.
c -They areincapableof leamingnon-linearlyseparable
to noise.
p - Theyareverysensitive
the labelsA,
(1.18) In the following diagram(ANN) of an artificialneuron,associate
B and C with the followins:
Body of nuron

1- A:A dendrite, B:An axon, C:A synapticjunction.


2- B:A dendrite, A:An axon, C:A sl,napticjunction.
3- C:A dendrite, B:An axon, A:A synapticjunction.
(l.19)Neuronsareconnected:
1- In series
3- In parallel

2- In a complexspatialarrangement

5/11

a:
1l.20)Thefollowingfigurerepresent

a. Singleinputneuron.
b . Multiple inputneuron.
c. Noneof the above.

tatement i
1-Thebasicunitsofneuronareinput(s),weight(s)andoutput(s)
2-Neuronis the basicunit ofthe brain.
3-Neuroncannotacceptmanyinputs.
4-Ifactive inputsarereceivedat onceto the neuon, thenthe neuronwill be activated
a.ndfire.
5- Whentheprototypeinputpattemsareorthogonal,theHebbrule producessome
EITOTS.

6- A singlelayerperceptron(multi neuron)cannotsolveXOR problem.


the inputforwardthroughthe
algorithmis to propagate
7- Thebackpropagation
backwardthroughthe network.
the sensitivities
network;next stepis to propagate
8- Humanstrain a neuralnetworkby feedingit a setof trainingdatafor whichthe
inputsproducea knownsetof outputsor conclusions.
9- Neuralnetworksmay not performwell if their trainingcoverstoo muchdata.
10- Eachof the inputsto theperceptronis multipliedby a numberto give it a weight'
to be changedsothatthe
Theseweightsallow the sffengthof the differentconnections
perceptron
canleam.
11-Theoutputofneural networkdependon theparticulartransferfunctionthat is
chosen.

6111

3l Answer the followin


[3.1]Considerthis setof data:

p:[337'1;s5991
t=[1 1-1 -1]

wherep is input vector and t is target.


neFnewffininmax(p),[3 1], {'tansig"purelin'},'traingdm');
and parameter
We then want to tiain the netwoik net with the following commands
of epochs)
> net.trainParam.epochs:30;%(number
rate)
r net.rra;nParam.lr-0.3;%(leaming
) net.trainParam.mc=0.6;%(momentum)
) net=train (net,P,t);

afierwe runtheabovecommand'
Thefollowingis plot oflrainingerrorvs epochsresulted
> 5sim(net,P);
Theoutputresultswill be for the followingcommandis
-\

13.21

figures
5electthebestmatchfrom groupA with groupB for the following

,\

X
Three LaYer

w
7l11

Answet:

GroupB

GroupA
1

.)
9

2
3

4l Answer the followin

uestions.

[4.t]nor th" fottowingMatlabcodeanswerthe following questions:

p = [ - 1- 12 2 ; 0 s 0 s l ;

t=[-1 -111];
neFnervff(minmax(p),
},'haingd');
[3,1], {'tansig','purelin'
: 50;
net.trainParam.show
: 0.05;
net.trainParam.lr
: 300;
net.trainParam.epochs
:
goal 1e-5;
net.trainParam.
1- Whatis thetypeof Artificial NeuralNetworkused?
3'c'k g'' f']"!tc' v"' A-l +)
f,t a5;'po'd

are used,and how many neuronsin eachlayers?


2- How manv iayers
-l-n^pa
| , t-{tJde^ , 6l/p

t-31

i'

Whatarethe transfersfunctionsusedin eachlayer?


too55' Pure\i\

* .f 4- What is the inPutvictor?


o 'f S- WY^tis the targetvictor?
\Z

t,'t

6- How manymaximumiterationswill beusedto train the neuralnetwork?


3oo

@'5 7- Write thetrain functioncode:

.f,.,. tF,t )
problemwith the preceptronru1e.Apply eachinput
r t4.21Solvethe following classification
b n".to, in order,for asmanyrepetitionsasit takesto insurethatthe problemis solved'

=h'1,*='}
=l}r1,"=
r}{r:=[-]l'"=o]{r+
{n=lll,n=o}{rz
Usethe following weightsandbias:

w(0) = [0 0]b(0)= 0

Answet:
8/i 1

Iterations
tteration number

\I*

f -r-

-31

b= I

Stepsin details:
f--t

ur'

9n1

2Aieuron Modrl and |ielsorh

Atrl'hilccturt't

,r"

Relation
InpuVOutput

Name

HardLimit

a=0
a= |

SymmericalHardLimit

a = -l
6 = 1l

i<0
r20
a<0
a20

o=n

Unear

i'

Lineu
Saturating

SymmeuicSaturating
Linear

n<0
0 S n SI
n>l

a = -l
a = n
a = l'

a<-l
-tSnSl
a>l

a=-

Ing-Sigmoid

I
I +c'

Tangent I
Hyperbolic
Sigmoid
I

"

=:-;^
e'+c-

a=0
ti = n

PositiveLinear

Competitive

a=0
a=n
a= |

n{0
03n

a = 1 ncuronwithmaxn
a = 0 all otherncuoos
Table 2.1 Tlausfer FulctioDB

&

Icon

MAILAB

Function

tr

hardlim

hardlims

a
a
a
a
a
a
tr

purclin

satliD

satliDs

logsig

tansig

poslin

compet

&lodMlemt

f,','t I f f.r
I I
r..l
I r
=
='l
=
=
=4
tn,llj'"=o|in,=[-t',,
t', [,1,,,
C{r.=h],,,
I

Uaethe ldtftl

weights irad biee:

w1o)=[sd arol=0.
We start by calculat ',g the u

outputd fortbefirst iaputvector

pr, usiDgtheinitial rat*m; ffil*lb

a = hardtim(W (0) p, +i (0) )

(- .1,1 \
hardlinlt
= hardlin(o)
=|
dL;.J.tJ
the output a doesaot eoualthe targetvalue
r, , sowe ure ttre
- penel,troD
rule to find newweigntsina Uases[*"d
; A;;;;.*
e=tr_a=0_t=_l

w(r)= w(0)*,pf= [oo]* t_r; =


lzzl l_z_il

D(l) = ,(0) +g = 0+ (-l) = _l

We:row apply the eecold iaput vector p2


, usilg the updated weights aait

a = h a r d l i m( r f i l( l ) p 2 + D ( l ) )

(- 1 tJ_
,'l )
- naa,inl _r]l
=r
t_,
l
2
) tl= hardtin(t)
\)

This ti'r'e the output a is eoual to-g:


tsrget r, . Application ofthe pertep
tron rule will Dot result in ainycnanges.
il(2) = w(l)
bl2) = b(t)
We now apply the third input vector.

/t27

,^-.!\i*(t

1 Perczptton Lcarning Rulc

a= hardtim
(W (2)pr+h(Z)t

(-

i"'r= naralinl
' L| llz J
\ -f_z_r]

r | = hardrin
(_t) = o
)

!h.e outputip respolseto input vector p, is egual to the


target ,! , Botbr
will be no cbaages

w(3) = w(2)
'(3) = E(2)
We aos moveoo to the last ,nput vector p. .
a = hardlin(W (3) p. +, (3))

--hardun(l2dlJ 'j = ha rin(-t)


=o

'I

Thir time Lheo.-utput doeouot eeyal.\ app-rop_riate


target'r. . Theper9
ceptronrule will result
in a newsi of
;d ;:'"alu,isfo"'w
e=\_a

= l_0= I

w(1)= w(3)*,p!,=
b ( 4 ) = b ( 3 )+ a = - I + l

l_z_:]+ 1ry[_, ,] = [_:_,]

=0
'We
now must checkthe firet vector p, again. Thie time the
output a is
egual to the associatedtarget ,, .
a = h o r d l i m ( Y ( 4 1 p ,+ D ( 4 ) )

= hu,dtin(13-,f
= hardtin(-B)
=0
f,]..J
TbereforuLherearenochanges.
W(5)= W(4)

b ( s )= o r c )
The cecondpreuerrr.tjon of p,
-' reeults ir aDerrcr aud tbereforca new set
of*eigbt and biaevaluer.
,br8

i
I

Вам также может понравиться