Академический Документы
Профессиональный Документы
Культура Документы
A. 3 B. 9 C. 4.5 D. 2.12
Question 3: Perceptrons
Assume a perceptron:
Training set Y j =1 if W * X = (Sum wi j * xi j) > 0
• with 3 inputs plus 1 for bias i= .. n
• where net = 0 0 1 -> 0 otherwise Y= 0.
x0*w0+x1*w1+x2*w2+x3*w3 1 1 1 -> 1 Wi j+1= Wi j+ a * (Tj-Yj)* Xi j
1 0 1 -> 1
• that outputs 1 if net > 0, else 0 0 1 1 -> 0 delta weight
How will the final weight vector look like when the fourth data-item is processed?
A. 1 1 1 1 B. -1 0 1 0 C. 0 0 0 0 D. 1 0 0 0 E. None of the above
Question 4: Single Neuron in a Feed forward network
Assume a single neuron: If Sum wij * ai > 0 the output of unit j is calculated as
Y = f ( Sum (wij * ai) ) otherwise Y=0.
• with 3 inputs plus 1 for bias
Training set
• with a learning rate a=1 W j i =Wj i+ a *(Tj-Yj) G´(SUMj)*Xi
Wj i =W j i+ a *(Tj-Yj) *X i j if f is linear.
• initial weights all 0 0 0 1 -> 0
1 1 1 -> 1
1 0 1 -> 1 The activation function f is ReLU
0 1 1 -> 0
How will the final weight vector look like when the fourth data-item is processed?
A. -1 1 -1 -1 B. -1 0 -1 0 C. 0 0 -1 0 D. 0 1 0 -1 E. None of the above
Question 5: Backpropagation Bias=0
y=1.5 Learning rate= 1
D1 =0.375 Activation function = ReLU
11
1 W´=0.5+1.5*0.375=1.055
0.5 11
W´=0.5+1*0.375=0.875 0.5
0.5 D4 =0.375
X1=1 W´=0.875 0.5 4 y=2.25
D2 =0.375 W´= ?
0.5
W´=0.875
y=1.5 W´=1.055
0.5 2 0.5 6 Z=3
W´=0.875 Y=2.25
0.5
0.5 0.5 0.5 W´= ?
W´=0.875 5 D=0.75
X2=2 0.5 D5 =0.375
y=2.25
0.5
W´=0.5+2*0.375=1.25 W´=0.5+1.5*0.375=1.055
3
0.5 D3 =0.375
y=1.5
What is the adapted weight for the edges between neurons 4,5 and 6? A 1.055 B 2.555 C 1.755 D 2.1875 E 1.25
Question 6: Recurrent Neural Network
Which network structure would best fit an analysis task, where a sequence of words is
mapped onto a sentiment or opinion?
A. B. C. D. E.
Question 7: Recurrent Neural Network
Which RNN architecture is well known for handling the wanishing gradient problem
well?
w32=3
x32=0 Y2=1
w42=1
W13, W23, W33, W43 ?????
x42=0
How will the final weight vector look like when the third data-item ( 1 1 0 0) is processed?
A. 3 1 3 1 B. 2 1 3 2 C. 4 2 3 1 D. 2 2 2 2 E. None of the above
Question 10 Associative Memory
Procedure for establishing a weight matrix and for matching of a test case using a matrix approach:
1.Define X in terms of the three training vectors
2. Defing the weight matrix W = XTX
3. Normalise W by dividing with p
4. Evaluating training pattern through matrix multiplication
5. Apply a threshold function with threshold = 0
1 1 1 1 1 -1 3 1 -1 1 1/3 -1/3
X = 1 1 -1 X T= 1 1 1 1. XTX = 1 3 1 -> 1/3 1 1/3
-1 1 1 1 -1 1 -1 1 3 -1/3 1/3 1
-2 -1 1 2 -2 -1 1 2
1 1
C. 1/2 D. 1/2
-2 -1 1 2 -2 -1 1 2
Question 13: Convolutional Network
Which technique is NOT a key ingredient in the methodology of Convolutional Neural
Networks
1 2 3 4
If the input array A looks like: and the filter is: 1 0 and the stride is 1,
4 3 2 1 0 1
1 2 3 4
4 3 2 1
how does the feature map look like after the convolution of input array A with the filter F?
4 4 4 5 5 5 4 4 4 3 3 3
5 5 5 6 6 6 6 6 6 6 6 6
A. 4 4 4 B. 5 5 5 C. 4 4 4 D. 4 4 4
Question 16: Convolution Neural Networks
If the array output from convolution C looks like: 1 2 3 4 and the filter is 2x2 and the stride is 2,
5 6 7 8
1 3 2 4
5 7 6 8
how does the feature map look like after the subsampling (pooling) of array C with the filter F using average pooling?
A. 5 4 B. 5 3.5 C. 4 5.5 D. 4 5
Recommendations for further readings
Warren McCulloch and Walter Pitt, 1943
"A Logical Calculus of the Ideas Immanent in Nervous Activity“. Bulletin of Mathematical Biophysics Vol 5,
http://www.cse.chalmers.se/~coquand/AUTOMATA/mcp.pdf
D. 0. Hebb , 1949
The Organization of Behavior: A neuro-psychologial theory, John Wiley,
http://s-f-walker.org.uk/pubsebooks/pdfs/The_Organization_of_Behavior-Donald_O._Hebb.pdf
Frank Rosenblatt,1957
´The perceptron, a perceiving and recognizing automaton´, Project Para, Cornell Aeronautical Laboratory
https://blogs.umass.edu/brain-wars/files/2016/03/rosenblatt-1957.pdf
Weibo Liua , Zidong Wanga , Xiaohu Liua, Nianyin Zengb, Yurong Liuc and Fuad E. Alsaadid, 2017.
´A Survey of Deep Neural Network Architectures and Their Applications´, Neuro computing, Volume 234
https://bura.brunel.ac.uk/bitstream/2438/14221/1/FullText.pdf
NPTEL