Академический Документы
Профессиональный Документы
Культура Документы
2/49
References
3/49
Outline
• Introduction
• The Convolution Operation
• Main Properties of CNNs (due to Convolution)
• Pooling
• Variants of the Basic Convolution Function
• Data Types
• Training Convolutional Networks
4/49
Outline
• Introduction
• The Convolution Operation
• Main Properties of CNNs (due to Convolution)
• Pooling
• Variants of the Basic Convolution Function
• Data Types
• Training Convolutional Networks
5/49
Applications
• CNNs have applications in image and video
recognition, recommender systems and natural
language processing.
6/49
• CNNs have played a
fundamental role in the
history of deep learning.
• They are still in use and
an important topic of
research.
7/49
Inspiration
• Convolutional Neural Networks (CNNs, Le Cun,
1989) were inspired by the animal visual cortex.
8/49
Inspiration
• Individual cortical neurons respond to stimuli only
in a restricted region of the visual field known as
the receptive field.
• The receptive fields of different neurons partially
overlap such that they cover the entire visual field.
9/49
Topology of Input Types
10/49
Main Operations of CNNs
11/49
An Architecture
12/49
Outline
• Introduction
• The Convolution Operation
• Main Properties of CNNs (due to Convolution)
• Pooling
• Variants of the Basic Convolution Function
• Data Types
• Training Convolutional Networks
13/49
The Convolution Operation
14/49
The Convolution Operation
Now suppose that our laser sensor is noisy and we would
like to average several measurements:
s(t) = 𝑡 𝑤 𝑎 𝑥 − 𝑎 𝑑𝑎
𝑠(𝑡) = (𝑥 ∗ 𝑤)(𝑡)
𝑠 𝑡 = 𝑥 ∗ 𝑤 𝑡 = 𝑥 𝑎 𝑤(𝑡 − 𝑎)
𝑎=−∞
16/49
Discretization
• In machine learning applications, the input is
usually a multidimensional array of data and the
kernel a multidimensional array of parameters that
are adapted by the learning algorithm.
• Such multidimensional arrays are called tensors.
• They are zero everywhere but in finite sets of
points:
17/49
• Convolution is commutative:
S(i,j) = (K*I)(i,j) = σ𝑚 σ𝑛 𝐼 𝑖 − 𝑚, 𝑗 − 𝑛 𝐾(𝑚, 𝑛)
• Cross-correlation:
S(i,j) = (I*K)(i,j) = σ𝑚 σ𝑛 𝐼 𝑖 + 𝑚, 𝑗 + 𝑛 𝐾(𝑚, 𝑛)
18/49
Outline
• Introduction
• The Convolution Operation
• Main Properties of CNNs (due to Convolution)
• Pooling
• Variants of the Basic Convolution Function
• Data Types
• Training Convolutional Networks
19/49
Main Properties of CNNs
20/49
Sparse Interactions
• In traditional neural
networks every output unit
interacts with every input
unit.
• Convolutional networks,
instead, thanks to kernels
smaller than the input,
achieve sparse interactions
(sparse connectivity).
21/49
Sparse Interactions
22/49
Example:
efficiency of edge detection
(60.000 times more efficient than multiplication)
23/49
Example
24/49
Parameter Sharing (Tied Weights)
25/49
Equivariance to Translation
I’(x,y) = I(x-1,y)
26/49
Outline
• Introduction
• The Convolution Operation
• Main Properties of CNNs (due to Convolution)
• Pooling
• Variants of the Basic Convolution Function
• Data Types
• Training Convolutional Networks
27/49
Pooling
• A typical layer of a
convolutional network
consists of three stages:
1. first stage, several
convolutions;
2. second stage, several
nonlinear activations
(e.g.: rectified linear);
3. third stage: pooling
function.
28/49
Pooling
• A pooling function replaces the output of the net at
a certain location with a summary statistic of the
nearby outputs.
• For example, the max pooling operation reports the
maximum output within a rectangular
neighborhood.
29/49
Pooling
30/49
Pooling
31/49
Pooling
32/49
(Rectified Linear Unit)
36/49
Long step
37/49
Unshared Convolution
38/49
Tiled Convolution
39/49
Outline
• Introduction
• The Convolution Operation
• Main Properties of CNNs (due to Convolution)
• Pooling
• Variants of the Basic Convolution Function
• Data Types
• Training Convolutional Networks
40/49
Data Types
41/49
42/49
43/49
44/49
Data Types
45/49
Outline
• Introduction
• The Convolution Operation
• Main Properties of CNNs (due to Convolution)
• Pooling
• Variants of the Basic Convolution Function
• Data Types
• Training Convolutional Networks
46/49
Training Convolutional Networks
47/49
Training Convolution Networks
48/49
Thank you for your kind attention!
Any questions?
49/49