Professional Documents
Culture Documents
More Convolutional
Neural Networks
CS 519 Deep Learning, Winter 2016
Fuxin Li
With materials from Zsolt Kira, Roger Grosse, Nitish Srivastava
Action Items
Project proposal teams due on 2/9 end of the day (11:59PM)
Reminder: no more than 3 persons per team
Then we will arrange the presentation order before 2/11
Remind again that the class on 2/11 will last until 6PM!
-1
-1
-2 -2
-2
-1
-1
-2 -2
-2
3 1 + 1 1 = 2
-1
-1
-2 -2
-2
-1
1 2 + 1 1 + 1 1 + 1 1 = 1
-1
-1
What if:
0
-1
-1
-2 -2
-2
-2 -2
-2
=
=
-1 -6
-1
-18
-1
-1
-2 -2
-2
2
4
-1 -6
-1
-1
-2 -2
-2
-1 -6
-3
-1
-1
-2 -2
-2
-1 -6
-3 -5
-1
-1
-2 -2
-2
-1 -6
-3 -5
-1
-1
-2 -2
-2
-1 -6
-3 -5
-2
-1
-1
-2 -2
-2
-1 -6
-3 -5
-2 -2
Rectifier
: = max(0, )
We need nonlinearity
Make the gradient sparser and simpler to
compute
Convolution/ReLU/Pooling
MNIST again
Le Net
Convolutional nets are invented by Yann LeCun et al. 1989
On handwritten digits classification
The 82 errors
made by LeNet5
The human error rate is
probably about 0.2% 0.3% (quite clean)
ReLU rectifier
Max-pooling
=?
=?
;
=
= (; )
st
(1
layer)
Visualization of layer 5
Convolution/ReLU/Pooling
arXiv!
Because how fast the field evolves, most deep learning papers are on
arXiv first
http://arxiv.org/list/cs.CV/recent
http://arxiv.org/list/cs.CL/recent
Check that for newest papers/ideas!
Deconvolutional Network
Instead of mapping
pixels to features,
map the other way
around
Can be used to
learned unsupervised
features
Here, attached to
trained convnet
Visualization of layer 5