Notes on Convolutional Neural Networks
Jake Bouvrie
Center for Biological and Computational Learning
Department of Brain and Cognitive Sciences
Massachusetts Institute of Technology
Cambridge, MA 02139
jvb@mit.edu
November 22, 2006
1 Introduction
This document discusses the derivation and implementation of convolutional neural networks
(CNNs) [3, 4], followed by a few straightforward extensions. Convolutional neural networks in-
volve many more connections than weights; the architecture itself realizes a form of regularization.
In addition, a convolutional network automatically provides some degree of translation invariance.
This particular kind of neural network assumes that we wish to learn filters, in a data-driven fash-
ion, as a means to extract features describing the inputs. The derivation we present is specific to
two-dimensional data and convolutions, but can be extended without much additional effort to an
arbitrary number of dimensions.
We begin with a descr
net/neural/derivation/work/convolutional/works/Convolutional/descr/机器/extended/
net/neural/derivation/work/convolutional/works/Convolutional/descr/机器/extended/
-->