项目作者: nmichlo

项目描述 :
Variance normalising pre-training of neural networks.
高级语言: Jupyter Notebook
项目地址: git://github.com/nmichlo/experiment-normalized-activations.git
创建时间: 2021-06-18T13:30:55Z
项目社区:https://github.com/nmichlo/experiment-normalized-activations

开源协议:MIT License

下载


Weight Initialisation & Normalised Activations

NOTE: it turns out similar ideas have already recently been investigated:

NB After having a look at the above, and playing around with the
initialisation methods. Their results usually beat the experiments below!

Old Experiments

Experiments based on normalising neural network activations (or weights)
in a pretraining step, using random input noise.

The network is trained to achieve a target mean and standard deviation after each layer,
based on this normally distributed noise.

Results show that this method can improve training speed and
performance using the same number of weights.