项目作者: oppenheimj

项目描述 :
Custom multilayer perceptron (MLP)
高级语言: Python
项目地址: git://github.com/oppenheimj/neural-network.git
创建时间: 2018-02-11T00:42:37Z
项目社区:https://github.com/oppenheimj/neural-network

开源协议:

下载


neural-network

This was my final project for Introduction to Machine Learning at the University of Vermont. I wanted to prove to myself that I thoroughly understood the calculus and linear algebra underlying neural networks (and especially backpropagation) so I implemented this from scratch. This may be a good resource if you are trying to do the same.

The project is tailored to the MNIST dataset of handwritten digits. Logic is separated into three classes, Data.py, NeuralNetwork.py, and Trainer.py. The Main.py file conducts the interaction between the aforementioned three classes.

Sample output:

  1. INITIALIZING DATA
  2. Importing training data...
  3. Importing testing data...
  4. Normalizing data
  5. Successfully loaded normalizedData.npz
  6. Converting target data to one-hot encoding...
  7. DATA INITIALIZATION COMPLETE
  8. *******************************
  9. * INITIALIZING NEURAL NETWORK *
  10. *******************************
  11. Input layer size: 784
  12. Hidden layer size: 128
  13. Output layer size: 10
  14. Number of hidden layers: 2
  15. Total number of synapses: 101760
  16. ***************************
  17. * TRAINING NEURAL NETWORK *
  18. ***************************
  19. Epochs: 12
  20. Iterations per epoch: 937
  21. Batch size: 64
  22. Learning rate: 0.02
  23. epoch training testing
  24. 0 15.23% 16.26%
  25. 1 74.31% 74.51%
  26. 2 89.61% 89.5%
  27. 3 92.39% 92.1%
  28. 4 94.06% 93.54%
  29. 5 95.15% 94.51%
  30. 6 95.83% 95.17%
  31. 7 96.41% 95.41%
  32. 8 96.85% 95.69%
  33. 9 97.29% 95.92%
  34. 10 97.56% 96.13%
  35. 11 97.88% 96.25%