项目作者: haidousm

项目描述 :
an artificial neural network framework built from scratch using just Python and Numpy
高级语言: Python
项目地址: git://github.com/haidousm/fine.git
创建时间: 2021-01-07T14:40:26Z
项目社区:https://github.com/haidousm/fine

开源协议:

下载


Fine

A keras-like neural network framework built purely using Python and Numpy that’s just that, fine.

Table of Contents

1- How to use
2- Demo
3- Technical Specifications

How to use" class="reference-link"> How to use

  1. git clone git@github.com:haidousm/fine.git
  2. cd fine
  3. python3 -m pip install -r requirements.txt

Demo" class="reference-link"> Demo

Demo was built using javascript for the frontend, and a flask server to serve predictions from the model.

Demo model creation & training:

  1. from datasets import load_mnist
  2. from models import Sequential
  3. from layers import Conv2D
  4. from layers import MaxPool2D
  5. from layers import Flatten
  6. from layers import Dense
  7. from activations import ReLU
  8. from activations import Softmax
  9. from loss import CategoricalCrossEntropy
  10. from models.model_utils import Categorical
  11. from optimizers import Adam
  12. X_train, y_train, X_test, y_test = load_mnist()
  13. model = Sequential(
  14. layers=[
  15. Conv2D(16, (1, 3, 3)),
  16. ReLU(),
  17. Conv2D(16, (16, 3, 3)),
  18. ReLU(),
  19. MaxPool2D((2, 2)),
  20. Conv2D(32, (16, 3, 3)),
  21. ReLU(),
  22. Conv2D(32, (32, 3, 3)),
  23. ReLU(),
  24. MaxPool2D((2, 2)),
  25. Flatten(),
  26. Dense(1568, 64),
  27. ReLU(),
  28. Dense(64, 64),
  29. ReLU(),
  30. Dense(64, 10),
  31. Softmax()
  32. ],
  33. loss=CategoricalCrossEntropy(),
  34. optimizer=Adam(decay=1e-3),
  35. accuracy=Categorical()
  36. )
  37. model.train(X_train, y_train, epochs=5, batch_size=120, print_every=100)
  38. model.evaluate(X_test, y_test, batch_size=120)

Technical Specifications" class="reference-link">Technical Specifications

Layers

  • [X] Dense Layer
  • [X] Dropout Layer
  • [X] Flatten Layer
  • [X] 2D Convolutional Layer
  • [X] Max Pool Layer

Activation Functions

  • [X] Rectified Linear (ReLU)
  • [X] Sigmoid
  • [X] Softmax
  • [X] Linear

Loss Functions

  • [X] Categorical Cross Entropy
  • [X] Binary Cross Entropy
  • [X] Mean Squared Error

Optimizers

  • [X] Stochastic Gradient Descent (SGD) with rate decay and momentum
  • [X] Adaptive Moment Estimation (ADAM)