项目作者: huseinzol05

项目描述 :
implement Artificial Neural Network on different languages
高级语言: PHP
项目地址: git://github.com/huseinzol05/Neural-Network-Multilanguages.git
创建时间: 2018-02-26T06:39:33Z
项目社区:https://github.com/huseinzol05/Neural-Network-Multilanguages

开源协议:MIT License

下载


Neural-Network-Multilanguages

implement Gradient Descent Feed-forward and Recurrent Neural Network on different languages, only use vector / linear algebra library.

Artificial Neural Network is relatively easy if you really understand it!

Support

Ruby

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Python

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Javascript

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Go

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

C++

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Julia

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

PHP

  • feed-forward iris
  • recurrent generator
  • recurrent forecasting

Instructions

  1. Go to any language folder.
  2. run install.sh
  3. run the program.

Neural Network Architectures

  1. Feed-forward Neural Network to predict Iris dataset.

    • 3 layers included input and output layer
    • first 2 layers squashed into sigmoid function
    • last layer squashed into softmax function
    • loss function is cross-entropy
  2. Vanilla Recurrent Neural Network to generate text.

    • 1 hidden layer
    • tanh as activation function
    • softmax and cross entropy combination for derivative
    • sequence length = 15
  3. Vanilla Recurrent Neural Network to predict TESLA market.

    • 1 hidden layer
    • tanh as activation function
    • mean square error for derivative
    • sequence length = 5

All implemention like max(), mean(), softmax(), cross_entropy(), sigmoid() are hand-coded, no other libraries.

Status

Will update overtime.

Warning

You would not see high accuracy for other languages that natively are not using float64. During backpropagation, the changes are very small, float32 ignored it.

Authors