A library that demonstrates training of data using stochastic gradient descent method
A library that demonstrates training of data using stochastic gradient descent method
The minflow library is a part of Udacity’s Nanodegree Program and has been prepared while pursuing the same
miniflow.py
This file consists of the set of functions used to perform a basic back propogation in a neural network
nn.py
This file consists of a sample neural network
Include the miniflow.py in your root project and use the following classess as follows:
Input()
Use it to declare input nodes of neural network
Linear()
Use to declare a node performing the task of linear activation of form Y = XW+ b
Sigmoid()
Use to declare a node performing sigmodial activation
MSE
Use this node to calculate Mean Square Error
sgd_update
The miniflow consists only a Linear and a Sigmoidal Activation Nodes. Contributors are expected to provide more depth to the miniflow by including more nodes and functions to perform basic Neural Network based tasks
MIT license is included in the repo