Gradient Descent with normalization suitable for large scale linear regressions.
This implementation of linear regression uses gradien descent for minimization of the cost function. Optionally polynomial features can be added with a regularized cost function.
The included datasets in the examples are open datasets taken from kaggle.
To use them in linear regression some parameters of the datasets are modified from string to belonging numerous values.
Target is to predict the weight of a fish with known body measures and species. Published under
GPL 2 license.
Parameters:
MIT License
Copyright (c) 2020 Philipp Biedenkopf