Skip to content

Latest commit

 

History

History
12 lines (7 loc) · 1.03 KB

README.md

File metadata and controls

12 lines (7 loc) · 1.03 KB

Gradient Descent algorithm

This code implements a Gradient Descent (GD) from scratch in order to minimize a loss function, known as mean of squared error (MSE), through a linear regression problem. However, Gradient Descent can be employed in different Machine Learning (ML) algorithms and scenarios.

Basically, the optimizer is used in ML training step to find, as an optimization process, a set of weights and biases that have low loss, on average, across the entire training dataset.

To demonstrate the usage of the GD and its applicability to univariate and multivariate linear regression problem, we are going to explore the dataset of Marketing and Sales from Kaggle, which contains information about TV, influencer, radio, and social media ads budget related to sales.

Dataset from Kaggle: https://www.kaggle.com/harrimansaragih/dummy-advertising-and-sales-data