Skip to content

A small interactive tool to visualize how gradient descent works

License

Notifications You must be signed in to change notification settings

OmkarJ13/gradient-descent-visualizer

Repository files navigation

gradient-descent-visualizer

Gradient descent is a core optimization algorithm in machine learning that iteratively adjusts a model's parameters (weights/biases) to minimize a cost function (error), essentially finding the lowest point (minimum) of a valley in the error landscape by taking small steps in the opposite direction of the function's gradient (steepest descent) until the model's predictions are as accurate as possible.

This is a small tool to help gain an intuitive understanding of gradient descent by visualizing the process in a 3-dimensional loss landscape.

Preview

Supported Optimizers

  • Vanilla Gradient Descent
  • Stochastic Gradient Descent (SGD)
  • SGD with Momentum (Classical)
  • SGD with Momentum (Nesterov)
  • Adagrad
  • RMSProp
  • Adam

About

A small interactive tool to visualize how gradient descent works

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published