Skip to content

KrishnaAgarwal1308/gradinet_maker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gradient Maker

Gradient Maker is a lightweight implementation demonstrating the core principles of gradient descent. This repository is designed to help users understand how gradient descent works under the hood without relying on heavy frameworks.


Overview

Gradient descent is the backbone of machine learning training. This project uses a simple Python implementation to show the concepts behind gradient descent by leveraging a graph-based approach to store and propagate gradients through computational nodes.


Features

  • Minimal Dependencies: Requires only basic Python knowledge and high school level calculus.

Repository Structure

  • nn.py: Contains an example of a Multi-Layer Perceptron (MLP) that trains on a simple dataset.
  • gradient_maker.py: Defines classes and functions to implement gradient descent using a computational graph.
  • README.md: Provides an overview of the project, its purpose, and usage instructions.
  • .gitignore: Lists files and directories to be ignored by Git.

How to Use

Running the MLP Example

  1. Clone the Repository:
    git clone https://github.com/KrishnaAgarwal1308/gradinet_maker.git
    cd gradinet_maker
    python nn.py

Customizing Gradient Descent

Feel free to modify the code in gradient_maker.py to experiment with different functions, gradient computations, or integrate more complex models.

Prerequisites

Python 3.x: Ensure you have Python installed on your machine.

Basic Python Knowledge: Understanding object-oriented programming in Python will help.

High School Level Calculus: Familiarity with basic calculus concepts such as derivatives is recommended.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages