Gradient Maker is a lightweight implementation demonstrating the core principles of gradient descent. This repository is designed to help users understand how gradient descent works under the hood without relying on heavy frameworks.
Gradient descent is the backbone of machine learning training. This project uses a simple Python implementation to show the concepts behind gradient descent by leveraging a graph-based approach to store and propagate gradients through computational nodes.
- Minimal Dependencies: Requires only basic Python knowledge and high school level calculus.
nn.py: Contains an example of a Multi-Layer Perceptron (MLP) that trains on a simple dataset.gradient_maker.py: Defines classes and functions to implement gradient descent using a computational graph.README.md: Provides an overview of the project, its purpose, and usage instructions..gitignore: Lists files and directories to be ignored by Git.
- Clone the Repository:
git clone https://github.com/KrishnaAgarwal1308/gradinet_maker.git cd gradinet_maker python nn.py
Feel free to modify the code in gradient_maker.py to experiment with different functions, gradient computations, or integrate more complex models.
Python 3.x: Ensure you have Python installed on your machine.
Basic Python Knowledge: Understanding object-oriented programming in Python will help.
High School Level Calculus: Familiarity with basic calculus concepts such as derivatives is recommended.