Skip to content

CipherSingularity/NeuroMath

 
 

Repository files navigation

Neural Networks Banner

Python Jupyter NumPy License: MIT

GitHub stars GitHub forks Contributions Welcome


Artificial Neural Networks (ANNs) are powerful computational models inspired by the human brain, and their strength lies in the mathematics that governs how they learn and make predictions. At their core, ANNs use linear algebra to represent data as vectors and matrices, and apply weighted transformations to propagate information through layers. Non-linear activation functions introduce complexity, enabling networks to learn relationships beyond simple linear patterns.Learning in ANNs is driven by calculus, specifically gradient-based optimization. Using backpropagation, networks compute partial derivatives of the loss function with respect to each weight, adjusting parameters to minimize error. Probability and statistics further support ANN behavior by defining loss functions, modeling uncertainty, and improving generalization. Overall, the mathematics of ANNs forms the foundation for training stable, accurate, and scalable deep learning models.


🛠️ Tech Stack & Tools

Category Tool Badge Purpose
Language Python 3.9+ Python Primary programming language
Core Library NumPy NumPy Numerical computing & matrix operations
Visualization Matplotlib Matplotlib Data visualization & plotting
ML Utilities Scikit-Learn Scikit Learn Dataset loading & preprocessing
Notebook Jupyter Jupyter Interactive development environment
Editor VS Code VS Code Code editor & IDE
Version Control Git Git Source code management
Repository GitHub GitHub Code hosting & collaboration
Framework TensorFlow TensorFlow Benchmarking & comparison only

Project Structure

# Project Folder Key Concepts
01 Perceptron Learning Rule 01_perceptron_learning Linear separability, weight updates
02 XOR with MLP 02_xor_mlp Non-linearity, backpropagation
03 MNIST Digit Recognition 03_mnist_digit_recognition Multi-class classification, softmax
04 Neural Network Visualizer 04_nn_visualizer Training dynamics, weight evolution
05 Custom Dataset ANN 05_custom_dataset_ann Tabular data, label encoding
06 Loss Surface Visualization 06_loss_landscape Loss contours, optimization geometry
07 Backpropagation Simulator 07_backprop_simulator Chain rule, matrix calculus
08 Activation Function Analysis 08_activation_function_analysis ReLU vs. Sigmoid vs. Tanh
09 Dropout Regularization 09_dropout_regularization Overfitting prevention
10 Time Series Forecasting 10_time_series_ann Sliding window, ANN regression

Architecture Overview

                           ┌────────────────────────┐
                           │ Mathematics of ANN     │
                           └─────────────┬──────────┘
                                         │
        ┌────────────────────────────────┼─────────────────────────────────┐
        │                                │                                 │
        ▼                                ▼                                 ▼
┌────────────────┐             ┌────────────────────┐             ┌────────────────────┐
│  Core Building │             │ Training &         │             │ Advanced Concepts  │
│  Blocks        │             │ Optimization       │             │ & Experiments      │
└───────┬────────┘             └─────────┬──────────┘             └─────────┬──────────┘
        │                                │                                  │
        ▼                                ▼                                  ▼
┌──────────────────┐          ┌──────────────────────┐          ┌──────────────────────────┐
│ • Perceptron     │          │ • Gradient Descent   │          │ • Backprop Simulator     │
│ • Activation     │          │ • Loss Functions     │          │ • Dropout Regularization │
│   Functions      │          │ • Loss Landscape Viz │          │ • Time Series Forecast   │
└──────────────────┘          └──────────────────────┘          │ • MNIST Recognition      │
                                                                └──────────────────────────┘

Quick Start

Prerequisites

Python pip

Recommended Knowledge:

  • Basic linear algebra and calculus
  • Python programming (NumPy basics)
  • Understanding of gradient descent

Installation

# Clone the repository
git clone https://github.com/arun-techverse/neural-networks-from-scratch-math-projects.git
cd neural-networks-from-scratch-math-projects

# Create virtual environment (recommended)
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Launch Jupyter Notebook
jupyter notebook

Dependencies

numpy>=1.21.0
matplotlib>=3.4.0
scikit-learn>=1.0.0
tensorflow>=2.8.0  # For comparison only
jupyter>=1.0.0

Activation Functions

Activation functions introduce non-linearity into neural networks, enabling them to learn complex patterns. Without them, networks would behave like linear regression regardless of depth.

Common Activation Functions

Function Formula Range Use Case
Sigmoid $f(x) = \frac{1}{1 + e^{-x}}$ (0, 1) Binary classification output
Tanh $f(x) = \frac{e^x - e^{-x}}{e^x + e^{-x}}$ (-1, 1) Hidden layers (zero-centered)
ReLU $f(x) = \max(0, x)$ [0, ∞) Hidden layers (most popular)
Softmax $P(y=i) = \frac{e^{z_i}}{\sum_{j} e^{z_j}}$ (0, 1) Multi-class classification

LinkedIn GitHub


⬆ Back to Top

About

Artificial Neural Networks (ANNs) Projects

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 65.6%
  • Python 34.4%