Releases: jaywyawhare/C-ML
v0.0.2
Changelog
All notable changes to C-ML will be documented in this file.
The format is based on Keep a Changelog,
and this project adheres to Semantic Versioning.
[0.0.2] - Pre-Alpha - 2025-11-08
Added
-
Complete Automatic Differentiation System: Implemented dynamic computation graphs that are built on-the-fly during forward pass, with automatic gradient computation during backward pass. Features include topological sorting for correct gradient flow, chain rule application, and gradient accumulation across multiple backward passes.
-
Forward and Backward Operations: All tensor operations now support autograd with proper forward/backward implementations for binary operations (add, sub, mul, div, pow), unary operations (exp, log, sqrt, trigonometric functions), activation functions (ReLU, Sigmoid, LeakyReLU), and reduction operations (sum, mean).
-
Loss Functions with Autograd: Implemented comprehensive loss functions including MSE, MAE, Binary Cross Entropy, Cross Entropy, Huber, and KL Divergence - all with full autograd support for seamless integration into training loops.
-
Gradient Checkpointing: Added memory-efficient backward pass implementation that recomputes activations instead of storing them, trading computation for memory. Configurable via
autograd_set_checkpointing()API. -
Real-Time Visualization UI: Built comprehensive React-based dashboard with Server-Sent Events (SSE) for live updates during training. Features include:
- Real-time training metrics visualization (loss curves, accuracy, learning rate, gradient health)
- Interactive computational graph visualization using Cytoscape
- Model architecture view with interactive exploration
- Automatic launch capability via
VIZ=1environment variable - FastAPI backend with SSE streaming for real-time data updates
-
Documentation and Examples: Added comprehensive user guide (
docs/AUTOGRAD.md), detailed implementation notes (docs/AUTOGRAD_IMPLEMENTATION.md), example code (examples/autograd_example.c), and complete test suite covering all operations.
Changed
- Updated build system to include autograd sources in Makefile and CMakeLists.txt
- Enhanced tensor operations to integrate with autograd system
Technical Details
This release adds a complete autograd system with real-time visualization to C-ML, enabling gradient-based training of neural networks with automatic differentiation and live monitoring capabilities.
Status: Pre-Alpha
Branch: autograd-integration
[0.0.1] - Initial Release
Added
- Basic tensor operations
- Neural network layers (Linear, Conv2d, BatchNorm2d, Pooling, Activations, Dropout)
- Optimizers (SGD, Adam)
- Basic training utilities
- Memory management system
v0.0.1
C-ML v0.0.1-pre Release
We're excited to announce the first pre-release of C-ML, version 0.0.1-pre! This release marks a significant milestone in the development of our lightweight machine learning library in C. While still in its early stages, this version provides a foundation for building and experimenting with neural network components.
What's Included
This pre-release includes the following core modules:
- Layers:
- Dense
- Dropout
- Activations:
- ReLU
- Sigmoid
- Tanh
- Softmax
- ELU
- Leaky ReLU
- Linear
- Loss Functions:
- Mean Squared Error
- Binary Cross-Entropy
- Focal Loss
- Mean Absolute Error
- Mean Absolute Percentage Error
- Root Mean Squared Error
- Reduce Mean
- Optimizers:
- SGD
- Adam
- RMSprop
- Preprocessing:
- Label Encoding
- One-Hot Encoding
- Standard Scaler
- Min-Max Scaler
- Regularizers:
- L1
- L2
- Combined L1-L2
Important Notes
- This is a pre-release, so expect potential bugs and incomplete features.
- Your feedback is highly appreciated to guide further development!
Changes Since Initial Commit
- Workflow Added: CI workflow added for automated testing and deployment.
- MkDocs Support: Integrated MkDocs for documentation generation and deployment.
- Documentation: Added comprehensive documentation for all modules.
- Testing: Added unit tests for all modules to ensure code correctness and robustness.
- Makefile: Streamlined the build process with automated source directory inclusion.
- Licensing: Added the "Don't Be a Jerk" Non-Commercial Care-Free License (DBaJ-NC-CFL).
- Example Usage: Provided a basic example in
main.cto demonstrate library usage. - Code Refactoring: Improved code structure and readability.
- Bug Fixes: Patched minor bugs and memory allocation issues.
- Spelling and Comment Updates: Fixed spelling mistakes and removed unnecessary comments.
- Logo Added: Added a logo to the project.
- Header File Migration: Migrated from a single header file to separate header files for better organization.