v0.2.0 (2025-12-29)
Chores
-
Add comprehensive benchmark suite for Tether, snnTorch, and SpikingJelly frameworks (
f401fda) -
Add comprehensive benchmark suite for Tether, snnTorch, and SpikingJelly frameworks (
70eb398) -
Add SpikingCIFARModel and TetherLM for CIFAR-10 training, include dataset download functionality, and update dependencies in pyproject.toml (
8d3144c) -
Refactor PLIF and LIF modules for improved readability and performance (
26846fc) -
Enhanced the PLIF and LIF classes by restructuring the initialization parameters for better clarity.
-
Updated the forward methods in both classes to improve code readability and maintainability.
-
Added support for vectorized decay and threshold parameters in PLIF.
-
Improved the handling of surrogate gradients in both LIF and PLIF.
-
Refactored the attention mechanism in SpikingSelfAttention to streamline operations.
-
Updated the Monitor utility to enhance voltage trace monitoring capabilities.
-
Added comprehensive tests for new features and ensured backward compatibility.
-
Cleaned up code formatting across multiple files for consistency.
-
Remove obsolete documentation files for tether modules (
ae7b023) -
Simplify spike handling in SNNTorch models and reset hidden states in MNIST model (
70f4a42)
Features
- Implement Triton kernels for causal linear attention and rate encoding, optimize LinearLIF layer (
669912f)
Detailed Changes: v0.1.1...v0.2.0