Central Thesis: Structure is Learning. Artificial Neural Networks typically start dense and learn weights. Biological brains start sparse and learn structure (topology). This project demonstrates that applying biological structural priors (derived from neuroplasticity simulations) to AI models leads to efficient, self-organizing sparsity without sacrificing accuracy.
This project bridges Computational Neuroscience and Deep Learning. It consists of three components:
- Biological Simulation: A ground-up simulation of Izhikevich neurons, Spike-Timing-Dependent Plasticity (STDP), and Homeostatic Scaling to model how "engrams" (memories) form in biological tissue.
- Meta-Learning Optimization: A framework to scientifically derive the "Optimal Neuroplasticity Constant" (connectivity sigma, learning rates) from the biological simulation.
- Dual-Stack AI Implementation: A validation suite that enforces these derived biological rules in standard Deep Learning models using both PyTorch and TensorFlow.
- Derivation: Successfully derived optimal
sigmaandA_plusparameters that maximize modularity and small-worldness in biological networks. - Validation: Implemented these rules in a custom
PlasticDenselayer for TensorFlow and aStructuralPlasticityOptimizerfor PyTorch. - Performance: The bio-regularized AI models achieved ~52% sparsity while maintaining ~98% accuracy on MNIST, proving that the network can self-organize into an efficient topology.
simulation/: Pure Python implementation of the biological brain model (Neurons, STDP, Network).analysis/: Tools for Topological Data Analysis (Betti numbers, Persistence) to measure network structure.metalearning/:layers_tf.py: TensorFlow Custom Layer (PlasticDense) implementing self-contained structural plasticity.model.py/regularizer.py: PyTorch implementation.
train_benchmark_tf.py: TensorFlow benchmark script.train_benchmark.py: PyTorch benchmark script.experiment_optimization.py: The meta-learning loop to find optimal biological constants.
pip install -r requirements.txtRequires Python 3.8+. Dependencies include tensorflow, torch, numpy, matplotlib, brian2 (optional), ripser (for topology).
Explore how plasticity parameters affect network topology.
python experiment_optimization.pyTrain a standard MLP vs. a Bio-Regularized MLP to see structural learning in action.
python train_benchmark_tf.pyCheck viz/benchmark_result_tf.png for results.
python train_benchmark.pyIf you use this code for research in Neuro-AI or Structural Plasticity, please cite:
Suryadevara, R. (2025). Learning Compositional Generalization from Biological Plasticity Dynamics.