Skip to content

Commit 843d937

Browse files
author
Alexander Ororbia
committed
minor revision to double-exp syn pointing, mods to modeling docs
1 parent ff2a25a commit 843d937

File tree

6 files changed

+72
-21
lines changed

6 files changed

+72
-21
lines changed

docs/modeling/neurons.md

Lines changed: 49 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,6 +86,22 @@ and `dmu` is the first derivative with respect to the mean parameter.
8686
:noindex:
8787
```
8888

89+
#### Bernoulli Error Cell
90+
91+
This cell is (currently) fixed to be a (factorized) multivariate Bernoulli cell.
92+
Concretely, this cell implements compartments/mechanics to facilitate Bernoulli
93+
log likelihood error calculations.
94+
95+
```{eval-rst}
96+
.. autoclass:: ngclearn.components.BernoulliErrorCell
97+
:noindex:
98+
99+
.. automethod:: advance_state
100+
:noindex:
101+
.. automethod:: reset
102+
:noindex:
103+
```
104+
89105
## Spiking Neurons
90106

91107
These neuronal cells exhibit dynamics that involve emission of discrete action
@@ -117,10 +133,42 @@ negative pressure on the membrane potential values at `t`).
117133
:noindex:
118134
```
119135

136+
### The IF (Integrate-and-Fire) Cell
137+
138+
This cell (the simple "integrator") models dynamics over the voltage `v`. Note that `thr` is used as the membrane potential threshold and no adaptive threshold mechanics are implemented for this cell model.
139+
(This cell is primarily a faster, convenience formulation that omits the leak element of the LIF.)
140+
141+
```{eval-rst}
142+
.. autoclass:: ngclearn.components.IFCell
143+
:noindex:
144+
145+
.. automethod:: advance_state
146+
:noindex:
147+
.. automethod:: reset
148+
:noindex:
149+
```
150+
151+
### The Winner-Take-All (WTAS) Cell
152+
153+
This cell models dynamics over the voltage `v` as a simple instantaneous
154+
softmax function of the electrical current input, where only a single
155+
spike, which wins the competition across the group of neuronal units
156+
within this component, emits a pulse/spike.
157+
158+
```{eval-rst}
159+
.. autoclass:: ngclearn.components.WTASCell
160+
:noindex:
161+
162+
.. automethod:: advance_state
163+
:noindex:
164+
.. automethod:: reset
165+
:noindex:
166+
```
167+
120168
### The LIF (Leaky Integrate-and-Fire) Cell
121169

122170
This cell (the "leaky integrator") models dynamics over the voltage `v`
123-
and threshold shift `thrTheta` (a homeostatic variable). Note that `thr`
171+
and threshold shift `thr_theta` (a homeostatic variable). Note that `thr`
124172
is used as a baseline level for the membrane potential threshold while
125173
`thrTheta` is treated as a form of short-term plasticity (full
126174
threshold is: `thr + thrTheta(t)`).

docs/modeling/synapses.md

Lines changed: 16 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,7 @@
11
# Synapses
22

3-
The synapse is a key building block for connecting/wiring together the various
4-
component cells that one would use for characterizing a biomimetic neural system.
5-
These particular objects are meant to perform, per simulated time step, a
6-
specific type of transformation -- such as a linear transform or a
7-
convolution -- utilizing their underlying synaptic parameters.
8-
Most times, a synaptic cable will be represented by a set of matrices (or filters)
9-
that are used to conduct a projection of an input signal (a value presented to a
10-
pre-synaptic/input compartment) resulting in an output signal (a value that
11-
appears within one of its post-synaptic compartments). Notably, a synapse component is
12-
typically associated with a local plasticity rule, e.g., a Hebbian-type
13-
update, that either is triggered online, i.e., at some or all simulation time
14-
steps, or by integrating a differential equation, e.g., via eligibility traces.
3+
The synapse is a key building block for connecting/wiring together the various component cells that one would use for characterizing a biomimetic neural system. These particular objects are meant to perform, per simulated time step, a specific type of transformation -- such as a linear transform or a convolution -- utilizing their underlying synaptic parameters. Most times, a synaptic cable will be represented by a set of matrices (or filters) that are used to conduct a projection of an input signal (a value presented to a pre-synaptic/input compartment) resulting in an output signal (a value that appears within one of its post-synaptic compartments). There are three general groupings of synaptic components in ngc-learn: 1) non-plastic static synapses (only perform fixed transformations of input signals); 2) non-plastic dynamic synapses (perform time-varying, input-dependent transformations on input signals); and 3) plastic synapses that carry out long-term evolution.
4+
Notably, plastic synapse components are typically associated with a local plasticity rule, e.g., a Hebbian-type update, that either is triggered online, i.e., at some or all simulation time steps, or by integrating a differential equation, e.g., via eligibility traces.
155

166
## Non-Plastic Synapse Types
177

@@ -74,6 +64,20 @@ This (chemical) synapse performs a linear transform of its input signals. Note t
7464
:noindex:
7565
```
7666

67+
### Double-Exponential Synapse
68+
69+
This (chemical) synapse performs a linear transform of its input signals. Note that this synapse is "dynamic" in the sense that its efficacies are a function of their pre-synaptic inputs; there is no inherent form of long-term plasticity in this base implementation. Synaptic strength values can be viewed as being filtered/smoothened through a doubleexpoential / difference of two exponentials kernel.
70+
71+
```{eval-rst}
72+
.. autoclass:: ngclearn.components.DoubleExpSynapse
73+
:noindex:
74+
75+
.. automethod:: advance_state
76+
:noindex:
77+
.. automethod:: reset
78+
:noindex:
79+
```
80+
7781
### Alpha Synapse
7882

7983
This (chemical) synapse performs a linear transform of its input signals. Note that this synapse is "dynamic" in the sense that its efficacies are a function of their pre-synaptic inputs; there is no inherent form of long-term plasticity in this base implementation. Synaptic strength values can be viewed as being filtered/smoothened through a kernel that models more realistic rise and fall times of synaptic conductance..

docs/ngclearn_papers.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,12 +17,14 @@ from data streams." arXiv preprint arXiv:1908.08655 (2019).
1717
a hyperdimensional predictive processing cognitive architecture."
1818
Proceedings of the Annual Meeting of the Cognitive Science Society (CogSci), Volume 44 (2022).
1919

20-
5. Ororbia, A., and Kelly, M. Alex. "Learning using a Hyperdimensional Predictive Processing Cognitive
21-
Architecture." 15th International Conference on Artificial General Intelligence (AGI) (2022).
20+
5. Ororbia, A., and Kelly, M. Alex. "Learning using a hyperdimensional predictive processing cognitive
21+
architecture." 15th International Conference on Artificial General Intelligence (AGI) (2022).
2222

2323
6. Ororbia, A., Mali, A., Kifer, D., & Giles, C. L. "Lifelong neural predictive coding: Learning cumulatively online without
2424
forgetting." Thirty-sixth Conference on Neural Information Processing Systems (NeurIPS) (2022).
2525

26+
7. Ororbia, A., Friston, K., Rao, Rajesh P. N. "Meta-representational predictive coding: Biomimetic self-supervised learning." arXiv preprint arXiv:2503.21796 (2025).
27+
2628
<b>Note:</b> Please let us know if your work uses ngc-learn so we can update this page to accurately track
2729
ngc-learn's use and include your work in the accumulating body of work in predictive processing
2830
and/or brain-inspired computational modeling.

ngclearn/components/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@
3939
from .synapses.hebbian.BCMSynapse import BCMSynapse
4040
from .synapses.STPDenseSynapse import STPDenseSynapse
4141
from .synapses.exponentialSynapse import ExponentialSynapse
42-
from .synapses.doubleExpSynapse import DoupleExpSynapse
42+
from .synapses.doubleExpSynapse import DoubleExpSynapse
4343
from .synapses.alphaSynapse import AlphaSynapse
4444

4545
## point to convolutional component types

ngclearn/components/synapses/__init__.py

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,10 @@
11
from .denseSynapse import DenseSynapse
22
from .staticSynapse import StaticSynapse
33

4-
54
## short-term plasticity components
65
from .STPDenseSynapse import STPDenseSynapse
76
from .exponentialSynapse import ExponentialSynapse
8-
from .doubleExpSynapse import DoupleExpSynapse
7+
from .doubleExpSynapse import DoubleExpSynapse
98
from .alphaSynapse import AlphaSynapse
109

1110
## dense synaptic components
@@ -15,7 +14,6 @@
1514
from .hebbian.eventSTDPSynapse import EventSTDPSynapse
1615
from .hebbian.BCMSynapse import BCMSynapse
1716

18-
1917
## conv/deconv synaptic components
2018
from .convolution.convSynapse import ConvSynapse
2119
from .convolution.staticConvSynapse import StaticConvSynapse
@@ -26,7 +24,6 @@
2624
from .convolution.hebbianDeconvSynapse import HebbianDeconvSynapse
2725
from .convolution.traceSTDPDeconvSynapse import TraceSTDPDeconvSynapse
2826

29-
3027
## modulated synaptic components
3128
from .modulated.MSTDPETSynapse import MSTDPETSynapse
3229
# from .modulated.REINFORCESynapse import REINFORCESynapse

ngclearn/components/synapses/doubleExpSynapse.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
from ngcsimlib.compartment import Compartment
77
from ngcsimlib.parser import compilable
88

9-
class DoupleExpSynapse(DenseSynapse): ## dynamic double-exponential synapse cable
9+
class DoubleExpSynapse(DenseSynapse): ## dynamic double-exponential synapse cable
1010
"""
1111
A dynamic double-exponential synaptic cable; this synapse evolves according to difference of two exponentials
1212
synaptic conductance dynamics.

0 commit comments

Comments
 (0)