-
Notifications
You must be signed in to change notification settings - Fork 341
Description
Currently, we implement synapses without their own dynamics. Inputs to a layer are computed by left-multiplying a vector of pre-synaptic spikes against a synapse weight matrix. However, synapse currents often follow temporal dynamics, such as exponential decay.
We can think of the current implementation as using temporal decay with positive infinity as the time constant (tau = np.inf). We can generalize it such that this is kept as the default, but if the user passes in a finite time constant, e.g., 100ms, synaptic currents can be computed as the sum of the present, exponentially decaying current, and the result of left-multiplying the pre-synaptic spike vector and synapse weights.
Having synapse parameters with their own dynamics in SNNs is not uncommon. This will be our first foray into it, however. My biggest concern is with minibatch processing: having stateful synapses mean that we must duplicate them across the minibatch dimension. Duplicating weight matrices will cause us to run out of memory quickly!