Skip to content

Commit f21aff0

Browse files
committed
update changes
1 parent c96ddb7 commit f21aff0

File tree

1 file changed

+140
-15
lines changed

1 file changed

+140
-15
lines changed

changes.md

Lines changed: 140 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
1-
# Change from Version 2.3.1 to Version 2.3.2
2-
1+
# Change from Version 2.3.4 to Version 2.3.5
32

43

54
This release (under the branch of ``brainpy=2.3.x``) continues to add supports for brain-inspired computation.
@@ -8,23 +7,149 @@ This release (under the branch of ``brainpy=2.3.x``) continues to add supports f
87
## New Features
98

109

11-
### 1. New package structure for stable API release
10+
### 1. ``brainpy.share`` for sharing data across submodules
11+
12+
In this release, we abstract the shared data as a ``brainpy.share`` object.
13+
14+
This object together with ``brainpy.Delay`` we will introduce below
15+
constitute the support that enable to define SNN models like ANN ones.
16+
17+
18+
### 2. ``brainpy.Delay`` for delay processing
19+
20+
``Delay`` is abstracted as a dynamical system, which can be updated / retrieved by users.
21+
22+
```python
23+
import brainpy as bp
24+
25+
class EINet(bp.DynamicalSystemNS):
26+
def __init__(self, scale=1.0, e_input=20., i_input=20., delay=None):
27+
super().__init__()
28+
29+
self.bg_exc = e_input
30+
self.bg_inh = i_input
31+
32+
# network size
33+
num_exc = int(3200 * scale)
34+
num_inh = int(800 * scale)
35+
36+
# neurons
37+
pars = dict(V_rest=-60., V_th=-50., V_reset=-60., tau=20., tau_ref=5.,
38+
V_initializer=bp.init.Normal(-55., 2.), input_var=False)
39+
self.E = bp.neurons.LIF(num_exc, **pars)
40+
self.I = bp.neurons.LIF(num_inh, **pars)
41+
42+
# synapses
43+
we = 0.6 / scale # excitatory synaptic weight (voltage)
44+
wi = 6.7 / scale # inhibitory synaptic weight
45+
self.E2E = bp.experimental.Exponential(
46+
bp.conn.FixedProb(0.02, pre=self.E.size, post=self.E.size),
47+
g_max=we, tau=5., out=bp.experimental.COBA(E=0.)
48+
)
49+
self.E2I = bp.experimental.Exponential(
50+
bp.conn.FixedProb(0.02, pre=self.E.size, post=self.I.size, ),
51+
g_max=we, tau=5., out=bp.experimental.COBA(E=0.)
52+
)
53+
self.I2E = bp.experimental.Exponential(
54+
bp.conn.FixedProb(0.02, pre=self.I.size, post=self.E.size),
55+
g_max=wi, tau=10., out=bp.experimental.COBA(E=-80.)
56+
)
57+
self.I2I = bp.experimental.Exponential(
58+
bp.conn.FixedProb(0.02, pre=self.I.size, post=self.I.size),
59+
g_max=wi, tau=10., out=bp.experimental.COBA(E=-80.)
60+
)
61+
self.delayE = bp.Delay(self.E.spike, entries={'E': delay})
62+
self.delayI = bp.Delay(self.I.spike, entries={'I': delay})
63+
64+
def update(self):
65+
e_spike = self.delayE.at('E')
66+
i_spike = self.delayI.at('I')
67+
e_inp = self.E2E(e_spike, self.E.V) + self.I2E(i_spike, self.E.V) + self.bg_exc
68+
i_inp = self.I2I(i_spike, self.I.V) + self.E2I(e_spike, self.I.V) + self.bg_inh
69+
self.delayE(self.E(e_inp))
70+
self.delayI(self.I(i_inp))
71+
72+
```
73+
74+
75+
76+
### 3. ``brainpy.checkpoints.save_pytree`` and ``brainpy.checkpoints.load_pytree`` for saving/loading target from the filename
77+
78+
Now we can directly use ``brainpy.checkpoints.save_pytree`` to save a
79+
network state into the filepath we specified.
80+
81+
Similarly, we can use ``brainpy.checkpoints.load_pytree`` to load
82+
states from the given file path.
83+
84+
85+
### 4. More ANN layers
86+
87+
88+
- brainpy.layers.ConvTranspose1d
89+
- brainpy.layers.ConvTranspose2d
90+
- brainpy.layers.ConvTranspose3d
91+
- brainpy.layers.Conv1dLSTMCell
92+
- brainpy.layers.Conv2dLSTMCell
93+
- brainpy.layers.Conv3dLSTMCell
94+
95+
96+
### 5. More compatible dense operators
97+
98+
PyTorch operators:
99+
100+
- brainpy.math.Tensor
101+
- brainpy.math.flatten
102+
- brainpy.math.cat
103+
- brainpy.math.abs
104+
- brainpy.math.absolute
105+
- brainpy.math.acos
106+
- brainpy.math.arccos
107+
- brainpy.math.acosh
108+
- brainpy.math.arccosh
109+
- brainpy.math.add
110+
- brainpy.math.addcdiv
111+
- brainpy.math.addcmul
112+
- brainpy.math.angle
113+
- brainpy.math.asin
114+
- brainpy.math.arcsin
115+
- brainpy.math.asinh
116+
- brainpy.math.arcsin
117+
- brainpy.math.atan
118+
- brainpy.math.arctan
119+
- brainpy.math.atan2
120+
- brainpy.math.atanh
12121

13-
Unstable APIs are all hosted in ``brainpy._src`` module.
14-
Other APIs are stable, and will be maintained in a long time.
15122

123+
TensorFlow operators:
16124

17-
### 2. New schedulers
125+
- brainpy.math.concat
126+
- brainpy.math.reduce_sum
127+
- brainpy.math.reduce_max
128+
- brainpy.math.reduce_min
129+
- brainpy.math.reduce_mean
130+
- brainpy.math.reduce_all
131+
- brainpy.math.reduce_any
132+
- brainpy.math.reduce_logsumexp
133+
- brainpy.math.reduce_prod
134+
- brainpy.math.reduce_std
135+
- brainpy.math.reduce_variance
136+
- brainpy.math.reduce_euclidean_norm
137+
- brainpy.math.unsorted_segment_sqrt_n
138+
- brainpy.math.segment_mean
139+
- brainpy.math.unsorted_segment_sum
140+
- brainpy.math.unsorted_segment_prod
141+
- brainpy.math.unsorted_segment_max
142+
- brainpy.math.unsorted_segment_min
143+
- brainpy.math.unsorted_segment_mean
144+
- brainpy.math.segment_sum
145+
- brainpy.math.segment_prod
146+
- brainpy.math.segment_max
147+
- brainpy.math.segment_min
148+
- brainpy.math.clip_by_value
149+
- brainpy.math.cast
18150

19-
- `brainpy.optim.CosineAnnealingWarmRestarts`
20-
- `brainpy.optim.CosineAnnealingLR`
21-
- `brainpy.optim.ExponentialLR`
22-
- `brainpy.optim.MultiStepLR`
23-
- `brainpy.optim.StepLR`
24151

152+
### Others
25153

26-
### 3. Others
154+
- Remove the hard requirements of ``brainpylib`` and ``numba``.
27155

28-
- support `static_argnums` in `brainpy.math.jit`
29-
- fix bugs of `reset_state()` and `clear_input()` in `brainpy.channels`
30-
- fix jit error checking

0 commit comments

Comments
 (0)