File tree Expand file tree Collapse file tree 1 file changed +6
-5
lines changed Expand file tree Collapse file tree 1 file changed +6
-5
lines changed Original file line number Diff line number Diff line change @@ -231,11 +231,12 @@ timestep_fraction: int, optional
231231------------
232232
233233softmax_to_relu: bool, optional
234- If ``True ``, replace softmax by ReLU activation function. This is
235- recommended (default), because the spiking softmax implementation tends to
236- reduce accuracy, especially top-5. It is safe to do this replacement as long
237- as the input to the activation function is not all negative. In that case,
238- the ReLU would not be able to determine the winner.
234+ If ``True ``, replace softmax by ReLU activation function, which is
235+ the default for most simulator backends. The ``INI `` simulator by default
236+ approximates the softmax in a spiking implementation. This approximation may
237+ reduce accuracy. It is safe to replace softmax by ReLU as long as the inputs
238+ to the activation function are not all negative for a sample. In that case,
239+ the spiking ReLU neurons would not be able to determine a winner.
239240
240241maxpool_type: str, optional
241242 Implementation variants of spiking MaxPooling layers, based on
You can’t perform that action at this time.
0 commit comments