In {NEURON}.update_output, the input sum is often huge when there's a lot of neurons in the previous layer which causes the neuron to not be able to properly learn the difference between 1.00000000 and 1.000000000 (because both huge values are clamped to 1 by the sigmoid function).
Therefore, we must implement softmax to properly clamp the input from 0 to 1.