Skip to content

Implementing softmax on the neuron's input sum #7

@ZeLarpMaster

Description

@ZeLarpMaster

In {NEURON}.update_output, the input sum is often huge when there's a lot of neurons in the previous layer which causes the neuron to not be able to properly learn the difference between 1.00000000 and 1.000000000 (because both huge values are clamped to 1 by the sigmoid function).

Therefore, we must implement softmax to properly clamp the input from 0 to 1.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions