Skip to content
This repository was archived by the owner on Feb 6, 2025. It is now read-only.

[🔷 Feature request ]: Derivative of Softmax #27

@matiasvlevi

Description

@matiasvlevi

Feature

Softmax activation function.

Type

  • Dann
  • Matrix
  • Layer
  • Activation functions
  • Loss functions
  • Pool functions
  • Datasets
  • Documentation
  • tests & examples
  • Other

Description

Here is the softmax function I wrote not so long ago:

/**
* Softmax function
* @method softmax
* @param z An array of numbers (vector)
* @return An array of numbers (vector)
**/
function softmax(z) {
  let ans = [];
  let denom = 0;
  for (let j = 0; j < z.length; j++) {
    denom += Math.exp(z[j]);
  }
  for (let i = 0; i < z.length; i++) {
    let top = Math.exp(z[i]);
    ans.push(top / denom);
  }
  return ans;
}

This function is not implemented in the repository yet.

For this function to work in a Neural Network, we would need to write the derivative of this function. This might be a difficult task since this function takes in & outputs vectors. Vectors that are represented as arrays.

These two functions would need to be implemented in src/core/functions/actfuncs.js.

For this function to work with a Dann model, we would need to change how to activations are handled since it expects a vector instead of a number value. I could work on that once the derivative is implemented.

Metadata

Metadata

Assignees

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions