This repository was archived by the owner on Feb 6, 2025. It is now read-only.
-
-
Notifications
You must be signed in to change notification settings - Fork 23
[🔷 Feature request ]: Derivative of Softmax #27
Copy link
Copy link
Open
Labels
Description
Feature
Softmax activation function.
Type
- Dann
- Matrix
- Layer
- Activation functions
- Loss functions
- Pool functions
- Datasets
- Documentation
- tests & examples
- Other
Description
Here is the softmax function I wrote not so long ago:
/**
* Softmax function
* @method softmax
* @param z An array of numbers (vector)
* @return An array of numbers (vector)
**/
function softmax(z) {
let ans = [];
let denom = 0;
for (let j = 0; j < z.length; j++) {
denom += Math.exp(z[j]);
}
for (let i = 0; i < z.length; i++) {
let top = Math.exp(z[i]);
ans.push(top / denom);
}
return ans;
}This function is not implemented in the repository yet.
For this function to work in a Neural Network, we would need to write the derivative of this function. This might be a difficult task since this function takes in & outputs vectors. Vectors that are represented as arrays.
These two functions would need to be implemented in src/core/functions/actfuncs.js.
For this function to work with a Dann model, we would need to change how to activations are handled since it expects a vector instead of a number value. I could work on that once the derivative is implemented.
Reactions are currently unavailable