Skip to content

Commit 8992e18

Browse files
Merge pull request #379 from apphp/Sam-12-softmax-function
Sam 12 softmax and softplus functions
2 parents 5a1d2c4 + 99a4ccd commit 8992e18

File tree

20 files changed

+511
-49
lines changed

20 files changed

+511
-49
lines changed
-49.1 KB
Loading
-31.6 KB
Loading

docs/neural-network/activation-functions/elu.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ ELU is a simple function and is well-suited for deployment on resource-constrain
2626

2727
## Example
2828
```php
29-
use Rubix\ML\NeuralNet\ActivationFunctions\ELU;
29+
use Rubix\ML\NeuralNet\ActivationFunctions\ELU\ELU;
3030

3131
$activationFunction = new ELU(2.5);
3232
```

docs/neural-network/activation-functions/gelu.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ GELU is computationally more expensive than simpler activation functions like Re
2020

2121
## Example
2222
```php
23-
use Rubix\ML\NeuralNet\ActivationFunctions\GELU;
23+
use Rubix\ML\NeuralNet\ActivationFunctions\GELU\GELU;
2424

2525
$activationFunction = new GELU();
2626
```

docs/neural-network/activation-functions/hyperbolic-tangent.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Hyperbolic Tangent requires more computational resources compared to simpler act
2020

2121
## Example
2222
```php
23-
use Rubix\ML\NeuralNet\ActivationFunctions\HyperbolicTangent;
23+
use Rubix\ML\NeuralNet\ActivationFunctions\HyperbolicTangent\HyperbolicTangent;
2424

2525
$activationFunction = new HyperbolicTangent();
2626
```

docs/neural-network/activation-functions/leaky-relu.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ Leaky ReLU is computationally efficient, requiring only simple comparison operat
2626

2727
## Example
2828
```php
29-
use Rubix\ML\NeuralNet\ActivationFunctions\LeakyReLU;
29+
use Rubix\ML\NeuralNet\ActivationFunctions\LeakyReLU\LeakyReLU;
3030

3131
$activationFunction = new LeakyReLU(0.3);
3232
```

docs/neural-network/activation-functions/relu.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ ReLU is one of the most computationally efficient activation functions, requirin
2424

2525
## Example
2626
```php
27-
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU;
27+
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU\ReLU;
2828

2929
$activationFunction = new ReLU(0.1);
3030
```

docs/neural-network/activation-functions/relu6.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ ReLU6 maintains the computational efficiency of standard ReLU while adding an up
2020

2121
## Example
2222
```php
23-
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU6;
23+
use Rubix\ML\NeuralNet\ActivationFunctions\ReLU6\ReLU6;
2424

2525
$activationFunction = new ReLU6();
2626
```

docs/neural-network/activation-functions/selu.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ SELU is computationally more expensive than simpler activation functions like Re
2828

2929
## Example
3030
```php
31-
use Rubix\ML\NeuralNet\ActivationFunctions\SELU;
31+
use Rubix\ML\NeuralNet\ActivationFunctions\SELU\SELU;
3232

3333
$activationFunction = new SELU();
3434
```

docs/neural-network/activation-functions/sigmoid.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Sigmoid is computationally more expensive than simpler activation functions like
2020

2121
## Example
2222
```php
23-
use Rubix\ML\NeuralNet\ActivationFunctions\Sigmoid;
23+
use Rubix\ML\NeuralNet\ActivationFunctions\Sigmoid\Sigmoid;
2424

2525
$activationFunction = new Sigmoid();
2626
```

0 commit comments

Comments
 (0)