Skip to content

Commit 59dffb8

Browse files
authored
Merge pull request #320 from Haleshot/haleshot/softplus
Add New Problem 99: Softplus
2 parents 64cbaa3 + f374460 commit 59dffb8

File tree

2 files changed

+82
-0
lines changed

2 files changed

+82
-0
lines changed

Problems/99_Softplus/learn.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
### Understanding the Softplus Activation Function
2+
3+
The Softplus activation function is a smooth approximation of the ReLU function. It's used in neural networks where a smoother transition around zero is desired. Unlike ReLU which has a sharp transition at x=0, Softplus provides a more gradual change.
4+
5+
### Mathematical Definition
6+
7+
The Softplus function is mathematically defined as:
8+
9+
$$
10+
Softplus(x) = \log(1 + e^x)
11+
$$
12+
13+
Where:
14+
- $x$ is the input to the function
15+
- $e$ is Euler's number (approximately 2.71828)
16+
- $\log$ is the natural logarithm
17+
18+
### Characteristics
19+
20+
1. **Output Range**:
21+
- The output is always positive: $(0, \infty)$
22+
- Unlike ReLU, Softplus never outputs exactly zero
23+
24+
2. **Smoothness**:
25+
- Softplus is continuously differentiable
26+
- The transition around x=0 is smooth, unlike ReLU's sharp "elbow"
27+
28+
3. **Relationship to ReLU**:
29+
- Softplus can be seen as a smooth approximation of ReLU
30+
- As x becomes very negative, Softplus approaches 0
31+
- As x becomes very positive, Softplus approaches x
32+
33+
4. **Derivative**:
34+
- The derivative of Softplus is the logistic sigmoid function:
35+
$$
36+
\frac{d}{dx}Softplus(x) = \frac{1}{1 + e^{-x}}
37+
$$
38+
39+
### Use Cases
40+
- When smooth gradients are important for optimization
41+
- In neural networks where a continuous approximation of ReLU is needed
42+
- Situations where strictly positive outputs are required with smooth transitions

Problems/99_Softplus/solution.py

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
import math
2+
3+
def softplus(x: float) -> float:
4+
"""
5+
Compute the softplus activation function.
6+
7+
Args:
8+
x: Input value
9+
10+
Returns:
11+
The softplus value: log(1 + e^x)
12+
"""
13+
# To prevent overflow for large positive values
14+
if x > 100:
15+
return x
16+
# To prevent underflow for large negative values
17+
if x < -100:
18+
return 0.0
19+
20+
return math.log(1.0 + math.exp(x))
21+
22+
def test_softplus():
23+
# Test case 1: x = 0
24+
assert abs(softplus(0) - math.log(2)) < 1e-6, "Test case 1 failed"
25+
26+
# Test case 2: large positive number
27+
assert abs(softplus(100) - 100) < 1e-6, "Test case 2 failed"
28+
29+
# Test case 3: large negative number
30+
assert abs(softplus(-100)) < 1e-6, "Test case 3 failed"
31+
32+
# Test case 4: positive number
33+
assert abs(softplus(2) - 2.1269280110429727) < 1e-6, "Test case 4 failed"
34+
35+
# Test case 5: negative number
36+
assert abs(softplus(-2) - 0.12692801104297272) < 1e-6, "Test case 5 failed"
37+
38+
if __name__ == "__main__":
39+
test_softplus()
40+
print("All Softplus tests passed.")

0 commit comments

Comments
 (0)