Skip to content

Commit da2b5b2

Browse files
authored
Merge pull request #276 from Haleshot/haleshot/PReLU
New problem: PReLU
2 parents 498faa7 + d98fc20 commit da2b5b2

File tree

2 files changed

+74
-0
lines changed

2 files changed

+74
-0
lines changed

Problems/98_PReLU/learn.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
### Understanding the PReLU (Parametric ReLU) Activation Function
2+
3+
The PReLU (Parametric Rectified Linear Unit) is an advanced variant of the ReLU activation function that introduces a learnable parameter for negative inputs. This makes it more flexible than standard ReLU and helps prevent the "dying ReLU" problem.
4+
5+
#### Mathematical Definition
6+
7+
The PReLU function is defined as:
8+
9+
$$
10+
PReLU(x) = \begin{cases}
11+
x & \text{if } x > 0 \\
12+
\alpha x & \text{otherwise}
13+
\end{cases}
14+
$$
15+
16+
Where:
17+
- $x$ is the input value
18+
- $\alpha$ is a learnable parameter (typically initialized to a small value like 0.25)
19+
20+
#### Key Characteristics
21+
22+
1. **Adaptive Slope**: Unlike ReLU which has a zero slope for negative inputs, PReLU learns the optimal negative slope parameter ($\alpha$) during training.
23+
24+
2. **Output Range**:
25+
- For $x > 0$: Output equals input ($y = x$)
26+
- For $x \leq 0$: Output is scaled by $\alpha$ ($y = \alpha x$)
27+
28+
3. **Advantages**:
29+
- Helps prevent the "dying ReLU" problem
30+
- More flexible than standard ReLU
31+
- Can improve model performance through learned parameter
32+
- Maintains the computational efficiency of ReLU
33+
34+
4. **Special Cases**:
35+
- When $\alpha = 0$, PReLU becomes ReLU
36+
- When $\alpha = 1$, PReLU becomes a linear function
37+
- When $\alpha$ is small (e.g., 0.01), PReLU behaves similarly to Leaky ReLU
38+
39+
PReLU is particularly useful in deep neural networks where the optimal negative slope might vary across different layers or channels.

Problems/98_PReLU/solution.py

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
def prelu(x: float, alpha: float = 0.25) -> float:
2+
"""
3+
Implements the PReLU (Parametric ReLU) activation function.
4+
5+
Args:
6+
x: Input value
7+
alpha: Slope parameter for negative values (default: 0.25)
8+
9+
Returns:
10+
float: PReLU activation value
11+
"""
12+
return x if x > 0 else alpha * x
13+
14+
def test_prelu() -> None:
15+
# Test positive input (should behave like regular ReLU)
16+
assert prelu(2.0) == 2.0, "Test failed for positive input"
17+
18+
# Test zero input
19+
assert prelu(0.0) == 0.0, "Test failed for zero input"
20+
21+
# Test negative input with default alpha
22+
assert prelu(-2.0) == -0.5, "Test failed for negative input with default alpha"
23+
24+
# Test negative input with custom alpha
25+
assert prelu(-2.0, alpha=0.1) == -0.2, "Test failed for negative input with custom alpha"
26+
27+
# Test with alpha = 0 (behaves like ReLU)
28+
assert prelu(-2.0, alpha=0.0) == 0.0, "Test failed for ReLU behavior"
29+
30+
# Test with alpha = 1 (behaves like linear function)
31+
assert prelu(-2.0, alpha=1.0) == -2.0, "Test failed for linear behavior"
32+
33+
if __name__ == "__main__":
34+
test_prelu()
35+
print("All PReLU tests passed.")

0 commit comments

Comments
 (0)