You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Logistic Regression is a fundamental algorithm for binary classification. Given input features and learned model parameters (weights and bias), your task is to implement the prediction function that computes class probabilities.
4
+
5
+
### Mathematical Background
6
+
7
+
The logistic regression model makes predictions using the sigmoid function:
8
+
9
+
$\sigma(z) = \frac{1}{1 + e^{-z}}$
10
+
11
+
where z is the linear combination of features and weights plus bias:
12
+
13
+
$z = \mathbf{w}^T\mathbf{x} + b = \sum_{i=1}^{n} w_ix_i + b$
14
+
15
+
### Implementation Requirements
16
+
17
+
Your task is to implement a function that:
18
+
19
+
- Takes a batch of samples $\mathbf{X}$ (shape: N × D), weights $\mathbf{w}$ (shape: D), and bias b
20
+
- Computes $z = \mathbf{X}\mathbf{w} + b$ for all samples
21
+
- Applies the sigmoid function to get probabilities
22
+
- Returns binary predictions i.e 0 or 1 using a threshold of 0.5
23
+
24
+
### Important Considerations
25
+
26
+
- Handle numerical stability in sigmoid computation
27
+
- Ensure efficient vectorized operations using numpy
28
+
- Return binary predictions i.e zeroes and ones
29
+
30
+
### Hint
31
+
32
+
To prevent overflow in the exponential calculation of sigmoid function, use np.clip to limit z values:
33
+
34
+
```python
35
+
z = np.clip(z, -500, 500)
36
+
```
37
+
38
+
This ensures numerical stability when dealing with large input values.
0 commit comments