Skip to content

Commit d6155c5

Browse files
authored
Fix weights gradient formula for problem 50.
1 parent f82cafe commit d6155c5

File tree

1 file changed

+1
-1
lines changed
  • Problems/50_lasso_regression_gradient_descent

1 file changed

+1
-1
lines changed

Problems/50_lasso_regression_gradient_descent/learn.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ <h3>2. Make Predictions at each step using the formula \[ \hat{y}_i = \sum_{j=1}
2020

2121
<h3>3. Find the residuals (Difference between the actual y values and the predicted ones) </h3>
2222
<h3>4. Update the weights and bias using the formula </p>
23-
<p> First, find the gradient with respect to weights $w$ using the formula \[ \frac{\partial J}{\partial w_j} = \frac{1}{n} \sum_{i=1}^nX_{ij}(y_i - \hat{y}_i) + \alpha \cdot sign(w_j) \] </h3>
23+
<p> First, find the gradient with respect to weights $w$ using the formula \[ \frac{\partial J}{\partial w_j} = \frac{1}{n} \sum_{i=1}^nX_{ij}(\hat{y}_i - y_i) + \alpha \cdot sign(w_j) \] </h3>
2424
<p> Then, we need to find the gradient with respect to the bias $b$. Since the bias term $b$ does not have a regularization component (since Lasso regularization is applied only to the weights $w_j$), the gradient with respect to $b$ is just the partial derivative of the Mean Squared Error (MSE) loss function with respect to $b$ \[ \frac{\partial J(w,b)}{\partial b} = \frac{1}{n} \sum_{i = 1}^{n}(\hat{y}_i-y_i)\]</p>
2525
<p> Next, we update the weights and bias using the formula \[ w_j = w_j - \eta \cdot \frac{\partial J}{\partial w_j} \] \[ b = b - \eta \cdot \frac{\partial J}{\partial b} \] Where eta is the learning rate defined in the beginning of the function</p>
2626
<h3> 5. Repeat steps 2-4 until the weights converge. This is determined by evaulating the L1 Norm of the weight gradients </h3>

0 commit comments

Comments
 (0)