Skip to content

Conversation

@reddyrohith49471
Copy link

This PR fixes the gradient update in LogisticRegression. The original code was missing the division by the number of samples (1/n), causing incorrect gradient scaling. This update makes the gradient consistent with standard logistic regression and the implementation in Regression.py.

This PR fixes the gradient update in LogisticRegression. The original code was missing
the division by the number of samples (1/n), causing incorrect gradient scaling.
This update makes the gradient consistent with standard logistic regression and the
implementation in Regression.py.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant