-
Notifications
You must be signed in to change notification settings - Fork 549
Add crps losses to regressor #711
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…orLabs/TabPFN into ben/add-regressor-finetuning-wrapper
…s/TabPFN into ben/add-rps-losses-to-regressor
|
Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This pull request adds RPS (Ranked Probability Score) and RLS (Ranked Logarithmic Score) losses to the regressor fine-tuning functionality and changes the default loss configuration to use RPS instead of the previous MSE-based approach.
- Refactored
compute_regression_lossto_compute_regression_losswith expanded loss options including CE, RPS, RLS, MSE, and MAE - Implemented
_ranked_probability_score_loss_from_bar_logitsfunction for computing ranked probability losses on CDF-based bar distributions - Changed default loss weights:
ce_loss_weight=0.0,rps_loss_weight=1.0,mse_loss_weight=1.0(previously MSE was 8.0 and only loss)
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
src/tabpfn/finetuning/finetuned_regressor.py |
Extended loss computation with RPS/RLS support, refactored function signature and default parameters, added comprehensive NaN handling |
tests/test_finetuning_regressor.py |
Added three new test functions to validate RPS/RLS loss computation, NaN masking behavior, and correct loss values; renamed existing test functions with double-underscore convention |
src/tabpfn/finetuning/finetuned_classifier.py |
Renamed compute_classification_loss to _compute_classification_loss for consistency with regressor changes |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
noahho
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great job with the implementation! Had some minor comments, after which LGTM!
Co-authored-by: Jonas Landsgesell <jonaslandsgesell@gmail.com> Co-authored-by: Pascal Knoll <knollpascal00@gmail.com>
Co-authored-by: Pascal Knoll <knollpascal00@gmail.com>
Co-authored-by: Pascal Knoll <knollpascal@gmail.com>
Co-authored-by: Pascal Knoll <knollpascal00@gmail.com>
|
Great work Ben, thanks a lot for the integration in the current version. Really looking forward👍 |
This adds Ranked Probability Score (RPS) and Ranked Logarithmic Score (RLS) to the regressor finetuning code.
Note that this PR builds upon the original work by Jonas Landsgesell and Pascal Knoll in #689. It adapts their implementation to be compatible with our latest finetuning refactor and adds RLS.
Adding the RPS loss in the finetune example results in significantly better performance.
To find good defaults for the regression loss weights, a broader benchmarking would need to be done.