Skip to content

Conversation

@bejaeger
Copy link
Contributor

@bejaeger bejaeger commented Jan 7, 2026

This adds Ranked Probability Score (RPS) and Ranked Logarithmic Score (RLS) to the regressor finetuning code.

Note that this PR builds upon the original work by Jonas Landsgesell and Pascal Knoll in #689. It adapts their implementation to be compatible with our latest finetuning refactor and adds RLS.

Adding the RPS loss in the finetune example results in significantly better performance.

To find good defaults for the regression loss weights, a broader benchmarking would need to be done.

@bejaeger bejaeger requested a review from a team as a code owner January 7, 2026 11:38
@bejaeger bejaeger requested review from Copilot and simo-prior and removed request for a team January 7, 2026 11:38
@chatgpt-codex-connector
Copy link

Codex usage limits have been reached for code reviews. Please check with the admins of this repo to increase the limits by adding credits.
Credits must be used to enable repository wide code reviews.

@bejaeger bejaeger requested review from noahho and removed request for simo-prior January 7, 2026 11:38
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds RPS (Ranked Probability Score) and RLS (Ranked Logarithmic Score) losses to the regressor fine-tuning functionality and changes the default loss configuration to use RPS instead of the previous MSE-based approach.

  • Refactored compute_regression_loss to _compute_regression_loss with expanded loss options including CE, RPS, RLS, MSE, and MAE
  • Implemented _ranked_probability_score_loss_from_bar_logits function for computing ranked probability losses on CDF-based bar distributions
  • Changed default loss weights: ce_loss_weight=0.0, rps_loss_weight=1.0, mse_loss_weight=1.0 (previously MSE was 8.0 and only loss)

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.

File Description
src/tabpfn/finetuning/finetuned_regressor.py Extended loss computation with RPS/RLS support, refactored function signature and default parameters, added comprehensive NaN handling
tests/test_finetuning_regressor.py Added three new test functions to validate RPS/RLS loss computation, NaN masking behavior, and correct loss values; renamed existing test functions with double-underscore convention
src/tabpfn/finetuning/finetuned_classifier.py Renamed compute_classification_loss to _compute_classification_loss for consistency with regressor changes

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Collaborator

@noahho noahho left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job with the implementation! Had some minor comments, after which LGTM!

bejaeger and others added 2 commits January 9, 2026 16:03
Co-authored-by: Jonas Landsgesell <jonaslandsgesell@gmail.com>
Co-authored-by: Pascal Knoll <knollpascal00@gmail.com>
@bejaeger bejaeger changed the title Add rps losses to regressor Add crps losses to regressor Jan 9, 2026
Co-authored-by: Pascal Knoll <knollpascal00@gmail.com>
@bejaeger bejaeger added the no changelog needed PR does not require a changelog entry label Jan 9, 2026
bejaeger and others added 2 commits January 9, 2026 16:16
Co-authored-by: Pascal Knoll <knollpascal@gmail.com>
Co-authored-by: Pascal Knoll <knollpascal00@gmail.com>
@jonaslandsgesell
Copy link
Contributor

Great work Ben, thanks a lot for the integration in the current version. Really looking forward👍

@bejaeger bejaeger merged commit b60cfca into main Jan 9, 2026
13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

no changelog needed PR does not require a changelog entry

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants