Releases: dholzmueller/probmetrics
Releases · dholzmueller/probmetrics
v1.2.0
v1.2.0 by @elsacho: Added new proper loss functions:
- ProperLpLoss(p=p): Metrics to evaluate
$E[ \Vert f(X) - E[Y|f(X)] \Vert_p ]$ where$f(X)$ are the
predictions of the classifier,$p >= 1$ , includingp=float("inf") - TopClassLoss: A wrapper to variationally evaluate top-class errors.
- OverConfidenceLoss & UnderConfidenceLoss: Wrappers to variationally evaluate
over/under-confidence in binary predictors. - MetricsWithCalibration can now handle arbitrary classifiers and Lp-type losses.
- New classifiers: Added
WS_CatboostClassifierandWS_LGBMClassifierfor
evaluating calibration errors. - removed sklearn < 1.7 constraint.
v1.1.0
v1.1.0 by @eugeneberta: Improvements to the SVS and SMS calibrators:
- logit pre-processing with
'ts-mix'is now automatic,
and the global scaling parameter$\alpha$ is fixed to 1. This yields:- improved performance on our tabular and computer vision benchmarks
(see the arxiv v2 of the SMS paper, coming soon). - faster convergence.
- ability to compute the duality gap in closed form for stopping SAGA solvers,
which we implement in this version.
- improved performance on our tabular and computer vision benchmarks
- improved L-BFGS solvers, much faster than in the previous version.
Now the solver for default SVS and SMS. - the default binary calibrator in
LogisticCalibratoris now quadratic scaling
instead of affine scaling, this can be changed back by using
LogisticCalibrator(binary_type='affine').
v1.0.0
What's Changed
- New post-hoc calibrators including SMS, SVS, affine and quadratic scaling. by @eugeneberta in #1
New Contributors
- @eugeneberta made their first contribution in #1
Full Changelog: v0.0.2...v1.0.0