Skip to content

Releases: ottenbreit-data-science/aplr

Minor bugfix

17 Oct 17:24

Choose a tag to compare

A bug affecting the combination of loss_function = "weibull" and validation_tuning_metric = "default" has been resolved. Previously, in this combination, the case where the response contained zeros was not correctly handled, causing the fitting procedure to terminate. A workaround, which involved adding a small constant to the response, was required. This workaround is no longer necessary, as the issue has now been fixed.

Normalized gini

05 Sep 20:41

Choose a tag to compare

Now using a normalized gini when validation_tuning_metric is set to negative_gini.

Changed the default value of a hyperparameter

25 Aug 10:19

Choose a tag to compare

Changed the default value of m from 20000 to 3000 in order to have less computationally demanding default hyperparameters.

Removed an unnecessary parameter and updated the documentation

08 Aug 19:14

Choose a tag to compare

Removed an unnecessary C++ constructor parameter and updated the documentation.

Modified default tuning parameters. Added APLRTuner

03 Aug 12:14

Choose a tag to compare

Modified default hyperparameters based on empirical tests on several openml and pmlb datasets. Learning rate (v) was increased to 0.5 from 0.1, min_observations_in_split was decreased to 4 from 20, ineligible_boosting_steps_added was increased to 15 from 10 and max_eligible_terms was increased to 7 from 5.

Also added the APLRTuner object which simplifies tuning of APLR. Se the example folder.

Bugfix

07 Jul 11:20

Choose a tag to compare

Fixed a bug that caused too slow convergence when using the logit link. This also affected APLRClassifier since it uses underlying logit models.

Improved sklearn compatibility, changed the default value for a hyperparameter

06 Jul 11:47

Choose a tag to compare

  • Improved sklearn compatibility for APLRClassifier by adding a classes_ field and a predict_proba method.
  • Changed the default value for the maximum number of boosting steps to try, m, from 3000 to 20000 to ensure convergence in most cases (if this gets too slow then you can increase the learning rate v and reduce m).

Bugfix

24 Jun 10:58

Choose a tag to compare

APLR now also works with numpy 2.0.0.

Fixed a minor bug

22 Jun 15:39

Choose a tag to compare

Fixed a minor bug regarding the method get_unique_term_affiliation_shape. When invoking this method from one of the underlying logit APLRRegressor models in an APLRClassifier model, then the method previously had no default parameter value for max_rows_before_sampling. This has now been corrected.

Fixed minor bug related to verbosity

20 Jun 19:00

Choose a tag to compare

Previously, parts of the progress report during fitting could still be printed if verbosity was 0. This has now been corrected.