Skip to content

Commit 1955368

Browse files
committed
release: 1.4.0
1 parent 3db5244 commit 1955368

File tree

4 files changed

+18
-15
lines changed

4 files changed

+18
-15
lines changed

DESCRIPTION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
Package: mlr3tuning
22
Title: Hyperparameter Optimization for 'mlr3'
3-
Version: 1.3.0.9000
3+
Version: 1.4.0
44
Authors@R: c(
55
person("Marc", "Becker", , "[email protected]", role = c("cre", "aut"),
66
comment = c(ORCID = "0000-0002-8115-0400")),

NEWS.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,11 @@
1-
# mlr3tuning (development version)
1+
# mlr3tuning 1.4.0
22

33
* feat: Resample stages from `CallbackResample` are now available in `CallbackBatchTuning` and `CallbackAsyncTuning`.
44
* fix: The `$predict_type` was written to the model even when the `AutoTuner` was not trained.
55
* feat: Internal tuned values are now visible in logs.
66
* BREAKING CHANGE: Remove internal search space argument.
7-
7+
* BREAKING CHANGE: The mlr3 ecosystem has a base logger now which is named `mlr3`.
8+
The `mlr3/bbotk` logger is a child of the `mlr3` logger and is used for logging messages from the `bbotk` and `mlr3tuning` package.
89

910
# mlr3tuning 1.3.0
1011

README.Rmd

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,6 @@ Package website: [release](https://mlr3tuning.mlr-org.com/) | [dev](https://mlr3
2222
<!-- badges: start -->
2323
[![r-cmd-check](https://github.com/mlr-org/mlr3tuning/actions/workflows/r-cmd-check.yml/badge.svg)](https://github.com/mlr-org/mlr3tuning/actions/workflows/r-cmd-check.yml)
2424
[![CRAN Status](https://www.r-pkg.org/badges/version-ago/mlr3tuning)](https://cran.r-project.org/package=mlr3tuning)
25-
[![StackOverflow](https://img.shields.io/badge/stackoverflow-mlr3-orange.svg)](https://stackoverflow.com/questions/tagged/mlr3)
2625
[![Mattermost](https://img.shields.io/badge/chat-mattermost-orange.svg)](https://lmmisld-lmu-stats-slds.srv.mwn.de/mlr_invite/)
2726
<!-- badges: end -->
2827

README.md

Lines changed: 14 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,6 @@ Package website: [release](https://mlr3tuning.mlr-org.com/) \|
99
[![r-cmd-check](https://github.com/mlr-org/mlr3tuning/actions/workflows/r-cmd-check.yml/badge.svg)](https://github.com/mlr-org/mlr3tuning/actions/workflows/r-cmd-check.yml)
1010
[![CRAN
1111
Status](https://www.r-pkg.org/badges/version-ago/mlr3tuning)](https://cran.r-project.org/package=mlr3tuning)
12-
[![StackOverflow](https://img.shields.io/badge/stackoverflow-mlr3-orange.svg)](https://stackoverflow.com/questions/tagged/mlr3)
1312
[![Mattermost](https://img.shields.io/badge/chat-mattermost-orange.svg)](https://lmmisld-lmu-stats-slds.srv.mwn.de/mlr_invite/)
1413
<!-- badges: end -->
1514

@@ -62,6 +61,8 @@ There are several sections about hyperparameter optimization in the
6261
- Simultaneously optimize hyperparameters and use [early
6362
stopping](https://mlr3book.mlr-org.com/chapters/chapter15/predsets_valid_inttune.html)
6463
with XGBoost.
64+
- [Automate](https://mlr3book.mlr-org.com/chapters/chapter4/hyperparameter_optimization.html#sec-autotuner)
65+
the tuning.
6566

6667
The [gallery](https://mlr-org.com/gallery-all-optimization.html)
6768
features a collection of case studies and demos about optimization.
@@ -128,14 +129,15 @@ instance = ti(
128129
instance
129130
```
130131

131-
## <TuningInstanceBatchSingleCrit>
132-
## * State: Not optimized
133-
## * Objective: <ObjectiveTuningBatch:classif.svm_on_sonar>
134-
## * Search Space:
132+
##
133+
## ── <TuningInstanceBatchSingleCrit> ─────────────────────────────────────────────────────────────────
134+
## • State: Not optimized
135+
## • Objective: <ObjectiveTuningBatch>
136+
## • Search Space:
135137
## id class lower upper nlevels
136138
## 1: cost ParamDbl -11.51293 11.51293 Inf
137139
## 2: gamma ParamDbl -11.51293 11.51293 Inf
138-
## * Terminator: <TerminatorNone>
140+
## Terminator: <TerminatorNone>
139141

140142
We select a simple grid search as the optimization algorithm.
141143

@@ -144,11 +146,12 @@ tuner = tnr("grid_search", resolution = 5)
144146
tuner
145147
```
146148

147-
## <TunerBatchGridSearch>: Grid Search
148-
## * Parameters: batch_size=1, resolution=5
149-
## * Parameter classes: ParamLgl, ParamInt, ParamDbl, ParamFct
150-
## * Properties: dependencies, single-crit, multi-crit
151-
## * Packages: mlr3tuning, bbotk
149+
##
150+
## ── <TunerBatchGridSearch>: Grid Search ─────────────────────────────────────────────────────────────
151+
## • Parameters: batch_size=1, resolution=5
152+
## • Parameter classes: <ParamLgl>, <ParamInt>, <ParamDbl>, and <ParamFct>
153+
## • Properties: dependencies, single-crit, and multi-crit
154+
## • Packages: mlr3tuning and bbotk
152155

153156
To start the tuning, we simply pass the tuning instance to the tuner.
154157

0 commit comments

Comments
 (0)