Skip to content

Commit 59635f2

Browse files
committed
add new documents
Signed-off-by: reiase <[email protected]>
1 parent fed5604 commit 59635f2

File tree

4 files changed

+194
-2
lines changed

4 files changed

+194
-2
lines changed

docs/examples/optimization.md

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
Hyper-Parameter Optimization
2+
============================
3+
4+
This example is based on `optuna` [quick start example](https://optuna.org/#code_quickstart). [Optuna](https://optuna.org/) is an open-source [hyperparameter](https://github.com/reiase/hyperparameter) optimization framework which is easy to use:
5+
6+
```python
7+
import optuna
8+
9+
def objective(trial):
10+
x = trial.suggest_float('x', -10, 10)
11+
return (x - 2) ** 2
12+
13+
study = optuna.create_study()
14+
study.optimize(objective, n_trials=100)
15+
16+
study.best_params # E.g. {'x': 2.002108042}
17+
```
18+
19+
The above example creates a `study` object to search for the best parameter `x` that minimizes the objective function `(x-2)^2`.
20+
21+
Parameter Searching with [`HyperParameter`](https://github.com/reiase/hyperparameter)
22+
-----------------------------------------
23+
24+
Parameter searching can be much easier with [`HyperParameter`](https://github.com/reiase/hyperparameter):
25+
26+
```python
27+
import optuna
28+
from hyperparameter import param_scope, auto_param, lazy_dispatch
29+
30+
@auto_param
31+
def objective(x = 0.0):
32+
return (x - 2) ** 2
33+
34+
def wrapper(trial):
35+
trial = lazy_dispatch(trial)
36+
with param_scope(**{
37+
"objective.x": trial.suggest_float('objective.x', -10, 10)
38+
}):
39+
return objective()
40+
41+
study = optuna.create_study()
42+
study.optimize(wrapper, n_trials=100)
43+
44+
study.best_params # E.g. {'x': 2.002108042}
45+
```
46+
47+
We directly apply [the `auto_param` decorator](https://reiase.github.io/hyperparameter/quick_start/#auto_param) to the objective function so that it accepts parameters from [`param_scope`](https://reiase.github.io/hyperparameter/quick_start/#param_scope). Then we define a wrapper function that adapts `param_scope` API to `optuna`'s `trial` API and starts the parameter experiment as suggested in `optuna`'s example.
48+
49+
Put the Best Parameters into Production
50+
---------------------------------------
51+
52+
To put the best parameters into production, we can directly pass them to `param_scope`. This is very convenient if you want to put a ML model into production.
53+
54+
```python
55+
with param_scope(**study.best_params):
56+
print(f"{study.best_params} => {objective()}")
57+
```
58+
59+
Optimization on Nested Functions
60+
--------------------------------
61+
62+
`param_scope` and `auto_param` also support complex problems with nested functions:
63+
64+
```python
65+
@auto_param
66+
def objective_x(x = 0.0):
67+
return (x - 2) ** 2
68+
69+
@auto_param
70+
def objective_y(y = 0.0):
71+
return (y - 1) ** 3
72+
73+
def objective():
74+
return objective_x() * objective_y()
75+
76+
def wrapper(trial):
77+
trial = lazy_dispatch(trial)
78+
with param_scope(**{
79+
"objective_x.x": trial.suggest_float('objective_x.x', -10, 10),
80+
"objective_y.y": trial.suggest_float('objective_y.y', -10, 10)
81+
}):
82+
return objective()
83+
84+
study = optuna.create_study()
85+
study.optimize(wrapper, n_trials=100)
86+
87+
study.best_params # E.g. {'x': 2.002108042}
88+
```

docs/examples/optimization.zh.md

Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,92 @@
1+
参数寻优
2+
=======
3+
4+
原示例出自`optuna`项目的[quick start example](https://optuna.org/#code_quickstart). [Optuna](https://optuna.org/) 是一款易于使用的开源超参优化框架:
5+
6+
```python
7+
import optuna
8+
9+
def objective(trial):
10+
x = trial.suggest_float('x', -10, 10)
11+
return (x - 2) ** 2
12+
13+
study = optuna.create_study()
14+
study.optimize(objective, n_trials=100)
15+
16+
study.best_params # E.g. {'x': 2.002108042}
17+
```
18+
19+
以上代码创建了一个 `study` 对象,用来搜索最小化目标函数 `(x-2)^2` 的参数`x` 的最优取值,.
20+
21+
使用 [`HyperParameter`](https://github.com/reiase/hyperparameter)进行超参搜索
22+
-----------------------------------------
23+
24+
我们可以借助 [`HyperParameter`](https://github.com/reiase/hyperparameter)将上述搜索过程大幅简化:
25+
26+
```python
27+
import optuna
28+
from hyperparameter import param_scope, auto_param, lazy_dispatch
29+
30+
@auto_param
31+
def objective(x = 0.0):
32+
return (x - 2) ** 2
33+
34+
def wrapper(trial):
35+
trial = lazy_dispatch(trial)
36+
with param_scope(**{
37+
"objective.x": trial.suggest_float('objective.x', -10, 10)
38+
}):
39+
return objective()
40+
41+
study = optuna.create_study()
42+
study.optimize(wrapper, n_trials=100)
43+
44+
study.best_params # E.g. {'x': 2.002108042}
45+
```
46+
47+
通过 [`auto_param`](https://reiase.github.io/hyperparameter/quick_start/#auto_param) 装饰器,我们对目标函数进行了`超参化`,使其能够从[`param_scope`](https://reiase.github.io/hyperparameter/quick_start/#param_scope)读取参数。之后我们定义了一个辅助函数来对接`param_scope``optuna``trial` 接口,并开始超参寻优。
48+
49+
使用 `auto_param``param_scope` 的好处是将代码不再耦合`optuna`,可以在生产代码中复用代码。
50+
51+
生产化部署
52+
---------
53+
54+
可以通过直接将 `study` 搜索到的最优参数传递给 `param_scope` 来是实现实验结果的复现以及生产化部署。
55+
56+
```python
57+
with param_scope(**study.best_params):
58+
print(f"{study.best_params} => {objective()}")
59+
```
60+
61+
多层嵌套函数的参数优化
62+
-------------------
63+
64+
`param_scope``auto_param` 可以用于优化复杂问题中的嵌套函数的参数优化,比如:
65+
66+
```python
67+
@auto_param
68+
def objective_x(x = 0.0):
69+
return (x - 2) ** 2
70+
71+
@auto_param
72+
def objective_y(y = 0.0):
73+
return (y - 1) ** 3
74+
75+
def objective():
76+
return objective_x() * objective_y()
77+
78+
def wrapper(trial):
79+
trial = lazy_dispatch(trial)
80+
with param_scope(**{
81+
"objective_x.x": trial.suggest_float('objective_x.x', -10, 10),
82+
"objective_y.y": trial.suggest_float('objective_y.y', -10, 10)
83+
}):
84+
return objective()
85+
86+
study = optuna.create_study()
87+
study.optimize(wrapper, n_trials=100)
88+
89+
study.best_params # E.g. {'x': 2.002108042}
90+
```
91+
92+
使用 `auto_param` 可以避免在嵌套函数之间传递 `trial` 对象,让代码看起来更为自然直接。

docs/index.zh.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../README.zh.md

mkdocs.yml

Lines changed: 13 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,13 @@ plugins:
1111
en: "English"
1212
zh: "中文"
1313
default_language: "en"
14+
translate_nav:
15+
zh:
16+
home: 首页
17+
quick: 快速开始
18+
en:
19+
home: Home
20+
quick: Quick Start
1421
- mkdocstrings:
1522
handlers:
1623
python:
@@ -22,9 +29,13 @@ plugins:
2229
show_submodules: no
2330

2431
nav:
25-
- Home: index.md
26-
- Quick Start: quick_start.md
32+
- home: index.md
33+
- home: index.zh.md
34+
- quick: quick_start.md
2735
- Best Practice: structured_parameter.md
36+
- Examples:
37+
- Hyperparameter Optimization: examples/optimization.md
38+
- 参数优化: examples/optimization.zh.md
2839
- Reference: reference.md
2940

3041
watch:

0 commit comments

Comments
 (0)