Skip to content

Commit f20d95d

Browse files
Merge pull request #5 from sktime/feature/ci
Feature/ci
2 parents 3d59c7d + b20810e commit f20d95d

File tree

97 files changed

+43
-31758
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

97 files changed

+43
-31758
lines changed

README.md

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,27 @@
11
# littlebooktimeseries
2+
23
Timeseries in 3 hours
4+
5+
## Environment setup
6+
7+
Choose one of the options below to get the workshop notebooks ready.
8+
9+
### Poetry
10+
- Install Poetry: `curl -sSL https://install.python-poetry.org | python3 -`
11+
- Install dependencies: `poetry install`
12+
- Start the shell: `poetry shell`
13+
- Launch Jupyter: `jupyter lab`
14+
15+
### pip
16+
- Create a virtual environment: `python3 -m venv .venv`
17+
- Activate it: `source .venv/bin/activate`
18+
- Install dependencies: `pip install -r requirements.txt`
19+
- Launch Jupyter: `jupyter lab`
20+
21+
### uv
22+
- Install uv: `pip install uv`
23+
- Sync dependencies: `uv sync`
24+
- Activate the environment: `source .venv/bin/activate`
25+
- Launch Jupyter: `jupyter lab`
26+
27+
> All options require Python 3.11 or 3.12. The notebooks live under `notebooks/`; open them in Jupyter Lab during the session.

book/content/pt/part2/hierarchical_forecasting.qmd

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@ from tsbook.forecasting.reduction import ReductionForecaster
105105
from lightgbm import LGBMRegressor
106106
107107
forecaster = ReductionForecaster(
108-
LGBMRegressor(n_estimators=50, verbose=-1, objective="tweedie"),
108+
LGBMRegressor(n_estimators=100, verbose=-1, objective="tweedie", random_state=42),
109109
window_length=30,
110110
normalization_strategy="divide_mean",
111111
)
@@ -296,7 +296,7 @@ pd.DataFrame(
296296
"OptimalReconciler (ols)": metric(y_test, y_pred_optimal, y_train=y_train),
297297
"Mint Reconciler": metric(y_test, y_pred_mint, y_train=y_train),
298298
},
299-
index=["Mean Absolute Scaled Error"],
299+
index=["Mean Absolute Squared Error"],
300300
)
301301
```
302302

book/content/pt/part2/ml_models.qmd

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -109,10 +109,10 @@ Primeiro, vamos import `ReductionForecaster`, que é a classe que implementa a a
109109

110110
```{python}
111111
from tsbook.forecasting.reduction import ReductionForecaster
112-
from sklearn.ensemble import RandomForestRegressor
112+
from lightgbm import LGBMRegressor
113113
114114
model = ReductionForecaster(
115-
RandomForestRegressor(n_estimators=100, random_state=42),
115+
LGBMRegressor(n_estimators=100, random_state=42),
116116
window_length=30,
117117
)
118118
@@ -135,7 +135,7 @@ Uma solução é usar a diferenciação para remover a tendência da série.
135135
```{python}
136136
from sktime.transformations.series.difference import Differencer
137137
138-
regressor = RandomForestRegressor(n_estimators=100, random_state=42)
138+
regressor = LGBMRegressor(n_estimators=100, random_state=42)
139139
140140
model = Differencer() * ReductionForecaster(
141141
regressor,

book/content/pt/part2/panel_data.qmd

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -165,7 +165,7 @@ from tsbook.forecasting.reduction import ReductionForecaster
165165
from lightgbm import LGBMRegressor
166166
167167
global_forecaster1 = ReductionForecaster(
168-
LGBMRegressor(n_estimators=100, verbose=-1),
168+
LGBMRegressor(n_estimators=100, verbose=-1, random_state=42),
169169
window_length=30,
170170
)
171171
@@ -219,7 +219,6 @@ Sabemos como preprocessar séries temporais univariadas para melhorar o desempen
219219
```{python}
220220
from sktime.transformations.series.difference import Differencer
221221
222-
223222
global_forecaster2 = Differencer() * global_forecaster1
224223
global_forecaster2.fit(y_train, X_train)
225224
```
@@ -249,7 +248,7 @@ errors["Local (2)"] = metric_local2
249248
errors
250249
```
251250

252-
Note como já superamos o modelo global incial, e o modelo local. Isso é para
251+
Note como já superamos o modelo global incial. Isso é para
253252
destacar que é **essencial** realizar um bom preprocessamento e engenharia de features para que modelos de Machine Learning tenham bom desempenho em dados em painel.
254253

255254
### Normalização por janela
Binary file not shown.

book/latest/content/pt/extra/sktime_custom.html

Lines changed: 0 additions & 4422 deletions
This file was deleted.
Binary file not shown.
Binary file not shown.
-2.31 MB
Binary file not shown.

book/latest/content/pt/part1/components_and_diff.html

Lines changed: 0 additions & 2761 deletions
This file was deleted.

0 commit comments

Comments
 (0)