Skip to content

Commit e6a5582

Browse files
committed
fix links in tutorials
1 parent e61319d commit e6a5582

11 files changed

+71
-71
lines changed

nbs/docs/tutorials/AnomalyDetection.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
"\n",
2020
"## Prerequisites\n",
2121
"\n",
22-
"This tutorial assumes basic familiarity with StatsForecast. For a minimal example visit the [Quick Start](../getting-started/1_Getting_Started_short)\n",
22+
"This tutorial assumes basic familiarity with StatsForecast. For a minimal example visit the [Quick Start](../getting-started/getting_started_short.html)\n",
2323
":::"
2424
]
2525
},
@@ -32,7 +32,7 @@
3232
"\n",
3333
"Anomaly detection is a crucial task in time series forecasting. It involves identifying unusual observations that don't follow the expected dataset patterns. Anomalies, also known as outliers, can be caused by a variety of factors, such as errors in the data collection process, sudden changes in the underlying patterns of the data, or unexpected events. They can pose problems for many forecasting models since they can distort trends, seasonal patterns, or autocorrelation estimates. As a result, anomalies can have a significant impact on the accuracy of the forecasts, and for this reason, it is essential to be able to identify them. Furthermore, anomaly detection has many applications across different industries, such as detecting fraud in financial data, monitoring the performance of online services, or identifying usual patterns in energy usage.\n",
3434
"\n",
35-
"By the end of this tutorial, you'll have a good understanding of how to detect anomalies in time series data using [StatsForecast](../../index)'s probabilistic models. "
35+
"By the end of this tutorial, you'll have a good understanding of how to detect anomalies in time series data using [StatsForecast](../../index.html)'s probabilistic models. "
3636
]
3737
},
3838
{
@@ -91,7 +91,7 @@
9191
"id": "a4a91923-d351-4a41-aad9-987cb050c8bd",
9292
"metadata": {},
9393
"source": [
94-
"We assume that you have StatsForecast already installed. If not, check this guide for instructions on [how to install StatsForecast](../getting-started/0_Installation)"
94+
"We assume that you have StatsForecast already installed. If not, check this guide for instructions on [how to install StatsForecast](../getting-started/installation.html)"
9595
]
9696
},
9797
{
@@ -316,7 +316,7 @@
316316
"id": "e3a7fde0-ecd2-4c05-8be3-f0a0d6dc4dc4",
317317
"metadata": {},
318318
"source": [
319-
"To generate the forecast, we'll use the [MSTL](../models/MultipleSeasonalTrend) model, which is well-suited for low-frequency data like the one used here. We first need to import it from `statsforecast.models` and then we need to instantiate it. Since we're using hourly data, we have two seasonal periods: one every 24 hours (hourly) and one every 24\\*7 hours (daily). Hence, we need to set `season_length = [24, 24*7]`. "
319+
"To generate the forecast, we'll use the [MSTL](../../src/core/models.html#multipleseasonaltrend) model, which is well-suited for low-frequency data like the one used here. We first need to import it from `statsforecast.models` and then we need to instantiate it. Since we're using hourly data, we have two seasonal periods: one every 24 hours (hourly) and one every 24\\*7 hours (daily). Hence, we need to set `season_length = [24, 24*7]`. "
320320
]
321321
},
322322
{
@@ -803,7 +803,7 @@
803803
"id": "2154b305-9a21-4eb3-b1e7-38be5d78572c",
804804
"metadata": {},
805805
"source": [
806-
"Here we identified the anomalies in the data using the MSTL model, but any [probabilistic model](../../models) from StatsForecast can be used. We also selected the 99% prediction interval of the insample forecasts, but other confidence levels can be used as well. "
806+
"Here we identified the anomalies in the data using the MSTL model, but any [probabilistic model](../../src/core/models.html) from StatsForecast can be used. We also selected the 99% prediction interval of the insample forecasts, but other confidence levels can be used as well. "
807807
]
808808
},
809809
{

nbs/docs/tutorials/ConformalPrediction.ipynb

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@
4747
"\n",
4848
"## Prerequisites\n",
4949
"\n",
50-
"This tutorial assumes basic familiarity with StatsForecast. For a minimal example visit the [Quick Start](../getting-started/1_Getting_Started_short)"
50+
"This tutorial assumes basic familiarity with StatsForecast. For a minimal example visit the [Quick Start](../getting-started/getting_started_short.html)"
5151
]
5252
},
5353
{
@@ -90,7 +90,7 @@
9090
"source": [
9191
"## Models with Native Prediction Intervals\n",
9292
"\n",
93-
"For models that already provide forecast distributions (like AutoARIMA, AutoETS), check [Prediction Intervals](./UncertaintyIntervals). Conformal prediction is particularly useful for models that only produce point forecasts, or when you want distribution-free intervals."
93+
"For models that already provide forecast distributions (like AutoARIMA, AutoETS), check [Prediction Intervals](./uncertaintyintervals.html). Conformal prediction is particularly useful for models that only produce point forecasts, or when you want distribution-free intervals."
9494
]
9595
},
9696
{
@@ -117,7 +117,7 @@
117117
"- **Financial forecasting**: Risk management with calibrated intervals\n",
118118
"- **Production models**: Any black-box forecasting model requiring uncertainty quantification\n",
119119
"\n",
120-
"[StatsForecast](../../index) implements conformal prediction for all available models, making it easy to add calibrated prediction intervals to any forecasting pipeline."
120+
"[StatsForecast](../../index.html) implements conformal prediction for all available models, making it easy to add calibrated prediction intervals to any forecasting pipeline."
121121
]
122122
},
123123
{
@@ -126,7 +126,7 @@
126126
"source": [
127127
"## Install libraries \n",
128128
"\n",
129-
"We assume that you have StatsForecast already installed. If not, check this guide for instructions on [how to install StatsForecast](../getting-started/0_Installation)"
129+
"We assume that you have StatsForecast already installed. If not, check this guide for instructions on [how to install StatsForecast](../getting-started/installation.html)"
130130
]
131131
},
132132
{
@@ -321,9 +321,9 @@
321321
"\n",
322322
"StatsForecast makes it simple to add conformal prediction to any forecasting model. We'll demonstrate with models that don't natively provide prediction intervals:\n",
323323
"\n",
324-
"- **[SeasonalExponentialSmoothing](../models/SimpleExponentialSmoothing)**: A simple smoothing model\n",
325-
"- **[ADIDA](../models/ADIDA)**: Aggregation method for intermittent demand\n",
326-
"- **[ARIMA](../models/ARIMA)**: Traditional statistical model (to show distribution-free intervals)\n",
324+
"- **[SeasonalExponentialSmoothing](../../src/core/models.html#SimpleExponentialSmoothing)**: A simple smoothing model\n",
325+
"- **[ADIDA](../../src/core/models.html#adida)**: Aggregation method for intermittent demand\n",
326+
"- **[ARIMA](../../src/core/models.html#ARIMA)**: Traditional statistical model (to show distribution-free intervals)\n",
327327
"\n",
328328
"### Setting Up Conformal Intervals\n",
329329
"\n",
@@ -858,7 +858,7 @@
858858
"- Try conformal prediction on your own forecasting problems\n",
859859
"- Experiment with different `n_windows` values for optimal calibration\n",
860860
"- Compare with native prediction intervals from statistical models\n",
861-
"- Explore [advanced uncertainty quantification methods](./UncertaintyIntervals)\n",
861+
"- Explore [advanced uncertainty quantification methods](./uncertaintyintervals.html)\n",
862862
"\n",
863863
"## Acknowledgements\n",
864864
"\n",

nbs/docs/tutorials/CrossValidation.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
"\n",
1818
"## Prerequisites\n",
1919
"\n",
20-
"This tutorial assumes basic familiarity with StatsForecast. For a minimal example visit the [Quick Start](../getting-started/1_Getting_Started_short)\n",
20+
"This tutorial assumes basic familiarity with StatsForecast. For a minimal example visit the [Quick Start](../getting-started/getting_started_short.html)\n",
2121
":::"
2222
]
2323
},
@@ -36,7 +36,7 @@
3636
"cell_type": "markdown",
3737
"metadata": {},
3838
"source": [
39-
"[Statsforecast](../../statsforecast) has an implementation of time series cross-validation that is fast and easy to use. This implementation makes cross-validation a distributed operation, which makes it less time-consuming. In this notebook, we'll use it on a subset of the [M4 Competition](https://www.sciencedirect.com/science/article/pii/S0169207019301128) hourly dataset. "
39+
"[Statsforecast](../../src/core/core.html#statsforecast) has an implementation of time series cross-validation that is fast and easy to use. This implementation makes cross-validation a distributed operation, which makes it less time-consuming. In this notebook, we'll use it on a subset of the [M4 Competition](https://www.sciencedirect.com/science/article/pii/S0169207019301128) hourly dataset. "
4040
]
4141
},
4242
{
@@ -72,7 +72,7 @@
7272
"cell_type": "markdown",
7373
"metadata": {},
7474
"source": [
75-
"We assume that you have StatsForecast already installed. If not, check this guide for instructions on [how to install StatsForecast](../getting-started/0_Installation)"
75+
"We assume that you have StatsForecast already installed. If not, check this guide for instructions on [how to install StatsForecast](../getting-started/installation.html)"
7676
]
7777
},
7878
{
@@ -288,7 +288,7 @@
288288
"cell_type": "markdown",
289289
"metadata": {},
290290
"source": [
291-
"For this example, we'll use StatsForecast [AutoETS](../models/AutoETS). We first need to import it from `statsforecast.models` and then we need to instantiate a new `StatsForecast` object. "
291+
"For this example, we'll use StatsForecast [AutoETS](../../src/core/models.html#autoets). We first need to import it from `statsforecast.models` and then we need to instantiate a new `StatsForecast` object. "
292292
]
293293
},
294294
{
@@ -297,7 +297,7 @@
297297
"source": [
298298
"The `StatsForecast` object has the following parameters: \n",
299299
"\n",
300-
"- models: a list of models. Select the models you want from [models](../../models/) and import them.\n",
300+
"- models: a list of models. Select the models you want from [models](../../src/core/models.html) and import them.\n",
301301
"- freq: a string indicating the frequency of the data. See [panda’s available frequencies.](https://pandas.pydata.org/pandas-docs/stable/user_guide/timeseries.html#offset-aliases)\n",
302302
"- n_jobs: int, number of jobs used in the parallel processing, use -1 for all cores.\n",
303303
"\n",
@@ -610,7 +610,7 @@
610610
"metadata": {},
611611
"source": [
612612
"::: {.callout-tip}\n",
613-
"Cross validation is especially useful when comparing multiple models. Here's an [example](../getting-started/2_Getting_Started_complete) with multiple models and time series. \n",
613+
"Cross validation is especially useful when comparing multiple models. Here's an [example](../getting-started/getting_started_complete.html) with multiple models and time series. \n",
614614
"::: "
615615
]
616616
},

nbs/docs/tutorials/ElectricityLoadForecasting.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -289,15 +289,15 @@
289289
"id": "b0fb7054-0695-450c-9b77-e637923ea481",
290290
"metadata": {},
291291
"source": [
292-
"The [MSTL](../models/MultipleSeasonalTrend) (Multiple Seasonal-Trend decomposition using LOESS) model, originally developed by [Kasun Bandara, Rob J Hyndman and Christoph Bergmeir](https://arxiv.org/abs/2107.13462), decomposes the time series in multiple seasonalities using a Local Polynomial Regression (LOESS). Then it forecasts the trend using a custom non-seasonal model and each seasonality using a [SeasonalNaive](../../models#class-seasonalnaive) model."
292+
"The [MSTL](../../src/core/models.html#multipleseasonaltrend) (Multiple Seasonal-Trend decomposition using LOESS) model, originally developed by [Kasun Bandara, Rob J Hyndman and Christoph Bergmeir](https://arxiv.org/abs/2107.13462), decomposes the time series in multiple seasonalities using a Local Polynomial Regression (LOESS). Then it forecasts the trend using a custom non-seasonal model and each seasonality using a [SeasonalNaive](../../src/core/models.html#seasonalnaive) model."
293293
]
294294
},
295295
{
296296
"cell_type": "markdown",
297297
"id": "4df75d56-1435-4bab-bc74-20acbebfc2c7",
298298
"metadata": {},
299299
"source": [
300-
"`StatsForecast` contains a fast implementation of the [MSTL](../models/MultipleSeasonalTrend) model. Also, the decomposition of the time series can be calculated."
300+
"`StatsForecast` contains a fast implementation of the [MSTL](../../src/core/models.html#multipleseasonaltrend) model. Also, the decomposition of the time series can be calculated."
301301
]
302302
},
303303
{
@@ -317,7 +317,7 @@
317317
"id": "4f8072ee-03bb-4a82-b1d7-300d48ad2512",
318318
"metadata": {},
319319
"source": [
320-
"First we must define the model parameters. As mentioned before, the electricity load presents seasonalities every 24 hours (Hourly) and every 24 * 7 (Daily) hours. Therefore, we will use `[24, 24 * 7]` as the seasonalities that the [MSTL](../models/MultipleSeasonalTrend) model receives. We must also specify the manner in which the trend will be forecasted. In this case we will use the [AutoARIMA](../models/AutoARIMA) model."
320+
"First we must define the model parameters. As mentioned before, the electricity load presents seasonalities every 24 hours (Hourly) and every 24 * 7 (Daily) hours. Therefore, we will use `[24, 24 * 7]` as the seasonalities that the [MSTL](../../src/core/models.html#multipleseasonaltrend) model receives. We must also specify the manner in which the trend will be forecasted. In this case we will use the [AutoARIMA](../../src/core/models.html#autoarima) model."
321321
]
322322
},
323323
{
@@ -396,7 +396,7 @@
396396
"Once the model is fitted, we can access the decomposition using the `fitted_` attribute of `StatsForecast`. This attribute stores all relevant information of the fitted models for each of the time series. \n",
397397
"\n",
398398
"\n",
399-
"In this case we are fitting a single model for a single time series, so by accessing the fitted_ location [0, 0] we will find the relevant information of our model. The [MSTL](../../models#class-mstl) class generates a `model_` attribute that contains the way the series was decomposed."
399+
"In this case we are fitting a single model for a single time series, so by accessing the fitted_ location [0, 0] we will find the relevant information of our model. The [MSTL](../../src/core/models.html#mstl) class generates a `model_` attribute that contains the way the series was decomposed."
400400
]
401401
},
402402
{
@@ -589,7 +589,7 @@
589589
"id": "489f379b-a420-434a-99cd-3e5627eaa0ea",
590590
"metadata": {},
591591
"source": [
592-
"We observe that there is a clear trend towards the high (orange line). This component would be predicted with the [AutoARIMA](../models/AutoARIMA) model. We can also observe that every 24 hours and every `24 * 7` hours there is a very well defined pattern. These two components will be forecast separately using a [SeasonalNaive](../../models#class-seasonalnaive) model. "
592+
"We observe that there is a clear trend towards the high (orange line). This component would be predicted with the [AutoARIMA](../../src/core/models.html#autoarima) model. We can also observe that every 24 hours and every `24 * 7` hours there is a very well defined pattern. These two components will be forecast separately using a [SeasonalNaive](../../src/core/models.html#seasonalnaive) model. "
593593
]
594594
},
595595
{
@@ -826,7 +826,7 @@
826826
"id": "aeec9c71-3414-4430-9479-f4c0e64d5e67",
827827
"metadata": {},
828828
"source": [
829-
"In addition to the `MSTL` model, we will include the [SeasonalNaive](../../models#class-seasonalnaive) model as a benchmark to validate the added value of the `MSTL` model. Including `StatsForecast` models is as simple as adding them to the list of models to be fitted."
829+
"In addition to the `MSTL` model, we will include the [SeasonalNaive](../../src/core/models.html#seasonalnaive) model as a benchmark to validate the added value of the `MSTL` model. Including `StatsForecast` models is as simple as adding them to the list of models to be fitted."
830830
]
831831
},
832832
{

nbs/docs/tutorials/ElectricityPeakForecasting.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@
6161
"cell_type": "markdown",
6262
"metadata": {},
6363
"source": [
64-
"We assume you have StatsForecast already installed. Check this guide for instructions on [how to install StatsForecast](../getting-started/0_Installation).\n",
64+
"We assume you have StatsForecast already installed. Check this guide for instructions on [how to install StatsForecast](../getting-started/installation.html).\n",
6565
"\n",
6666
"Install the necessary packages using `pip install statsforecast`"
6767
]
@@ -173,7 +173,7 @@
173173
"metadata": {},
174174
"source": [
175175
":::{.callout-tip}\n",
176-
"Check our detailed explanation and tutorial on MSTL [here](../tutorials/MultipleSeasonalities)\n",
176+
"Check our detailed explanation and tutorial on MSTL [here](./multipleseasonalities.html)\n",
177177
":::"
178178
]
179179
},
@@ -201,7 +201,7 @@
201201
"cell_type": "markdown",
202202
"metadata": {},
203203
"source": [
204-
"First, instantiate the model and define the parameters. The electricity load presents seasonalities every 24 hours (Hourly) and every 24 * 7 (Daily) hours. Therefore, we will use `[24, 24 * 7]` as the seasonalities. See [this link](https://robjhyndman.com/hyndsight/seasonal-periods/) for a detailed explanation on how to set seasonal lengths. In this example we use the `SklearnModel` with a `LinearRegression` model for the trend component, however, any StatsForecast model can be used. The complete list of models is available [here](../../models)."
204+
"First, instantiate the model and define the parameters. The electricity load presents seasonalities every 24 hours (Hourly) and every 24 * 7 (Daily) hours. Therefore, we will use `[24, 24 * 7]` as the seasonalities. See [this link](https://robjhyndman.com/hyndsight/seasonal-periods/) for a detailed explanation on how to set seasonal lengths. In this example we use the `SklearnModel` with a `LinearRegression` model for the trend component, however, any StatsForecast model can be used. The complete list of models is available [here](../../src/core/models.html)."
205205
]
206206
},
207207
{
@@ -315,7 +315,7 @@
315315
"source": [
316316
"We fit the model by instantiating a `StatsForecast` object with the following required parameters:\n",
317317
"\n",
318-
"* `models`: a list of models. Select the models you want from [models](../../models) and import them.\n",
318+
"* `models`: a list of models. Select the models you want from [models](../../src/core/models.html) and import them.\n",
319319
"\n",
320320
"* `freq`: a string indicating the frequency of the data. (See [panda's available frequencies](https://pandas.pydata.org/pandas-docs/stable/user_guide/timeseries.html#offset-aliases).)"
321321
]
@@ -351,7 +351,7 @@
351351
"cell_type": "markdown",
352352
"metadata": {},
353353
"source": [
354-
"The `cross_validation` method allows the user to simulate multiple historic forecasts, greatly simplifying pipelines by replacing for loops with `fit` and `predict` methods. This method re-trains the model and forecast each window. See [this tutorial](../getting-started/2_Getting_Started_complete) for an animation of how the windows are defined. \n",
354+
"The `cross_validation` method allows the user to simulate multiple historic forecasts, greatly simplifying pipelines by replacing for loops with `fit` and `predict` methods. This method re-trains the model and forecast each window. See [this tutorial](../getting-started/getting_started_complete.html) for an animation of how the windows are defined. \n",
355355
"\n",
356356
"Use the `cross_validation` method to produce all the daily forecasts for September. To produce daily forecasts set the forecasting horizon `h` as 24. In this example we are simulating deploying the pipeline during September, so set the number of windows as 30 (one for each day). Finally, set the step size between windows as 24, to only produce one forecast per day."
357357
]
@@ -601,7 +601,7 @@
601601
"source": [
602602
"StatsForecast and MSTL in particular are good benchmarking models for peak detection. However, it might be useful to explore further and newer forecasting algorithms. We have seen particularly good results with the N-HiTS, a deep-learning model from Nixtla's NeuralForecast library.\n",
603603
"\n",
604-
"Learn how to predict ERCOT demand peaks with our deep-learning N-HiTS model and the NeuralForecast library in [this tutorial](../../../neuralforecast/use-cases/electricitypeakforecasting)."
604+
"Learn how to predict ERCOT demand peaks with our deep-learning N-HiTS model and the NeuralForecast library in [this tutorial](../../../neuralforecast/use-cases/electricitypeakforecasting.html)."
605605
]
606606
},
607607
{

0 commit comments

Comments
 (0)