Skip to content

Commit a788a8b

Browse files
authored
Fix broken links (#1433)
1 parent 10412f5 commit a788a8b

24 files changed

+44
-43
lines changed

docs/models.rnn.html.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ and adapted into $\mathbf{\hat{y}}_{[t+1:t+H],[q]}$ through MLPs.
2323
**References**
2424

2525
- [Jeffrey L. Elman (1990). “Finding Structure in
26-
Time”.](https://onlinelibrary.wiley.com/doiabs/10.1207/s15516709cog1402_1)
26+
Time”.](https://onlinelibrary.wiley.com/doi/abs/10.1207/s15516709cog1402_1)
2727
- [Cho, K., van Merrienboer, B., Gülcehre, C., Bougares, F., Schwenk, H.,
2828
& Bengio, Y. (2014). Learning phrase representations using RNN
2929
encoder-decoder for statistical machine

nbs/docs/capabilities/exogenous_variables.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@
5252
"source": [
5353
"You can run these experiments using GPU with Google Colab.\n",
5454
"\n",
55-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Exogenous_Variables.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
55+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/exogenous_variables.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
5656
]
5757
},
5858
{

nbs/docs/capabilities/hyperparameter_tuning.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@
3636
"source": [
3737
"You can run these experiments using GPU with Google Colab.\n",
3838
"\n",
39-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Automatic_Hyperparameter_Tuning.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
39+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/hyperparameter_tuning.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
4040
]
4141
},
4242
{
@@ -64,7 +64,7 @@
6464
"source": [
6565
"## 2. Load Data\n",
6666
"\n",
67-
"In this example we will use the `AirPasengers`, a popular dataset with monthly airline passengers in the US from 1949 to 1960. Load the data, available at our `utils` methods in the required format. See https://nixtla.github.io/neuralforecast/examples/data_format.html for more details on the data input format."
67+
"In this example we will use the `AirPasengers`, a popular dataset with monthly airline passengers in the US from 1949 to 1960. Load the data, available at our `utils` methods in the required format. See https://nixtlaverse.nixtla.io/neuralforecast/utils.html#example-data for more details on the data input format."
6868
]
6969
},
7070
{
@@ -273,7 +273,7 @@
273273
"metadata": {},
274274
"source": [
275275
":::{.callout-important}\n",
276-
"Configuration dictionaries are not interchangeable between models since they have different hyperparameters. Refer to https://nixtla.github.io/neuralforecast/models.html for a complete list of each model's hyperparameters.\n",
276+
"Configuration dictionaries are not interchangeable between models since they have different hyperparameters. Refer to https://nixtlaverse.nixtla.io/neuralforecast/models.html for a complete list of each model's hyperparameters.\n",
277277
":::"
278278
]
279279
},

nbs/docs/capabilities/predictInsample.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
"source": [
3232
"You can run these experiments using GPU with Google Colab.\n",
3333
"\n",
34-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/PredictInsample.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
34+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/predictInsample.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
3535
]
3636
},
3737
{

nbs/docs/capabilities/save_load_models.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@
3636
"source": [
3737
"You can run these experiments using GPU with Google Colab.\n",
3838
"\n",
39-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Save_Load_models.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
39+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/save_load_models.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
4040
]
4141
},
4242
{

nbs/docs/capabilities/time_series_scaling.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@
3232
"source": [
3333
"You can run these experiments using GPU with Google Colab.\n",
3434
"\n",
35-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Time_Series_Scaling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
35+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/time_series_scaling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
3636
]
3737
},
3838
{
@@ -497,7 +497,7 @@
497497
"cell_type": "markdown",
498498
"metadata": {},
499499
"source": [
500-
"Temporal normalization scales each instance of the batch separately at the window level. It is performed at each training iteration for each window of the batch, for both target variable and temporal exogenous covariates. For more details, see [Olivares et al. (2023)](https://arxiv.org/abs/2305.07089) and https://nixtla.github.io/neuralforecast/common.scalers.html."
500+
"Temporal normalization scales each instance of the batch separately at the window level. It is performed at each training iteration for each window of the batch, for both target variable and temporal exogenous covariates. For more details, see [Olivares et al. (2023)](https://arxiv.org/abs/2305.07089) and https://nixtlaverse.nixtla.io/neuralforecast/common.scalers.html."
501501
]
502502
},
503503
{
@@ -513,7 +513,7 @@
513513
"source": [
514514
"Temporal normalization is specified by the `scaler_type` argument. Currently, it is only supported for Windows-based models (`NHITS`, `NBEATS`, `MLP`, `TimesNet`, and all Transformers). In this example, we use the `TimesNet` model and `robust` scaler, recently proposed by Wu, Haixu, et al. (2022). First instantiate the model with the desired parameters.\n",
515515
"\n",
516-
"Visit https://nixtla.github.io/neuralforecast/common.scalers.html for a complete list of supported scalers."
516+
"Visit https://nixtlaverse.nixtla.io/neuralforecast/common.scalers.html for a complete list of supported scalers."
517517
]
518518
},
519519
{

nbs/docs/getting-started/datarequirements.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@
2525
"source": [
2626
"You can run these experiments using GPU with Google Colab.\n",
2727
"\n",
28-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Data_Format.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
28+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/getting-started/datarequirements.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
2929
]
3030
},
3131
{

nbs/docs/getting-started/quickstart.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@
2424
"source": [
2525
"You can run these experiments using GPU with Google Colab.\n",
2626
"\n",
27-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Getting_Started.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
27+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/getting-started/quickstart.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
2828
]
2929
},
3030
{

nbs/docs/tutorials/comparing_methods.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -861,7 +861,7 @@
861861
"source": [
862862
"This machine doesn't have GPU, but Google Colabs offers some for free. \n",
863863
"\n",
864-
"Using [Colab's GPU to train NeuralForecast](https://nixtla.github.io/neuralforecast/examples/intermittentdata.html).\n"
864+
"Using [Colab's GPU to train NeuralForecast](https://nixtlaverse.nixtla.io/neuralforecast/docs/tutorials/intermittent_data.html).\n"
865865
]
866866
},
867867
{

nbs/docs/tutorials/forecasting_tft.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
"cell_type": "markdown",
1414
"metadata": {},
1515
"source": [
16-
"Temporal Fusion Transformer (TFT) proposed by Lim et al. [1] is one of the most popular transformer-based model for time-series forecasting. In summary, TFT combines gating layers, an LSTM recurrent encoder, with multi-head attention layers for a multi-step forecasting strategy decoder. For more details on the Nixtla's TFT implementation visit [this link](https://nixtla.github.io/neuralforecast/models.tft.html).\n",
16+
"Temporal Fusion Transformer (TFT) proposed by Lim et al. [1] is one of the most popular transformer-based model for time-series forecasting. In summary, TFT combines gating layers, an LSTM recurrent encoder, with multi-head attention layers for a multi-step forecasting strategy decoder. For more details on the Nixtla's TFT implementation visit [this link](https://nixtlaverse.nixtla.io/neuralforecast/models.tft.html).\n",
1717
"\n",
1818
"In this notebook we show how to train the TFT model on the Texas electricity market load data (ERCOT). Accurately forecasting electricity markets is of great interest, as it is useful for planning distribution and consumption.\n",
1919
"\n",
@@ -27,7 +27,7 @@
2727
"source": [
2828
"You can run these experiments using GPU with Google Colab.\n",
2929
"\n",
30-
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Forecasting_TFT.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
30+
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/tutorials/forecasting_tft.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
3131
]
3232
},
3333
{
@@ -226,7 +226,7 @@
226226
"metadata": {},
227227
"source": [
228228
":::{.callout-tip}\n",
229-
"All our models can be used for both point and probabilistic forecasting. For producing probabilistic outputs, simply modify the loss to one of our `DistributionLoss`. The complete list of losses is available in [this link](https://nixtla.github.io/neuralforecast/losses.pytorch.html) \n",
229+
"All our models can be used for both point and probabilistic forecasting. For producing probabilistic outputs, simply modify the loss to one of our `DistributionLoss`. The complete list of losses is available in [this link](https://nixtlaverse.nixtla.io/neuralforecast/losses.pytorch.html) \n",
230230
":::"
231231
]
232232
},
@@ -453,7 +453,7 @@
453453
"cell_type": "markdown",
454454
"metadata": {},
455455
"source": [
456-
"The `cross_validation` method allows you to simulate multiple historic forecasts, greatly simplifying pipelines by replacing for loops with `fit` and `predict` methods. See [this tutorial](https://nixtla.github.io/statsforecast/examples/getting_started_complete.html) for an animation of how the windows are defined. \n",
456+
"The `cross_validation` method allows you to simulate multiple historic forecasts, greatly simplifying pipelines by replacing for loops with `fit` and `predict` methods. See [this tutorial](https://nixtlaverse.nixtla.io/statsforecast/examples/getting_started_complete.html) for an animation of how the windows are defined. \n",
457457
"\n",
458458
"With time series data, cross validation is done by defining a sliding window across the historical data and predicting the period following it. This form of cross validation allows us to arrive at a better estimation of our model’s predictive abilities across a wider range of temporal instances while also keeping the data in the training set contiguous as is required by our models. The `cross_validation` method will use the validation set for hyperparameter selection, and will then produce the forecasts for the test set.\n",
459459
"\n",

0 commit comments

Comments
 (0)