Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/models.rnn.html.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ and adapted into $\mathbf{\hat{y}}_{[t+1:t+H],[q]}$ through MLPs.
**References**

- [Jeffrey L. Elman (1990). “Finding Structure in
Time”.](https://onlinelibrary.wiley.com/doiabs/10.1207/s15516709cog1402_1)
Time”.](https://onlinelibrary.wiley.com/doi/abs/10.1207/s15516709cog1402_1)
- [Cho, K., van Merrienboer, B., Gülcehre, C., Bougares, F., Schwenk, H.,
& Bengio, Y. (2014). Learning phrase representations using RNN
encoder-decoder for statistical machine
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/capabilities/exogenous_variables.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Exogenous_Variables.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/exogenous_variables.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
6 changes: 3 additions & 3 deletions nbs/docs/capabilities/hyperparameter_tuning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Automatic_Hyperparameter_Tuning.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/hyperparameter_tuning.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down Expand Up @@ -64,7 +64,7 @@
"source": [
"## 2. Load Data\n",
"\n",
"In this example we will use the `AirPasengers`, a popular dataset with monthly airline passengers in the US from 1949 to 1960. Load the data, available at our `utils` methods in the required format. See https://nixtla.github.io/neuralforecast/examples/data_format.html for more details on the data input format."
"In this example we will use the `AirPasengers`, a popular dataset with monthly airline passengers in the US from 1949 to 1960. Load the data, available at our `utils` methods in the required format. See https://nixtlaverse.nixtla.io/neuralforecast/utils.html#example-data for more details on the data input format."
]
},
{
Expand Down Expand Up @@ -273,7 +273,7 @@
"metadata": {},
"source": [
":::{.callout-important}\n",
"Configuration dictionaries are not interchangeable between models since they have different hyperparameters. Refer to https://nixtla.github.io/neuralforecast/models.html for a complete list of each model's hyperparameters.\n",
"Configuration dictionaries are not interchangeable between models since they have different hyperparameters. Refer to https://nixtlaverse.nixtla.io/neuralforecast/models.html for a complete list of each model's hyperparameters.\n",
":::"
]
},
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/capabilities/predictInsample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/PredictInsample.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/predictInsample.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/capabilities/save_load_models.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Save_Load_models.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/save_load_models.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
6 changes: 3 additions & 3 deletions nbs/docs/capabilities/time_series_scaling.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Time_Series_Scaling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/capabilities/time_series_scaling.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down Expand Up @@ -497,7 +497,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Temporal normalization scales each instance of the batch separately at the window level. It is performed at each training iteration for each window of the batch, for both target variable and temporal exogenous covariates. For more details, see [Olivares et al. (2023)](https://arxiv.org/abs/2305.07089) and https://nixtla.github.io/neuralforecast/common.scalers.html."
"Temporal normalization scales each instance of the batch separately at the window level. It is performed at each training iteration for each window of the batch, for both target variable and temporal exogenous covariates. For more details, see [Olivares et al. (2023)](https://arxiv.org/abs/2305.07089) and https://nixtlaverse.nixtla.io/neuralforecast/common.scalers.html."
]
},
{
Expand All @@ -513,7 +513,7 @@
"source": [
"Temporal normalization is specified by the `scaler_type` argument. Currently, it is only supported for Windows-based models (`NHITS`, `NBEATS`, `MLP`, `TimesNet`, and all Transformers). In this example, we use the `TimesNet` model and `robust` scaler, recently proposed by Wu, Haixu, et al. (2022). First instantiate the model with the desired parameters.\n",
"\n",
"Visit https://nixtla.github.io/neuralforecast/common.scalers.html for a complete list of supported scalers."
"Visit https://nixtlaverse.nixtla.io/neuralforecast/common.scalers.html for a complete list of supported scalers."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/getting-started/datarequirements.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Data_Format.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/getting-started/datarequirements.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/getting-started/quickstart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Getting_Started.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/getting-started/quickstart.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/tutorials/comparing_methods.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -861,7 +861,7 @@
"source": [
"This machine doesn't have GPU, but Google Colabs offers some for free. \n",
"\n",
"Using [Colab's GPU to train NeuralForecast](https://nixtla.github.io/neuralforecast/examples/intermittentdata.html).\n"
"Using [Colab's GPU to train NeuralForecast](https://nixtlaverse.nixtla.io/neuralforecast/docs/tutorials/intermittent_data.html).\n"
]
},
{
Expand Down
8 changes: 4 additions & 4 deletions nbs/docs/tutorials/forecasting_tft.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Temporal Fusion Transformer (TFT) proposed by Lim et al. [1] is one of the most popular transformer-based model for time-series forecasting. In summary, TFT combines gating layers, an LSTM recurrent encoder, with multi-head attention layers for a multi-step forecasting strategy decoder. For more details on the Nixtla's TFT implementation visit [this link](https://nixtla.github.io/neuralforecast/models.tft.html).\n",
"Temporal Fusion Transformer (TFT) proposed by Lim et al. [1] is one of the most popular transformer-based model for time-series forecasting. In summary, TFT combines gating layers, an LSTM recurrent encoder, with multi-head attention layers for a multi-step forecasting strategy decoder. For more details on the Nixtla's TFT implementation visit [this link](https://nixtlaverse.nixtla.io/neuralforecast/models.tft.html).\n",
"\n",
"In this notebook we show how to train the TFT model on the Texas electricity market load data (ERCOT). Accurately forecasting electricity markets is of great interest, as it is useful for planning distribution and consumption.\n",
"\n",
Expand All @@ -27,7 +27,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Forecasting_TFT.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/tutorials/forecasting_tft.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down Expand Up @@ -226,7 +226,7 @@
"metadata": {},
"source": [
":::{.callout-tip}\n",
"All our models can be used for both point and probabilistic forecasting. For producing probabilistic outputs, simply modify the loss to one of our `DistributionLoss`. The complete list of losses is available in [this link](https://nixtla.github.io/neuralforecast/losses.pytorch.html) \n",
"All our models can be used for both point and probabilistic forecasting. For producing probabilistic outputs, simply modify the loss to one of our `DistributionLoss`. The complete list of losses is available in [this link](https://nixtlaverse.nixtla.io/neuralforecast/losses.pytorch.html) \n",
":::"
]
},
Expand Down Expand Up @@ -453,7 +453,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"The `cross_validation` method allows you to simulate multiple historic forecasts, greatly simplifying pipelines by replacing for loops with `fit` and `predict` methods. See [this tutorial](https://nixtla.github.io/statsforecast/examples/getting_started_complete.html) for an animation of how the windows are defined. \n",
"The `cross_validation` method allows you to simulate multiple historic forecasts, greatly simplifying pipelines by replacing for loops with `fit` and `predict` methods. See [this tutorial](https://nixtlaverse.nixtla.io/statsforecast/examples/getting_started_complete.html) for an animation of how the windows are defined. \n",
"\n",
"With time series data, cross validation is done by defining a sliding window across the historical data and predicting the period following it. This form of cross validation allows us to arrive at a better estimation of our model’s predictive abilities across a wider range of temporal instances while also keeping the data in the training set contiguous as is required by our models. The `cross_validation` method will use the validation set for hyperparameter selection, and will then produce the forecasts for the test set.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/tutorials/getting_started_complete.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"metadata": {},
"source": [
"::: {.callout-tip}\n",
"You can use Colab to run this Notebook interactively <a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Getting_Started_complete.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
"You can use Colab to run this Notebook interactively <a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/tutorials/getting_started_complete.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
"::: "
]
},
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/tutorials/hierarchical_forecasting.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/HierarchicalNetworks.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/tutorials/hierarchical_forecasting.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
10 changes: 5 additions & 5 deletions nbs/docs/tutorials/intermittent_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
"source": [
"Intermittent or sparse data has very few non-zero observations. This type of data is hard to forecast because the zero values increase the uncertainty about the underlying patterns in the data. Furthermore, once a non-zero observation occurs, there can be considerable variation in its size. Intermittent time series are common in many industries, including finance, retail, transportation, and energy. Given the ubiquity of this type of series, special methods have been developed to forecast them. The first was from [Croston (1972)](#ref), followed by several variants and by different aggregation frameworks. \n",
"\n",
"The models of [NeuralForecast](https://nixtla.github.io/statsforecast/) can be trained to model sparse or intermittent time series using a `Poisson` distribution loss. By the end of this tutorial, you'll have a good understanding of these models and how to use them. "
"The models of [NeuralForecast](https://nixtlaverse.nixtla.io/statsforecast/) can be trained to model sparse or intermittent time series using a `Poisson` distribution loss. By the end of this tutorial, you'll have a good understanding of these models and how to use them. "
]
},
{
Expand All @@ -39,7 +39,7 @@
"metadata": {},
"source": [
"::: {.callout-tip}\n",
"You can use Colab to run this Notebook interactively <a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/IntermittentData.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
"You can use Colab to run this Notebook interactively <a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/tutorials/intermittent_data.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
"::: "
]
},
Expand All @@ -60,7 +60,7 @@
"source": [
"## 1. Install libraries \n",
"\n",
"We assume that you have NeuralForecast already installed. If not, check this guide for instructions on [how to install NeuralForecast](https://nixtla.github.io/neuralforecast/examples/installation.html) \n",
"We assume that you have NeuralForecast already installed. If not, check this guide for instructions on [how to install NeuralForecast](https://nixtlaverse.nixtla.io/neuralforecast/examples/installation.html) \n",
"\n",
"Install the necessary packages using `pip install neuralforecast`"
]
Expand Down Expand Up @@ -135,7 +135,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Plot some series using the plot method from the `StatsForecast` class. This method prints 8 random series from the dataset and is useful for basic [EDA](https://nixtla.github.io/statsforecast/core.html#statsforecast.plot)."
"Plot some series using the plot method from the `StatsForecast` class. This method prints 8 random series from the dataset and is useful for basic [EDA](https://nixtlaverse.nixtla.io/statsforecast/core.html#statsforecast.plot)."
]
},
{
Expand Down Expand Up @@ -385,7 +385,7 @@
"\n",
"![](https://raw.githubusercontent.com/Nixtla/statsforecast/main/nbs/imgs/ChainedWindows.gif)\n",
"\n",
"[NeuralForecast](https://nixtla.github.io/neuralforecast/) has an implementation of time series cross-validation that is fast and easy to use.\n",
"[NeuralForecast](https://nixtlaverse.nixtla.io/neuralforecast/) has an implementation of time series cross-validation that is fast and easy to use.\n",
"\n",
"The `cross_validation` method from the `NeuralForecast` class takes the following arguments.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion nbs/docs/tutorials/interpretable_decompositions.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"\n",
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/Signal_Decomposition.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/tutorials/interpretable_decompositions.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions nbs/docs/tutorials/longhorizon_nhits.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"metadata": {},
"source": [
"\n",
"Long-horizon forecasting is challenging because of the *volatility* of the predictions and the *computational complexity*. To solve this problem we created the [NHITS](https://arxiv.org/abs/2201.12886) model and made the code available [NeuralForecast library](https://nixtla.github.io/neuralforecast/models.nhits.html). `NHITS` specializes its partial outputs in the different frequencies of the time series through hierarchical interpolation and multi-rate input\n",
"Long-horizon forecasting is challenging because of the *volatility* of the predictions and the *computational complexity*. To solve this problem we created the [NHITS](https://arxiv.org/abs/2201.12886) model and made the code available [NeuralForecast library](https://nixtlaverse.nixtla.io/neuralforecast/models.nhits.html). `NHITS` specializes its partial outputs in the different frequencies of the time series through hierarchical interpolation and multi-rate input\n",
"processing. \n",
"\n",
"In this notebook we show how to use `NHITS` on the [ETTm2](https://github.com/zhouhaoyi/ETDataset) benchmark dataset. This data set includes data points for 2 Electricity Transformers at 2 stations, including load, oil temperature.\n",
Expand All @@ -29,7 +29,7 @@
"source": [
"You can run these experiments using GPU with Google Colab.\n",
"\n",
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/examples/LongHorizon_with_NHITS.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
"<a href=\"https://colab.research.google.com/github/Nixtla/neuralforecast/blob/main/nbs/docs/tutorials/longhorizon_nhits.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
Expand Down
Loading