Skip to content

Commit 7517372

Browse files
fix review issues
1 parent b7a0b87 commit 7517372

File tree

3 files changed

+6
-6
lines changed

3 files changed

+6
-6
lines changed

articles/machine-learning/concept-automl-forecasting-deep-learning.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: machine-learning
1010
ms.subservice: automl
1111
ms.topic: conceptual
1212
ms.custom: contperf-fy21q1, automl, FY21Q4-aml-seo-hack, sdkv2, event-tier1-build-2022
13-
ms.date: 02/06/2023
13+
ms.date: 02/24/2023
1414
show_latex: true
1515
---
1616

@@ -66,7 +66,7 @@ We can give a more precise definition of the TCNForecaster architecture in terms
6666

6767
:::image type="content" source="media/concept-automl-forecasting-deep-learning/tcn-equations.png" alt-text="Equations describing TCNForecaster operations.":::
6868

69-
where $W_{e}$ is an [embedding](https://huggingface.co/blog/getting-started-with-embeddings) matrix for the categorical features, $n_{l} = n_{b}n_{c}$ is the total number of residual cells, the $H_{k}$ denote hidden layer outputs, and the $f_{q}$ are forecast outputs for given quantiles of the prediction distribution. To aid understanding, the dimensions of the these variables are in the following table:
69+
where $W_{e}$ is an [embedding](https://huggingface.co/blog/getting-started-with-embeddings) matrix for the categorical features, $n_{l} = n_{b}n_{c}$ is the total number of residual cells, the $H_{k}$ denote hidden layer outputs, and the $f_{q}$ are forecast outputs for given quantiles of the prediction distribution. To aid understanding, the dimensions of these variables are in the following table:
7070

7171
|Variable|Description|Dimensions|
7272
|--|--|--|

articles/machine-learning/concept-automl-forecasting-methods.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -137,7 +137,7 @@ Lags of feature columns | Optional
137137
Rolling window aggregations (for example, rolling average) of target quantity | Optional
138138
Seasonal decomposition ([STL](https://otexts.com/fpp3/stl.html)) | Optional
139139

140-
You can configure featurization from the AutoML SDK via the [ForecastingJob](/python/api/azure-ai-ml/azure.ai.ml.automl.forecastingjob#azure-ai-ml-automl-forecastingjob-set-forecast-settings) class or from the [Azure Machine Learning Studio web interface](how-to-use-automated-ml-for-ml-models.md#customize-featurization).
140+
You can configure featurization from the AutoML SDK via the [ForecastingJob](/python/api/azure-ai-ml/azure.ai.ml.automl.forecastingjob#azure-ai-ml-automl-forecastingjob-set-forecast-settings) class or from the [Azure Machine Learning studio web interface](how-to-use-automated-ml-for-ml-models.md#customize-featurization).
141141

142142
### Non-stationary time series detection and handling
143143

articles/machine-learning/how-to-automl-forecasting-faq.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -51,9 +51,9 @@ There are four basic configurations supported by AutoML forecasting:
5151

5252
|Configuration|Scenario|Pros|Cons|
5353
|--|--|--|--|
54-
|**Default AutoML**|Recommended if the dataset has a small number of time series that have roughly similar historic behavior.|- Simple to configure from code/SDK or Azure Machine Learning Studio <br><br> - AutoML has the chance to cross-learn across different time series since the regression models pool all series together in training. See the [model grouping](./concept-automl-forecasting-methods.md#model-grouping) section for more information.|- Regression models may be less accurate if the time series in the training data have divergent behavior <br> <br> - Time series models may take a long time to train if there are a large number of series in the training data. See the ["why is AutoML slow on my data"](#why-is-automl-slow-on-my-data) answer for more information.|
55-
|**AutoML with deep learning**|Recommended for datasets with more than 1000 observations and, potentially, numerous time series exhibiting complex patterns. When enabled, AutoML will sweep over [temporal convolutional neural network (TCN) models](./concept-automl-forecasting-deep-learning.md#introduction-to-tcnforecaster) during training. See the [enable deep learning](./how-to-auto-train-forecast.md#enable-deep-learning) section for more information.|- Simple to configure from code/SDK or Azure Machine Learning Studio <br> <br> - Cross-learning opportunities since the TCN pools data over all series <br> <br> - Potentially higher accuracy due to the large capacity of DNN models. See the [forecasting models in AutoML](./concept-automl-forecasting-methods.md#forecasting-models-in-automl) section for more information.|- Training can take much longer due to the complexity of DNN models <br> <br> - Series with small amounts of history are unlikely to benefit from these models.|
56-
|**Many Models**|Recommended if you need to train and manage a large number of forecasting models in a scalable way. See the [forecasting at scale](./how-to-auto-train-forecast.md#forecasting-at-scale) section for more information.|- Scalable <br> <br> - Potentially higher accuracy when time series have divergent behavior from one another.|- No cross-learning across time series <br> <br> - You can't configure or launch Many Models jobs from Azure Machine Learning Studio, only the code/SDK experience is currently available.|
54+
|**Default AutoML**|Recommended if the dataset has a small number of time series that have roughly similar historic behavior.|- Simple to configure from code/SDK or Azure Machine Learning studio <br><br> - AutoML has the chance to cross-learn across different time series since the regression models pool all series together in training. See the [model grouping](./concept-automl-forecasting-methods.md#model-grouping) section for more information.|- Regression models may be less accurate if the time series in the training data have divergent behavior <br> <br> - Time series models may take a long time to train if there are a large number of series in the training data. See the ["why is AutoML slow on my data"](#why-is-automl-slow-on-my-data) answer for more information.|
55+
|**AutoML with deep learning**|Recommended for datasets with more than 1000 observations and, potentially, numerous time series exhibiting complex patterns. When enabled, AutoML will sweep over [temporal convolutional neural network (TCN) models](./concept-automl-forecasting-deep-learning.md#introduction-to-tcnforecaster) during training. See the [enable deep learning](./how-to-auto-train-forecast.md#enable-deep-learning) section for more information.|- Simple to configure from code/SDK or Azure Machine Learning studio <br> <br> - Cross-learning opportunities since the TCN pools data over all series <br> <br> - Potentially higher accuracy due to the large capacity of DNN models. See the [forecasting models in AutoML](./concept-automl-forecasting-methods.md#forecasting-models-in-automl) section for more information.|- Training can take much longer due to the complexity of DNN models <br> <br> - Series with small amounts of history are unlikely to benefit from these models.|
56+
|**Many Models**|Recommended if you need to train and manage a large number of forecasting models in a scalable way. See the [forecasting at scale](./how-to-auto-train-forecast.md#forecasting-at-scale) section for more information.|- Scalable <br> <br> - Potentially higher accuracy when time series have divergent behavior from one another.|- No cross-learning across time series <br> <br> - You can't configure or launch Many Models jobs from Azure Machine Learning studio, only the code/SDK experience is currently available.|
5757
|**Hierarchical Time Series**|HTS is recommended if the series in your data have nested, hierarchical structure and you need to train or make forecasts at aggregated levels of the hierarchy. See the [hierarchical time series forecasting](how-to-auto-train-forecast.md#hierarchical-time-series-forecasting) section for more information.|- Training at aggregated levels can reduce noise in the leaf node time series and potentially lead to higher accuracy models. <br> <br> - Forecasts can be retrieved for any level of the hierarchy by aggregating or dis-aggregating forecasts from the training level.|- You need to provide the aggregation level for training. AutoML doesn't currently have an algorithm to find an optimal level.|
5858

5959
> [!NOTE]

0 commit comments

Comments
 (0)