Skip to content

Commit 3ec7d9e

Browse files
authored
Merge pull request #207158 from v-rajagt/franlanglois
Link fixed.
2 parents 425199a + 6eaaa6a commit 3ec7d9e

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

articles/azure-cache-for-redis/cache-ml.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ Azure Cache for Redis is performant and scalable. When paired with an Azure Mach
3333
> * `model` - The registered model that will be deployed.
3434
> * `inference_config` - The inference configuration for the model.
3535
>
36-
> For more information on setting these variables, see [Deploy models with Azure Machine Learning](../machine-learning/how-to-deploy-and-where.md).
36+
> For more information on setting these variables, see [Deploy models with Azure Machine Learning](/azure/machine-learning/how-to-deploy-managed-online-endpoints).
3737
3838
## Create an Azure Cache for Redis instance
3939

@@ -119,7 +119,7 @@ def run(data):
119119
return error
120120
```
121121

122-
For more information on entry script, see [Define scoring code.](../machine-learning/how-to-deploy-and-where.md?tabs=python#define-an-entry-script)
122+
For more information on entry script, see [Define scoring code.](/azure/machine-learning/how-to-deploy-managed-online-endpoints)
123123

124124
* **Dependencies**, such as helper scripts or Python/Conda packages required to run the entry script or model
125125

@@ -144,7 +144,7 @@ These entities are encapsulated into an **inference configuration**. The inferen
144144
145145
For more information on environments, see [Create and manage environments for training and deployment](../machine-learning/how-to-use-environments.md).
146146
147-
For more information on inference configuration, see [Deploy models with Azure Machine Learning](../machine-learning/how-to-deploy-and-where.md?tabs=python#define-an-inference-configuration).
147+
For more information on inference configuration, see [Deploy models with Azure Machine Learning](/azure/machine-learning/how-to-deploy-managed-online-endpoints).
148148
149149
> [!IMPORTANT]
150150
> When deploying to Functions, you do not need to create a **deployment configuration**.
@@ -162,7 +162,7 @@ pip install azureml-contrib-functions
162162
To create the Docker image that is deployed to Azure Functions, use [azureml.contrib.functions.package](/python/api/azureml-contrib-functions/azureml.contrib.functions) or the specific package function for the trigger you want to use. The following code snippet demonstrates how to create a new package with an HTTP trigger from the model and inference configuration:
163163
164164
> [!NOTE]
165-
> The code snippet assumes that `model` contains a registered model, and that `inference_config` contains the configuration for the inference environment. For more information, see [Deploy models with Azure Machine Learning](../machine-learning/how-to-deploy-and-where.md).
165+
> The code snippet assumes that `model` contains a registered model, and that `inference_config` contains the configuration for the inference environment. For more information, see [Deploy models with Azure Machine Learning](/azure/machine-learning/how-to-deploy-managed-online-endpoints).
166166
167167
```python
168168
from azureml.contrib.functions import package

0 commit comments

Comments
 (0)