Skip to content

Commit 9acd3bf

Browse files
authored
Merge pull request #207224 from pritamso/Broken-link-fix-sidram
Broken link fixed
2 parents 8db4cbc + b884486 commit 9acd3bf

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

articles/stream-analytics/machine-learning-udf.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,13 +17,13 @@ You can implement machine learning models as a user-defined function (UDF) in yo
1717

1818
Complete the following steps before you add a machine learning model as a function to your Stream Analytics job:
1919

20-
1. Use Azure Machine Learning to [deploy your model as a web service](../machine-learning/how-to-deploy-and-where.md).
20+
1. Use Azure Machine Learning to [deploy your model as a web service](../machine-learning/how-to-deploy-managed-online-endpoints.md).
2121

2222
2. Your machine learning endpoint must have an associated [swagger](../machine-learning/how-to-deploy-advanced-entry-script.md) that helps Stream Analytics understand the schema of the input and output. You can use this [sample swagger definition](https://github.com/Azure/azure-stream-analytics/blob/master/Samples/AzureML/asa-mlswagger.json) as a reference to ensure you have set it up correctly.
2323

2424
3. Make sure your web service accepts and returns JSON serialized data.
2525

26-
4. Deploy your model on [Azure Kubernetes Service](../machine-learning/how-to-deploy-and-where.md#choose-a-compute-target) for high-scale production deployments. If the web service is not able to handle the number of requests coming from your job, the performance of your Stream Analytics job will be degraded, which impacts latency. Models deployed on Azure Container Instances are supported only when you use the Azure portal.
26+
4. Deploy your model on [Azure Kubernetes Service](../machine-learning/how-to-deploy-managed-online-endpoints.md#use-different-cpu-and-gpu-instance-types) for high-scale production deployments. If the web service is not able to handle the number of requests coming from your job, the performance of your Stream Analytics job will be degraded, which impacts latency. Models deployed on Azure Container Instances are supported only when you use the Azure portal.
2727

2828
## Add a machine learning model to your job
2929

0 commit comments

Comments
 (0)