Skip to content

Commit bc385b7

Browse files
authored
Update articles/machine-learning/concept-endpoints-online.md
1 parent 83ecc0a commit bc385b7

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

articles/machine-learning/concept-endpoints-online.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,6 @@ Azure Machine Learning provides various ways to debug online endpoints locally a
148148

149149
#### Local debugging with the Azure Machine Learning inference HTTP server
150150

151-
[!INCLUDE [machine-learning-preview-generic-disclaimer](includes/machine-learning-preview-generic-disclaimer.md)]
152151

153152
You can debug your scoring script locally by using the Azure Machine Learning inference HTTP server. The HTTP server is a Python package that exposes your scoring function as an HTTP endpoint and wraps the Flask server code and dependencies into a singular package. It's included in the [prebuilt Docker images for inference](concept-prebuilt-docker-images-inference.md) that are used when deploying a model with Azure Machine Learning. Using the package alone, you can deploy the model locally for production, and you can also easily validate your scoring (entry) script in a local development environment. If there's a problem with the scoring script, the server will return an error and the location where the error occurred.
154153
You can also use Visual Studio Code to debug with the Azure Machine Learning inference HTTP server.

0 commit comments

Comments
 (0)