Skip to content

Commit 10cc996

Browse files
committed
Update code and text
1 parent dc36b68 commit 10cc996

File tree

2 files changed

+37
-44
lines changed

2 files changed

+37
-44
lines changed

articles/machine-learning/toc.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1087,7 +1087,7 @@ items:
10871087
- name: Consume web service
10881088
displayName: create client consume request response synchronous
10891089
href: ./v1/how-to-consume-web-service.md
1090-
- name: Advanced entry script authoring
1090+
- name: Advanced entry scripts
10911091
displayName: swagger inference schema binary cors
10921092
href: ./v1/how-to-deploy-advanced-entry-script.md
10931093
- name: Prebuilt Docker images

articles/machine-learning/v1/how-to-deploy-advanced-entry-script.md

Lines changed: 36 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -1,34 +1,35 @@
11
---
2-
title: Entry script authoring for advanced scenarios
2+
title: Entry scripts for advanced scenarios
33
titleSuffix: Azure Machine Learning
4-
description: Learn how to write Azure Machine Learning entry scripts for pre- and post-processing during deployment.
4+
description: See how to write Azure Machine Learning entry scripts for advanced scenarios like schema generation, accepting raw data, and loading registered models.
55
services: machine-learning
66
ms.service: azure-machine-learning
77
ms.subservice: mlops
88
ms.topic: how-to
9-
ms.date: 03/12/2024
9+
ms.date: 02/03/2025
1010
author: msakande
1111
ms.author: mopeakande
1212
ms.reviewer: sehan
1313
ms.custom: UpdateFrequency5, deploy, sdkv1
14+
# customer intent: As a developer, I want to see how to use advanced entry scripts in Azure Machine Learning so that I can implement schema generation, accept raw data, and load registered models.
1415
---
1516

16-
# Advanced entry script authoring
17+
# Advanced entry scripts
1718

1819
[!INCLUDE [sdk v1](../includes/machine-learning-sdk-v1.md)]
1920

20-
This article explains how to write entry scripts for specialized use cases in Azure Machine Learning.
21+
This article explains how to write entry scripts for specialized use cases in Azure Machine Learning. An entry script, which is also called a scoring script, accepts requests, uses a model to score data, and returns a response.
2122

2223
## Prerequisites
2324

24-
- A trained machine learning model that you intend to deploy with Azure Machine Learning. To learn more about model deployment, see [Deploy machine learning models to Azure](how-to-deploy-and-where.md).
25+
* A trained machine learning model that you intend to deploy with Azure Machine Learning. For more information about model deployment, see [Deploy machine learning models to Azure](how-to-deploy-and-where.md).
2526

2627
## Automatically generate a Swagger schema
2728

2829
To automatically generate a schema for your web service, provide a sample of the input or output in the constructor for one of the defined type objects. The type and sample are used to automatically create the schema. Azure Machine Learning then creates an [OpenAPI specification](https://swagger.io/docs/specification/about/) (formerly, a Swagger specification) for the web service during deployment.
2930

3031
> [!WARNING]
31-
> Don't use sensitive or private data for the sample input or output. The Swagger page for AML-hosted inferencing exposes the sample data.
32+
> Don't use sensitive or private data for the sample input or output. In Azure Machine Learning, the Swagger page for inferencing exposes the sample data.
3233
3334
The following types are currently supported:
3435

@@ -67,40 +68,40 @@ from inference_schema.parameter_types.pandas_parameter_type import PandasParamet
6768

6869
def init():
6970
global model
70-
# Replace filename if needed.
71+
# Replace the file name if needed.
7172
model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')
7273
# Deserialize the model file back into a sklearn model.
7374
model = joblib.load(model_path)
7475

7576

76-
# providing 3 sample inputs for schema generation
77+
# Provide three sample inputs for schema generation.
7778
numpy_sample_input = NumpyParameterType(np.array([[1,2,3,4,5,6,7,8,9,10],[10,9,8,7,6,5,4,3,2,1]],dtype='float64'))
7879
pandas_sample_input = PandasParameterType(pd.DataFrame({'name': ['Sarah', 'John'], 'age': [25, 26]}))
7980
standard_sample_input = StandardPythonParameterType(0.0)
8081

81-
# This is a nested input sample, any item wrapped by `ParameterType` will be described by schema
82+
# The following sample is a nested input sample. Any item wrapped by `ParameterType` is described by the schema.
8283
sample_input = StandardPythonParameterType({'input1': numpy_sample_input,
8384
'input2': pandas_sample_input,
8485
'input3': standard_sample_input})
8586

86-
sample_global_parameters = StandardPythonParameterType(1.0) # this is optional
87+
sample_global_parameters = StandardPythonParameterType(1.0) # This line is optional.
8788
sample_output = StandardPythonParameterType([1.0, 1.0])
88-
outputs = StandardPythonParameterType({'Results':sample_output}) # 'Results' is case sensitive
89+
outputs = StandardPythonParameterType({'Results':sample_output}) # "Results" is case sensitive.
8990

9091
@input_schema('Inputs', sample_input)
91-
# 'Inputs' is case sensitive
92+
# "Inputs" is case sensitive.
9293

9394
@input_schema('GlobalParameters', sample_global_parameters)
94-
# this is optional, 'GlobalParameters' is case sensitive
95+
# The preceding line is optional. "GlobalParameters" is case sensitive.
9596

9697
@output_schema(outputs)
9798

9899
def run(Inputs, GlobalParameters):
99-
# the parameters here have to match those in decorator, both 'Inputs' and
100-
# 'GlobalParameters' here are case sensitive
100+
# The parameters in the preceding line have to match those in the decorator. "Inputs" and
101+
# "GlobalParameters" are case sensitive.
101102
try:
102103
data = Inputs['input1']
103-
# data will be convert to target format
104+
# The data gets converted to the target format.
104105
assert isinstance(data, np.ndarray)
105106
result = model.predict(data)
106107
return result.tolist()
@@ -132,45 +133,44 @@ from azureml.contrib.services.aml_response import AMLResponse
132133
from PIL import Image
133134
import json
134135
135-
136136
def init():
137137
print("This is init()")
138-
139138
140139
@rawhttp
141140
def run(request):
142141
print("This is run()")
143142
144143
if request.method == 'GET':
145-
# For this example, just return the URL for GETs.
144+
# For this example, return the URL for GET requests.
146145
respBody = str.encode(request.full_path)
147146
return AMLResponse(respBody, 200)
148147
elif request.method == 'POST':
149148
file_bytes = request.files["image"]
150149
image = Image.open(file_bytes).convert('RGB')
151-
# For a real-world solution, you would load the data from reqBody
150+
# For a real-world solution, load the data from the request body
152151
# and send it to the model. Then return the response.
153152
154-
# For demonstration purposes, this example just returns the size of the image as the response.
153+
# For demonstration purposes, this example returns the size of the image as the response.
155154
return AMLResponse(json.dumps(image.size), 200)
156155
else:
157156
return AMLResponse("bad request", 500)
158157
```
159158
160159
> [!IMPORTANT]
161-
> The `AMLRequest` class is in the `azureml.contrib` namespace. Entities in this namespace are in preview. They change frequently as the service undergoes improvements. These entities aren't fully supported by Microsoft.
160+
> The `AMLRequest` class is in the `azureml.contrib` namespace. Entities in this namespace are in preview. They change frequently while the service undergoes improvements. Microsoft doesn't offer full support for these entities.
162161
>
163-
> If you need to test this code in your local development environment, you can install the components by using the following command:
162+
> If you need to test code that uses this class in your local development environment, you can install the components by using the following command:
164163
>
165164
> ```shell
166165
> pip install azureml-contrib-services
167166
> ```
168167
169168
> [!NOTE]
170-
> We don't recommend using `500` as a custom status code. On the `azureml-fe` side, the status code is rewritten to `502`.
171-
> * The status code is passed through `azureml-fe` and then sent to the client.
172-
> * The `azureml-fe` code rewrites the `500` that's returned from the model side as `502`. The client receives a code of `502`.
173-
> * If the `azureml-fe` code itself returns `500`, the client side still receives a code of `500`.
169+
> We don't recommend using `500` as a custom status code. On the Azure Machine Learning inference router (`azureml-fe`) side, the status code is rewritten to `502`.
170+
>
171+
> * The status code is passed through `azureml-fe` and then sent to the client.
172+
> * The `azureml-fe` code rewrites the `500` that's returned from the model side as `502`. The client receives a code of `502`.
173+
> * If the `azureml-fe` code itself returns `500`, the client side still receives a code of `500`.
174174
175175
When you use the `AMLRequest` class, you can access only the raw posted data in the score.py file. There's no client-side component. From a client, you can post data as usual. For example, the following Python code reads an image file and posts the data:
176176
@@ -240,17 +240,16 @@ def run(request):
240240
```
241241
242242
> [!IMPORTANT]
243-
> The `AMLRequest` class is in the `azureml.contrib` namespace. Entities in this namespace are in preview. They change frequently as the service undergoes improvements. These entities aren't fully supported by Microsoft.
243+
> The `AMLRequest` class is in the `azureml.contrib` namespace. Entities in this namespace are in preview. They change frequently while the service undergoes improvements. Microsoft doesn't offer full support for these entities.
244244
>
245-
> If you need to test code this in your local development environment, you can install the components by using the following command:
245+
> If you need to test code that uses this class in your local development environment, you can install the components by using the following command:
246246
>
247247
> ```shell
248248
> pip install azureml-contrib-services
249249
> ```
250250
251251
> [!WARNING]
252252
> Azure Machine Learning only routes POST and GET requests to the containers that run the scoring service. Errors can result if browsers use OPTIONS requests to issue preflight requests.
253-
>
254253
255254
## Load registered models
256255
@@ -263,14 +262,14 @@ There are two ways to locate models in your entry script:
263262
264263
`AZUREML_MODEL_DIR` is an environment variable that's created during service deployment. You can use this environment variable to find the location of deployed models.
265264
266-
The following table describes the value of `AZUREML_MODEL_DIR` when a varying number of models are deployed:
265+
The following table describes possible values of `AZUREML_MODEL_DIR` for a varying number of deployed models:
267266
268267
| Deployment | Environment variable value |
269268
| ----- | ----- |
270269
| Single model | The path to the folder that contains the model. |
271270
| Multiple models | The path to the folder that contains all models. Models are located by name and version in this folder in the format `<model-name>/<version>`. |
272271
273-
During model registration and deployment, models are placed in the `AZUREML_MODEL_DIR` path, and their original filenames are preserved.
272+
During model registration and deployment, models are placed in the `AZUREML_MODEL_DIR` path, and their original file names are preserved.
274273
275274
To get the path to a model file in your entry script, combine the environment variable with the file path you're looking for.
276275
@@ -281,10 +280,10 @@ The following example shows you how to find the path when you have a single mode
281280
```python
282281
import os
283282
284-
# Example when the model is a file
283+
# In the following example, the model is a file.
285284
model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')
286285
287-
# Example when the model is a folder containing a file
286+
# In the following example, the model is a folder that contains a file.
288287
file_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'my_model_folder', 'sklearn_regression_model.pkl')
289288
```
290289
@@ -313,7 +312,7 @@ In the Docker image that hosts the service, the `AZUREML_MODEL_DIR` environment
313312
In this example, the path of the first model is `$AZUREML_MODEL_DIR/my_first_model/1/my_first_model.pkl`. The path of the second model is `$AZUREML_MODEL_DIR/my_second_model/2/my_second_model.pkl`.
314313
315314
```python
316-
# Example when the model is a file, and the deployment contains multiple models
315+
# In the following example, the model is a file, and the deployment contains multiple models.
317316
first_model_name = 'my_first_model'
318317
first_model_version = '1'
319318
first_model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), first_model_name, first_model_version, 'my_first_model.pkl')
@@ -335,16 +334,10 @@ For more entry script examples for specific machine learning use cases, see the
335334
* [PyTorch](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/ml-frameworks/pytorch)
336335
* [TensorFlow](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/ml-frameworks/tensorflow)
337336
* [Keras](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/ml-frameworks/keras/train-hyperparameter-tune-deploy-with-keras/train-hyperparameter-tune-deploy-with-keras.ipynb)
338-
* [AutoML](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/automated-machine-learning/classification-bank-marketing-all-features)
337+
* [Automated machine learning](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/automated-machine-learning/classification-bank-marketing-all-features)
339338
340339
## Related content
341340
342341
* [Troubleshooting remote model deployment](how-to-troubleshoot-deployment.md)
343-
* [Deploy a model to an Azure Kubernetes Service cluster with v1](how-to-deploy-azure-kubernetes-service.md)
344342
* [Consume an Azure Machine Learning model deployed as a web service](how-to-consume-web-service.md)
345343
* [Update a deployed web service (v1)](how-to-deploy-update-web-service.md)
346-
* [Use a custom container to deploy a model to an online endpoint](../how-to-deploy-custom-container.md)
347-
* [Use TLS to secure a web service through Azure Machine Learning](how-to-secure-web-service.md)
348-
* [Monitor and collect data from ML web service endpoints](how-to-enable-app-insights.md)
349-
* [Collect data from models in production](how-to-enable-data-collection.md)
350-
* [Trigger applications, processes, or CI/CD workflows based on Azure Machine Learning events](../how-to-use-event-grid.md)

0 commit comments

Comments
 (0)