Skip to content

Commit 0877ec1

Browse files
authored
Merge pull request #217475 from AbeOmor/patch-61
Adding docs for input and output of a model
2 parents d3664e0 + 1496cff commit 0877ec1

File tree

1 file changed

+173
-1
lines changed

1 file changed

+173
-1
lines changed

articles/machine-learning/how-to-manage-models.md

Lines changed: 173 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,10 @@ Type | Input/Output | `direct` | `download` | `ro_mount`
5454
`mlflow` | Output | ✓ | ✓ | ✓ |
5555

5656

57+
### Follow along in Jupyter Notebooks
58+
59+
You can follow along this sample in a Jupyter Notebook. In the [azureml-examples](https://github.com/azure/azureml-examples) repository, open the notebook: [model.ipynb](https://github.com/Azure/azureml-examples/blob/main/sdk/python/assets/model/model.ipynb).
60+
5761
## Create a model in the model registry
5862

5963
[Model registration](concept-model-management-and-deployment.md) allows you to store and version your models in the Azure cloud, in your workspace. The model registry helps you organize and keep track of your trained models.
@@ -251,8 +255,176 @@ For a complete example, see the [model notebook](https://github.com/Azure/azurem
251255

252256
To create a model in Machine Learning, from the UI, open the **Models** page. Select **Register model**, and select where your model is located. Fill out the required fields, and then select **Register**.
253257

254-
:::image type="content" source="./media/how-to-manage-models/register-model-as-asset.png" alt-text="Screenshot of the UI to register a model." lightbox="./media/how-to-manage-models/register-model-as-asset.png":::
258+
:::image type="content" source="./media/how-to-manage-models/register-model-local.png" alt-text="Screenshot of the UI to register a model." lightbox="./media/how-to-manage-models/register-model-local.png":::
259+
260+
---
261+
262+
## Use model as input in a job
263+
264+
# [Azure CLI](#tab/cli)
265+
266+
Create a job specification YAML file (`<file-name>.yml`). Specify in the `inputs` section of the job:
267+
268+
1. The `type`; whether the model is a `mlflow_model`,`custom_model` or `triton_model`.
269+
1. The `path` of where your data is located; can be any of the paths outlined in the [Supported Paths](#supported-paths) section.
270+
271+
```yaml
272+
$schema: https://azuremlschemas.azureedge.net/latest/commandJob.schema.json
273+
274+
# Possible Paths for models:
275+
# AzureML Datastore: azureml://datastores/<datastore-name>/paths/<path_on_datastore>
276+
# MLflow run: runs:/<run-id>/<path-to-model-relative-to-the-root-of-the-artifact-location>
277+
# Job: azureml://jobs/<job-name>/outputs/<output-name>/paths/<path-to-model-relative-to-the-named-output-location>
278+
# Model Asset: azureml:<my_model>:<version>
255279

280+
command: |
281+
ls ${{inputs.my_model}}
282+
code: <folder where code is located>
283+
inputs:
284+
my_model:
285+
type: <type> # mlflow_model,custom_model, triton_model
286+
path: <path>
287+
environment: azureml:AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest
288+
compute: azureml:cpu-cluster
289+
```
290+
291+
Next, run in the CLI
292+
293+
```azurecli
294+
az ml job create -f <file-name>.yml
295+
```
296+
297+
# [Python SDK](#tab/python)
298+
299+
The `Input` class allows you to define:
300+
301+
1. The `type`; whether the model is a `mlflow_model`,`custom_model` or `triton_model`.
302+
1. The `path` of where your data is located; can be any of the paths outlined in the [Supported Paths](#supported-paths) section.
303+
304+
```python
305+
from azure.ai.ml import command
306+
from azure.ai.ml.entities import Model
307+
from azure.ai.ml import Input
308+
from azure.ai.ml.constants import AssetTypes
309+
from azure.ai.ml import MLClient
310+
311+
# Possible Asset Types for Data:
312+
# AssetTypes.MLFLOW_MODEL
313+
# AssetTypes.CUSTOM_MODEL
314+
# AssetTypes.TRITON_MODEL
315+
316+
# Possible Paths for Model:
317+
# Local path: mlflow-model/model.pkl
318+
# AzureML Datastore: azureml://datastores/<datastore-name>/paths/<path_on_datastore>
319+
# MLflow run: runs:/<run-id>/<path-to-model-relative-to-the-root-of-the-artifact-location>
320+
# Job: azureml://jobs/<job-name>/outputs/<output-name>/paths/<path-to-model-relative-to-the-named-output-location>
321+
# Model Asset: azureml:<my_model>:<version>
322+
323+
my_job_inputs = {
324+
"input_model": Input(type=AssetTypes.MLFLOW_MODEL, path="mlflowmodel")
325+
}
326+
327+
job = command(
328+
code="./src", # local path where the code is stored
329+
command="ls ${{inputs.input_model}}",
330+
inputs=my_job_inputs,
331+
environment="AzureML-sklearn-0.24-ubuntu18.04-py37-cpu:9",
332+
compute="cpu-cluster",
333+
)
334+
335+
# submit the command
336+
returned_job = ml_client.jobs.create_or_update(job)
337+
# get a URL for the status of the job
338+
returned_job.services["Studio"].endpoint
339+
```
340+
341+
---
342+
## Use model as output in a job
343+
344+
In your job you can write model to your cloud-based storage using *outputs*.
345+
346+
# [Azure CLI](#tab/cli)
347+
348+
Create a job specification YAML file (`<file-name>.yml`), with the `outputs` section populated with the type and path of where you would like to write your data to:
349+
350+
```yaml
351+
$schema: https://azuremlschemas.azureedge.net/latest/CommandJob.schema.json
352+
353+
# Possible Paths for Model:
354+
# Local path: mlflow-model/model.pkl
355+
# AzureML Datastore: azureml://datastores/<datastore-name>/paths/<path_on_datastore>
356+
# MLflow run: runs:/<run-id>/<path-to-model-relative-to-the-root-of-the-artifact-location>
357+
# Job: azureml://jobs/<job-name>/outputs/<output-name>/paths/<path-to-model-relative-to-the-named-output-location>
358+
# Model Asset: azureml:<my_model>:<version>
359+
360+
code: src
361+
command: >-
362+
python load_write_model.py
363+
--input_model ${{inputs.input_model}}
364+
--custom_model_output ${{outputs.output_folder}}
365+
inputs:
366+
input_model:
367+
type: <type> # mlflow_model,custom_model, triton_model
368+
path: <path>
369+
outputs:
370+
output_folder:
371+
type: <type> # mlflow_model,custom_model, triton_model
372+
environment: azureml:AzureML-sklearn-0.24-ubuntu18.04-py37-cpu:9
373+
compute: azureml:cpu-cluster
374+
```
375+
376+
Next create a job using the CLI:
377+
378+
```azurecli
379+
az ml job create --file <file-name>.yml
380+
```
381+
382+
# [Python SDK](#tab/python)
383+
384+
```python
385+
from azure.ai.ml import command
386+
from azure.ai.ml.entities import Model
387+
from azure.ai.ml import Input, Output
388+
from azure.ai.ml.constants import AssetTypes
389+
390+
# Possible Asset Types for Model:
391+
# AssetTypes.MLFLOW_MODEL
392+
# AssetTypes.CUSTOM_MODEL
393+
# AssetTypes.TRITON_MODEL
394+
395+
# Possible Paths for Model:
396+
# Local path: mlflow-model/model.pkl
397+
# AzureML Datastore: azureml://datastores/<datastore-name>/paths/<path_on_datastore>
398+
# MLflow run: runs:/<run-id>/<path-to-model-relative-to-the-root-of-the-artifact-location>
399+
# Job: azureml://jobs/<job-name>/outputs/<output-name>/paths/<path-to-model-relative-to-the-named-output-location>
400+
# Model Asset: azureml:<my_model>:<version>
401+
402+
my_job_inputs = {
403+
"input_model": Input(type=AssetTypes.MLFLOW_MODEL, path="mlflow-model"),
404+
"input_data": Input(type=AssetTypes.URI_FILE, path="./mlflow-model/input_example.json"),
405+
}
406+
407+
my_job_outputs = {
408+
"output_folder": Output(type=AssetTypes.CUSTOM_MODEL)
409+
}
410+
411+
job = command(
412+
code="./src", # local path where the code is stored
413+
command="python load_write_model.py --input_model ${{inputs.input_model}} --output_folder ${{outputs.output_folder}}",
414+
inputs=my_job_inputs,
415+
outputs=my_job_outputs,
416+
environment="AzureML-sklearn-0.24-ubuntu18.04-py37-cpu:9",
417+
compute="cpu-cluster",
418+
)
419+
420+
# submit the command
421+
returned_job = ml_client.create_or_update(job)
422+
# get a URL for the status of the job
423+
returned_job.services["Studio"].endpoint
424+
425+
```
426+
427+
---
256428
## Next steps
257429

258430
* [Install and set up Python SDK v2](https://aka.ms/sdk-v2-install)

0 commit comments

Comments
 (0)