You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/tutorial-train-deploy-model-cli.md
+7-5Lines changed: 7 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -241,7 +241,7 @@ The output of this command is similar to the following JSON:
241
241
> [!IMPORTANT]
242
242
> Copy the value of the `id` entry, as it is used in the next section.
243
243
244
-
To check out a more comprehensive template for the JSON file that describe a dataset, use the following command:
244
+
To see a more comprehensive template for a dataset, use the following command:
245
245
```azurecli-interactive
246
246
az ml dataset register --show-template
247
247
```
@@ -283,17 +283,17 @@ data:
283
283
284
284
Change the value of the `id` entry to match the value returned when you registered the dataset. This value is used to load the data into the compute target during training.
285
285
286
-
This YAML does the following:
286
+
This YAML results in the following actions during training:
287
287
288
-
* Mounts the dataset (based on the ID of the dataset) in the training environment, and stores the path to the mount point in the `mnist` environment variable..
288
+
* Mounts the dataset (based on the ID of the dataset) in the training environment, and stores the path to the mount point in the `mnist` environment variable.
289
289
* Passes the location of the data (mount point) inside the training environment to the script using the `--data-folder` argument.
290
290
291
291
The runconfig file also contains information used to configure the environment used by the training run. If you inspect this file, you'll see that it references the `cpu-compute` compute target you created earlier. It also lists the number of nodes to use when training (`"nodeCount": "4"`), and contains a `"condaDependencies"` section that lists the Python packages needed to run the training script.
292
292
293
293
> [!TIP]
294
294
> While it is possible to manually create a runconfig file, the one in this example was created using the `generate-runconfig.py` file included in the repository. This file gets a reference to the registered dataset, creates a run config programatically, and then persists it to file.
295
295
296
-
For more information on run configuration files, see [Set up and use compute targets for model training](how-to-set-up-training-targets.md#create-run-configuration-and-submit-run-using-azure-machine-learning-cli), or reference this [JSON file](https://github.com/microsoft/MLOps/blob/b4bdcf8c369d188e83f40be8b748b49821f71cf2/infra-as-code/runconfigschema.json) to see the full schema for a runconfig.
296
+
For more information on run configuration files, see [Set up and use compute targets for model training](how-to-set-up-training-targets.md#create-run-configuration-and-submit-run-using-azure-machine-learning-cli). For a complete JSON reference, see the [runconfigschema.json](https://github.com/microsoft/MLOps/blob/b4bdcf8c369d188e83f40be8b748b49821f71cf2/infra-as-code/runconfigschema.json).
297
297
298
298
## Submit the training run
299
299
@@ -374,7 +374,9 @@ az ml model deploy -n myservice -m "mymodel:1" --ic inferenceConfig.yml --dc aci
374
374
375
375
This command deploys a new service named `myservice`, using version 1 of the model that you registered previously.
376
376
377
-
The `inferenceConfig.yml` file provides information on how to perform inference, such as the entry script (`score.py`) and software dependencies. For more information on the structure of this file, see the [Inference configuration schema](reference-azure-machine-learning-cli.md#inference-configuration-schema). For more information on entry scripts, see [Deploy models with the Azure Machine Learning](how-to-deploy-and-where.md#prepare-to-deploy).
377
+
The `inferenceConfig.yml` file provides information on how to use the model for inference. For example, it references the entry script (`score.py`) and software dependencies.
378
+
379
+
For more information on the structure of this file, see the [Inference configuration schema](reference-azure-machine-learning-cli.md#inference-configuration-schema). For more information on entry scripts, see [Deploy models with the Azure Machine Learning](how-to-deploy-and-where.md#prepare-to-deploy).
378
380
379
381
The `aciDeploymentConfig.yml` describes the deployment environment used to host the service. The deployment configuration is specific to the compute type that you use for the deployment. In this case, an Azure Container Instance is used. For more information, see the [Deployment configuration schema](reference-azure-machine-learning-cli.md#deployment-configuration-schema).
Copy file name to clipboardExpand all lines: includes/machine-learning-service-inference-config.md
+57-22Lines changed: 57 additions & 22 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
author: Blackmist
3
3
ms.service: machine-learning
4
4
ms.topic: include
5
-
ms.date: 11/06/2019
5
+
ms.date: 01/28/2020
6
6
ms.author: larryfr
7
7
---
8
8
@@ -11,15 +11,15 @@ The entries in the `inferenceconfig.json` document map to the parameters for the
11
11
| JSON entity | Method parameter | Description |
12
12
| ----- | ----- | ----- |
13
13
|`entryScript`|`entry_script`| Path to a local file that contains the code to run for the image. |
14
-
|`runtime`|`runtime`| Optional. Which runtime to use for the image. Current supported runtimes are `spark-py` and `python`. If `environment` is set, this gets ignored. |
15
-
|`condaFile`|`conda_file`| Optional. Path to a local file that contains a Conda environment definition to use for the image. If `environment` is set, this gets ignored. |
16
-
|`extraDockerFileSteps`|`extra_docker_file_steps`| Optional. Path to a local file that contains additional Docker steps to run when setting up the image. If `environment` is set, this gets ignored.|
17
-
|`sourceDirectory`|`source_directory`| Optional. Path to folders that contain all files to create the image which makes it easy to access any files within this folder or subfolder. You can upload an entire folder from your local machine as dependencies for the Webservice. Note: your entry_script, conda_file, and extra_docker_file_steps paths are relative paths to the source_directory path. |
18
-
|`enableGpu`|`enable_gpu`| Optional. Whether to enable GPU support in the image. The GPU image must be used on an Azure service, like Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. The default is False. If `environment` is set, this gets ignored.|
19
-
|`baseImage`|`base_image`| Optional. Custom image to be used as a base image. If no base image is provided, the image will be based on the provided runtime parameter. If `environment` is set, this gets ignored. |
20
-
|`baseImageRegistry`|`base_image_registry`| Optional. Image registry that contains the base image. If `environment` is set, this gets ignored.|
21
-
|`cudaVersion`|`cuda_version`| Optional. Version of CUDA to install for images that need GPU support. The GPU image must be used on an Azure service, like Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0. If `enable_gpu` is set, the default is 9.1. If `environment` is set, this gets ignored. |
22
-
|`description`|`description`| A description for the image. If `environment` is set, this gets ignored. |
14
+
|`runtime`|`runtime`| Optional. Which runtime to use for the image. Supported runtimes are `spark-py` and `python`. If `environment` is set, this entry is ignored. |
15
+
|`condaFile`|`conda_file`| Optional. Path to a local file that contains a Conda environment definition to use for the image. If `environment` is set, this entry is ignored. |
16
+
|`extraDockerFileSteps`|`extra_docker_file_steps`| Optional. Path to a local file that contains additional Docker steps to run when setting up the image. If `environment` is set, this entry is ignored.|
17
+
|`sourceDirectory`|`source_directory`| Optional. Path to folders that contain all files to create the image, which makes it easy to access any files within this folder or subfolder. You can upload an entire folder from your local machine as dependencies for the Webservice. Note: your entry_script, conda_file, and extra_docker_file_steps paths are relative paths to the source_directory path. |
18
+
|`enableGpu`|`enable_gpu`| Optional. Whether to enable GPU support in the image. The GPU image must be used on an Azure service, like Azure Container Instances. For example, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. The default is False. If `environment` is set, this entry is ignored.|
19
+
|`baseImage`|`base_image`| Optional. Custom image to be used as a base image. If no base image is provided, the image will be based on the provided runtime parameter. If `environment` is set, this entry is ignored. |
20
+
|`baseImageRegistry`|`base_image_registry`| Optional. Image registry that contains the base image. If `environment` is set, this entry is ignored.|
21
+
|`cudaVersion`|`cuda_version`| Optional. Version of CUDA to install for images that need GPU support. The GPU image must be used on an Azure service. For example, Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0. If `enable_gpu` is set, the default is 9.1. If `environment` is set, this entry is ignored. |
22
+
|`description`|`description`| A description for the image. If `environment` is set, this entry is ignored. |
The following JSON is an example inference configuration for use with the CLI:
@@ -37,30 +37,65 @@ The following JSON is an example inference configuration for use with the CLI:
37
37
}
38
38
```
39
39
40
-
The following JSON is an example inference configuration that uses an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py)with a specific version for use with the CLI:
40
+
You can include full specifications of an Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py)in the inference configuration file. If this environment doesn't exist in your workspace, Azure Machine Learning will create it. Otherwise, Azure Machine Learning will update the environment if necessary. The following JSON is an example:
The following JSON is an example inference configuration that uses an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py)with latest version for use with the CLI:
86
+
You can also use an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py)in separated CLI parameters and remove the "environment" key from the inference configuration file. Use -e for the environment name, and --ev for the environment version. If you don't specify --ev, the latest version will be used. Here is an example of an inference configuration file:
55
87
56
88
```json
57
89
{
58
90
"entryScript": "score.py",
59
-
"environment":{
60
-
"name": "myenv",
61
-
"version": null
62
-
},
63
-
"condaFile": "myenv.yml",
64
91
"sourceDirectory": null
65
92
}
66
93
```
94
+
95
+
The following command demonstrates how to deploy a model using the previous inference configuration file (named myInferenceConfig.json).
96
+
97
+
It also uses the latest version of an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) (named AzureML-Minimal).
98
+
99
+
```azurecli-interactive
100
+
az ml model deploy -m mymodel:1 --ic myInferenceConfig.json -e AzureML-Minimal --dc deploymentconfig.json
0 commit comments