Skip to content

Commit 6a75169

Browse files
authored
Merge pull request #102504 from Blackmist/model-inference-config
Model inference config
2 parents 43e96cb + 33439d7 commit 6a75169

File tree

2 files changed

+64
-27
lines changed

2 files changed

+64
-27
lines changed

articles/machine-learning/tutorial-train-deploy-model-cli.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -241,7 +241,7 @@ The output of this command is similar to the following JSON:
241241
> [!IMPORTANT]
242242
> Copy the value of the `id` entry, as it is used in the next section.
243243
244-
To check out a more comprehensive template for the JSON file that describe a dataset, use the following command:
244+
To see a more comprehensive template for a dataset, use the following command:
245245
```azurecli-interactive
246246
az ml dataset register --show-template
247247
```
@@ -283,17 +283,17 @@ data:
283283
284284
Change the value of the `id` entry to match the value returned when you registered the dataset. This value is used to load the data into the compute target during training.
285285

286-
This YAML does the following:
286+
This YAML results in the following actions during training:
287287

288-
* Mounts the dataset (based on the ID of the dataset) in the training environment, and stores the path to the mount point in the `mnist` environment variable..
288+
* Mounts the dataset (based on the ID of the dataset) in the training environment, and stores the path to the mount point in the `mnist` environment variable.
289289
* Passes the location of the data (mount point) inside the training environment to the script using the `--data-folder` argument.
290290

291291
The runconfig file also contains information used to configure the environment used by the training run. If you inspect this file, you'll see that it references the `cpu-compute` compute target you created earlier. It also lists the number of nodes to use when training (`"nodeCount": "4"`), and contains a `"condaDependencies"` section that lists the Python packages needed to run the training script.
292292
293293
> [!TIP]
294294
> While it is possible to manually create a runconfig file, the one in this example was created using the `generate-runconfig.py` file included in the repository. This file gets a reference to the registered dataset, creates a run config programatically, and then persists it to file.
295295

296-
For more information on run configuration files, see [Set up and use compute targets for model training](how-to-set-up-training-targets.md#create-run-configuration-and-submit-run-using-azure-machine-learning-cli), or reference this [JSON file](https://github.com/microsoft/MLOps/blob/b4bdcf8c369d188e83f40be8b748b49821f71cf2/infra-as-code/runconfigschema.json) to see the full schema for a runconfig.
296+
For more information on run configuration files, see [Set up and use compute targets for model training](how-to-set-up-training-targets.md#create-run-configuration-and-submit-run-using-azure-machine-learning-cli). For a complete JSON reference, see the [runconfigschema.json](https://github.com/microsoft/MLOps/blob/b4bdcf8c369d188e83f40be8b748b49821f71cf2/infra-as-code/runconfigschema.json).
297297

298298
## Submit the training run
299299

@@ -374,7 +374,9 @@ az ml model deploy -n myservice -m "mymodel:1" --ic inferenceConfig.yml --dc aci
374374

375375
This command deploys a new service named `myservice`, using version 1 of the model that you registered previously.
376376

377-
The `inferenceConfig.yml` file provides information on how to perform inference, such as the entry script (`score.py`) and software dependencies. For more information on the structure of this file, see the [Inference configuration schema](reference-azure-machine-learning-cli.md#inference-configuration-schema). For more information on entry scripts, see [Deploy models with the Azure Machine Learning](how-to-deploy-and-where.md#prepare-to-deploy).
377+
The `inferenceConfig.yml` file provides information on how to use the model for inference. For example, it references the entry script (`score.py`) and software dependencies.
378+
379+
For more information on the structure of this file, see the [Inference configuration schema](reference-azure-machine-learning-cli.md#inference-configuration-schema). For more information on entry scripts, see [Deploy models with the Azure Machine Learning](how-to-deploy-and-where.md#prepare-to-deploy).
378380

379381
The `aciDeploymentConfig.yml` describes the deployment environment used to host the service. The deployment configuration is specific to the compute type that you use for the deployment. In this case, an Azure Container Instance is used. For more information, see the [Deployment configuration schema](reference-azure-machine-learning-cli.md#deployment-configuration-schema).
380382

includes/machine-learning-service-inference-config.md

Lines changed: 57 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
author: Blackmist
33
ms.service: machine-learning
44
ms.topic: include
5-
ms.date: 11/06/2019
5+
ms.date: 01/28/2020
66
ms.author: larryfr
77
---
88

@@ -11,15 +11,15 @@ The entries in the `inferenceconfig.json` document map to the parameters for the
1111
| JSON entity | Method parameter | Description |
1212
| ----- | ----- | ----- |
1313
| `entryScript` | `entry_script` | Path to a local file that contains the code to run for the image. |
14-
| `runtime` | `runtime` | Optional. Which runtime to use for the image. Current supported runtimes are `spark-py` and `python`. If `environment` is set, this gets ignored. |
15-
| `condaFile` | `conda_file` | Optional. Path to a local file that contains a Conda environment definition to use for the image. If `environment` is set, this gets ignored. |
16-
| `extraDockerFileSteps` | `extra_docker_file_steps` | Optional. Path to a local file that contains additional Docker steps to run when setting up the image. If `environment` is set, this gets ignored.|
17-
| `sourceDirectory` | `source_directory` | Optional. Path to folders that contain all files to create the image which makes it easy to access any files within this folder or subfolder. You can upload an entire folder from your local machine as dependencies for the Webservice. Note: your entry_script, conda_file, and extra_docker_file_steps paths are relative paths to the source_directory path. |
18-
| `enableGpu` | `enable_gpu` | Optional. Whether to enable GPU support in the image. The GPU image must be used on an Azure service, like Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. The default is False. If `environment` is set, this gets ignored.|
19-
| `baseImage` | `base_image` | Optional. Custom image to be used as a base image. If no base image is provided, the image will be based on the provided runtime parameter. If `environment` is set, this gets ignored. |
20-
| `baseImageRegistry` | `base_image_registry` | Optional. Image registry that contains the base image. If `environment` is set, this gets ignored.|
21-
| `cudaVersion` | `cuda_version` | Optional. Version of CUDA to install for images that need GPU support. The GPU image must be used on an Azure service, like Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0. If `enable_gpu` is set, the default is 9.1. If `environment` is set, this gets ignored. |
22-
| `description` | `description` | A description for the image. If `environment` is set, this gets ignored. |
14+
| `runtime` | `runtime` | Optional. Which runtime to use for the image. Supported runtimes are `spark-py` and `python`. If `environment` is set, this entry is ignored. |
15+
| `condaFile` | `conda_file` | Optional. Path to a local file that contains a Conda environment definition to use for the image. If `environment` is set, this entry is ignored. |
16+
| `extraDockerFileSteps` | `extra_docker_file_steps` | Optional. Path to a local file that contains additional Docker steps to run when setting up the image. If `environment` is set, this entry is ignored.|
17+
| `sourceDirectory` | `source_directory` | Optional. Path to folders that contain all files to create the image, which makes it easy to access any files within this folder or subfolder. You can upload an entire folder from your local machine as dependencies for the Webservice. Note: your entry_script, conda_file, and extra_docker_file_steps paths are relative paths to the source_directory path. |
18+
| `enableGpu` | `enable_gpu` | Optional. Whether to enable GPU support in the image. The GPU image must be used on an Azure service, like Azure Container Instances. For example, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. The default is False. If `environment` is set, this entry is ignored.|
19+
| `baseImage` | `base_image` | Optional. Custom image to be used as a base image. If no base image is provided, the image will be based on the provided runtime parameter. If `environment` is set, this entry is ignored. |
20+
| `baseImageRegistry` | `base_image_registry` | Optional. Image registry that contains the base image. If `environment` is set, this entry is ignored.|
21+
| `cudaVersion` | `cuda_version` | Optional. Version of CUDA to install for images that need GPU support. The GPU image must be used on an Azure service. For example, Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0. If `enable_gpu` is set, the default is 9.1. If `environment` is set, this entry is ignored. |
22+
| `description` | `description` | A description for the image. If `environment` is set, this entry is ignored. |
2323
| `environment` | `environment` | Optional. Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py).|
2424

2525
The following JSON is an example inference configuration for use with the CLI:
@@ -37,30 +37,65 @@ The following JSON is an example inference configuration for use with the CLI:
3737
}
3838
```
3939

40-
The following JSON is an example inference configuration that uses an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) with a specific version for use with the CLI:
40+
You can include full specifications of an Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) in the inference configuration file. If this environment doesn't exist in your workspace, Azure Machine Learning will create it. Otherwise, Azure Machine Learning will update the environment if necessary. The following JSON is an example:
4141

4242
```json
4343
{
4444
"entryScript": "score.py",
45-
"environment":{
46-
"name": "myenv",
45+
"environment": {
46+
"docker": {
47+
"arguments": [],
48+
"baseDockerfile": null,
49+
"baseImage": "mcr.microsoft.com/azureml/base:intelmpi2018.3-ubuntu16.04",
50+
"enabled": false,
51+
"sharedVolumes": true,
52+
"shmSize": null
53+
},
54+
"environmentVariables": {
55+
"EXAMPLE_ENV_VAR": "EXAMPLE_VALUE"
56+
},
57+
"name": "my-deploy-env",
58+
"python": {
59+
"baseCondaEnvironment": null,
60+
"condaDependencies": {
61+
"channels": [
62+
"conda-forge"
63+
],
64+
"dependencies": [
65+
"python=3.6.2",
66+
{
67+
"pip": [
68+
"azureml-defaults",
69+
"azureml-telemetry",
70+
"scikit-learn",
71+
"inference-schema[numpy-support]"
72+
]
73+
}
74+
],
75+
"name": "project_environment"
76+
},
77+
"condaDependenciesFile": null,
78+
"interpreterPath": "python",
79+
"userManagedDependencies": false
80+
},
4781
"version": "1"
48-
},
49-
"condaFile": "myenv.yml",
50-
"sourceDirectory": null
82+
}
5183
}
5284
```
5385

54-
The following JSON is an example inference configuration that uses an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) with latest version for use with the CLI:
86+
You can also use an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) in separated CLI parameters and remove the "environment" key from the inference configuration file. Use -e for the environment name, and --ev for the environment version. If you don't specify --ev, the latest version will be used. Here is an example of an inference configuration file:
5587

5688
```json
5789
{
5890
"entryScript": "score.py",
59-
"environment":{
60-
"name": "myenv",
61-
"version": null
62-
},
63-
"condaFile": "myenv.yml",
6491
"sourceDirectory": null
6592
}
6693
```
94+
95+
The following command demonstrates how to deploy a model using the previous inference configuration file (named myInferenceConfig.json).
96+
97+
It also uses the latest version of an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) (named AzureML-Minimal).
98+
99+
```azurecli-interactive
100+
az ml model deploy -m mymodel:1 --ic myInferenceConfig.json -e AzureML-Minimal --dc deploymentconfig.json
101+
```

0 commit comments

Comments
 (0)