You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: includes/machine-learning-service-inference-config.md
+13-11Lines changed: 13 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
author: Blackmist
3
3
ms.service: machine-learning
4
4
ms.topic: include
5
-
ms.date: 11/06/2019
5
+
ms.date: 01/28/2020
6
6
ms.author: larryfr
7
7
---
8
8
@@ -11,15 +11,15 @@ The entries in the `inferenceconfig.json` document map to the parameters for the
11
11
| JSON entity | Method parameter | Description |
12
12
| ----- | ----- | ----- |
13
13
|`entryScript`|`entry_script`| Path to a local file that contains the code to run for the image. |
14
-
|`runtime`|`runtime`| Optional. Which runtime to use for the image. Current supported runtimes are `spark-py` and `python`. If `environment` is set, this gets ignored. |
15
-
|`condaFile`|`conda_file`| Optional. Path to a local file that contains a Conda environment definition to use for the image. If `environment` is set, this gets ignored. |
16
-
|`extraDockerFileSteps`|`extra_docker_file_steps`| Optional. Path to a local file that contains additional Docker steps to run when setting up the image. If `environment` is set, this gets ignored.|
17
-
|`sourceDirectory`|`source_directory`| Optional. Path to folders that contain all files to create the image which makes it easy to access any files within this folder or subfolder. You can upload an entire folder from your local machine as dependencies for the Webservice. Note: your entry_script, conda_file, and extra_docker_file_steps paths are relative paths to the source_directory path. |
18
-
|`enableGpu`|`enable_gpu`| Optional. Whether to enable GPU support in the image. The GPU image must be used on an Azure service, like Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. The default is False. If `environment` is set, this gets ignored.|
19
-
|`baseImage`|`base_image`| Optional. Custom image to be used as a base image. If no base image is provided, the image will be based on the provided runtime parameter. If `environment` is set, this gets ignored. |
20
-
|`baseImageRegistry`|`base_image_registry`| Optional. Image registry that contains the base image. If `environment` is set, this gets ignored.|
21
-
|`cudaVersion`|`cuda_version`| Optional. Version of CUDA to install for images that need GPU support. The GPU image must be used on an Azure service, like Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0. If `enable_gpu` is set, the default is 9.1. If `environment` is set, this gets ignored. |
22
-
|`description`|`description`| A description for the image. If `environment` is set, this gets ignored. |
14
+
|`runtime`|`runtime`| Optional. Which runtime to use for the image. Supported runtimes are `spark-py` and `python`. If `environment` is set, this entry is ignored. |
15
+
|`condaFile`|`conda_file`| Optional. Path to a local file that contains a Conda environment definition to use for the image. If `environment` is set, this gentry is ignored. |
16
+
|`extraDockerFileSteps`|`extra_docker_file_steps`| Optional. Path to a local file that contains additional Docker steps to run when setting up the image. If `environment` is set, this entry is ignored.|
17
+
|`sourceDirectory`|`source_directory`| Optional. Path to folders that contain all files to create the image, which makes it easy to access any files within this folder or subfolder. You can upload an entire folder from your local machine as dependencies for the Webservice. Note: your entry_script, conda_file, and extra_docker_file_steps paths are relative paths to the source_directory path. |
18
+
|`enableGpu`|`enable_gpu`| Optional. Whether to enable GPU support in the image. The GPU image must be used on an Azure service, like Azure Container Instances. For example, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. The default is False. If `environment` is set, this entry is ignored.|
19
+
|`baseImage`|`base_image`| Optional. Custom image to be used as a base image. If no base image is provided, the image will be based on the provided runtime parameter. If `environment` is set, this entry is ignored. |
20
+
|`baseImageRegistry`|`base_image_registry`| Optional. Image registry that contains the base image. If `environment` is set, this entry is ignored.|
21
+
|`cudaVersion`|`cuda_version`| Optional. Version of CUDA to install for images that need GPU support. The GPU image must be used on an Azure service. For example, Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0. If `enable_gpu` is set, the default is 9.1. If `environment` is set, this entry is ignored. |
22
+
|`description`|`description`| A description for the image. If `environment` is set, this entry is ignored. |
The following JSON is an example inference configuration for use with the CLI:
@@ -92,7 +92,9 @@ You can also use an existing Azure Machine Learning [environment](https://docs.m
92
92
}
93
93
```
94
94
95
-
And here is a CLI example to deploy a model with the above inference configuration file, named myInferenceConfig.json, and with the latest version of an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) called AzureML-Minimal
95
+
The following command demonstrates how to deploy a model using the previous inference configuration file (named myInferenceConfig.json).
96
+
97
+
It also uses the latest version of an existing Azure Machine Learning [environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment.environment?view=azure-ml-py) (named AzureML-Minimal).
96
98
97
99
```azurecli-interactive
98
100
az ml model deploy -m mymodel:1 --ic myInferenceConfig.json -e AzureML-Minimal --dc deploymentconfig.json
0 commit comments