Skip to content

Commit 7032a9a

Browse files
committed
switch to main branch
1 parent 359eb8c commit 7032a9a

34 files changed

+296
-296
lines changed

articles/machine-learning/how-to-access-resources-from-endpoints-managed-identities.md

Lines changed: 46 additions & 46 deletions
Large diffs are not rendered by default.

articles/machine-learning/how-to-autoscale-endpoints.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -36,15 +36,15 @@ To enable autoscale for an endpoint, you first define an autoscale profile. This
3636

3737
The following snippet sets the endpoint and deployment names:
3838

39-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-moe-autoscale.sh" ID="set_endpoint_deployment_name" :::
39+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-moe-autoscale.sh" ID="set_endpoint_deployment_name" :::
4040

4141
Next, get the Azure Resource Manager ID of the deployment and endpoint:
4242

43-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-moe-autoscale.sh" ID="set_other_env_variables" :::
43+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-moe-autoscale.sh" ID="set_other_env_variables" :::
4444

4545
The following snippet creates the autoscale profile:
4646

47-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-moe-autoscale.sh" ID="create_autoscale_profile" :::
47+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-moe-autoscale.sh" ID="create_autoscale_profile" :::
4848

4949
> [!NOTE]
5050
> For more, see the [reference page for autoscale](/cli/azure/monitor/autoscale?view=azure-cli-latest&preserve-view=true)
@@ -76,7 +76,7 @@ A common scaling out rule is one that increases the number of VM instances when
7676

7777
# [Azure CLI](#tab/azure-cli)
7878

79-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-moe-autoscale.sh" ID="scale_out_on_cpu_util" :::
79+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-moe-autoscale.sh" ID="scale_out_on_cpu_util" :::
8080

8181
The rule is part of the `my-scale-settings` profile (`autoscale-name` matches the `name` of the profile). The value of its `condition` argument says the rule should trigger when "The average CPU consumption among the VM instances exceeds 70% for five minutes." When that condition is satisfied, two more VM instances are allocated.
8282

@@ -104,7 +104,7 @@ When load is light, a scaling in rule can reduce the number of VM instances. The
104104

105105
# [Azure CLI](#tab/azure-cli)
106106

107-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-moe-autoscale.sh" ID="scale_in_on_cpu_util" :::
107+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-moe-autoscale.sh" ID="scale_in_on_cpu_util" :::
108108

109109
# [Portal](#tab/azure-portal)
110110

@@ -131,7 +131,7 @@ The previous rules applied to the deployment. Now, add a rule that applies to th
131131

132132
# [Azure CLI](#tab/azure-cli)
133133

134-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-moe-autoscale.sh" ID="scale_up_on_request_latency" :::
134+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-moe-autoscale.sh" ID="scale_up_on_request_latency" :::
135135

136136
# [Portal](#tab/azure-portal)
137137

@@ -157,7 +157,7 @@ You can also create rules that apply only on certain days or at certain times. I
157157

158158
# [Azure CLI](#tab/azure-cli)
159159

160-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-moe-autoscale.sh" ID="weekend_profile" :::
160+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-moe-autoscale.sh" ID="weekend_profile" :::
161161

162162
# [Portal](#tab/azure-portal)
163163

@@ -176,7 +176,7 @@ From the bottom of the page, select __+ Add a scale condition__. On the new scal
176176

177177
If you are not going to use your deployments, delete them:
178178

179-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/deploy-managed-online-endpoint.sh" ID="delete_endpoint" :::
179+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-managed-online-endpoint.sh" ID="delete_endpoint" :::
180180

181181
## Next steps
182182

articles/machine-learning/how-to-configure-cli.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -29,73 +29,73 @@ The `ml` extension (preview) to the [Azure CLI](/cli/azure/) is the enhanced int
2929

3030
The new Machine Learning extension **requires Azure CLI version `>=2.15.0`**. Ensure this requirement is met:
3131

32-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_version":::
32+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_version":::
3333

3434
If it isn't, [upgrade your Azure CLI](/cli/azure/update-azure-cli).
3535

3636
Check the Azure CLI extensions you've installed:
3737

38-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_extension_list":::
38+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_extension_list":::
3939

4040
Ensure no conflicting extension using the `ml` namespace is installed, including the `azure-cli-ml` extension:
4141

42-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_extension_remove":::
42+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_extension_remove":::
4343

4444
Now, install the `ml` extension:
4545

46-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/setup.sh" id="az_ml_install":::
46+
:::code language="azurecli" source="~/azureml-examples-main/cli/setup.sh" id="az_ml_install":::
4747

4848
Run the help command to verify your installation and see available subcommands:
4949

50-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_ml_verify":::
50+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_ml_verify":::
5151

5252
You can upgrade the extension to the latest version:
5353

54-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_ml_update":::
54+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_ml_update":::
5555

5656
### Installation on Linux
5757

5858
If you're using Linux, the fastest way to install the necessary CLI version and the Machine Learning extension is:
5959

60-
:::code language="bash" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_extension_install_linux":::
60+
:::code language="bash" source="~/azureml-examples-main/cli/misc.sh" id="az_extension_install_linux":::
6161

6262
For more, see [Install the Azure CLI for Linux](/cli/azure/install-azure-cli-linux).
6363

6464
## Set up
6565

6666
Login:
6767

68-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_login":::
68+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_login":::
6969

7070
If you have access to multiple Azure subscriptions, you can set your active subscription:
7171

72-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="az_account_set":::
72+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_account_set":::
7373

7474
Optionally, setup common variables in your shell for usage in subsequent commands:
7575

76-
:::code language="azurecli" source="~/azureml-examples-cli-preview/setup-repo/azure-github.sh" id="set_variables":::
76+
:::code language="azurecli" source="~/azureml-examples-main/setup-repo/azure-github.sh" id="set_variables":::
7777

7878
> [!WARNING]
7979
> This uses Bash syntax for setting variables -- adjust as needed for your shell. You can also replace the values in commands below inline rather than using variables.
8080
8181
If it doesn't already exist, you can create the Azure resource group:
8282

83-
:::code language="azurecli" source="~/azureml-examples-cli-preview/setup-repo/azure-github.sh" id="az_group_create":::
83+
:::code language="azurecli" source="~/azureml-examples-main/setup-repo/azure-github.sh" id="az_group_create":::
8484

8585
And create a machine learning workspace:
8686

87-
:::code language="azurecli" source="~/azureml-examples-cli-preview/setup-repo/azure-github.sh" id="az_ml_workspace_create":::
87+
:::code language="azurecli" source="~/azureml-examples-main/setup-repo/azure-github.sh" id="az_ml_workspace_create":::
8888

8989
Machine learning subcommands require the `--workspace/-w` and `--resource-group/-g` parameters. To avoid typing these repeatedly, configure defaults:
9090

91-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/setup.sh" id="az_configure_defaults":::
91+
:::code language="azurecli" source="~/azureml-examples-main/cli/setup.sh" id="az_configure_defaults":::
9292

9393
> [!TIP]
9494
> Most code examples assume you have set a default workspace and resource group. You can override these on the command line.
9595
9696
You can show your current defaults using `--list-defaults/-l`:
9797

98-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/misc.sh" id="list_defaults":::
98+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="list_defaults":::
9999

100100
> [!TIP]
101101
> Combining with `--output/-o` allows for more readable output formats.

articles/machine-learning/how-to-deploy-automl-endpoint.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -119,11 +119,11 @@ To create a managed online endpoint from the command line, you'll need to create
119119

120120
__automl_endpoint.yml__
121121

122-
::: code language="yaml" source="~/azureml-examples-cli-preview/cli/endpoints/online/managed/sample/endpoint.yml" :::
122+
::: code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/managed/sample/endpoint.yml" :::
123123

124124
__automl_deployment.yml__
125125

126-
::: code language="yaml" source="~/azureml-examples-cli-preview/cli/endpoints/online/managed/sample/blue-deployment.yml" :::
126+
::: code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/managed/sample/blue-deployment.yml" :::
127127

128128
You'll need to modify this file to use the files you downloaded from the AutoML Models page.
129129

articles/machine-learning/how-to-deploy-batch-with-rest.md

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ In this article, you learn how to use the new REST APIs to:
4747
> [!NOTE]
4848
> Batch endpoint names need to be unique at the Azure region level. For example, there can be only one batch endpoint with the name mybatchendpoint in westus2.
4949
50-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="set_endpoint_name":::
50+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="set_endpoint_name":::
5151

5252
## Azure Machine Learning batch endpoints
5353

@@ -63,18 +63,18 @@ In the following REST API calls, we use `SUBSCRIPTION_ID`, `RESOURCE_GROUP`, `LO
6363

6464
Administrative REST requests a [service principal authentication token](how-to-manage-rest.md#retrieve-a-service-principal-authentication-token). Replace `TOKEN` with your own value. You can retrieve this token with the following command:
6565

66-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" range="13":::
66+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" range="13":::
6767

6868
The service provider uses the `api-version` argument to ensure compatibility. The `api-version` argument varies from service to service. Set the API version as a variable to accommodate future versions:
6969

70-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" range="11":::
70+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" range="11":::
7171

7272
### Create compute
7373
Batch scoring runs only on cloud computing resources, not locally. The cloud computing resource is a reusable virtual computer cluster where you can run batch scoring workflows.
7474

7575
Create a compute cluster:
7676

77-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="create_compute":::
77+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_compute":::
7878

7979
> [!TIP]
8080
> If you want to use an existing compute instead, you must specify the full Azure Resource Manager ID when [creating the batch deployment](#create-batch-deployment). The full ID uses the format `/subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.MachineLearningServices/workspaces/$WORKSPACE/computes/<your-compute-name>`.
@@ -85,41 +85,41 @@ To register the model and code, first they need to be uploaded to a storage acco
8585

8686
You can use the tool [jq](https://stedolan.github.io/jq/) to parse the JSON result and get the required values. You can also use the Azure portal to find the same information:
8787

88-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="get_storage_details":::
88+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="get_storage_details":::
8989

9090
### Upload & register code
9191

9292
Now that you have the datastore, you can upload the scoring script. Use the Azure Storage CLI to upload a blob into your default container:
9393

94-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="upload_code":::
94+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="upload_code":::
9595

9696
> [!TIP]
9797
> You can also use other methods to upload, such as the Azure portal or [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/).
9898
9999
Once you upload your code, you can specify your code with a PUT request:
100100

101-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="create_code":::
101+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_code":::
102102

103103
### Upload and register model
104104

105105
Similar to the code, Upload the model files:
106106

107-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="upload_model":::
107+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="upload_model":::
108108

109109
Now, register the model:
110110

111-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="create_model":::
111+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_model":::
112112

113113
### Create environment
114114
The deployment needs to run in an environment that has the required dependencies. Create the environment with a PUT request. Use a docker image from Microsoft Container Registry. You can configure the docker image with `image` and add conda dependencies with `condaFile`.
115115

116116
Run the following code to read the `condaFile` defined in json. The source file is at `/cli/endpoints/batch/mnist/environment/conda.json` in the example repository:
117117

118-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="read_condafile":::
118+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="read_condafile":::
119119

120120
Now, run the following snippet to create an environment:
121121

122-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="create_environment":::
122+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_environment":::
123123

124124
## Deploy with batch endpoints
125125

@@ -129,19 +129,19 @@ Next, create the batch endpoint, a deployment, and set the default deployment.
129129

130130
Create the batch endpoint:
131131

132-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="create_endpoint":::
132+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_endpoint":::
133133

134134
### Create batch deployment
135135

136136
Create a batch deployment under the endpoint:
137137

138-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="create_deployment":::
138+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_deployment":::
139139

140140
### Set the default batch deployment under the endpoint
141141

142142
There's only one default batch deployment under one endpoint, which will be used when invoke to run batch scoring job.
143143

144-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="set_endpoint_defaults":::
144+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="set_endpoint_defaults":::
145145

146146
## Run batch scoring
147147

@@ -151,23 +151,23 @@ Invoking a batch endpoint triggers a batch scoring job. A job `id` is returned i
151151

152152
Get the scoring uri and access token to invoke the batch endpoint. First get the scoring uri:
153153

154-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="get_endpoint":::
154+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="get_endpoint":::
155155

156156
Get the batch endpoint access token:
157157

158-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="get_access_token":::
158+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="get_access_token":::
159159

160160
Now, invoke the batch endpoint to start a batch scoring job. The following example scores data publicly available in the cloud:
161161

162-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="score_endpoint_with_data_in_cloud":::
162+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="score_endpoint_with_data_in_cloud":::
163163

164164
If your data is stored in an Azure Machine Learning registered datastore, you can invoke the batch endpoint with a dataset. The following code creates a new dataset:
165165

166-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="create_dataset":::
166+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_dataset":::
167167

168168
Next, reference the dataset when invoking the batch endpoint:
169169

170-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="score_endpoint_with_dataset":::
170+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="score_endpoint_with_dataset":::
171171

172172
In the previous code snippet, a custom output location is provided by using `datastoreId`, `path`, and `outputFileName`. These settings allow you to configure where to store the batch scoring results.
173173

@@ -176,7 +176,7 @@ In the previous code snippet, a custom output location is provided by using `dat
176176
177177
For this example, the output is stored in the default blob storage for the workspace. The folder name is the same as the endpoint name, and the file name is randomly generated by the following code:
178178

179-
:::code language="azurecli" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" ID="unique_output" :::
179+
:::code language="azurecli" source="~/azureml-examples-main/cli/batch-score-rest.sh" ID="unique_output" :::
180180

181181
### Check the batch scoring job
182182

@@ -185,7 +185,7 @@ Batch scoring jobs usually take some time to process the entire set of inputs. M
185185
> [!TIP]
186186
> The example invokes the default deployment of the batch endpoint. To invoke a non-default deployment, use the `azureml-model-deployment` HTTP header and set the value to the deployment name. For example, using a parameter of `--header "azureml-model-deployment: $DEPLOYMENT_NAME"` with curl.
187187
188-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="check_job":::
188+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="check_job":::
189189

190190
### Check batch scoring results
191191

@@ -195,7 +195,7 @@ For information on checking the results, see [Check batch scoring results](how-t
195195

196196
If you aren't going use the batch endpoint, you should delete it with the below command (it deletes the batch endpoint and all the underlying deployments):
197197

198-
:::code language="rest-api" source="~/azureml-examples-cli-preview/cli/batch-score-rest.sh" id="delete_endpoint":::
198+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="delete_endpoint":::
199199

200200
## Next steps
201201

0 commit comments

Comments
 (0)