Skip to content

Commit 017cdc7

Browse files
committed
switch to main
1 parent 164ab47 commit 017cdc7

11 files changed

+148
-148
lines changed

articles/machine-learning/how-to-access-resources-from-endpoints-managed-identities.md

Lines changed: 46 additions & 46 deletions
Large diffs are not rendered by default.

articles/machine-learning/how-to-create-component-pipelines-cli.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ You should receive a JSON dictionary with information about the pipeline job, in
8686

8787
Open `ComponentA.yaml` to see how the first component is defined:
8888

89-
:::code language="yaml" source="~/azureml-examples-march-cli-preview/cli/jobs/pipelines-with-components/basics/3a_basic_pipeline/componentA.yml":::
89+
:::code language="yaml" source="~/azureml-examples-main/cli/jobs/pipelines-with-components/basics/3a_basic_pipeline/componentA.yml":::
9090

9191
In the current preview, only components of type `command` are supported. The `name` is the unique identifier and used in Studio to describe the component, and `display_name` is used for a display-friendly name. The `version` key-value pair allows you to evolve your pipeline components while maintaining reproducibility with older versions.
9292

@@ -105,7 +105,7 @@ For more information on components and their specification, see [What is an Azur
105105

106106
In the example directory, the `pipeline.yaml` file looks like the following code:
107107

108-
:::code language="yaml" source="~/azureml-examples-march-cli-preview/cli/jobs/pipelines-with-components/basics/3a_basic_pipeline/pipeline.yml":::
108+
:::code language="yaml" source="~/azureml-examples-main/cli/jobs/pipelines-with-components/basics/3a_basic_pipeline/pipeline.yml":::
109109

110110
If you open the job's URL in Studio (the value of `services.Studio.endpoint` from the `job create` command when creating a job or `job show` after the job has been created), you'll see a graph representation of your pipeline:
111111

@@ -172,11 +172,11 @@ Each of these phases may have multiple components. For instance, the data prepar
172172

173173
The `pipeline.yml` begins with the mandatory `type: pipeline` key-value pair. Then, it defines inputs and outputs as follows:
174174

175-
:::code language="yaml" source="~/azureml-examples-march-cli-preview/cli/jobs/pipelines-with-components/nyc_taxi_data_regression/pipeline.yml" range="5-22":::
175+
:::code language="yaml" source="~/azureml-examples-main/cli/jobs/pipelines-with-components/nyc_taxi_data_regression/pipeline.yml" range="5-22":::
176176

177177
As described previously, these entries specify the input data to the pipeline, in this case the dataset in `./data`, and the intermediate and final outputs of the pipeline, which are stored in separate paths. The names within these input and output entries become values in the `inputs` and `outputs` entries of the individual jobs:
178178

179-
:::code language="yaml" source="~/azureml-examples-march-cli-preview/cli/jobs/pipelines-with-components/nyc_taxi_data_regression/pipeline.yml" range="26-72":::
179+
:::code language="yaml" source="~/azureml-examples-main/cli/jobs/pipelines-with-components/nyc_taxi_data_regression/pipeline.yml" range="26-72":::
180180

181181
Notice how `parent.jobs.train-job.outputs.model_output` is used as an input to both the prediction job and the scoring job, as shown in the following diagram:
182182

@@ -206,7 +206,7 @@ Click on a component. You'll see some basic information about the component, suc
206206

207207
In the `1b_e2e_registered_components` directory, open the `pipeline.yml` file. The keys and values in the `inputs` and `outputs` dictionaries are similar to those already discussed. The only significant difference is the value of the `command` values in the `jobs.<JOB_NAME>.component` entries. The `component` value is of the form `azureml:<JOB_NAME>:<COMPONENT_VERSION>`. The `train-job` definition, for instance, specifies the latest version of the registered component `Train` should be used:
208208

209-
:::code language="yaml" source="~/azureml-examples-march-cli-preview/cli/jobs/pipelines-with-components/basics/1b_e2e_registered_components/pipeline.yml" range="29-40" highlight="4":::
209+
:::code language="yaml" source="~/azureml-examples-main/cli/jobs/pipelines-with-components/basics/1b_e2e_registered_components/pipeline.yml" range="29-40" highlight="4":::
210210

211211

212212
## Caching & reuse

articles/machine-learning/how-to-deploy-automl-endpoint.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -98,21 +98,21 @@ To deploy using these files, you can use either the studio or the Azure CLI.
9898

9999
To create a deployment from the CLI, you'll need the Azure CLI with the ML v2 extension. Run the following command to confirm that you've both:
100100

101-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/misc.sh" id="az_version":::
101+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_version":::
102102

103103
If you receive an error message or you don't see `Extensions: ml` in the response, follow the steps at [Install and set up the CLI (v2)](how-to-configure-cli.md).
104104

105105
Sign in:
106106

107-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/misc.sh" id="az_login":::
107+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_login":::
108108

109109
If you've access to multiple Azure subscriptions, you can set your active subscription:
110110

111-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/misc.sh" id="az_account_set":::
111+
:::code language="azurecli" source="~/azureml-examples-main/cli/misc.sh" id="az_account_set":::
112112

113113
Set the default resource group and workspace to where you wish to create the deployment:
114114

115-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/setup.sh" id="az_configure_defaults":::
115+
:::code language="azurecli" source="~/azureml-examples-main/cli/setup.sh" id="az_configure_defaults":::
116116

117117
## Put the scoring file in its own directory
118118

@@ -124,11 +124,11 @@ To create an online endpoint from the command line, you'll need to create an *en
124124

125125
__automl_endpoint.yml__
126126

127-
::: code language="yaml" source="~/azureml-examples-march-cli-preview/cli/endpoints/online/managed/sample/endpoint.yml" :::
127+
::: code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/managed/sample/endpoint.yml" :::
128128

129129
__automl_deployment.yml__
130130

131-
::: code language="yaml" source="~/azureml-examples-march-cli-preview/cli/endpoints/online/managed/sample/blue-deployment.yml" :::
131+
::: code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/managed/sample/blue-deployment.yml" :::
132132

133133
You'll need to modify this file to use the files you downloaded from the AutoML Models page.
134134

articles/machine-learning/how-to-deploy-batch-with-rest.md

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ In this article, you learn how to use the new REST APIs to:
4848
> [!NOTE]
4949
> Batch endpoint names need to be unique at the Azure region level. For example, there can be only one batch endpoint with the name mybatchendpoint in westus2.
5050
51-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="set_endpoint_name":::
51+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="set_endpoint_name":::
5252

5353
## Azure Machine Learning batch endpoints
5454

@@ -64,18 +64,18 @@ In the following REST API calls, we use `SUBSCRIPTION_ID`, `RESOURCE_GROUP`, `LO
6464

6565
Administrative REST requests a [service principal authentication token](how-to-manage-rest.md#retrieve-a-service-principal-authentication-token). Replace `TOKEN` with your own value. You can retrieve this token with the following command:
6666

67-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" range="10":::
67+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" range="10":::
6868

6969
The service provider uses the `api-version` argument to ensure compatibility. The `api-version` argument varies from service to service. Set the API version as a variable to accommodate future versions:
7070

71-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" range="8":::
71+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" range="8":::
7272

7373
### Create compute
7474
Batch scoring runs only on cloud computing resources, not locally. The cloud computing resource is a reusable virtual computer cluster where you can run batch scoring workflows.
7575

7676
Create a compute cluster:
7777

78-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="create_compute":::
78+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_compute":::
7979

8080
> [!TIP]
8181
> If you want to use an existing compute instead, you must specify the full Azure Resource Manager ID when [creating the batch deployment](#create-batch-deployment). The full ID uses the format `/subscriptions/$SUBSCRIPTION_ID/resourceGroups/$RESOURCE_GROUP/providers/Microsoft.MachineLearningServices/workspaces/$WORKSPACE/computes/<your-compute-name>`.
@@ -86,41 +86,41 @@ To register the model and code, first they need to be uploaded to a storage acco
8686

8787
You can use the tool [jq](https://stedolan.github.io/jq/) to parse the JSON result and get the required values. You can also use the Azure portal to find the same information:
8888

89-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="get_storage_details":::
89+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="get_storage_details":::
9090

9191
### Upload & register code
9292

9393
Now that you have the datastore, you can upload the scoring script. Use the Azure Storage CLI to upload a blob into your default container:
9494

95-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="upload_code":::
95+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="upload_code":::
9696

9797
> [!TIP]
9898
> You can also use other methods to upload, such as the Azure portal or [Azure Storage Explorer](https://azure.microsoft.com/features/storage-explorer/).
9999
100100
Once you upload your code, you can specify your code with a PUT request:
101101

102-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="create_code":::
102+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_code":::
103103

104104
### Upload and register model
105105

106106
Similar to the code, Upload the model files:
107107

108-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="upload_model":::
108+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="upload_model":::
109109

110110
Now, register the model:
111111

112-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="create_model":::
112+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_model":::
113113

114114
### Create environment
115115
The deployment needs to run in an environment that has the required dependencies. Create the environment with a PUT request. Use a docker image from Microsoft Container Registry. You can configure the docker image with `image` and add conda dependencies with `condaFile`.
116116

117117
Run the following code to read the `condaFile` defined in json. The source file is at `/cli/endpoints/batch/mnist/environment/conda.json` in the example repository:
118118

119-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="read_condafile":::
119+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="read_condafile":::
120120

121121
Now, run the following snippet to create an environment:
122122

123-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="create_environment":::
123+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_environment":::
124124

125125
## Deploy with batch endpoints
126126

@@ -130,19 +130,19 @@ Next, create the batch endpoint, a deployment, and set the default deployment.
130130

131131
Create the batch endpoint:
132132

133-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="create_endpoint":::
133+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_endpoint":::
134134

135135
### Create batch deployment
136136

137137
Create a batch deployment under the endpoint:
138138

139-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="create_deployment":::
139+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_deployment":::
140140

141141
### Set the default batch deployment under the endpoint
142142

143143
There's only one default batch deployment under one endpoint, which will be used when invoke to run batch scoring job.
144144

145-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="set_endpoint_defaults":::
145+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="set_endpoint_defaults":::
146146

147147
## Run batch scoring
148148

@@ -152,23 +152,23 @@ Invoking a batch endpoint triggers a batch scoring job. A job `id` is returned i
152152

153153
Get the scoring uri and access token to invoke the batch endpoint. First get the scoring uri:
154154

155-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="get_endpoint":::
155+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="get_endpoint":::
156156

157157
Get the batch endpoint access token:
158158

159-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="get_access_token":::
159+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="get_access_token":::
160160

161161
Now, invoke the batch endpoint to start a batch scoring job. The following example scores data publicly available in the cloud:
162162

163-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="score_endpoint_with_data_in_cloud":::
163+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="score_endpoint_with_data_in_cloud":::
164164

165165
If your data is stored in an Azure Machine Learning registered datastore, you can invoke the batch endpoint with a dataset. The following code creates a new dataset:
166166

167-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="create_dataset":::
167+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="create_dataset":::
168168

169169
Next, reference the dataset when invoking the batch endpoint:
170170

171-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="score_endpoint_with_dataset":::
171+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="score_endpoint_with_dataset":::
172172

173173
In the previous code snippet, a custom output location is provided by using `datastoreId`, `path`, and `outputFileName`. These settings allow you to configure where to store the batch scoring results.
174174

@@ -177,7 +177,7 @@ In the previous code snippet, a custom output location is provided by using `dat
177177
178178
For this example, the output is stored in the default blob storage for the workspace. The folder name is the same as the endpoint name, and the file name is randomly generated by the following code:
179179

180-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" ID="unique_output" :::
180+
:::code language="azurecli" source="~/azureml-examples-main/cli/batch-score-rest.sh" ID="unique_output" :::
181181

182182
### Check the batch scoring job
183183

@@ -186,7 +186,7 @@ Batch scoring jobs usually take some time to process the entire set of inputs. M
186186
> [!TIP]
187187
> The example invokes the default deployment of the batch endpoint. To invoke a non-default deployment, use the `azureml-model-deployment` HTTP header and set the value to the deployment name. For example, using a parameter of `--header "azureml-model-deployment: $DEPLOYMENT_NAME"` with curl.
188188
189-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="check_job":::
189+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="check_job":::
190190

191191
### Check batch scoring results
192192

@@ -196,7 +196,7 @@ For information on checking the results, see [Check batch scoring results](how-t
196196

197197
If you aren't going use the batch endpoint, you should delete it with the below command (it deletes the batch endpoint and all the underlying deployments):
198198

199-
:::code language="rest-api" source="~/azureml-examples-march-cli-preview/cli/batch-score-rest.sh" id="delete_endpoint":::
199+
:::code language="rest-api" source="~/azureml-examples-main/cli/batch-score-rest.sh" id="delete_endpoint":::
200200

201201
## Next steps
202202

articles/machine-learning/how-to-deploy-custom-container.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -57,47 +57,47 @@ cd azureml-examples/cli
5757

5858
Define environment variables:
5959

60-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/deploy-tfserving.sh" id="initialize_variables":::
60+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-tfserving.sh" id="initialize_variables":::
6161

6262
## Download a TensorFlow model
6363

6464
Download and unzip a model that divides an input by two and adds 2 to the result:
6565

66-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/deploy-tfserving.sh" id="download_and_unzip_model":::
66+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-tfserving.sh" id="download_and_unzip_model":::
6767

6868
## Run a TF Serving image locally to test that it works
6969

7070
Use docker to run your image locally for testing:
7171

72-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/deploy-tfserving.sh" id="run_image_locally_for_testing":::
72+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-tfserving.sh" id="run_image_locally_for_testing":::
7373

7474
### Check that you can send liveness and scoring requests to the image
7575

7676
First, check that the container is "alive," meaning that the process inside the container is still running. You should get a 200 (OK) response.
7777

78-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/deploy-tfserving.sh" id="check_liveness_locally":::
78+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-tfserving.sh" id="check_liveness_locally":::
7979

8080
Then, check that you can get predictions about unlabeled data:
8181

82-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/deploy-tfserving.sh" id="check_scoring_locally":::
82+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-tfserving.sh" id="check_scoring_locally":::
8383

8484
### Stop the image
8585

8686
Now that you've tested locally, stop the image:
8787

88-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/deploy-tfserving.sh" id="stop_image":::
88+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-tfserving.sh" id="stop_image":::
8989

9090
## Create a YAML file for your endpoint and deployment
9191

9292
You can configure your cloud deployment using YAML. Take a look at the sample YAML for this example:
9393

9494
__tfserving-endpoint.yml__
9595

96-
:::code language="yaml" source="~/azureml-examples-march-cli-preview/cli/endpoints/online/custom-container/tfserving-endpoint.yml":::
96+
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/custom-container/tfserving-endpoint.yml":::
9797

9898
__tfserving-deployment.yml__
9999

100-
:::code language="yaml" source="~/azureml-examples-march-cli-preview/cli/endpoints/online/custom-container/tfserving-deployment.yml":::
100+
:::code language="yaml" source="~/azureml-examples-main/cli/endpoints/online/custom-container/tfserving-deployment.yml":::
101101

102102
There are a few important concepts to notice in this YAML:
103103

@@ -170,7 +170,7 @@ az ml online-deployment create --name tfserving-deployment -f endpoints/online/c
170170

171171
Once your deployment completes, see if you can make a scoring request to the deployed endpoint.
172172

173-
:::code language="azurecli" source="~/azureml-examples-march-cli-preview/cli/deploy-tfserving.sh" id="invoke_endpoint":::
173+
:::code language="azurecli" source="~/azureml-examples-main/cli/deploy-tfserving.sh" id="invoke_endpoint":::
174174

175175
### Delete endpoint and model
176176

0 commit comments

Comments
 (0)