@@ -138,7 +138,7 @@ For more information on `az ml model register`, see the [reference documentation
138
138
139
139
You can register a model by providing the local path of the model. You can provide the path of either a folder or a single file on your local machine.
140
140
<!-- pyhton nb call -->
141
- [ !Notebook-python[ ] (~ /azureml-examples-main /v1/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=register-model-from-local-file-code)]
141
+ [ !Notebook-python[ ] (~ /azureml-examples-archive /v1/python-sdk/tutorials/deploy-local/1.deploy-local.ipynb?name=register-model-from-local-file-code)]
142
142
143
143
144
144
To include multiple files in the model registration, set ` model_path ` to the path of a folder that contains the files.
@@ -204,7 +204,7 @@ The two things you need to accomplish in your entry script are:
204
204
205
205
For your initial deployment, use a dummy entry script that prints the data it receives.
206
206
207
- :::code language = " python" source = " ~/azureml-examples-main /v1/python-sdk/tutorials/deploy-local/source_dir/echo_score.py" :::
207
+ :::code language = " python" source = " ~/azureml-examples-archive /v1/python-sdk/tutorials/deploy-local/source_dir/echo_score.py" :::
208
208
209
209
Save this file as `echo_score.py` inside of a directory called `source_dir` . This dummy script returns the data you send to it, so it doesn' t use the model. But it is useful for testing that the scoring script is running.
210
210
@@ -225,7 +225,7 @@ You can use any [Azure Machine Learning inference curated environments](../conce
225
225
226
226
A minimal inference configuration can be written as :
227
227
228
- :::code language = " json" source = " ~/azureml-examples-main /v1/python-sdk/tutorials/deploy-local/dummyinferenceconfig.json" :::
228
+ :::code language = " json" source = " ~/azureml-examples-archive /v1/python-sdk/tutorials/deploy-local/dummyinferenceconfig.json" :::
229
229
230
230
Save this file with the name `dummyinferenceconfig.json` .
231
231
@@ -236,7 +236,7 @@ Save this file with the name `dummyinferenceconfig.json`.
236
236
237
237
The following example demonstrates how to create a minimal environment with no pip dependencies, using the dummy scoring script you defined above.
238
238
239
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = inference- configuration- code)]
239
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = inference- configuration- code)]
240
240
241
241
For more information on environments, see [Create and manage environments for training and deployment](../ how- to- use- environments.md).
242
242
@@ -263,7 +263,7 @@ For more information, see the [deployment schema](reference-azure-machine-learni
263
263
264
264
The following Python demonstrates how to create a local deployment configuration:
265
265
266
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deployment- configuration- code)]
266
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deployment- configuration- code)]
267
267
268
268
-- -
269
269
@@ -290,9 +290,9 @@ az ml model deploy -n myservice \
290
290
# [Python SDK](#tab/python)
291
291
292
292
293
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deploy- model- code)]
293
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deploy- model- code)]
294
294
295
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deploy- model- print - logs)]
295
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deploy- model- print - logs)]
296
296
297
297
For more information, see the documentation for [Model.deploy()](/ python/ api/ azureml- core/ azureml.core.model.model# deploy-workspace--name--models--inference-config-none--deployment-config-none--deployment-target-none--overwrite-false-) and [Webservice](/python/api/azureml-core/azureml.core.webservice.webservice).
298
298
@@ -315,7 +315,7 @@ curl -v -X POST -H "content-type:application/json" \
315
315
316
316
# [Python SDK](#tab/python)
317
317
< !-- python nb call -- >
318
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = call- into- model- code)]
318
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = call- into- model- code)]
319
319
320
320
-- -
321
321
@@ -324,7 +324,7 @@ curl -v -X POST -H "content-type:application/json" \
324
324
Now it' s time to actually load your model. First, modify your entry script:
325
325
326
326
327
- :::code language = " python" source = " ~/azureml-examples-main /v1/python-sdk/tutorials/deploy-local/source_dir/score.py" :::
327
+ :::code language = " python" source = " ~/azureml-examples-archive /v1/python-sdk/tutorials/deploy-local/source_dir/score.py" :::
328
328
329
329
Save this file as `score.py` inside of `source_dir` .
330
330
@@ -334,7 +334,7 @@ Notice the use of the `AZUREML_MODEL_DIR` environment variable to locate your re
334
334
335
335
[! INCLUDE [cli v1](../ includes/ machine- learning- cli- v1.md)]
336
336
337
- :::code language = " json" source = " ~/azureml-examples-main /v1/python-sdk/tutorials/deploy-local/inferenceconfig.json" :::
337
+ :::code language = " json" source = " ~/azureml-examples-archive /v1/python-sdk/tutorials/deploy-local/inferenceconfig.json" :::
338
338
339
339
Save this file as `inferenceconfig.json`
340
340
@@ -377,9 +377,9 @@ az ml model deploy -n myservice \
377
377
378
378
# [Python SDK](#tab/python)
379
379
380
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- model- code)]
380
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- model- code)]
381
381
382
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- model- print - logs)]
382
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- model- print - logs)]
383
383
384
384
For more information, see the documentation for [Model.deploy()](/ python/ api/ azureml- core/ azureml.core.model.model# deploy-workspace--name--models--inference-config-none--deployment-config-none--deployment-target-none--overwrite-false-) and [Webservice](/python/api/azureml-core/azureml.core.webservice.webservice).
385
385
@@ -398,7 +398,7 @@ curl -v -X POST -H "content-type:application/json" \
398
398
399
399
# [Python SDK](#tab/python)
400
400
401
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = send- post- request- code)]
401
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = send- post- request- code)]
402
402
403
403
-- -
404
404
@@ -418,15 +418,15 @@ Change your deploy configuration to correspond to the compute target you've chos
418
418
419
419
The options available for a deployment configuration differ depending on the compute target you choose.
420
420
421
- :::code language = " json" source = " ~/azureml-examples-main /v1/python-sdk/tutorials/deploy-local/re-deploymentconfig.json" :::
421
+ :::code language = " json" source = " ~/azureml-examples-archive /v1/python-sdk/tutorials/deploy-local/re-deploymentconfig.json" :::
422
422
423
423
Save this file as `re- deploymentconfig.json` .
424
424
425
425
For more information, see [this reference](reference- azure- machine- learning- cli.md# deployment-configuration-schema).
426
426
427
427
# [Python SDK](#tab/python)
428
428
429
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deploy- model- on- cloud- code)]
429
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = deploy- model- on- cloud- code)]
430
430
431
431
-- -
432
432
@@ -460,9 +460,9 @@ az ml service get-logs -n myservice \
460
460
# [Python SDK](#tab/python)
461
461
462
462
463
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- service- code)]
463
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- service- code)]
464
464
465
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- service- print - logs)]
465
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = re- deploy- service- print - logs)]
466
466
467
467
For more information, see the documentation for [Model.deploy()](/ python/ api/ azureml- core/ azureml.core.model.model# deploy-workspace--name--models--inference-config-none--deployment-config-none--deployment-target-none--overwrite-false-) and [Webservice](/python/api/azureml-core/azureml.core.webservice.webservice).
468
468
@@ -473,9 +473,9 @@ For more information, see the documentation for [Model.deploy()](/python/api/azu
473
473
474
474
When you deploy remotely, you may have key authentication enabled. The example below shows how to get your service key with Python in order to make an inference request.
475
475
476
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = call- remote- web- service- code)]
476
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = call- remote- web- service- code)]
477
477
478
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = call- remote- webservice- print - logs)]
478
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = call- remote- webservice- print - logs)]
479
479
480
480
481
481
@@ -515,7 +515,7 @@ The following table describes the different service states:
515
515
[! INCLUDE [cli v1](../ includes/ machine- learning- cli- v1.md)]
516
516
517
517
518
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 2. deploy- local- cli.ipynb? name = delete- resource- code)]
518
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 2. deploy- local- cli.ipynb? name = delete- resource- code)]
519
519
520
520
```azurecli- interactive
521
521
az ml service delete - n myservice
@@ -531,7 +531,7 @@ Read more about [deleting a webservice](/cli/azure/ml(v1)/computetarget/create#a
531
531
532
532
# [Python SDK](#tab/python)
533
533
534
- [! Notebook- python[] (~ / azureml- examples- main / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = delete- resource- code)]
534
+ [! Notebook- python[] (~ / azureml- examples- archive / v1/ python- sdk/ tutorials/ deploy- local/ 1. deploy- local.ipynb? name = delete- resource- code)]
535
535
536
536
To delete a deployed web service, use `service.delete()` .
537
537
To delete a registered model, use `model.delete()` .
0 commit comments