Skip to content

Commit 9f03743

Browse files
Merge pull request #214980 from pritamso/Broken-link-fix-dem108
Broken link fixed
2 parents cd118da + 55cbf85 commit 9f03743

File tree

4 files changed

+8
-8
lines changed

4 files changed

+8
-8
lines changed

articles/machine-learning/how-to-access-resources-from-endpoints-managed-identities.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -96,7 +96,7 @@ This guide assumes you don't have a managed identity, a storage account or an on
9696
git clone https://github.com/Azure/azureml-examples --depth 1
9797
cd azureml-examples/sdk/endpoints/online/managed/managed-identities
9898
```
99-
* To follow along with this notebook, access the companion [example notebook](https://github.com/Azure/azureml-examples/blob/main/sdk/endpoints/online/managed/managed-identities/online-endpoints-managed-identity-sai.ipynb) within in the `sdk/endpoints/online/managed/managed-identities` directory.
99+
* To follow along with this notebook, access the companion [example notebook](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/managed/managed-identities/online-endpoints-managed-identity-uai.ipynb) within in the `sdk/endpoints/online/managed/managed-identities` directory.
100100
101101
* Additional Python packages are required for this example:
102102
@@ -134,7 +134,7 @@ Install them with the following code:
134134
git clone https://github.com/Azure/azureml-examples --depth 1
135135
cd azureml-examples/sdk/endpoints/online/managed/managed-identities
136136
```
137-
* To follow along with this notebook, access the companion [example notebook](https://github.com/Azure/azureml-examples/blob/main/sdk/endpoints/online/managed/managed-identities/online-endpoints-managed-identity-uai.ipynb) within in the `sdk/endpoints/online/managed/managed-identities` directory.
137+
* To follow along with this notebook, access the companion [example notebook](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/managed/managed-identities/online-endpoints-managed-identity-uai.ipynb) within in the `sdk/endpoints/online/managed/managed-identities` directory.
138138
139139
* Additional Python packages are required for this example:
140140

articles/machine-learning/how-to-deploy-custom-container.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -381,4 +381,4 @@ ml_client.online_endpoints.begin_delete(name=online_endpoint_name)
381381

382382
- [Safe rollout for online endpoints](how-to-safely-rollout-managed-endpoints.md)
383383
- [Troubleshooting online endpoints deployment](./how-to-troubleshoot-online-endpoints.md)
384-
- [Torch serve sample](https://github.com/Azure/azureml-examples/blob/main/cli/deploy-torchserve.sh)
384+
- [Torch serve sample](https://github.com/Azure/azureml-examples/blob/main/cli/deploy-custom-container-torchserve-densenet.sh)

articles/machine-learning/how-to-deploy-managed-online-endpoints.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -286,7 +286,7 @@ For supported general-purpose and GPU instance types, see [Managed online endpoi
286286

287287
### Use more than one model
288288

289-
Currently, you can specify only one model per deployment in the YAML. If you've more than one model, when you register the model, copy all the models as files or subdirectories into a folder that you use for registration. In your scoring script, use the environment variable `AZUREML_MODEL_DIR` to get the path to the model root folder. The underlying directory structure is retained. For an example of the scoring script for multi models, see [multimodel-minimal-score.py](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/custom-container/multimodel-minimal-score.py).
289+
Currently, you can specify only one model per deployment in the YAML. If you've more than one model, when you register the model, copy all the models as files or subdirectories into a folder that you use for registration. In your scoring script, use the environment variable `AZUREML_MODEL_DIR` to get the path to the model root folder. The underlying directory structure is retained. For an example of deploying multiple models to one deployment, see [Deploy multiple models to one deployment](https://github.com/Azure/azureml-examples/blob/main/cli/endpoints/online/custom-container/minimal/multimodel/README.md).
290290

291291
## Understand the scoring script
292292

@@ -384,7 +384,7 @@ The output should appear similar to the following JSON. The `provisioning_state`
384384
ml_client.online_endpoints.get(name=local_endpoint_name, local=True)
385385
```
386386

387-
The method returns [`ManagedOnlineEndpoint` entity](/python/api/azure-ai-ml/azure.ai.ml.entities.managedonlineendpoint.md). The `provisioning_state` is `Succeeded`.
387+
The method returns [`ManagedOnlineEndpoint` entity](/python/api/azure-ai-ml/azure.ai.ml.entities.managedonlineendpoint). The `provisioning_state` is `Succeeded`.
388388

389389
```python
390390
ManagedOnlineEndpoint({'public_network_access': None, 'provisioning_state': 'Succeeded', 'scoring_uri': 'http://localhost:49158/score', 'swagger_uri': None, 'name': 'local-10061534497697', 'description': 'this is a sample local endpoint', 'tags': {}, 'properties': {}, 'id': None, 'Resource__source_path': None, 'base_path': '/path/to/your/working/directory', 'creation_context': None, 'serialize': <msrest.serialization.Serializer object at 0x7ffb781bccd0>, 'auth_mode': 'key', 'location': 'local', 'identity': None, 'traffic': {}, 'mirror_traffic': {}, 'kind': None})
@@ -587,7 +587,7 @@ for endpoint in ml_client.online_endpoints.list():
587587
print(endpoint.name)
588588
```
589589

590-
The method returns list (iterator) of `ManagedOnlineEndpoint` entities. You can get other information by specifying [parameters](/python/api/azure-ai-ml/azure.ai.ml.entities.managedonlineendpoint.md#parameters).
590+
The method returns list (iterator) of `ManagedOnlineEndpoint` entities. You can get other information by specifying [parameters](/python/api/azure-ai-ml/azure.ai.ml.entities.managedonlineendpoint#parameters).
591591

592592
For example, output the list of endpoints like a table:
593593

articles/machine-learning/how-to-deploy-with-triton.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -163,7 +163,7 @@ This section shows how you can define a Triton deployment to deploy to a managed
163163
endpoint_name = f"endpoint-{random.randint(0, 10000)}"
164164
```
165165
166-
1. We use these details above in the `MLClient` from `azure.ai.ml` to get a handle to the required Azure Machine Learning workspace. Check the [configuration notebook](https://github.com/Azure/azureml-examples/tree/main/sdk/jobs/configuration.ipynb) for more details on how to configure credentials and connect to a workspace.
166+
1. We use these details above in the `MLClient` from `azure.ai.ml` to get a handle to the required Azure Machine Learning workspace. Check the [configuration notebook](https://github.com/Azure/azureml-examples/blob/main/sdk/python/jobs/configuration.ipynb) for more details on how to configure credentials and connect to a workspace.
167167
168168
```python
169169
from azure.ai.ml import MLClient
@@ -352,7 +352,7 @@ Once your deployment completes, use the following command to make a scoring requ
352352
keys = ml_client.online_endpoints.list_keys(endpoint_name)
353353
auth_key = keys.primary_key
354354
355-
1. The following scoring code uses the [Triton Inference Server Client](https://github.com/triton-inference-server/client) to submit the image of a peacock to the endpoint. This script is available in the companion notebook to this example - [Deploy a model to online endpoints using Triton](https://github.com/Azure/azureml-examples/blob/main/sdk/endpoints/online/triton/single-model/online-endpoints-triton.ipynb).
355+
1. The following scoring code uses the [Triton Inference Server Client](https://github.com/triton-inference-server/client) to submit the image of a peacock to the endpoint. This script is available in the companion notebook to this example - [Deploy a model to online endpoints using Triton](https://github.com/Azure/azureml-examples/blob/main/sdk/python/endpoints/online/triton/single-model/online-endpoints-triton.ipynb).
356356
357357
```python
358358
# Test the blue deployment with some sample data

0 commit comments

Comments
 (0)