Skip to content

Commit b609187

Browse files
authored
acrolinx
1 parent 8752432 commit b609187

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

articles/machine-learning/how-to-deploy-custom-container.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -185,7 +185,7 @@ endpoint = ManagedOnlineEndpoint(
185185

186186
### Configure online deployment
187187

188-
A deployment is a set of resources required for hosting the model that does the actual inferencing. We will create a deployment for our endpoint using the `ManagedOnlineDeployment` class.
188+
A deployment is a set of resources required for hosting the model that does the actual inferencing. We'll create a deployment for our endpoint using the `ManagedOnlineDeployment` class.
189189

190190
> [!TIP]
191191
> - `name` - Name of the deployment.
@@ -307,7 +307,7 @@ blue_deployment = ManagedOnlineDeployment(
307307

308308
---
309309

310-
then your model will be located at `/var/tfserving-model-mount/tfserving-deployment/1` in your deployment. Note that it is no longer under `azureml-app/azureml-models`, but under the mount path you specified:
310+
then your model will be located at `/var/tfserving-model-mount/tfserving-deployment/1` in your deployment. Note that it's no longer under `azureml-app/azureml-models`, but under the mount path you specified:
311311

312312
:::image type="content" source="./media/how-to-deploy-custom-container/mount-path-deployment-location.png" alt-text="Diagram showing a tree view of the deployment directory structure when using mount_model_path.":::
313313

@@ -331,7 +331,7 @@ az ml online-deployment create --name tfserving-deployment -f endpoints/online/c
331331

332332
# [Python SDK](#tab/python)
333333

334-
Using the `MLClient` created earlier, we will now create the Endpoint in the workspace. This command will start the endpoint creation and return a confirmation response while the endpoint creation continues.
334+
Using the `MLClient` created earlier, we'll now create the Endpoint in the workspace. This command will start the endpoint creation and return a confirmation response while the endpoint creation continues.
335335

336336
```python
337337
ml_client.begin_create_or_update(endpoint)
@@ -355,12 +355,12 @@ Once your deployment completes, see if you can make a scoring request to the dep
355355

356356
# [Python SDK](#tab/python)
357357

358-
Using the `MLClient` created earlier, we will get a handle to the endpoint. The endpoint can be invoked using the `invoke` command with the following parameters:
358+
Using the `MLClient` created earlier, we'll get a handle to the endpoint. The endpoint can be invoked using the `invoke` command with the following parameters:
359359
- `endpoint_name` - Name of the endpoint
360360
- `request_file` - File with request data
361361
- `deployment_name` - Name of the specific deployment to test in an endpoint
362362

363-
We will send a sample request using a json file. The sample json is in the [example repository](https://github.com/Azure/azureml-examples/tree/main/sdk/python/endpoints/online/custom-container).
363+
We'll send a sample request using a json file. The sample json is in the [example repository](https://github.com/Azure/azureml-examples/tree/main/sdk/python/endpoints/online/custom-container).
364364

365365
```python
366366
# test the blue deployment with some sample data

0 commit comments

Comments
 (0)