Skip to content

Commit dec85a0

Browse files
committed
review medimageparse and minor edits to other articles
1 parent 6f13809 commit dec85a0

File tree

3 files changed

+40
-32
lines changed

3 files changed

+40
-32
lines changed

articles/ai-studio/how-to/healthcare-ai/deploy-cxrreportgen.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -43,9 +43,9 @@ To use CXRReportGen model with Azure AI Studio or Azure Machine Learning studio,
4343

4444
**Deployment to a self-hosted managed compute**
4545

46-
CXRReportGen model can be deployed to our self-hosted managed inference solution, which allows you to customize and control all the details about how the model is served.
46+
CXRReportGen model can be deployed to our self-hosted managed inference solution, which allows you to customize and control all the details about how the model is served. You can deploy the model through the model catalog UI or programmatically.
4747

48-
You can deploy the model through the model catalog UI or programmatically. To deploy through the UI,
48+
To __deploy the model through the UI__:
4949

5050
- Go to the [model card in the catalog](https://aka.ms/cxrreportgenmodelcard).
5151
- On the model's overview page, select __Deploy__.
@@ -56,7 +56,7 @@ You can deploy the model through the model catalog UI or programmatically. To de
5656
> For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
5757
- Select __Deploy__.
5858

59-
To deploy the model programmatically, see [How to deploy and inference a managed compute deployment with code](../deploy-models-managed.md).
59+
To __deploy the model programmatically__, see [How to deploy and inference a managed compute deployment with code](../deploy-models-managed.md).
6060

6161

6262
## Work with a grounded report generation model for chest X-ray analysis
@@ -76,7 +76,7 @@ credential = DefaultAzureCredential()
7676
ml_client_workspace = MLClient.from_config(credential)
7777
```
7878

79-
In the deployment configuration, you get to choose the authentication method. This example uses Azure Machine Learning token-based authentication. For more authentication options, see the [corresponding documentation page](../../../machine-learning/how-to-setup-authentication.md). Also note that the client is created from a configuration file that is created automatically for Azure Machine Learning virtual machines (VMs). Learn more on the [corresponding API documentation page](/python/api/azure-ai-ml/azure.ai.ml.mlclient?view=azure-python#azure-ai-ml-mlclient-from-config).
79+
In the deployment configuration, you get to choose the authentication method. This example uses Azure Machine Learning token-based authentication. For more authentication options, see the [corresponding documentation page](../../../machine-learning/how-to-setup-authentication.md). Also, note that the client is created from a configuration file that is created automatically for Azure Machine Learning virtual machines (VMs). Learn more on the [corresponding API documentation page](/python/api/azure-ai-ml/azure.ai.ml.mlclient?view=azure-python#azure-ai-ml-mlclient-from-config).
8080

8181
### Make basic calls to the model
8282

articles/ai-studio/how-to/healthcare-ai/deploy-medimageinsight.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -41,9 +41,9 @@ To use MedImageInsight models with Azure AI Studio or Azure Machine Learning stu
4141

4242
**Deployment to a self-hosted managed compute**
4343

44-
MedImageInsight model can be deployed to our self-hosted managed inference solution, which allows you to customize and control all the details about how the model is served.
44+
MedImageInsight model can be deployed to our self-hosted managed inference solution, which allows you to customize and control all the details about how the model is served. You can deploy the model through the model catalog UI or programmatically.
4545

46-
You can deploy the model through the model catalog UI or programmatically. To deploy through the UI,
46+
To __deploy the model through the UI__:
4747

4848
- Go to the [model card in the catalog](https://aka.ms/mi2modelcard).
4949
- On the model's overview page, select __Deploy__.
@@ -55,7 +55,7 @@ You can deploy the model through the model catalog UI or programmatically. To de
5555
5656
- Select __Deploy__.
5757

58-
To deploy the model programmatically, see [How to deploy and inference a managed compute deployment with code](../deploy-models-managed.md).
58+
To __deploy the model programmatically__, see [How to deploy and inference a managed compute deployment with code](../deploy-models-managed.md).
5959

6060
## Work with an embedding model
6161

@@ -74,7 +74,7 @@ credential = DefaultAzureCredential()
7474
ml_client_workspace = MLClient.from_config(credential)
7575
```
7676

77-
In the deployment configuration, you get to choose the authentication method. This example uses Azure Machine Learning token-based authentication. For more authentication options, see the [corresponding documentation page](../../../machine-learning/how-to-setup-authentication.md). Also note that the client is created from a configuration file that is created automatically for Azure Machine Learning virtual machines (VMs). Learn more on the [corresponding API documentation page](/python/api/azure-ai-ml/azure.ai.ml.mlclient?view=azure-python#azure-ai-ml-mlclient-from-config).
77+
In the deployment configuration, you get to choose the authentication method. This example uses Azure Machine Learning token-based authentication. For more authentication options, see the [corresponding documentation page](../../../machine-learning/how-to-setup-authentication.md). Also, note that the client is created from a configuration file that is created automatically for Azure Machine Learning virtual machines (VMs). Learn more on the [corresponding API documentation page](/python/api/azure-ai-ml/azure.ai.ml.mlclient?view=azure-python#azure-ai-ml-mlclient-from-config).
7878

7979
### Make basic calls to the model
8080

0 commit comments

Comments
 (0)