Skip to content

Commit ee57220

Browse files
authored
Update model-catalog-overview.md
1 parent 25607ef commit ee57220

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

articles/ai-studio/how-to/model-catalog-overview.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ Some models in the **Curated by Azure AI** and **Open models from the Hugging Fa
3737
* **Compare:** Compare benchmarks across models and datasets available in the industry to assess which one meets your business scenario.
3838
* **Evaluate:** Evaluate if the model is suited for your specific workload by providing your own test data. Evaluation metrics make it easy to visualize how well the selected model performed in your scenario.
3939
* **Fine-tune:** Customize fine-tunable models using your own training data and pick the best model by comparing metrics across all your fine-tuning jobs. Built-in optimizations speedup fine-tuning and reduce the memory and compute needed for fine-tuning.
40-
* **Deploy:** Deploy pretrained models or fine-tuned models seamlessly for inference. Models that can be deployed to real-time endpoints can also be downloaded.
40+
* **Deploy:** Deploy pretrained models or fine-tuned models seamlessly for inference. Models that can be deployed to managed compute can also be downloaded.
4141

4242
## Model deployment: Managed compute and serverless API (pay-as-you-go)
4343

@@ -86,7 +86,7 @@ Models available for deployment to a Managed compute can be deployed to Azure Ma
8686

8787
### Build Generative AI Apps with Managed computes
8888

89-
Prompt flow offers a great experience for prototyping. You can use models deployed with Managed computes in Prompt Flow with the [Open Model LLM tool](../../machine-learning/prompt-flow/tools-reference/open-model-llm-tool.md). You can also use the REST API exposed by the Real-time endpoints in popular LLM tools like LangChain with the [Azure Machine Learning extension](https://python.langchain.com/docs/integrations/chat/azureml_chat_endpoint/).
89+
Prompt flow offers a great experience for prototyping. You can use models deployed with Managed computes in Prompt Flow with the [Open Model LLM tool](../../machine-learning/prompt-flow/tools-reference/open-model-llm-tool.md). You can also use the REST API exposed by managed compute in popular LLM tools like LangChain with the [Azure Machine Learning extension](https://python.langchain.com/docs/integrations/chat/azureml_chat_endpoint/).
9090

9191

9292
### Content safety for models deployed as Managed Computes

0 commit comments

Comments
 (0)