Skip to content

Commit bb081fa

Browse files
Merge pull request #4885 from ssalgadodev/ad-ae-name-change
Azure direct and ecosystems name change
2 parents 724ae6e + 1df5816 commit bb081fa

26 files changed

+90
-117
lines changed

articles/ai-foundry/concepts/content-filtering.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ author: PatrickFarley
2626

2727
The content filtering system is powered by [Azure AI Content Safety](../../ai-services/content-safety/overview.md), and it works by running both the prompt input and completion output through a set of classification models designed to detect and prevent the output of harmful content. Variations in API configurations and application design might affect completions and thus filtering behavior.
2828

29-
With Azure OpenAI model deployments, you can use the default content filter or create your own content filter (described later on). Models available through **standard deployments** have content filtering enabled by default. To learn more about the default content filter enabled for standard deployments, see [Content safety for Azure Direct Models](model-catalog-content-safety.md).
29+
With Azure OpenAI model deployments, you can use the default content filter or create your own content filter (described later on). Models available through **standard deployments** have content filtering enabled by default. To learn more about the default content filter enabled for standard deployments, see [Content safety for Models Sold Directly by Azure ](model-catalog-content-safety.md).
3030

3131
## Language support
3232

articles/ai-foundry/concepts/fine-tuning-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -92,7 +92,7 @@ For more information on fine-tuning using a managed compute (preview), see [Fine
9292

9393
For details about Azure OpenAI in Azure AI Foundry Models that are available for fine-tuning, see the [Azure OpenAI in Foundry Models documentation](../../ai-services/openai/concepts/models.md#fine-tuning-models) or the [Azure OpenAI models table](#fine-tuning-azure-openai-models) later in this guide.
9494

95-
For the Azure OpenAI Service models that you can fine tune, supported regions for fine-tuning include North Central US, Sweden Central, and more.
95+
For the Azure OpenAI models that you can fine tune, supported regions for fine-tuning include North Central US, Sweden Central, and more.
9696

9797
### Fine-tuning Azure OpenAI models
9898

articles/ai-foundry/concepts/foundry-models-overview.md

Lines changed: 16 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -30,30 +30,30 @@ Azure AI Foundry offers a comprehensive catalog of AI models. There are over 190
3030

3131
Our catalog is organized into two main categories:
3232

33-
* [Azure Direct Models](#azure-direct-models)
34-
* [Azure Ecosystem Models](#azure-ecosystem-models)
33+
* [Models sold directly by Azure](#models-sold-directly-by-azure)
34+
* [Models from Partners and Community](#models-from-partners-and-community)
3535

3636
Understanding the distinction between these categories helps you choose the right models based on your specific requirements and strategic goals.
3737

38-
## Azure Direct Models
38+
## Models Sold Directly by Azure
3939

40-
Azure Direct Models are models that are hosted and sold by Microsoft under Microsoft Product Terms. These models have undergone rigorous evaluation and are deeply integrated into Azure's AI ecosystem. They offer enhanced integration, optimized performance, and direct Microsoft support, including enterprise-grade Service Level Agreements (SLAs).
40+
These are models that are hosted and sold by Microsoft under Microsoft Product Terms. These models have undergone rigorous evaluation and are deeply integrated into Azures AI ecosystem. The models come from a variety of top providers and they offer enhanced integration, optimized performance, and direct Microsoft support, including enterprise-grade Service Level Agreements (SLAs).
4141

42-
Characteristics of Azure Direct Models:
42+
Characteristics of models sold directly by Azure:
4343

4444
- Official first-party support from Microsoft
4545
- High level of integration with Azure services and infrastructure
4646
- Extensive performance benchmarking and validation
4747
- Adherence to Microsoft's Responsible AI standards
4848
- Enterprise-grade scalability, reliability, and security
4949

50-
Azure Direct Models also have the benefit of flexible Provisioned Throughput, meaning you can use your quota and reservations across any of these models.
50+
These Models also have the benefit of fungible Provisioned Throughput, meaning you can flexibly use your quota and reservations across any of these models.
5151

52-
## Azure Ecosystem Models
52+
## Models from Partners and Community
5353

54-
Models constitute the vast majority of the Azure AI Foundry Models. These models are provided by trusted third-party organizations, partners, research labs, and community contributors. These models offer specialized and diverse AI capabilities, covering a wide array of scenarios, industries, and innovations.
54+
These models constitute the vast majority of the Azure AI Foundry Models. These models are provided by trusted third-party organizations, partners, research labs, and community contributors. These models offer specialized and diverse AI capabilities, covering a wide array of scenarios, industries, and innovations.
5555

56-
Characteristics of Azure Ecosystem Models:
56+
Characteristics of Models from Partners and Community:
5757
* Developed and supported by external partners and community contributors
5858
* Diverse range of specialized models catering to niche or broad use cases
5959
* Typically validated by providers themselves, with integration guidelines provided by Azure
@@ -62,28 +62,17 @@ Characteristics of Azure Ecosystem Models:
6262

6363
Models are deployable as Managed Compute or Standard (pay-go) deployment options. The model provider selects how the models are deployable.
6464

65-
## Choosing between Azure Direct and Azure Ecosystem Models
66-
65+
## Choosing Between direct models and partner & community models
6766

6867
When selecting models from Azure AI Foundry Models, consider the following:
69-
* **Use Case and Requirements**: Azure Direct Models are ideal for scenarios requiring deep Azure integration, guaranteed support, and enterprise SLAs. Azure Ecosystem Models excel in specialized use cases and innovation-led scenarios.
70-
* **Support Expectations**: Azure Direct Models come with robust Microsoft-provided support and maintenance. Azure Ecosystem Models are supported by their providers, with varying levels of SLA and support structures.
71-
* **Innovation and Specialization**: Azure Ecosystem Models offer rapid access to specialized innovations and niche capabilities often developed by leading research labs and emerging AI providers.
72-
73-
## Accessing Azure Ecosystem Models
68+
* **Use Case and Requirements**: Models sold directly by Azure are ideal for scenarios requiring deep Azure integration, guaranteed support, and enterprise SLAs. Models from Partners and Community excel in specialized use cases and innovation-led scenarios.
69+
* **Support Expectations**: Models sold directly by Azure come with robust Microsoft-provided support and maintenance. These models are supported by their providers, with varying levels of SLA and support structures.
70+
* **Innovation and Specialization**: Models from Partners and Community offer rapid access to specialized innovations and niche capabilities often developed by leading research labs and emerging AI providers.
7471

75-
Azure Ecosystem Models are accessible through Azure AI Foundry, providing:
76-
* Comprehensive details about the model's capabilities and integration requirements.
77-
* Community ratings, usage data, and qualitative feedback to guide your decisions.
78-
* Clear integration guidelines to help incorporate these models seamlessly into your Azure workflows.
79-
80-
For more detailed guidance and exploration of available models, visit the [Azure AI Foundry documentation](/azure/ai-foundry/).
81-
82-
Azure AI Foundry remains committed to providing a robust ecosystem, enabling customers to easily access the best AI innovations from Microsoft and our trusted partners.
8372

8473
## Model collections
8574

86-
The model catalog organizes models into different collections:
75+
The model catalog organizes models into different collections, including:
8776

8877
* **Azure OpenAI models exclusively available on Azure**: Flagship Azure OpenAI models available through an integration with Azure OpenAI in Foundry Models. Microsoft supports these models and their use according to the product terms and [SLA for Azure OpenAI](https://www.microsoft.com/licensing/docs/view/Service-Level-Agreements-SLA-for-Online-Services).
8978

@@ -108,7 +97,7 @@ On the **model catalog filters**, you'll find:
10897
* **Batch**: best suited for cost-optimized batch jobs, and not latency. No playground support is provided for the batch deployment.
10998
* **Managed compute**: this option allows you to deploy a model on an Azure virtual machine. You will be billed for hosting and inferencing.
11099
* **Inference tasks**: you can filter models based on the inference task type.
111-
* **Finetune tasks**: you can filter models based on the finetune task type.
100+
* **Fine-tune tasks**: you can filter models based on the fine-tune task type.
112101
* **Licenses**: you can filter models based on the license type.
113102

114103
On the **model card**, you'll find:
@@ -248,7 +237,7 @@ To set the public network access flag for the Azure AI Foundry hub:
248237

249238
* If you have an Azure AI Foundry hub with a private endpoint created before July 11, 2024, standard deployments added to projects in this hub won't follow the networking configuration of the hub. Instead, you need to create a new private endpoint for the hub and create a new standard deployment in the project so that the new deployments can follow the hub's networking configuration.
250239

251-
* If you have an Azure AI Foundry hub with MaaS deployments created before July 11, 2024, and you enable a private endpoint on this hub, the existing sstandard deployments won't follow the hub's networking configuration. For standard deployments in the hub to follow the hub's networking configuration, you need to create the deployments again.
240+
* If you have an Azure AI Foundry hub with MaaS deployments created before July 11, 2024, and you enable a private endpoint on this hub, the existing standard deployments won't follow the hub's networking configuration. For standard deployments in the hub to follow the hub's networking configuration, you need to create the deployments again.
252241

253242
* Currently, [Azure OpenAI On Your Data](/azure/ai-services/openai/concepts/use-your-data) support isn't available for standard deployments in private hubs, because private hubs have the public network access flag disabled.
254243

articles/ai-foundry/concepts/model-catalog-content-safety.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Guardrails & controls for Azure Direct Models
2+
title: Guardrails & controls for Models Sold Directly by Azure
33
titleSuffix: Azure AI Foundry
44
description: Learn about content safety for models deployed using standard deployments, using Azure AI Foundry.
55
manager: scottpolly
@@ -13,7 +13,7 @@ reviewer: ositanachi
1313
ms.custom:
1414
---
1515

16-
# Guardrails & controls for Azure Direct Models
16+
# Guardrails & controls for Models Sold Directly by Azure
1717

1818
[!INCLUDE [feature-preview](../includes/feature-preview.md)]
1919

@@ -34,7 +34,7 @@ Content filtering occurs synchronously as the service processes prompts to gener
3434
- When you first deploy a language model
3535
- Later, by selecting the content filtering toggle on the deployment details page
3636

37-
Suppose you decide to use an API other than the [Azure AI Model Inference API](/azure/ai-studio/reference/reference-model-inference-api) to work with a model that is deployed via a standard deployment. In such a situation, content filtering (preview) isn't enabled unless you implement it separately by using Azure AI Content Safety. To get started with Azure AI Content Safety, see [Quickstart: Analyze text content](/azure/ai-services/content-safety/quickstart-text). You run a higher risk of exposing users to harmful content if you don't use content filtering (preview) when working with models that are deployed via standard deployments.
37+
Suppose you decide to use an API other than the [Foundry Models API](/azure/ai-studio/reference/reference-model-inference-api) to work with a model that is deployed via a standard deployment. In such a situation, content filtering (preview) isn't enabled unless you implement it separately by using Azure AI Content Safety. To get started with Azure AI Content Safety, see [Quickstart: Analyze text content](/azure/ai-services/content-safety/quickstart-text). You run a higher risk of exposing users to harmful content if you don't use content filtering (preview) when working with models that are deployed via standard deployments.
3838

3939
[!INCLUDE [content-safety-harm-categories](../includes/content-safety-harm-categories.md)]
4040

articles/ai-foundry/concepts/model-lifecycle-retirement.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ reviewer: fkriti
1919
Azure AI Foundry Models in the model catalog are continually refreshed with newer and more capable models. As part of this process, model providers might deprecate and retire their older models, and you might need to update your applications to use a newer model. This document communicates information about the model lifecycle and deprecation timelines and explains how you're informed of model lifecycle stages.
2020

2121
> [!IMPORTANT]
22-
> This article describes deprecation and retirement only for Azure Direct models and Azure Ecosystem models models in Foundry Models. For information about deprecation and retirement for Azure OpenAI in Foundry Models, see the [Azure OpenAI models lifecycle](../../ai-services/openai/concepts/model-retirements.md?context=/azure/ai-foundry/context/context) documentation.
22+
> This article describes deprecation and retirement only for Models Sold Directly by Azure and Models from Partners and Community in Foundry Models. For information about deprecation and retirement for Azure OpenAI in Foundry Models, see the [Azure OpenAI models lifecycle](../../ai-services/openai/concepts/model-retirements.md?context=/azure/ai-foundry/context/context) documentation.
2323
2424
## Model lifecycle stages
2525

articles/ai-foundry/concepts/models-featured.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -364,5 +364,5 @@ For examples of how to use Stability AI models, see the following examples:
364364
- [Deploy models as standard deployments](../how-to/deploy-models-serverless.md)
365365
- [Model catalog and collections in Azure AI Foundry portal](../how-to/model-catalog-overview.md)
366366
- [Region availability for models in standard deployments](../how-to/deploy-models-serverless-availability.md)
367-
- [Content safety for Azure Direct Models](model-catalog-content-safety.md)
367+
- [Content safety for Models Sold Directly by Azure ](model-catalog-content-safety.md)
368368

articles/ai-foundry/how-to/concept-data-privacy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ Deploying models to managed compute deploys model weights to dedicated virtual m
3535

3636
You manage the infrastructure for these managed compute resources. Azure data, privacy, and security commitments apply. To learn more about Azure compliance offerings applicable to Azure AI Foundry, see the [Azure Compliance Offerings page](https://servicetrust.microsoft.com/DocumentPage/7adf2d9e-d7b5-4e71-bad8-713e6a183cf3).
3737

38-
Although containers for **Azure Direct Models** are scanned for vulnerabilities that could exfiltrate data, not all models available through the model catalog are scanned. To reduce the risk of data exfiltration, you can [help protect your deployment by using virtual networks](configure-managed-network.md). You can also use [Azure Policy](../../ai-services/policy-reference.md) to regulate the models that your users can deploy.
38+
Although containers for **Models Sold Directly by Azure** are scanned for vulnerabilities that could exfiltrate data, not all models available through the model catalog are scanned. To reduce the risk of data exfiltration, you can [help protect your deployment by using virtual networks](configure-managed-network.md). You can also use [Azure Policy](../../ai-services/policy-reference.md) to regulate the models that your users can deploy.
3939

4040
:::image type="content" source="../media/explore/subscription-service-cycle.png" alt-text="Diagram that shows the platform service life cycle." lightbox="../media/explore/subscription-service-cycle.png":::
4141

articles/ai-foundry/how-to/configure-managed-network.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ You need to configure following network isolation configurations.
2929
- Choose network isolation mode. You have two options: allow internet outbound mode or allow only approved outbound mode.
3030
- If you use Visual Studio Code integration with allow only approved outbound mode, create FQDN outbound rules described in the [use Visual Studio Code](#scenario-use-visual-studio-code) section.
3131
- If you use HuggingFace models in Models with allow only approved outbound mode, create FQDN outbound rules described in the [use HuggingFace models](#scenario-use-huggingface-models) section.
32-
- If you use one of the open-source models with allow only approved outbound mode, create FQDN outbound rules described in the [Azure Direct Models](#scenario-azure-direct-models) section.
32+
- If you use one of the open-source models with allow only approved outbound mode, create FQDN outbound rules described in the [Models Sold Directly by Azure](#scenario-models-sold-directly-by-azure) section.
3333

3434
## Network isolation architecture and isolation modes
3535

@@ -812,7 +812,7 @@ If you plan to use __HuggingFace models__ with the hub, add outbound _FQDN_ rule
812812
* cnd.auth0.com
813813
* cdn-lfs.huggingface.co
814814

815-
### Scenario: Azure Direct Models
815+
### Scenario: Models Sold Directly by Azure
816816

817817
These models involve dynamic installation of dependencies at runtime, and require outbound _FQDN_ rules to allow traffic to the following hosts:
818818

articles/ai-foundry/how-to/deploy-models-serverless.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ In this article, you learn how to deploy a model from the model catalog as a sta
1919

2020
[!INCLUDE [models-preview](../includes/models-preview.md)]
2121

22-
[Certain models in the model catalog](deploy-models-serverless-availability.md) can be deployed as a standard deployments. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need. This deployment option doesn't require quota from your subscription.
22+
[Certain models in the model catalog](deploy-models-serverless-availability.md) can be deployed as a standard deployment. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need. This deployment option doesn't require quota from your subscription.
2323

2424
This article uses a Meta Llama model deployment for illustration. However, you can use the same steps to deploy any of the [models in the model catalog that are available for standard deployment](deploy-models-serverless-availability.md).
2525

@@ -522,7 +522,7 @@ In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**.
522522
You can select the deployment, and note the endpoint's _Target URI_ and _Key_. Use them to call the deployment and generate predictions.
523523
524524
> [!NOTE]
525-
> When using the [Azure portal](https://portal.azure.com), standard deployment aren't displayed by default on the resource group. Use the **Show hidden types** option to display them on the resource group.
525+
> When using the [Azure portal](https://portal.azure.com), standard deployments aren't displayed by default on the resource group. Use the **Show hidden types** option to display them on the resource group.
526526
527527
# [Azure CLI](#tab/cli)
528528
@@ -555,7 +555,7 @@ In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**.
555555
556556
## Use the standard deployment
557557
558-
Models deployed in Azure Machine Learning and Azure AI Foundry in standard deployments support the [Foundry Models API](../../ai-foundry/model-inference/reference/reference-model-inference-api.md) that exposes a common set of capabilities for foundational models and that can be used by developers to consume predictions from a diverse set of models in a uniform and consistent way.
558+
Models deployed in Azure Machine Learning and Azure AI Foundry in standard deployments support the [Azure AI Foundry Models API](../../ai-foundry/model-inference/reference/reference-model-inference-api.md) that exposes a common set of capabilities for foundational models and that can be used by developers to consume predictions from a diverse set of models in a uniform and consistent way.
559559
560560
Read more about the [capabilities of this API](../../ai-foundry/model-inference/reference/reference-model-inference-api.md#capabilities) and how [you can use it when building applications](../../ai-foundry/model-inference/reference/reference-model-inference-api.md#getting-started).
561561

0 commit comments

Comments
 (0)