You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/architecture.md
+22-32Lines changed: 22 additions & 32 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,81 +18,71 @@ author: sdgilley
18
18
19
19
Azure AI Foundry provides a comprehensive set of tools to support development teams in building, customizing, evaluating and operating AI Agents and its composing models and tools.
20
20
21
-
This article is intended to provide IT security teams with details on the Azure service architecture, its components, and its relation with related Azure resource types. Use this information to guide how to [customize](how-to/configure-private-link.md) your Foundry deployment to your organization's requirements. For additional guidance on how to roll out AI Foundry in your organization, see [Azure AI Foundry Rollout](concepts/planning.md).
21
+
This article is intended to provide IT security teams with details on the Azure service architecture, its components, and its relation with related Azure resource types. Use this information to guide how to [customize](../how-to/configure-private-link.md) your Foundry deployment to your organization's requirements. For additional guidance on how to roll out AI Foundry in your organization, see [Azure AI Foundry Rollout](planning.md).
22
22
23
23
## Azure AI resource types and providers
24
24
25
-
Within the Azure AI product family, we distinguish three [Azure resource providers]() supporting user needs at different layers in the stack.
25
+
Within the Azure AI product family, we distinguish three [Azure resource providers](https://learn.microsoft.com/azure/azure-resource-manager/management/resource-providers-and-types) supporting user needs at different layers in the stack.
| Microsoft.CognitiveServices | Supports Agentic and GenAI application development composing and customizing pre-built models. | Azure AI Foundry; Azure OpenAI service; Azure Speech; Azure Vision |
30
30
| Microsoft.Search | Support knowledge retrieval over your data | Azure AI Search |
31
31
| Microsoft.MachineLearningServices | Train, deploy and operate custom and open source machine learning models | Azure AI Hub (and its projects); Azure Machine Learning Workspace |
32
32
33
-
[Resource provider registration](/azure/azure-resource-manager/management/resource-providers-and-types#register-resource-provider) is required in your Azure subscription before you are able to create the above resource types.
34
-
35
-
Azure AI Foundry resource is the primary resource for Azure AI and is recommended for most use cases. It is built on the same [Azure resource provider](#LINK) and [resource type](#LINK) as Azure OpenAI service, Azure Speech, Azure Vision, and Azure Language service. It provides access to the superset of capabilities from each individual services combined.
33
+
Azure AI Foundry resource is the primary resource for Azure AI and is recommended for most use cases. It is built on the same [Azure resource provider and resource type](https://learn.microsoft.com/azure/azure-resource-manager/management/resource-providers-and-types) as Azure OpenAI service, Azure Speech, Azure Vision, and Azure Language service. It provides access to the superset of capabilities from each individual services combined.
Resource types under the same provider namespace use similar [Azure RBAC](#link) actions, networking configurations and aliases for Azure Policy configuration. If you are upgrading from Azure OpenAI to Azure AI Foundry, this means your existing custom Azure policies and Azure RBAC options apply.
37
+
Resource types under the same provider namespace share the same control plane, hence use similar [Azure RBAC](#link) actions, networking configurations and aliases for Azure Policy configuration. If you are upgrading from Azure OpenAI to Azure AI Foundry, this means your existing custom Azure policies and Azure RBAC options apply.
40
38
41
39
## Security-driven separation of concerns
42
40
43
41
Azure AI Foundry enforces a clear separation between management and development operations to ensure secure and scalable AI workloads.
44
42
45
43
-**Top-Level Resource Governance:** Management operations—such as configuring security, establishing connectivity with other Azure services, and managing deployments—are scoped to the top-level Azure AI Foundry resource. Development activities are isolated within dedicated project containers, which encapsulate use cases and provide boundaries for access control, files, agents, and evaluations.
46
44
47
-
-**Role-Based Access Control (RBAC):** Azure RBAC actions are designed to reflect this separation of concerns. Control plane actions (e.g., creating deployments and projects) are distinct from data plane actions (e.g., building agents, running evaluations, uploading files). RBAC assignments can be scoped at both the top-level resource and individual project level. Managed identities can also be assigned at either scope to support secure automation and service access.
45
+
-**Role-Based Access Control (RBAC):** Azure RBAC actions are designed to reflect this separation of concerns. Control plane actions (e.g., creating deployments and projects) are distinct from data plane actions (e.g., building agents, running evaluations, uploading files). RBAC assignments can be scoped at both the top-level resource and individual project level. [Managed identities](https://learn.microsoft.com/entra/identity/managed-identities-azure-resources/overview) can be assigned at either scope to support secure automation and service access.
48
46
49
47
-**Monitoring and Observability:** Azure Monitor metrics are segmented by scope. Management and usage metrics are available at the top-level resource, while project-specific metrics—such as evaluation performance or agent activity—are scoped to the individual project containers.
50
48
51
49
## Computing infrastructure
52
50
53
-
Azure AI Foundry leverages a flexible compute architecture to support diverse model hosting and workload execution scenarios.
51
+
Azure AI Foundry leverages a flexible compute architecture to support diverse [model access](../concepts/foundry-models-overview) and workload execution scenarios.
54
52
55
-
- Model Hosting Architecture: Models are hosted using different compute stacks depending on their origin:
53
+
- Model Hosting Architecture: Foundry models can be accessed in different ways:
54
+
55
+
1.[Standard deployment in Azure AI Foundry resources](deployments-overview.md#standard-deployment-in-azure-ai-foundry-resources)
56
+
1.[Deployment to serverless API endpoints](deployments-overview.md#serverless-api-endpoint)
57
+
1.[Deployment to managed computes](deployments-overview#managed-compute)
56
58
57
-
-**Microsoft-hosted** models are served directly by Microsoft's model serving infrastructure.
58
-
-**Partner models** are served by Microsoft's model serving infrastructure.
59
-
-**Open-source models** are deployed on managed compute through the Azure ML stack, accessible via AI Hub and hub-based projects. To learn more, see [this link](#some-link) These rely on Azure Batch infrastructure within Microsoft subscriptions, with support for managed virtual networking.
59
+
For an overview of data, privacy and security considerations with these deployment options, see [Data, privacy, and security for use of models](../how-to/concept-data-privacy?branch=main)
60
60
61
-
-**Workload Execution:** Agentsand Evaluations are executed on Azure Container Apps compute, fully managed by Microsoft. This ensures scalability and operational consistency across use cases.
61
+
-**Workload Execution:** Agents, Evaluations and Batch jobs are executed as managed container compute, fully managed by Microsoft.
62
62
63
-
-**Networking Integration:** For enhanced security and compliance, Agents can be injected into your own virtualnetwork (VNet). Evaluations, however, currently do not support VNet injection during the preview phase.
63
+
-**Networking Integration:** For enhanced security and compliance when your Agents connect with external systems, [container injection](../agents/how-to/virtual-networks.md) allows the platform network to host APIs and inject a subnet into your network, enabling local communication of your Azure resources within the same virtual network.
64
64
65
65
## Data storage
66
66
67
67
Azure AI Foundry provides flexible and secure data storage options to support a wide range of AI workloads.
68
68
69
-
***Managed Storage (Default Configuration)**:
70
-
In the default setup, Azure AI Foundry uses Microsoft-managed, multi-tenant storage accounts. These accounts offer per-tenant isolation and support direct file uploads for select use cases—such as OpenAI models, Assistants, and Agents—without requiring a customer-provided storage account.
69
+
***Managed Storage**:
70
+
In the default setup, Azure AI Foundry uses Microsoft-managedstorage accounts, that are logically separated, and support direct file uploads for select use cases—such as OpenAI models, Assistants, and Agents—without requiring a customer-provided storage account.
71
71
72
72
***Bring Your Own Storage (Optional)**:
73
73
Users can optionally connect their own Azure Storage accounts. Foundry tools can read inputs from and write outputs to these accounts, depending on the tool and use case.
74
74
75
75
***Agent Data Storage:**
76
76
77
-
* In the basic configuration, the Agent service stores threads, messages, and files in Microsoft-managed multi-tenant storage, with tenant-level isolation.
78
-
* With the Agent standard setup, users can bring their own storage for thread and message data. In this configuration, data is isolated by project within the customer’s storage account. For details on the container format, refer to the documentation [link to container format doc].
77
+
* In the basic configuration, the Agent service stores threads, messages, and files in Microsoft-managed multi-tenant storage, with logical separatation.
78
+
* With the [Agent standard setup](../agents/how-to/use-your-own-resources.md), you may bring your own storage for thread and message data. In this configuration, data is isolated by project within the customer’s storage account.
79
79
80
80
***Customer-Managed Key Encryption:**
81
81
When using customer-managed keys, data remains stored in Microsoft-managed multi-tenant infrastructure, encrypted using the customer’s keys. To support in-product search and optimized query performance, a dedicated Azure Search instance is provisioned for metadata indexing.
82
82
83
-
## Credential storage
84
-
85
-
When configuring Azure AI Foundry tools to connect with external Azure or non-Azure services, certain scenarios require storing sensitive credentials such as connection strings or API keys.
86
-
87
-
***Default Configuration:**
88
-
By default, secrets are stored in Microsoft-managed Key Vault instances.
89
-
90
-
***Bring Your Own Key Vault (Preview):**
91
-
As an optional configuration (preview), you can integrate your own Azure Key Vault instance for credential storage. This allows for greater control over secret management and aligns with enterprise-specific security policies.
92
-
93
83
## Next steps
94
84
95
-
* Rollout doc
96
-
*CMK docs
97
-
*Networking odcs
98
-
*AGent standar set up docs
85
+
*[Azure AI Foundry Rollout Across My Organization](planning.md)
86
+
*[Customer-managed keys for encryption with Azure AI Foundry](encryption-keys-portal.md)
87
+
*[How to configure a private link for Azure AI Foundry](configure-private-link.md)
88
+
*[Bring-your-own resources with Agent service](../agents/how-to/use-your-own-resources.md)
0 commit comments