You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/architecture.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -56,7 +56,7 @@ Azure AI Foundry applies a flexible compute architecture to support diverse [mod
56
56
1.[Deployment to serverless API endpoints in Azure AI Hub resources](deployments-overview.md#serverless-api-endpoint)
57
57
1.[Deployment to managed computes in Azure AI Hub resources](deployments-overview.md#managed-compute)
58
58
59
-
For an overview of data, privacy and security considerations with these deployment options, see [Data, privacy, and security for use of models](../how-to/concept-data-privacy.md)
59
+
For an overview of data, privacy, and security considerations with these deployment options, see [Data, privacy, and security for use of models](../how-to/concept-data-privacy.md)
60
60
61
61
-**Workload Execution:** Agents, Evaluations, and Batch jobs are executed as managed container compute, fully managed by Microsoft.
62
62
@@ -67,26 +67,26 @@ Azure AI Foundry applies a flexible compute architecture to support diverse [mod
67
67
Azure AI Foundry provides flexible and secure data storage options to support a wide range of AI workloads.
68
68
69
69
***Managed storage for file upload**:
70
-
In the default setup, Azure AI Foundry uses Microsoft-managed storage accounts, that are logically separated, and support direct file uploads for select use cases—such as OpenAI models, Assistants, and Agents, without requiring a customer-provided storage account.
70
+
In the default setup, Azure AI Foundry uses Microsoft-managed storage accounts that are logically separated and support direct file uploads for select use cases, such as OpenAI models, Assistants, and Agents, without requiring a customer-provided storage account.
71
71
72
72
***Bring Your Own Storage (Optional)**:
73
73
Users can optionally connect their own Azure Storage accounts. Foundry tools can read inputs from and write outputs to these accounts, depending on the tool and use case.
74
74
75
75
***Bring-your-own storage for storing Agent state:**
76
76
77
77
* In the basic configuration, the Agent service stores threads, messages, and files in Microsoft-managed multi-tenant storage, with logical separation.
78
-
* With the [Agent standard setup](../agents/how-to/use-your-own-resources.md), you may bring your own storage for thread and message data. In this configuration, data is isolated by project within the customer’s storage account.
78
+
* With the [Agent standard setup](../agents/how-to/use-your-own-resources.md), you can bring your own storage for thread and message data. In this configuration, data is isolated by project within the customer’s storage account.
79
79
80
80
***Customer-Managed Key Encryption:**
81
81
By default, Azure services use Microsoft-managed encryption keys to encrypt data in transit and at rest. Data is encrypted and decrypted using FIPS 140-2 compliant 256-bit AES encryption. Encryption and decryption are transparent, meaning encryption and access are managed for you. Your data is secure by default and you don't need to modify your code or applications to take advantage of encryption.
82
82
83
-
When using customer-managed keys, your data on Microsoft-managed infrastructure is encrypted using your keys for encryption.
83
+
When using customer-managed keys, your data on Microsoft-managed infrastructure is encrypted using your keys.
84
84
85
85
To learn more about data encryption, see [customer-managed keys for encryption with Azure AI Foundry](encryption-keys-portal.md).
86
86
87
87
## Next steps
88
88
89
-
*[Azure AI Foundry Rollout Across My Organization](planning.md)
89
+
*[Azure AI Foundry rollout across my organization](planning.md)
90
90
*[Customer-managed keys for encryption with Azure AI Foundry](encryption-keys-portal.md)
91
91
*[How to configure a private link for Azure AI Foundry](../how-to/configure-private-link.md)
92
-
*[Bring-your-own resources with Agent service](../agents/how-to/use-your-own-resources.md)
92
+
*[Bring-your-own resources with the Agent service](../agents/how-to/use-your-own-resources.md)
Copy file name to clipboardExpand all lines: articles/ai-foundry/how-to/create-resource-terraform.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,23 +20,23 @@ ai-usage: ai-assisted
20
20
21
21
In this article, you use Terraform to create an [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) resource. You learn how to use Terraform to create AI Foundry management configurations including projects, deployments, and connections.
22
22
23
-
The examples used in article use the [AzAPI](https://learn.microsoft.com/azure/developer/terraform/overview-azapi-provider) Terraform provider. Similar [AzureRM](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/ai_services) provider support is available via the classic `AzureRM_AIServices` module (using `aiservices` kind as value), but is limited in functionality to resource and deployment creation.
23
+
The examples used in article use the [AzAPI](/azure/developer/terraform/overview-azapi-provider) Terraform provider. Similar [AzureRM](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/ai_services) provider support is available via the classic `AzureRM_AIServices` module (using the `aiservices` kind as its value), but is limited in functionality to resource and deployment creation.
0 commit comments