Skip to content

Commit 8532431

Browse files
author
Jill Grant
authored
Merge pull request #2196 from eric-urban/eur/ai-foundry-text
rebrand from Azure OpenAI Studio
2 parents 30be1bf + 3f6c056 commit 8532431

23 files changed

+112
-252
lines changed

articles/ai-services/openai/concepts/use-your-data.md

Lines changed: 19 additions & 19 deletions
Large diffs are not rendered by default.

articles/ai-services/openai/how-to/fine-tuning.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: 'Customize a model with Azure OpenAI Service'
33
titleSuffix: Azure OpenAI
4-
description: Learn how to create your own customized model with Azure OpenAI Service by using Python, the REST APIs, or Azure OpenAI Studio.
4+
description: Learn how to create your own customized model with Azure OpenAI Service by using Python, the REST APIs, or Azure AI Foundry portal.
55
#services: cognitive-services
66
manager: nitinme
77
ms.service: azure-ai-openai
@@ -28,7 +28,7 @@ We use LoRA, or low rank approximation, to fine-tune models in a way that reduce
2828

2929
::: zone pivot="programming-language-studio"
3030

31-
[!INCLUDE [Azure OpenAI Studio fine-tuning](../includes/fine-tuning-unified.md)]
31+
[!INCLUDE [Azure AI Foundry portal fine-tuning](../includes/fine-tuning-unified.md)]
3232

3333
::: zone-end
3434

@@ -159,7 +159,7 @@ In order to successfully access fine-tuning, you need **Cognitive Services OpenA
159159

160160
### Why did my upload fail?
161161

162-
If your file upload fails in Azure OpenAI Studio, you can view the error message under “data files in Azure OpenAI Studio. Hover your mouse over where it says “error” (under the status column) and an explanation of the failure will be displayed.
162+
If your file upload fails in Azure AI Foundry portal, you can view the error message under **Data files** in Azure AI Foundry portal. Hover your mouse over where it says “error” (under the status column) and an explanation of the failure will be displayed.
163163

164164
:::image type="content" source="../media/fine-tuning/error.png" alt-text="Screenshot of fine-tuning error message." lightbox="../media/fine-tuning/error.png":::
165165

articles/ai-services/openai/how-to/on-your-data-configuration.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -67,9 +67,9 @@ Azure OpenAI On Your Data lets you restrict the documents that can be used in re
6767
`group_ids` is the default field name. If you use a different field name like `my_group_ids`, you can map the field in [index field mapping](../concepts/use-your-data.md#index-field-mapping).
6868

6969
1. Make sure each sensitive document in the index has this security field value set to the permitted groups of the document.
70-
1. In [Azure OpenAI Studio](https://oai.azure.com/portal), add your data source. in the [index field mapping](../concepts/use-your-data.md#index-field-mapping) section, you can map zero or one value to the **permitted groups** field, as long as the schema is compatible. If the **permitted groups** field isn't mapped, document level access is disabled.
70+
1. In [Azure AI Foundry portal](https://oai.azure.com/portal), add your data source. in the [index field mapping](../concepts/use-your-data.md#index-field-mapping) section, you can map zero or one value to the **permitted groups** field, as long as the schema is compatible. If the **permitted groups** field isn't mapped, document level access is disabled.
7171

72-
**Azure OpenAI Studio**
72+
**Azure AI Foundry portal**
7373

7474
Once the Azure AI Search index is connected, your responses in the studio have document access based on the Microsoft Entra permissions of the logged in user.
7575

@@ -178,7 +178,7 @@ This step can be skipped only if you have a [shared private link](#create-shared
178178

179179
You can disable public network access of your Azure OpenAI resource in the Azure portal.
180180

181-
To allow access to your Azure OpenAI Service from your client machines, like using Azure OpenAI Studio, you need to create [private endpoint connections](/azure/ai-services/cognitive-services-virtual-networks?tabs=portal#use-private-endpoints) that connect to your Azure OpenAI resource.
181+
To allow access to your Azure OpenAI Service from your client machines, like using Azure AI Foundry portal, you need to create [private endpoint connections](/azure/ai-services/cognitive-services-virtual-networks?tabs=portal#use-private-endpoints) that connect to your Azure OpenAI resource.
182182

183183

184184
## Configure Azure AI Search
@@ -202,7 +202,7 @@ For more information, see the [Azure AI Search RBAC article](/azure/search/searc
202202

203203
You can disable public network access of your Azure AI Search resource in the Azure portal.
204204

205-
To allow access to your Azure AI Search resource from your client machines, like using Azure OpenAI Studio, you need to create [private endpoint connections](/azure/search/service-create-private-endpoint) that connect to your Azure AI Search resource.
205+
To allow access to your Azure AI Search resource from your client machines, like using Azure AI Foundry portal, you need to create [private endpoint connections](/azure/search/service-create-private-endpoint) that connect to your Azure AI Search resource.
206206

207207

208208
### Enable trusted service
@@ -256,7 +256,7 @@ In the Azure portal, navigate to your storage account networking tab, choose "Se
256256

257257
You can disable public network access of your Storage Account in the Azure portal.
258258

259-
To allow access to your Storage Account from your client machines, like using Azure OpenAI Studio, you need to create [private endpoint connections](/azure/storage/common/storage-private-endpoints) that connect to your blob storage.
259+
To allow access to your Storage Account from your client machines, like using Azure AI Foundry portal, you need to create [private endpoint connections](/azure/storage/common/storage-private-endpoints) that connect to your blob storage.
260260

261261

262262

@@ -285,9 +285,9 @@ To enable the developers to use these resources to build applications, the admin
285285

286286
|Role| Resource | Description |
287287
|--|--|--|
288-
| `Cognitive Services OpenAI Contributor` | Azure OpenAI | Call public ingestion API from Azure OpenAI Studio. The `Contributor` role is not enough, because if you only have `Contributor` role, you cannot call data plane API via Microsoft Entra ID authentication, and Microsoft Entra ID authentication is required in the secure setup described in this article. |
289-
| `Contributor` | Azure AI Search | List API-Keys to list indexes from Azure OpenAI Studio.|
290-
| `Contributor` | Storage Account | List Account SAS to upload files from Azure OpenAI Studio.|
288+
| `Cognitive Services OpenAI Contributor` | Azure OpenAI | Call public ingestion API from Azure AI Foundry portal. The `Contributor` role is not enough, because if you only have `Contributor` role, you cannot call data plane API via Microsoft Entra ID authentication, and Microsoft Entra ID authentication is required in the secure setup described in this article. |
289+
| `Contributor` | Azure AI Search | List API-Keys to list indexes from Azure AI Foundry portal.|
290+
| `Contributor` | Storage Account | List Account SAS to upload files from Azure AI Foundry portal.|
291291
| `Contributor` | The resource group or Azure subscription where the developer need to deploy the web app to | Deploy web app to the developer's Azure subscription.|
292292
| `Role Based Access Control Administrator` | Azure OpenAI | Permission to configure the necessary role assignment on the Azure OpenAI resource. Enables the web app to call Azure OpenAI. |
293293

@@ -309,9 +309,9 @@ Configure your local machine `hosts` file to point your resources host names to
309309
10.0.0.7 contoso.blob.core.windows.net
310310
```
311311

312-
## Azure OpenAI Studio
312+
## Azure AI Foundry portal
313313

314-
You should be able to use all Azure OpenAI Studio features, including both ingestion and inference, from your on-premises client machines.
314+
You should be able to use all Azure AI Foundry portal features, including both ingestion and inference, from your on-premises client machines.
315315

316316
## Web app
317317
The web app communicates with your Azure OpenAI resource. Since your Azure OpenAI resource has public network disabled, the web app needs to be set up to use the private endpoint in your virtual network to access your Azure OpenAI resource.
@@ -322,7 +322,7 @@ The web app needs to resolve your Azure OpenAI host name to the private IP of th
322322
1. [Add a DNS record](/azure/dns/private-dns-getstarted-portal#create-an-additional-dns-record). The IP is the private IP of the private endpoint for your Azure OpenAI resource, and you can get the IP address from the network interface associated with the private endpoint for your Azure OpenAI.
323323
1. [Link the private DNS zone to your virtual network](/azure/dns/private-dns-getstarted-portal#link-the-virtual-network) so the web app integrated in this virtual network can use this private DNS zone.
324324

325-
When deploying the web app from Azure OpenAI Studio, select the same location with the virtual network, and select a proper SKU, so it can support the [virtual network integration feature](/azure/app-service/overview-vnet-integration).
325+
When deploying the web app from Azure AI Foundry portal, select the same location with the virtual network, and select a proper SKU, so it can support the [virtual network integration feature](/azure/app-service/overview-vnet-integration).
326326

327327
After the web app is deployed, from the Azure portal networking tab, configure the web app outbound traffic virtual network integration, choose the third subnet that you reserved for web app.
328328

articles/ai-services/openai/how-to/provisioned-get-started.md

Lines changed: 14 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,6 @@ The following guide walks you through key steps in creating a provisioned deploy
2222

2323
- An Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services?azure-portal=true)
2424
- Azure Contributor or Cognitive Services Contributor role
25-
- Access to Azure OpenAI Studio
2625

2726
## Obtain/verify PTU quota availability.
2827

@@ -37,9 +36,9 @@ Creating a new deployment requires available (unused) quota to cover the desired
3736

3837
Then 200 PTUs of quota are considered used, and there are 300 PTUs available for use to create new deployments.
3938

40-
A default amount of global, data zone, and regional provisioned quota is assigned to eligible subscriptions in several regions. You can view the quota available to you in a region by visiting the Quotas pane in Azure AI Foundry and selecting the desired subscription and region. For example, the screenshot below shows a quota limit of 500 PTUs in West US for the selected subscription. Note that you might see lower values of available default quotas.
39+
A default amount of global, data zone, and regional provisioned quota is assigned to eligible subscriptions in several regions. You can view the quota available to you in a region by visiting the Quotas pane in Azure AI Foundry portal and selecting the desired subscription and region. For example, the screenshot below shows a quota limit of 500 PTUs in West US for the selected subscription. Note that you might see lower values of available default quotas.
4140

42-
:::image type="content" source="../media/provisioned/available-quota.png" alt-text="A screenshot of the available quota in Azure OpenAI studio." lightbox="../media/provisioned/available-quota.png":::
41+
:::image type="content" source="../media/provisioned/available-quota.png" alt-text="A screenshot of the available quota in Azure AI Foundry portal." lightbox="../media/provisioned/available-quota.png":::
4342

4443
Additional quota can be requested by clicking the Request Quota link to the right of the “Usage/Limit” column. (This is off-screen in the screenshot above).
4544

@@ -54,16 +53,15 @@ Provisioned deployments are created via Azure OpenAI resource objects within Azu
5453

5554
once you have verified your quota, you can create a deployment. To create a provisioned deployment, you can follow these steps; the choices described reflect the entries shown in the screenshot.
5655

57-
:::image type="content" source="../media/provisioned/deployment-screen.png" alt-text="Screenshot of the Azure OpenAI Studio deployment page for a provisioned deployment." lightbox="../media/provisioned/deployment-screen.png":::
56+
:::image type="content" source="../media/provisioned/deployment-screen.png" alt-text="Screenshot of the Azure AI Foundry portal deployment page for a provisioned deployment." lightbox="../media/provisioned/deployment-screen.png":::
5857

5958

6059

61-
1. Sign into [Azure AI Foundry](https://oai.azure.com)
60+
1. Sign into the [Azure AI Foundry portal](https://ai.azure.com).
6261
1. Choose the subscription that was enabled for provisioned deployments & select the desired resource in a region where you have the quota.
63-
64-
3. Under **Management** in the left-nav select **Deployments**.
65-
4. Select Create new deployment and configure the following fields. Expand the **advanced options** drop-down menu.
66-
5. Fill out the values in each field. Here's an example:
62+
1. Under **Management** in the left-nav select **Deployments**.
63+
1. Select Create new deployment and configure the following fields. Expand the **advanced options** drop-down menu.
64+
1. Fill out the values in each field. Here's an example:
6765

6866
| Field | Description | Example |
6967
|--|--|--|
@@ -74,8 +72,8 @@ once you have verified your quota, you can create a deployment. To create a prov
7472
| Deployment Type |This impacts the throughput and performance. Choose Global Provisioned-Managed, DataZone Provisioned-Managed or Provisioned-Managed from the deployment dialog dropdown for your deployment | Provisioned-Managed |
7573
| Provisioned Throughput Units | Choose the amount of throughput you wish to include in the deployment. | 100 |
7674

77-
Important things to note:
78-
* The deployment dialog contains a reminder that you can purchase an Azure Reservation for Azure OpenAI Provisioned to obtain a significant discount for a term commitment.
75+
> [!NOTE]
76+
> The deployment dialog contains a reminder that you can purchase an Azure Reservation for Azure OpenAI Provisioned to obtain a significant discount for a term commitment.
7977
8078
Once you have entered the deployment settings, click **Confirm Pricing** to continue. A pricing confirmation dialog will appear that will display the list price for the deployment, if you choose to pay for it on an hourly basis, with no Azure Reservation to provide a term discount.
8179

@@ -108,16 +106,16 @@ REST, ARM template, Bicep, and Terraform can also be used to create deployments.
108106

109107
Due to the dynamic nature of capacity availability, it is possible that the region of your selected resource might not have the service capacity to create the deployment of the specified model, version, and number of PTUs.
110108

111-
In this event, Azure AI Foundry will direct you to other regions with available quota and capacity to create a deployment of the desired model. If this happens, the deployment dialog will look like this:
109+
In this event, the wizard in Azure AI Foundry portal will direct you to other regions with available quota and capacity to create a deployment of the desired model. If this happens, the deployment dialog will look like this:
112110

113-
:::image type="content" source="../media/provisioned/deployment-screen-2.png" alt-text="Screenshot of the Azure OpenAI Studio deployment page for a provisioned deployment with no capacity available." lightbox="../media/provisioned/deployment-screen-2.png":::
111+
:::image type="content" source="../media/provisioned/deployment-screen-2.png" alt-text="Screenshot of the Azure AI Foundry portal deployment page for a provisioned deployment with no capacity available." lightbox="../media/provisioned/deployment-screen-2.png":::
114112

115113
Things to notice:
116114

117115
* A message displays showing you many PTUs you have in available quota, and how many can currently be deployed at this time.
118116
* If you select a number of PTUs greater than service capacity, a message will appear that provides options for you to obtain more capacity, and a button to allow you to select an alternate region. Clicking the "See other regions" button will display a dialog that shows a list of Azure OpenAI resources where you can create a deployment, along with the maximum sized deployment that can be created based on available quota and service capacity in each region.
119117

120-
:::image type="content" source="../media/provisioned/choose-different-resource.png" alt-text="Screenshot of the Azure OpenAI Studio deployment page for choosing a different resource and region." lightbox="../media/provisioned/choose-different-resource.png":::
118+
:::image type="content" source="../media/provisioned/choose-different-resource.png" alt-text="Screenshot of the Azure AI Foundry portal deployment page for choosing a different resource and region." lightbox="../media/provisioned/choose-different-resource.png":::
121119

122120
Selecting a resource and clicking **Switch resource** will cause the deployment dialog to redisplay using the selected resource. You can then proceed to create your deployment in the new region.
123121

@@ -167,9 +165,8 @@ The inferencing code for provisioned deployments is the same a standard deployme
167165

168166
## Understanding expected throughput
169167
The amount of throughput that you can achieve on the endpoint is a factor of the number of PTUs deployed, input size, output size, and call rate. The number of concurrent calls and total tokens processed can vary based on these values. Our recommended way for determining the throughput for your deployment is as follows:
170-
1. Use the Capacity calculator for a sizing estimate. You can find the capacity calculator in Azure AI Foundry under the quotas page and Provisioned tab.
171-
172-
2. Benchmark the load using real traffic workload. For more information about benchmarking, see the [benchmarking](#run-a-benchmark) section.
168+
1. Use the Capacity calculator for a sizing estimate. You can find the capacity calculator in Azure AI Foundry portal under the quotas page and Provisioned tab.
169+
1. Benchmark the load using real traffic workload. For more information about benchmarking, see the [benchmarking](#run-a-benchmark) section.
173170

174171

175172
## Measuring your deployment utilization

0 commit comments

Comments
 (0)