You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/concepts/ai-resources.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ author: Blackmist
16
16
17
17
# Manage, collaborate, and organize with hubs
18
18
19
-
Hubs are the primary top-level Azure resource for AI studio and provide a central way for a team to govern security, connectivity, and computing resources across playgrounds and projects. Once a hub is created, developers can create projects from it and access shared company resources without needing an IT administrator's repeated help.
19
+
Hubs are the primary top-level Azure resource for AI Studio and provide a central way for a team to govern security, connectivity, and computing resources across playgrounds and projects. Once a hub is created, developers can create projects from it and access shared company resources without needing an IT administrator's repeated help.
20
20
21
21
Project workspaces that are created using a hub inherit the same security settings and shared resource access. Teams can create project workspaces as needed to organize their work, isolate data, and/or restrict access.
22
22
@@ -99,7 +99,7 @@ Azure AI Studio layers on top of existing Azure services including Azure AI and
When you create a new hub, a set of dependent Azure resources are required to store data that you upload or get generated when working in AI studio. If not provided by you, and required, these resources are automatically created.
102
+
When you create a new hub, a set of dependent Azure resources are required to store data that you upload or get generated when working in AI Studio. If not provided by you, and required, these resources are automatically created.
Copy file name to clipboardExpand all lines: articles/ai-studio/concepts/safety-evaluations-transparency-note.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -51,9 +51,9 @@ Azure AI Studio provisions an Azure OpenAI GPT-4 model and orchestrates adversar
51
51
52
52
The safety evaluations aren't intended to use for any purpose other than to evaluate content risks and jailbreak vulnerabilities of your generative AI application:
53
53
54
-
-**Evaluating your generative AI application pre-deployment**: Using the evaluation wizard in the Azure AI studio or the Azure AI Python SDK, safety evaluations can assess in an automated way to evaluate potential content or security risks.
54
+
-**Evaluating your generative AI application pre-deployment**: Using the evaluation wizard in the Azure AI Studio or the Azure AI Python SDK, safety evaluations can assess in an automated way to evaluate potential content or security risks.
55
55
-**Augmenting your red-teaming operations**: Using the adversarial simulator, safety evaluations can simulate adversarial interactions with your generative AI application to attempt to uncover content and security risks.
56
-
-**Communicating content and security risks to stakeholders**: Using the Azure AI studio, you can share access to your Azure AI Studio project with safety evaluations results with auditors or compliance stakeholders.
56
+
-**Communicating content and security risks to stakeholders**: Using the Azure AI Studio, you can share access to your Azure AI Studio project with safety evaluations results with auditors or compliance stakeholders.
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/create-projects.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -122,7 +122,7 @@ In addition, a number of resources are only accessible by users in your project
122
122
| workspacefilestore | {project-GUID}-code | Hosts files created on your compute and using prompt flow |
123
123
124
124
> [!NOTE]
125
-
> Storage connections are not created directly with the project when your storage account has public network access set to disabled. These are created instead when a first user accesses AI studio over a private network connection. [Troubleshoot storage connections](troubleshoot-secure-connection-project.md#troubleshoot-missing-storage-connections)
125
+
> Storage connections are not created directly with the project when your storage account has public network access set to disabled. These are created instead when a first user accesses AI Studio over a private network connection. [Troubleshoot storage connections](troubleshoot-secure-connection-project.md#troubleshoot-missing-storage-connections)
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/deploy-models-serverless.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -548,7 +548,7 @@ In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**.
548
548
549
549
## Use the serverless API endpoint
550
550
551
-
Models deployed in Azure Machine Learning and Azure AI studio in Serverless API endpoints support the [Azure AI Model Inference API](../reference/reference-model-inference-api.md) that exposes a common set of capabilities for foundational models and that can be used by developers to consume predictions from a diverse set of models in a uniform and consistent way.
551
+
Models deployed in Azure Machine Learning and Azure AI Studio in Serverless API endpoints support the [Azure AI Model Inference API](../reference/reference-model-inference-api.md) that exposes a common set of capabilities for foundational models and that can be used by developers to consume predictions from a diverse set of models in a uniform and consistent way.
552
552
553
553
Read more about the [capabilities of this API](../reference/reference-model-inference-api.md#capabilities) and how [you can use it when building applications](../reference/reference-model-inference-api.md#getting-started).
After logging your custom evaluator to your AI Studio project, you can view it in your [Evaluator library](../evaluate-generative-ai-app.md#view-and-manage-the-evaluators-in-the-evaluator-library) under Evaluation tab in AI studio.
224
+
After logging your custom evaluator to your AI Studio project, you can view it in your [Evaluator library](../evaluate-generative-ai-app.md#view-and-manage-the-evaluators-in-the-evaluator-library) under Evaluation tab in AI Studio.
225
225
226
226
### Prompt-based evaluators
227
227
@@ -328,7 +328,7 @@ result = evaluate(
328
328
"ground_truth": "${data.truth}"
329
329
}
330
330
},
331
-
# Optionally provide your AI Studio project information to track your evaluation results in your Azure AI studio project
331
+
# Optionally provide your AI Studio project information to track your evaluation results in your Azure AI Studio project
332
332
azure_ai_project= azure_ai_project,
333
333
# Optionally provide an output path to dump a json of metric summary, row level data and metric and studio URL
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/develop/index-build-consume-sdk.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -28,7 +28,7 @@ You must have:
28
28
- An [Azure AI Search service connection](../../how-to/connections-add.md#create-a-new-connection) to index the sample product and customer data. If you don't have an Azure AI Search service, you can create one from the [Azure portal](https://portal.azure.com/) or see the instructions [here](../../../search/search-create-service-portal.md).
29
29
- Models for embedding:
30
30
- You can use an ada-002 embedding model from Azure OpenAI. The instructions to deploy can be found [here](../deploy-models-openai.md).
31
-
- OR you can use any another embedding model deployed in your AI studio project. In this example we use Cohere multi-lingual embedding. The instructions to deploy this model can be found [here](../deploy-models-cohere-embed.md).
31
+
- OR you can use any another embedding model deployed in your AI Studio project. In this example we use Cohere multi-lingual embedding. The instructions to deploy this model can be found [here](../deploy-models-cohere-embed.md).
32
32
33
33
## Build and consume an index locally
34
34
@@ -88,9 +88,9 @@ local_index_aoai=build_index(
88
88
89
89
The above code builds an index locally. It uses environment variables to get the AI Search service and also to connect to the Azure OpenAI embedding model.
90
90
91
-
### Build an index locally using other embedding models deployed in your AI studio project
91
+
### Build an index locally using other embedding models deployed in your AI Studio project
92
92
93
-
To create an index that uses an embedding model deployed in your AI studio project, we configure the connection to the model using a `ConnectionConfig` as shown below. The `subscription`, `resource_group` and `workspace` refers to the project where the embedding model is installed. The `connection_name` refers to the connection name for the model, which can be found in the AI Studio project settings page.
93
+
To create an index that uses an embedding model deployed in your AI Studio project, we configure the connection to the model using a `ConnectionConfig` as shown below. The `subscription`, `resource_group` and `workspace` refers to the project where the embedding model is installed. The `connection_name` refers to the connection name for the model, which can be found in the AI Studio project settings page.
94
94
95
95
```python
96
96
from promptflow.rag.config import ConnectionConfig
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/disaster-recovery.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -110,7 +110,7 @@ For more information, see [Availability zone service and regional support](/azur
110
110
111
111
Determine the level of business continuity that you're aiming for. The level might differ between the components of your solution. For example, you might want to have a hot/hot configuration for production pipelines or model deployments, and hot/cold for development.
112
112
113
-
Azure AI studio is a regional service and stores data both service-side and on a storage account in your subscription. If a regional disaster occurs, service data can't be recovered. But you can recover the data stored by the service on the storage account in your subscription given storage redundancy is enforced. Service-side stored data is mostly metadata (tags, asset names, descriptions). Stored on your storage account is typically non-metadata, for example, uploaded data.
113
+
Azure AI Studio is a regional service and stores data both service-side and on a storage account in your subscription. If a regional disaster occurs, service data can't be recovered. But you can recover the data stored by the service on the storage account in your subscription given storage redundancy is enforced. Service-side stored data is mostly metadata (tags, asset names, descriptions). Stored on your storage account is typically non-metadata, for example, uploaded data.
114
114
115
115
For connections, we recommend creating two separate resources in two distinct regions and then create two connections for the hub. For example, if AI Services is a critical resource for business continuity, creating two AI Services resources and two connections for the hub, would be a good strategy for business continuity. With this configuration, if one region goes down there's still one region operational.
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/fine-tune-phi-3.md
+10-10Lines changed: 10 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,7 +41,7 @@ The model underwent a rigorous enhancement process, incorporating both supervise
41
41
42
42
## [Phi-3-mini](#tab/phi-3-mini)
43
43
44
-
The following models are available in Azure AI studio for Phi 3 when fine-tuning as a service with pay-as-you-go:
44
+
The following models are available in Azure AI Studio for Phi 3 when fine-tuning as a service with pay-as-you-go:
45
45
46
46
-`Phi-3-mini-4k-instruct` (preview)
47
47
-`Phi-3-mini-128k-instruct` (preview)
@@ -50,7 +50,7 @@ Fine-tuning of Phi-3 models is currently supported in projects located in East U
50
50
51
51
## [Phi-3-medium](#tab/phi-3-medium)
52
52
53
-
The following models are available in Azure AI studio for Phi 3 when fine-tuning as a service with pay-as-you-go:
53
+
The following models are available in Azure AI Studio for Phi 3 when fine-tuning as a service with pay-as-you-go:
54
54
55
55
-`Phi-3-medium-4k-instruct` (preview)
56
56
-`Phi-3-medium-128k-instruct` (preview)
@@ -117,13 +117,13 @@ To fine-tune a Phi-3 model:
117
117
1. On the model's **Details** page, select **fine-tune**.
118
118
119
119
1. Select the project in which you want to fine-tune your models. To use the pay-as-you-go model fine-tune offering, your workspace must belong to the **East US 2** region.
120
-
1. On the fine-tune wizard, select the link to **Azure AI studio Terms** to learn more about the terms of use. You can also select the **Azure AI studio offer details** tab to learn about pricing for the selected model.
121
-
1. If this is your first time fine-tuning the model in the project, you have to subscribe your project for the particular offering (for example, Phi-3-mini-128k-instruct) from Azure AI studio. This step requires that your account has the Azure subscription permissions and resource group permissions listed in the prerequisites. Each project has its own subscription to the particular Azure AI studio offering, which allows you to control and monitor spending. Select **Subscribe and fine-tune**.
120
+
1. On the fine-tune wizard, select the link to **Azure AI Studio Terms** to learn more about the terms of use. You can also select the **Azure AI Studio offer details** tab to learn about pricing for the selected model.
121
+
1. If this is your first time fine-tuning the model in the project, you have to subscribe your project for the particular offering (for example, Phi-3-mini-128k-instruct) from Azure AI Studio. This step requires that your account has the Azure subscription permissions and resource group permissions listed in the prerequisites. Each project has its own subscription to the particular Azure AI Studio offering, which allows you to control and monitor spending. Select **Subscribe and fine-tune**.
122
122
123
123
> [!NOTE]
124
-
> Subscribing a project to a particular Azure AI studio offering (in this case, Phi-3-mini-128k-instruct) requires that your account has **Contributor** or **Owner** access at the subscription level where the project is created. Alternatively, your user account can be assigned a custom role that has the Azure subscription permissions and resource group permissions listed in the [prerequisites](#prerequisites).
124
+
> Subscribing a project to a particular Azure AI Studio offering (in this case, Phi-3-mini-128k-instruct) requires that your account has **Contributor** or **Owner** access at the subscription level where the project is created. Alternatively, your user account can be assigned a custom role that has the Azure subscription permissions and resource group permissions listed in the [prerequisites](#prerequisites).
125
125
126
-
1. Once you sign up the project for the particular Azure AI studio offering, subsequent fine-tuning of the _same_ offering in the _same_ project don't require subscribing again. Therefore, you don't need to have the subscription-level permissions for subsequent fine-tune jobs. If this scenario applies to you, select **Continue to fine-tune**.
126
+
1. Once you sign up the project for the particular Azure AI Studio offering, subsequent fine-tuning of the _same_ offering in the _same_ project don't require subscribing again. Therefore, you don't need to have the subscription-level permissions for subsequent fine-tune jobs. If this scenario applies to you, select **Continue to fine-tune**.
127
127
128
128
1. Enter a name for your fine-tuned model and the optional tags and description.
129
129
1. Select training data to fine-tune your model. See [data preparation](#data-preparation) for more information.
@@ -154,13 +154,13 @@ To fine-tune a Phi-3 model:
154
154
1. On the model's **Details** page, select **fine-tune**.
155
155
156
156
1. Select the project in which you want to fine-tune your models. To use the pay-as-you-go model fine-tune offering, your workspace must belong to the **East US 2** region.
157
-
1. On the fine-tune wizard, select the link to **Azure AI studio Terms** to learn more about the terms of use. You can also select the **Azure AI studio offer details** tab to learn about pricing for the selected model.
158
-
1. If this is your first time fine-tuning the model in the project, you have to subscribe your project for the particular offering (for example, Phi-3-medium-128k-instruct) from Azure AI studio. This step requires that your account has the Azure subscription permissions and resource group permissions listed in the prerequisites. Each project has its own subscription to the particular Azure AI studio offering, which allows you to control and monitor spending. Select **Subscribe and fine-tune**.
157
+
1. On the fine-tune wizard, select the link to **Azure AI Studio Terms** to learn more about the terms of use. You can also select the **Azure AI Studio offer details** tab to learn about pricing for the selected model.
158
+
1. If this is your first time fine-tuning the model in the project, you have to subscribe your project for the particular offering (for example, Phi-3-medium-128k-instruct) from Azure AI Studio. This step requires that your account has the Azure subscription permissions and resource group permissions listed in the prerequisites. Each project has its own subscription to the particular Azure AI Studio offering, which allows you to control and monitor spending. Select **Subscribe and fine-tune**.
159
159
160
160
> [!NOTE]
161
-
> Subscribing a project to a particular Azure AI studio offering (in this case, Phi-3-mini-128k-instruct) requires that your account has **Contributor** or **Owner** access at the subscription level where the project is created. Alternatively, your user account can be assigned a custom role that has the Azure subscription permissions and resource group permissions listed in the [prerequisites](#prerequisites).
161
+
> Subscribing a project to a particular Azure AI Studio offering (in this case, Phi-3-mini-128k-instruct) requires that your account has **Contributor** or **Owner** access at the subscription level where the project is created. Alternatively, your user account can be assigned a custom role that has the Azure subscription permissions and resource group permissions listed in the [prerequisites](#prerequisites).
162
162
163
-
1. Once you sign up the project for the particular Azure AI studio offering, subsequent fine-tuning of the _same_ offering in the _same_ project don't require subscribing again. Therefore, you don't need to have the subscription-level permissions for subsequent fine-tune jobs. If this scenario applies to you, select **Continue to fine-tune**.
163
+
1. Once you sign up the project for the particular Azure AI Studio offering, subsequent fine-tuning of the _same_ offering in the _same_ project don't require subscribing again. Therefore, you don't need to have the subscription-level permissions for subsequent fine-tune jobs. If this scenario applies to you, select **Continue to fine-tune**.
164
164
165
165
1. Enter a name for your fine-tuned model and the optional tags and description.
166
166
1. Select training data to fine-tune your model. See [data preparation](#data-preparation) for more information.
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/model-catalog-overview.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -165,7 +165,7 @@ To set the PNA flag for the AI Studio hub:
165
165
166
166
* If you have an AI Studio hub with a private endpoint created before July 11, 2024, new MaaS endpoints added to projects in this hub won't follow the networking configuration of the hub. Instead, you need to create a new private endpoint for the hub and create new serverless API deployments in the project so that the new deployments can follow the hub's networking configuration.
167
167
168
-
* If you have an AI studio hub with MaaS deployments created before July 11, 2024, and you enable a private endpoint on this hub, the existing MaaS deployments won't follow the hub's networking configuration. For serverless API deployments in the hub to follow the hub's networking configuration, you need to create the deployments again.
168
+
* If you have an AI Studio hub with MaaS deployments created before July 11, 2024, and you enable a private endpoint on this hub, the existing MaaS deployments won't follow the hub's networking configuration. For serverless API deployments in the hub to follow the hub's networking configuration, you need to create the deployments again.
169
169
170
170
* Currently, [Azure OpenAI On Your Data](/azure/ai-services/openai/concepts/use-your-data) support isn't available for MaaS deployments in private hubs, because private hubs have the PNA flag disabled.
0 commit comments