Skip to content

Commit 5ef39c2

Browse files
Merge pull request #284676 from eric-urban/eur/model-edits
terminology and casing
2 parents 269fe00 + 00f9175 commit 5ef39c2

13 files changed

+41
-41
lines changed

articles/ai-studio/concepts/ai-resources.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ author: Blackmist
1616

1717
# Manage, collaborate, and organize with hubs
1818

19-
Hubs are the primary top-level Azure resource for AI studio and provide a central way for a team to govern security, connectivity, and computing resources across playgrounds and projects. Once a hub is created, developers can create projects from it and access shared company resources without needing an IT administrator's repeated help.
19+
Hubs are the primary top-level Azure resource for AI Studio and provide a central way for a team to govern security, connectivity, and computing resources across playgrounds and projects. Once a hub is created, developers can create projects from it and access shared company resources without needing an IT administrator's repeated help.
2020

2121
Project workspaces that are created using a hub inherit the same security settings and shared resource access. Teams can create project workspaces as needed to organize their work, isolate data, and/or restrict access.
2222

@@ -99,7 +99,7 @@ Azure AI Studio layers on top of existing Azure services including Azure AI and
9999

100100
[!INCLUDE [Resource provider kinds](../includes/resource-provider-kinds.md)]
101101

102-
When you create a new hub, a set of dependent Azure resources are required to store data that you upload or get generated when working in AI studio. If not provided by you, and required, these resources are automatically created.
102+
When you create a new hub, a set of dependent Azure resources are required to store data that you upload or get generated when working in AI Studio. If not provided by you, and required, these resources are automatically created.
103103

104104
[!INCLUDE [Dependent Azure resources](../includes/dependent-resources.md)]
105105

articles/ai-studio/concepts/architecture.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,13 +19,13 @@ AI Studio provides a unified experience for AI developers and data scientists to
1919

2020
The top level AI Studio resources (hub and project) are based on Azure Machine Learning. Connected resources, such as Azure OpenAI, Azure AI services, and Azure AI Search, are used by the hub and project in reference, but follow their own resource management lifecycle.
2121

22-
- **AI hub**: The hub is the top-level resource in AI Studio. The Azure resource provider for a hub is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Hub`. It provides the following features:
22+
- **AI Studio hub**: The hub is the top-level resource in AI Studio. The Azure resource provider for a hub is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Hub`. It provides the following features:
2323
- Security configuration including a managed network that spans projects and model endpoints.
2424
- Compute resources for interactive development, finetuning, open source, and serverless model deployments.
2525
- Connections to other Azure services such as Azure OpenAI, Azure AI services, and Azure AI Search. Hub-scoped connections are shared with projects created from the hub.
2626
- Project management. A hub can have multiple child projects.
2727
- An associated Azure storage account for data upload and artifact storage.
28-
- **AI project**: A project is a child resource of the hub. The Azure resource provider for a project is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Project`. The project provides the following features:
28+
- **AI Studio project**: A project is a child resource of the hub. The Azure resource provider for a project is `Microsoft.MachineLearningServices/workspaces`, and the kind of resource is `Project`. The project provides the following features:
2929
- Access to development tools for building and customizing AI applications.
3030
- Reusable components including datasets, models, and indexes.
3131
- An isolated container to upload data to (within the storage inherited from the hub).

articles/ai-studio/concepts/safety-evaluations-transparency-note.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,9 +51,9 @@ Azure AI Studio provisions an Azure OpenAI GPT-4 model and orchestrates adversar
5151

5252
The safety evaluations aren't intended to use for any purpose other than to evaluate content risks and jailbreak vulnerabilities of your generative AI application:
5353

54-
- **Evaluating your generative AI application pre-deployment**: Using the evaluation wizard in the Azure AI studio or the Azure AI Python SDK, safety evaluations can assess in an automated way to evaluate potential content or security risks.
54+
- **Evaluating your generative AI application pre-deployment**: Using the evaluation wizard in the Azure AI Studio or the Azure AI Python SDK, safety evaluations can assess in an automated way to evaluate potential content or security risks.
5555
- **Augmenting your red-teaming operations**: Using the adversarial simulator, safety evaluations can simulate adversarial interactions with your generative AI application to attempt to uncover content and security risks.
56-
- **Communicating content and security risks to stakeholders**: Using the Azure AI studio, you can share access to your Azure AI Studio project with safety evaluations results with auditors or compliance stakeholders.
56+
- **Communicating content and security risks to stakeholders**: Using the Azure AI Studio, you can share access to your Azure AI Studio project with safety evaluations results with auditors or compliance stakeholders.
5757

5858
#### Considerations when choosing a use case
5959

articles/ai-studio/how-to/create-hub-terraform.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: 'Use Terraform to create an Azure AI Studio hub'
3-
description: In this article, you create an Azure AI hub, an AI project, an AI services resource, and more resources.
3+
description: In this article, you create an Azure AI Studio hub, an Azure AI Studio project, an AI services resource, and more resources.
44
ms.topic: how-to
55
ms.date: 07/12/2024
66
titleSuffix: Azure AI Studio
@@ -27,8 +27,8 @@ In this article, you use Terraform to create an Azure AI Studio hub, a project,
2727
> * Set up a storage account
2828
> * Establish a key vault
2929
> * Configure AI services
30-
> * Build an Azure AI hub
31-
> * Develop an AI project
30+
> * Build an AI Studio hub
31+
> * Develop an AI Studio project
3232
> * Establish an AI services connection
3333
3434
## Prerequisites

articles/ai-studio/how-to/create-projects.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ Use the following tabs to select the method you plan to use to create a project:
6363

6464
For more information on authenticating, see [Authentication methods](/cli/azure/authenticate-azure-cli).
6565

66-
1. Once the extension is installed and authenticated to your Azure subscription, use the following command to create a new Azure AI project from an existing Azure AI hub:
66+
1. Once the extension is installed and authenticated to your Azure subscription, use the following command to create a new Azure AI Studio project from an existing Azure AI Studio hub:
6767

6868
```azurecli
6969
az ml workspace create --kind project --hub-id {my_hub_ARM_ID} --resource-group {my_resource_group} --name {my_project_name}
@@ -122,7 +122,7 @@ In addition, a number of resources are only accessible by users in your project
122122
| workspacefilestore | {project-GUID}-code | Hosts files created on your compute and using prompt flow |
123123

124124
> [!NOTE]
125-
> Storage connections are not created directly with the project when your storage account has public network access set to disabled. These are created instead when a first user accesses AI studio over a private network connection. [Troubleshoot storage connections](troubleshoot-secure-connection-project.md#troubleshoot-missing-storage-connections)
125+
> Storage connections are not created directly with the project when your storage account has public network access set to disabled. These are created instead when a first user accesses AI Studio over a private network connection. [Troubleshoot storage connections](troubleshoot-secure-connection-project.md#troubleshoot-missing-storage-connections)
126126
127127

128128
## Next steps

articles/ai-studio/how-to/deploy-models-serverless.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -548,19 +548,19 @@ In this section, you create an endpoint with the name **meta-llama3-8b-qwerty**.
548548
549549
## Use the serverless API endpoint
550550
551-
Models deployed in Azure Machine Learning and Azure AI studio in Serverless API endpoints support the [Azure AI Model Inference API](../reference/reference-model-inference-api.md) that exposes a common set of capabilities for foundational models and that can be used by developers to consume predictions from a diverse set of models in a uniform and consistent way.
551+
Models deployed in Azure Machine Learning and Azure AI Studio in Serverless API endpoints support the [Azure AI Model Inference API](../reference/reference-model-inference-api.md) that exposes a common set of capabilities for foundational models and that can be used by developers to consume predictions from a diverse set of models in a uniform and consistent way.
552552
553553
Read more about the [capabilities of this API](../reference/reference-model-inference-api.md#capabilities) and how [you can use it when building applications](../reference/reference-model-inference-api.md#getting-started).
554554
555555
## Network isolation
556556
557557
Endpoints for models deployed as Serverless APIs follow the public network access (PNA) flag setting of the AI Studio Hub that has the project in which the deployment exists. To secure your MaaS endpoint, disable the PNA flag on your AI Studio Hub. You can secure inbound communication from a client to your endpoint by using a private endpoint for the hub.
558558
559-
To set the PNA flag for the Azure AI hub:
559+
To set the PNA flag for the Azure AI Studio hub:
560560
561561
1. Go to the [Azure portal](https://portal.azure.com).
562-
2. Search for the Resource group to which the hub belongs, and select your Azure AI hub from the resources listed for this Resource group.
563-
3. On the hub Overview page, use the left navigation pane to go to Settings > Networking.
562+
2. Search for the Resource group to which the hub belongs, and select the **Azure AI hub** from the resources listed for this resource group.
563+
3. From the hub **Overview** page on the left menu, select **Settings** > **Networking**.
564564
4. Under the **Public access** tab, you can configure settings for the public network access flag.
565565
5. Save your changes. Your changes might take up to five minutes to propagate.
566566

articles/ai-studio/how-to/develop/flow-evaluate-sdk.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -221,7 +221,7 @@ ml_client.evaluators.download("answer_len_uploaded", version=1, download_path=".
221221
evaluator = load_flow(os.path.join("answer_len_uploaded", flex_flow_path))
222222
```
223223

224-
After logging your custom evaluator to your AI project, you can view it in your [Evaluator library](../evaluate-generative-ai-app.md#view-and-manage-the-evaluators-in-the-evaluator-library) under Evaluation tab in AI studio.
224+
After logging your custom evaluator to your AI Studio project, you can view it in your [Evaluator library](../evaluate-generative-ai-app.md#view-and-manage-the-evaluators-in-the-evaluator-library) under Evaluation tab in AI Studio.
225225

226226
### Prompt-based evaluators
227227

@@ -307,7 +307,7 @@ ml_client.evaluators.download("prompty_uploaded", version=1, download_path=".")
307307
evaluator = load_flow(os.path.join("prompty_uploaded", "apology.prompty"))
308308
```
309309

310-
After logging your custom evaluator to your AI project, you can view it in your [Evaluator library](../evaluate-generative-ai-app.md#view-and-manage-the-evaluators-in-the-evaluator-library) under Evaluation tab in AI studio.
310+
After logging your custom evaluator to your AI Studio project, you can view it in your [Evaluator library](../evaluate-generative-ai-app.md#view-and-manage-the-evaluators-in-the-evaluator-library) under **Evaluation** tab in AI Studio.
311311

312312
## Evaluate on test dataset using `evaluate()`
313313

@@ -328,7 +328,7 @@ result = evaluate(
328328
"ground_truth": "${data.truth}"
329329
}
330330
},
331-
# Optionally provide your AI Studio project information to track your evaluation results in your Azure AI studio project
331+
# Optionally provide your AI Studio project information to track your evaluation results in your Azure AI Studio project
332332
azure_ai_project = azure_ai_project,
333333
# Optionally provide an output path to dump a json of metric summary, row level data and metric and studio URL
334334
output_path="./myevalresults.json"

articles/ai-studio/how-to/develop/index-build-consume-sdk.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ You must have:
2828
- An [Azure AI Search service connection](../../how-to/connections-add.md#create-a-new-connection) to index the sample product and customer data. If you don't have an Azure AI Search service, you can create one from the [Azure portal](https://portal.azure.com/) or see the instructions [here](../../../search/search-create-service-portal.md).
2929
- Models for embedding:
3030
- You can use an ada-002 embedding model from Azure OpenAI. The instructions to deploy can be found [here](../deploy-models-openai.md).
31-
- OR you can use any another embedding model deployed in your AI studio project. In this example we use Cohere multi-lingual embedding. The instructions to deploy this model can be found [here](../deploy-models-cohere-embed.md).
31+
- OR you can use any another embedding model deployed in your AI Studio project. In this example we use Cohere multi-lingual embedding. The instructions to deploy this model can be found [here](../deploy-models-cohere-embed.md).
3232

3333
## Build and consume an index locally
3434

@@ -88,9 +88,9 @@ local_index_aoai=build_index(
8888

8989
The above code builds an index locally. It uses environment variables to get the AI Search service and also to connect to the Azure OpenAI embedding model.
9090

91-
### Build an index locally using other embedding models deployed in your AI studio project
91+
### Build an index locally using other embedding models deployed in your AI Studio project
9292

93-
To create an index that uses an embedding model deployed in your AI studio project, we configure the connection to the model using a `ConnectionConfig` as shown below. The `subscription`, `resource_group` and `workspace` refers to the project where the embedding model is installed. The `connection_name` refers to the connection name for the model, which can be found in the AI Studio project settings page.
93+
To create an index that uses an embedding model deployed in your AI Studio project, we configure the connection to the model using a `ConnectionConfig` as shown below. The `subscription`, `resource_group` and `workspace` refers to the project where the embedding model is installed. The `connection_name` refers to the connection name for the model, which can be found in the AI Studio project settings page.
9494

9595
```python
9696
from promptflow.rag.config import ConnectionConfig
@@ -245,7 +245,7 @@ embeddings_model_config = IndexModelConfiguration.from_connection(
245245
deployment_name="text-embedding-ada-002")
246246
```
247247

248-
You can connect to embedding model deployed in your AI studio project (non Azure OpenAI models) using the serverless connection.
248+
You can connect to embedding model deployed in your AI Studio project (non Azure OpenAI models) using the serverless connection.
249249

250250
```python
251251
from azure.ai.ml.entities import IndexModelConfiguration

articles/ai-studio/how-to/disaster-recovery.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -110,7 +110,7 @@ For more information, see [Availability zone service and regional support](/azur
110110

111111
Determine the level of business continuity that you're aiming for. The level might differ between the components of your solution. For example, you might want to have a hot/hot configuration for production pipelines or model deployments, and hot/cold for development.
112112

113-
Azure AI studio is a regional service and stores data both service-side and on a storage account in your subscription. If a regional disaster occurs, service data can't be recovered. But you can recover the data stored by the service on the storage account in your subscription given storage redundancy is enforced. Service-side stored data is mostly metadata (tags, asset names, descriptions). Stored on your storage account is typically non-metadata, for example, uploaded data.
113+
Azure AI Studio is a regional service and stores data both service-side and on a storage account in your subscription. If a regional disaster occurs, service data can't be recovered. But you can recover the data stored by the service on the storage account in your subscription given storage redundancy is enforced. Service-side stored data is mostly metadata (tags, asset names, descriptions). Stored on your storage account is typically non-metadata, for example, uploaded data.
114114

115115
For connections, we recommend creating two separate resources in two distinct regions and then create two connections for the hub. For example, if AI Services is a critical resource for business continuity, creating two AI Services resources and two connections for the hub, would be a good strategy for business continuity. With this configuration, if one region goes down there's still one region operational.
116116

articles/ai-studio/how-to/fine-tune-model-llama.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -56,15 +56,15 @@ Fine-tuning of Llama 2 models is currently supported in projects located in West
5656

5757

5858
An Azure subscription with a valid payment method. Free or trial Azure subscriptions won't work. If you don't have an Azure subscription, create a [paid Azure account](https://azure.microsoft.com/pricing/purchase-options/pay-as-you-go) to begin.
59-
- An [Azure AI hub resource](../how-to/create-azure-ai-resource.md).
59+
- An [Azure AI Studio hub](../how-to/create-azure-ai-resource.md).
6060

6161
> [!IMPORTANT]
62-
> For Meta Llama 3.1 models, the pay-as-you-go model fine-tune offering is only available with AI hubs created in **West US 3** regions.
62+
> For Meta Llama 3.1 models, the pay-as-you-go model fine-tune offering is only available with hubs created in **West US 3** regions.
6363
64-
- An [Azure AI project](../how-to/create-projects.md) in Azure AI Studio.
64+
- An [Azure AI Studio project](../how-to/create-projects.md) in Azure AI Studio.
6565
- Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Studio. To perform the steps in this article, your user account must be assigned the __owner__ or __contributor__ role for the Azure subscription. Alternatively, your account can be assigned a custom role that has the following permissions:
6666

67-
- On the Azure subscription—to subscribe the Azure AI project to the Azure Marketplace offering, once for each project, per offering:
67+
- On the Azure subscription—to subscribe the AI Studio project to the Azure Marketplace offering, once for each project, per offering:
6868
- `Microsoft.MarketplaceOrdering/agreements/offers/plans/read`
6969
- `Microsoft.MarketplaceOrdering/agreements/offers/plans/sign/action`
7070
- `Microsoft.MarketplaceOrdering/offerTypes/publishers/offers/plans/agreements/read`
@@ -75,7 +75,7 @@ Fine-tuning of Llama 2 models is currently supported in projects located in West
7575
- `Microsoft.SaaS/resources/read`
7676
- `Microsoft.SaaS/resources/write`
7777

78-
- On the Azure AI project—to deploy endpoints (the Azure AI Developer role contains these permissions already):
78+
- On the AI Studio project—to deploy endpoints (the Azure AI Developer role contains these permissions already):
7979
- `Microsoft.MachineLearningServices/workspaces/marketplaceModelSubscriptions/*`
8080
- `Microsoft.MachineLearningServices/workspaces/serverlessEndpoints/*`
8181

0 commit comments

Comments
 (0)