You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/concepts/ai-resources.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,22 +32,22 @@ The Azure AI hub resource provides the collaboration environment for a team to b
32
32
33
33
## Central setup and management concepts
34
34
35
-
Various management concepts are available on Azure AI hub resources to support team leads and admins to centrally manage a team's environment. In [Azure AI studio](https://ai.azure.com/), you find these on the **Manage** page.
35
+
Various management concepts are available on Azure AI hub resources to support team leads and admins to centrally manage a team's environment. In [Azure AI Studio](https://ai.azure.com/), you find these on the **Manage** page.
36
36
37
37
***Security configuration** including public network access, [virtual networking](#virtual-networking), customer-managed key encryption, and privileged access to whom can create projects for customization. Security settings configured on the Azure AI hub resource automatically pass down to each project. A managed virtual network is shared between all projects that share the same Azure AI hub resource.
38
38
***Connections** are named and authenticated references to Azure and non-Azure resources like data storage providers. Use a connection as a means for making an external resource available to a group of developers without having to expose its stored credential to an individual.
39
-
***Compute and quota allocation** is managed as shared capacity for all projects in AI studio that share the same Azure AI hub resource. This includes compute instance as managed cloud-based workstation for an individual. Compute instance can be used across projects by the same user.
39
+
***Compute and quota allocation** is managed as shared capacity for all projects in AI Studio that share the same Azure AI hub resource. This includes compute instance as managed cloud-based workstation for an individual. Compute instance can be used across projects by the same user.
40
40
***AI services access keys** to endpoints for prebuilt AI models are managed on the Azure AI hub resource scope. Use these endpoints to access foundation models from Azure OpenAI, Speech, Vision, and Content Safety with one [API key](#azure-ai-services-api-access-keys)
41
41
***Policy** enforced in Azure on the Azure AI hub resource scope applies to all projects managed under it.
42
-
***Dependent Azure resources** are set up once per Azure AI hub resource and associated projects and used to store artifacts you generate while working in AI studio such as logs or when uploading data. See [Azure AI dependencies](#azure-ai-dependencies) for more details.
42
+
***Dependent Azure resources** are set up once per Azure AI hub resource and associated projects and used to store artifacts you generate while working in AI Studio such as logs or when uploading data. See [Azure AI dependencies](#azure-ai-dependencies) for more details.
43
43
44
44
## Organize work in projects for customization
45
45
46
-
An Azure AI hub resource provides the hosting environment for **projects** in AI studio. A project is an organizational container that has tools for AI customization and orchestration, lets you organize your work, save state across different tools like prompt flow, and collaborate with others. For example, you can share uploaded files and connections to data sources.
46
+
An Azure AI hub resource provides the hosting environment for **projects** in AI Studio. A project is an organizational container that has tools for AI customization and orchestration, lets you organize your work, save state across different tools like prompt flow, and collaborate with others. For example, you can share uploaded files and connections to data sources.
47
47
48
48
Multiple projects can use an Azure AI hub resource, and a project can be used by multiple users. A project also helps you keep track of billing, and manage access and provides data isolation. Every project has dedicated storage containers to let you upload files and share it with only other project members when using the 'data' experiences.
49
49
50
-
Projects let you create and group reusable components that can be used across tools in AI studio:
50
+
Projects let you create and group reusable components that can be used across tools in AI Studio:
51
51
52
52
| Asset | Description |
53
53
| --- | --- |
@@ -108,7 +108,7 @@ Connections can be set up as shared with all projects in the same Azure AI hub r
108
108
109
109
## Azure AI dependencies
110
110
111
-
Azure AI studio layers on top of existing Azure services including Azure AI and Azure Machine Learning services. While this might not be visible on the display names in Azure portal, AI studio, or when using the SDK or CLI, some of these architectural details become apparent when you work with the Azure REST APIs, use Azure cost reporting, or use infrastructure-as-code templates such as Azure Bicep or Azure Resource Manager. From an Azure Resource Provider perspective, Azure AI studio resource types map to the following resource provider kinds:
111
+
Azure AI Studio layers on top of existing Azure services including Azure AI and Azure Machine Learning services. While this might not be visible on the display names in Azure portal, AI Studio, or when using the SDK or CLI, some of these architectural details become apparent when you work with the Azure REST APIs, use Azure cost reporting, or use infrastructure-as-code templates such as Azure Bicep or Azure Resource Manager. From an Azure Resource Provider perspective, Azure AI Studio resource types map to the following resource provider kinds:
112
112
113
113
|Resource type|Resource provider|Kind|
114
114
|---|---|---|
@@ -117,7 +117,7 @@ Azure AI studio layers on top of existing Azure services including Azure AI and
117
117
|Azure AI services|Microsoft.CognitiveServices/account|AIServices|
118
118
|Azure AI OpenAI Service|Microsoft.CognitiveServices/account|OpenAI|
119
119
120
-
When you create a new Azure AI hub resource, a set of dependent Azure resources are required to store data that you upload or get generated when working in AI studio. If not provided by you, these resources are automatically created.
120
+
When you create a new Azure AI hub resource, a set of dependent Azure resources are required to store data that you upload or get generated when working in AI Studio. If not provided by you, these resources are automatically created.
Copy file name to clipboardExpand all lines: articles/ai-studio/concepts/deployments-overview.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,7 +32,7 @@ Azure AI Studio simplifies deployments. A simple select or a line of code deploy
32
32
33
33
### Azure OpenAI models
34
34
35
-
Azure OpenAI allows you to get access to the latest OpenAI models with the enterprise features from Azure. Learn more about [how to deploy OpenAI models in AI studio](../how-to/deploy-models-openai.md).
35
+
Azure OpenAI allows you to get access to the latest OpenAI models with the enterprise features from Azure. Learn more about [how to deploy OpenAI models in AI Studio](../how-to/deploy-models-openai.md).
Copy file name to clipboardExpand all lines: articles/ai-studio/concepts/evaluation-improvement-strategies.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ Mitigating harms presented by large language models (LLMs) such as the Azure Ope
22
22
:::image type="content" source="../media/evaluations/mitigation-layers.png" alt-text="Diagram of strategy to mitigate potential harms of generative AI applications." lightbox="../media/evaluations/mitigation-layers.png":::
23
23
24
24
## Model layer
25
-
At the model level, it's important to understand the models you use and what fine-tuning steps might have been taken by the model developers to align the model towards its intended uses and to reduce the risk of potentially harmful uses and outcomes. Azure AI studio's model catalog enables you to explore models from Azure OpenAI Service, Meta, etc., organized by collection and task. In the [model catalog](../how-to/model-catalog.md), you can explore model cards to understand model capabilities and limitations, experiment with sample inferences, and assess model performance. You can further compare multiple models side-by-side through benchmarks to select the best one for your use case. Then, you can enhance model performance by fine-tuning with your training data.
25
+
At the model level, it's important to understand the models you use and what fine-tuning steps might have been taken by the model developers to align the model towards its intended uses and to reduce the risk of potentially harmful uses and outcomes. Azure AI Studio's model catalog enables you to explore models from Azure OpenAI Service, Meta, etc., organized by collection and task. In the [model catalog](../how-to/model-catalog.md), you can explore model cards to understand model capabilities and limitations, experiment with sample inferences, and assess model performance. You can further compare multiple models side-by-side through benchmarks to select the best one for your use case. Then, you can enhance model performance by fine-tuning with your training data.
26
26
27
27
## Safety systems layer
28
28
For most applications, it’s not enough to rely on the safety fine-tuning built into the model itself. LLMs can make mistakes and are susceptible to attacks like jailbreaks. In many applications at Microsoft, we use another AI-based safety system, [Azure AI Content Safety](https://azure.microsoft.com/products/ai-services/ai-content-safety/), to provide an independent layer of protection, helping you to block the output of harmful content.
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/deploy-models-llama.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -349,7 +349,7 @@ The following is an example response:
349
349
350
350
## Deploy Llama 2 models to real-time endpoints
351
351
352
-
Llama 2 models can be deployed to real-time endpoints in AI studio. When deployed to real-time endpoints, you can select all the details about on the infrastructure running the model including the virtual machines used to run it and the number of instances to handle the load you're expecting. Models deployed in this modality consume quota from your subscription. All the models in the Llama family can be deployed to real-time endpoints.
352
+
Llama 2 models can be deployed to real-time endpoints in AI Studio. When deployed to real-time endpoints, you can select all the details about on the infrastructure running the model including the virtual machines used to run it and the number of instances to handle the load you're expecting. Models deployed in this modality consume quota from your subscription. All the models in the Llama family can be deployed to real-time endpoints.
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/models-foundation-azure-ai.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -86,7 +86,7 @@ You can conveniently access these links from a menu at the top-right corner of A
86
86
87
87
Prompt engineering is an important aspect of working with generative AI models as it allows users to have greater control, customization, and influence over the outputs. By skillfully designing prompts, users can harness the capabilities of generative AI models to generate desired content, address specific requirements, and cater to various application domains.
88
88
89
-
The prompt samples are designed to assist AI studio users in finding and utilizing prompts for common use-cases and quickly get started. Users can explore the catalog, view available prompts, and easily open them in a playground for further customization and fine-tuning.
89
+
The prompt samples are designed to assist AI Studio users in finding and utilizing prompts for common use-cases and quickly get started. Users can explore the catalog, view available prompts, and easily open them in a playground for further customization and fine-tuning.
90
90
91
91
> [!NOTE]
92
92
> These prompts serve as starting points to help users get started and we recommend users to tune and evaluate before using in production.
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/prompt-flow-tools/python-tool.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -118,7 +118,7 @@ Create a custom connection that stores all your LLM API KEY or other required cr
118
118
- azureml.flow.connection_type: Custom
119
119
- azureml.flow.module: promptflow.connections
120
120
121
-
:::image type="content"source="./media/python-tool/custom-connection-meta.png" alt-text="Screenshot that shows add extra meta to custom connection in AI studio."lightbox="./media/python-tool/custom-connection-meta.png":::
121
+
:::image type="content"source="./media/python-tool/custom-connection-meta.png" alt-text="Screenshot that shows add extra meta to custom connection in AI Studio."lightbox="./media/python-tool/custom-connection-meta.png":::
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/prompt-flow-tools/serp-api-tool.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -34,7 +34,7 @@ Create a Serp connection:
34
34
- azureml.flow.module: promptflow.connections
35
35
- api_key: Your_Serp_API_key, please mark it as a secret.
36
36
37
-
:::image type="content" source="./media/serp-api-tool/serp-connection-meta.png" alt-text="Screenshot that shows add extra meta to custom connection in AI studio." lightbox = "./media/serp-api-tool/serp-connection-meta.png":::
37
+
:::image type="content" source="./media/serp-api-tool/serp-connection-meta.png" alt-text="Screenshot that shows add extra meta to custom connection in AI Studio." lightbox = "./media/serp-api-tool/serp-connection-meta.png":::
38
38
39
39
The connection is the model used to establish connections with Serp API. Get your API key from the SerpAPI account dashboard.
0 commit comments