Skip to content

Commit a64c0a6

Browse files
committed
rebrand in machine-learning dir
1 parent 05e6c1f commit a64c0a6

File tree

56 files changed

+96
-96
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

56 files changed

+96
-96
lines changed

articles/machine-learning/breadcrumb/toc.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -107,7 +107,7 @@ items:
107107
tocHref: /security/benchmark/azure/
108108
topicHref: /security/benchmark/azure/index
109109

110-
# AI Studio or Azure ML
110+
# AI Foundry or Azure ML
111111
- name: Azure
112112
tocHref: /ai/
113113
topicHref: /azure/index

articles/machine-learning/component-reference-v2/component-reference-v2.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ Azure Machine Learning designer components (Designer) allow users to create mach
2424

2525
This reference content provides background on each of the custom components (v2) available in Azure Machine Learning designer.
2626

27-
You can navigate to Custom components in Azure Machine Learning Studio as shown in the following image.
27+
You can navigate to Custom components in Azure Machine Learning studio as shown in the following image.
2828

2929
:::image type="content" source="media/designer-new-pipeline.png" alt-text="Diagram showing the Designer UI for selecting a custom component.":::
3030

articles/machine-learning/component-reference/score-vowpal-wabbit-model.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ This section contains implementation details, tips, and answers to frequently as
6767

6868
Vowpal Wabbit has many command-line options for choosing and tuning algorithms. A full discussion of these options is not possible here; we recommend that you view the [Vowpal Wabbit wiki page](https://github.com/JohnLangford/vowpal_wabbit/wiki/Command-line-arguments).
6969

70-
The following parameters are not supported in Azure Machine Learning Studio (classic).
70+
The following parameters are not supported in Azure Machine Learning studio (classic).
7171

7272
- The input/output options specified in [https://github.com/JohnLangford/vowpal_wabbit/wiki/Command-line-arguments](https://github.com/JohnLangford/vowpal_wabbit/wiki/Command-line-arguments)
7373

articles/machine-learning/concept-hub-workspace.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -33,11 +33,11 @@ In the transition from proving feasibility of an idea, to a funded project, many
3333

3434
The goal of hubs is to take away this bottleneck, by letting IT set up a secure, preconfigured, and reusable environment for a team to prototype, build, and operate machine learning models.
3535

36-
## Interoperability between ML studio and AI studio
36+
## Interoperability between ML studio and AI Foundry
3737

38-
Hubs can be used as your team's collaboration environment for both ML studio and [AI studio](/azure/ai-studio/what-is-ai-studio). Use ML Studio for training and operationalizing custom machine learning models. Use AI studio as experience for building and operating AI applications responsibly.
38+
Hubs can be used as your team's collaboration environment for both ML studio and [AI Foundry](/azure/ai-studio/what-is-ai-studio). Use ML Studio for training and operationalizing custom machine learning models. Use AI Foundry as experience for building and operating AI applications responsibly.
3939

40-
| Workspace Kind | ML Studio | AI Studio |
40+
| Workspace Kind | ML Studio | AI Foundry |
4141
| --- | --- | --- |
4242
| Default | Supported | - |
4343
| Hub | Supported | Supported |
@@ -54,7 +54,7 @@ Project workspaces that are created using a hub obtain the hub's security settin
5454
| Network settings | One [managed virtual network](how-to-managed-network.md) is shared between hub and project workspaces. To access content in the hub and project workspaces, create a single private link endpoint on the hub workspace. |
5555
| Encryption settings | Encryption settings pass down from hub to project. |
5656
| Storage for encrypted data | When you bring your customer-managed keys for encryption, hub and project workspaces share the same managed resource group for storing encrypted service data. |
57-
| Connections | Project workspaces can consume shared connections created on the hub. This feature is currently only supported in [AI studio]() |
57+
| Connections | Project workspaces can consume shared connections created on the hub. This feature is currently only supported in [AI Foundry]() |
5858
| Compute instance | Reuse a compute instance across all project workspaces associated to the same hub. |
5959
| Compute quota | Any compute quota consumed by project workspaces is deducted from the hub workspace quota balance. |
6060
| Storage | Associated resource for storing workspace data. Project workspaces use designated containers starting with a prefix {workspaceGUID}, and have a conditional [Azure Attribute Based Access](/azure/role-based-access-control/conditions-overview) role assignment for the workspace identity for accessing these containers only. |
@@ -69,7 +69,7 @@ Data that is uploaded in one project workspace, is stored in isolation from data
6969
Once a hub is created, there are multiple ways to create a project workspace using it:
7070

7171
1. [Using ML Studio](how-to-manage-workspace.md?tabs=mlstudio)
72-
1. [Using AI Studio](/azure/ai-studio/how-to/create-projects)
72+
1. [Using AI Foundry](/azure/ai-studio/how-to/create-projects)
7373
2. [Using Azure SDK](how-to-manage-workspace.md?tabs=python)
7474
4. [Using automation templates](how-to-create-workspace-template.md)
7575

@@ -93,11 +93,11 @@ Features that are supported using hub/project workspaces differ from regular wor
9393
| Feature | Default workspace | Hub workspace | Project workspace | Note |
9494
|--|--|--|--|--|
9595
|Self-serve create project workspaces from Studio| - | X | X | - |
96-
|Create shared connections on hub | |X|X| Only in AI studio |
96+
|Create shared connections on hub | |X|X| Only in AI Foundry portal |
9797
|Consume shared connections from hub | |X|X| - |
9898
|Reuse compute instance across workspaces|-|X|X| |
9999
|Share compute quota across workspaces|-|X|X||
100-
|Build GenAI apps in AI studio|-|X|X||
100+
|Build GenAI apps in AI Foundry portal|-|X|X||
101101
|Single private link endpoint across workspaces|-|X|X||
102102
|Managed virtual network|X|X|X|-|
103103
|BYO virtual network|X|-|-|Use alternative [managed virtual network](how-to-managed-network.md)|
@@ -115,6 +115,6 @@ To learn more about setting up Azure Machine Learning, see:
115115
+ [Create and manage a workspace](how-to-manage-workspace.md)
116116
+ [Get started with Azure Machine Learning](quickstart-create-resources.md)
117117

118-
To learn more about hub workspace support in AI Studio, see:
118+
To learn more about hub workspace support in AI Foundry portal, see:
119119

120120
+ [How to configure a managed network for hubs](/azure/ai-studio/how-to/configure-managed-network)

articles/machine-learning/concept-model-catalog.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -118,19 +118,19 @@ Models that are available for deployment as serverless APIs with pay-as-you-go b
118118

119119
### Pay for model usage in MaaS
120120

121-
The discovery, subscription, and consumption experience for models deployed via MaaS is in the Azure AI Studio and Azure Machine Learning studio. Users accept license terms for use of the models, and pricing information for consumption is provided during deployment. Models from third party providers are billed through Azure Marketplace, in accordance with the [Commercial Marketplace Terms of Use](/legal/marketplace/marketplace-terms); models from Microsoft are billed using Azure meters as First Party Consumption Services. As described in the [Product Terms](https://www.microsoft.com/licensing/terms/welcome/welcomepage), First Party Consumption Services are purchased using Azure meters but aren't subject to Azure service terms; use of these models is subject to the license terms provided.
121+
The discovery, subscription, and consumption experience for models deployed via MaaS is in the Azure AI Foundry portal and Azure Machine Learning studio. Users accept license terms for use of the models, and pricing information for consumption is provided during deployment. Models from third party providers are billed through Azure Marketplace, in accordance with the [Commercial Marketplace Terms of Use](/legal/marketplace/marketplace-terms); models from Microsoft are billed using Azure meters as First Party Consumption Services. As described in the [Product Terms](https://www.microsoft.com/licensing/terms/welcome/welcomepage), First Party Consumption Services are purchased using Azure meters but aren't subject to Azure service terms; use of these models is subject to the license terms provided.
122122

123123
### Deploy models for inference through MaaS
124124

125125
Deploying a model through MaaS allows users to get access to ready to use inference APIs without the need to configure infrastructure or provision GPUs, saving engineering time and resources. These APIs can be integrated with several LLM tools and usage is billed as described in the previous section.
126126

127127
### Fine-tune models through MaaS with Pay-as-you-go
128128

129-
For models that are available through MaaS and support fine-tuning, users can take advantage of hosted fine-tuning with pay-as-you-go billing to tailor the models using data they provide. For more information, see [fine-tune a Llama 2 model](/azure/ai-studio/how-to/fine-tune-model-llama) in Azure AI Studio.
129+
For models that are available through MaaS and support fine-tuning, users can take advantage of hosted fine-tuning with pay-as-you-go billing to tailor the models using data they provide. For more information, see [fine-tune a Llama 2 model](/azure/ai-studio/how-to/fine-tune-model-llama) in Azure AI Foundry portal.
130130

131131
### RAG with models deployed through MaaS
132132

133-
Azure AI Studio enables users to make use of Vector Indexes and Retrieval Augmented Generation. Models that can be deployed as serverless APIs can be used to generate embeddings and inferencing based on custom data to generate answers specific to their use case. For more information, see [Retrieval augmented generation and indexes](concept-retrieval-augmented-generation.md).
133+
Azure AI Foundry enables users to make use of Vector Indexes and Retrieval Augmented Generation. Models that can be deployed as serverless APIs can be used to generate embeddings and inferencing based on custom data to generate answers specific to their use case. For more information, see [Retrieval augmented generation and indexes](concept-retrieval-augmented-generation.md).
134134

135135
### Regional availability of offers and models
136136

articles/machine-learning/how-to-create-compute-instance.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -438,7 +438,7 @@ Assigned to user does not need compute write (create) permission to enable SSO.
438438

439439
Here are the steps assigned to user needs to take. Please note creator of compute instance is not allowed to enable SSO on that compute instance due to security reasons.
440440

441-
1. Click on compute in left navigation pane in Azure Machine Learning Studio.
441+
1. Click on compute in left navigation pane in Azure Machine Learning studio.
442442
1. Click on the name of compute instance where you need to enable SSO.
443443
1. Edit the Single sign-on details section.
444444

articles/machine-learning/how-to-custom-dns.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ When using an Azure Machine Learning workspace (including Azure AI hubs) with a
2525
- An Azure Virtual Network that uses [your own DNS server](/azure/virtual-network/virtual-networks-name-resolution-for-vms-and-role-instances#name-resolution-that-uses-your-own-dns-server).
2626

2727
:::moniker range="azureml-api-2"
28-
- An Azure Machine Learning workspace with a private endpoint, including hub workspaces such as those used by Azure AI Studio. For more information, see [Create an Azure Machine Learning workspace](how-to-manage-workspace.md).
28+
- An Azure Machine Learning workspace with a private endpoint, including hub workspaces such as those used by Azure AI Foundry. For more information, see [Create an Azure Machine Learning workspace](how-to-manage-workspace.md).
2929

3030
- If your workspace dependency resources are secured with an __Azure Virtual network__, familiarity with the [Network isolation during training & inference](./how-to-network-security-overview.md) article.
3131
:::moniker-end
@@ -57,7 +57,7 @@ Another option is to modify the `hosts` file on the client that is connecting to
5757
Access to a given Azure Machine Learning workspace via Private Link is done by communicating with the following Fully Qualified Domains (called the workspace FQDNs) listed below:
5858

5959
> [!IMPORTANT]
60-
> If you are using a hub workspace (including Azure AI Studio hub), then you will have addtional entries for each project workspace created from the hub.
60+
> If you are using a hub workspace (including Azure AI Foundry hub), then you will have addtional entries for each project workspace created from the hub.
6161
6262
**Azure Public regions**:
6363
- ```<per-workspace globally-unique identifier>.workspace.<region the workspace was created in>.api.azureml.ms```

articles/machine-learning/how-to-deploy-models-cohere-command.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2165,4 +2165,4 @@ For more information on how to track costs, see [Monitor costs for models offere
21652165
* [Azure AI Model Inference API](reference-model-inference-api.md)
21662166
* [Deploy models as serverless APIs](how-to-deploy-models-serverless.md)
21672167
* [Region availability for models in serverless API endpoints](concept-endpoint-serverless-availability.md)
2168-
* [Plan and manage costs for Azure AI Studio](concept-plan-manage-cost.md)
2168+
* [Plan and manage costs for Azure AI Foundry](concept-plan-manage-cost.md)

articles/machine-learning/how-to-deploy-models-cohere-embed.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -666,4 +666,4 @@ Quota is managed per deployment. Each deployment has a rate limit of 200,000 tok
666666
* [Azure AI Model Inference API](reference-model-inference-api.md)
667667
* [Deploy models as serverless APIs](how-to-deploy-models-serverless.md)
668668
* [Region availability for models in serverless API endpoints](concept-endpoint-serverless-availability.md)
669-
* [Plan and manage costs for Azure AI Studio](concept-plan-manage-cost.md)
669+
* [Plan and manage costs for Azure AI Foundry](concept-plan-manage-cost.md)

articles/machine-learning/how-to-deploy-models-cohere-rerank.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ author: msakande
1313
ms.custom: references_regions, build-2024
1414
ms.collection: ce-skilling-ai-copilot
1515

16-
#This functionality is also available in Azure AI Studio: /azure/ai-studio/how-to/deploy-models-cohere.md
16+
#This functionality is also available in Azure AI Foundry portal: /azure/ai-studio/how-to/deploy-models-cohere.md
1717
---
1818

1919
# How to deploy Cohere Rerank models with Azure Machine Learning studio
@@ -250,5 +250,5 @@ For more information on how to track costs, see [Monitor costs for models offere
250250

251251
- [Model Catalog and Collections](concept-model-catalog.md)
252252
- [Deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md)
253-
- [Plan and manage costs for Azure AI Studio](concept-plan-manage-cost.md)
253+
- [Plan and manage costs for Azure AI Foundry](concept-plan-manage-cost.md)
254254
- [Region availability for models in serverless API endpoints](concept-endpoint-serverless-availability.md)

0 commit comments

Comments
 (0)