Skip to content

Commit b99d64b

Browse files
committed
fixing merge conflict
2 parents f9af1fb + 48d232c commit b99d64b

File tree

221 files changed

+2017
-4463
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

221 files changed

+2017
-4463
lines changed

.openpublishing.redirection.json

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -294,6 +294,31 @@
294294
"source_path_from_root": "/articles/ai-services/speech-service/text-to-speech-avatar/custom-avatar-endpoint.md",
295295
"redirect_url": "/azure/ai-services/speech-service/custom-avatar-create",
296296
"redirect_document_id": false
297+
},
298+
{
299+
"source_path_from_root": "/articles/ai-services/speech-service/migration-overview-neural-voice.md",
300+
"redirect_url": "/azure/ai-services/speech-service/custom-neural-voice",
301+
"redirect_document_id": false
302+
},
303+
{
304+
"source_path_from_root": "/articles/ai-services/speech-service/how-to-migrate-to-custom-neural-voice.md",
305+
"redirect_url": "/azure/ai-services/speech-service/custom-neural-voice",
306+
"redirect_document_id": false
307+
},
308+
{
309+
"source_path_from_root": "/articles/ai-services/speech-service/how-to-migrate-to-prebuilt-neural-voice.md",
310+
"redirect_url": "/azure/ai-services/speech-service/custom-neural-voice",
311+
"redirect_document_id": false
312+
},
313+
{
314+
"source_path_from_root": "/articles/ai-foundry/quickstarts/hear-speak-playground.md",
315+
"redirect_url": "/azure/ai-foundry/quickstarts/get-started-playground",
316+
"redirect_document_id": false
317+
},
318+
{
319+
"source_path_from_root": "/articles/ai-services/language-service/tutorials/prompt-flow.md",
320+
"redirect_url": "/azure/ai-services/language-service/tutorials/power-automate",
321+
"redirect_document_id": false
297322
}
298323
]
299324
}

articles/ai-foundry/concepts/concept-model-distillation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Learn how to do distillation in Azure AI Foundry portal.
55
manager: scottpolly
66
ms.service: azure-ai-foundry
77
ms.topic: how-to
8-
ms.date: 03/09/2025
8+
ms.date: 05/20/2025
99
ms.reviewer: vkann
1010
reviewer: anshirga
1111
ms.author: ssalgado

articles/ai-foundry/concepts/concept-synthetic-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Learn how to generate a synthetic dataset in Azure AI Foundry porta
55
manager: scottpolly
66
ms.service: azure-ai-foundry
77
ms.topic: how-to
8-
ms.date: 03/11/2025
8+
ms.date: 05/20/2025
99
ms.reviewer: vkann
1010
reviewer: anshirga
1111
ms.author: ssalgado

articles/ai-foundry/foundry-local/get-started.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ manager: scottpolly
66
keywords: Azure AI services, cognitive, AI models, local inference
77
ms.service: azure-ai-foundry
88
ms.topic: quickstart
9-
ms.date: 05/20/2025
9+
ms.date: 05/23/2025
1010
ms.reviewer: samkemp
1111
ms.author: samkemp
1212
author: samuel100
@@ -24,7 +24,7 @@ This guide walks you through setting up Foundry Local to run AI models on your d
2424

2525
Your system must meet the following requirements to run Foundry Local:
2626

27-
- **Operating System**: Windows 10 (x64), Windows 11 (x64/ARM), macOS.
27+
- **Operating System**: Windows 10 (x64), Windows 11 (x64/ARM), Windows Server 2025, macOS.
2828
- **Hardware**: Minimum 8GB RAM, 3GB free disk space. Recommended 16GB RAM, 15GB free disk space.
2929
- **Network**: Internet connection for initial model download (optional for offline use)
3030
- **Acceleration (optional)**: NVIDIA GPU (2,000 series or newer), AMD GPU (6,000 series or newer), Qualcomm Snapdragon X Elite (8GB or more of memory), or Apple silicon.

articles/ai-foundry/how-to/configure-managed-network.md

Lines changed: 27 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -855,8 +855,33 @@ When you create a private endpoint for hub dependency resources, such as Azure S
855855

856856
A private endpoint is automatically created for a connection if the target resource is an Azure resource listed previously. A valid target ID is expected for the private endpoint. A valid target ID for the connection can be the Azure Resource Manager ID of a parent resource. The target ID is also expected in the target of the connection or in `metadata.resourceid`. For more on connections, see [How to add a new connection in Azure AI Foundry portal](connections-add.md).
857857

858-
> [!IMPORTANT]
859-
> As of April 30th 2025, the Azure AI Enterprise Network Connection Approver role must be assigned to the Azure AI Foundry hub's managed identity to approve private endpoints to securely access your Azure resources from the managed virtual network. This doesn't impact existing resources with approved private endpoints as the role is correctly assigned by the service. For new resources, ensure the role is assigned to the hub's managed identity. For Azure Data Factory, Azure Databricks, and Azure Function Apps, the Contributor role should instead be assigned to your hub's managed identity. This role assignment is applicable to both User-assigned identity and System-assigned identity workspaces.
858+
### Approval of Private Endpoints
859+
860+
To establish Private Endpoint connections in managed virtual networks using Azure AI Foundry, the workspace managed identity, whether system-assigned or user-assigned, must have permissions to approve the Private Endpoint connections on the target resources. Previously, this was done through automatic role assignments by the Azure AI Foundry service. However, there are security concerns about the automatic role assignment. To improve security, starting April 30th, 2025, we will discontinue this automatic permission grant logic. We recommend assigning the [Azure AI Enterprise Network Connection Approver role](/azure/role-based-access-control/built-in-roles/ai-machine-learning) or a custom role with the necessary Private Endpoint connection permissions on the target resource types and grant this role to the Azure Machine Learning workspace's managed identity to allow Azure AI Foundry services to approve Private Endpoint connections to the target Azure resources.
861+
862+
Here's the list of private endpoint target resource types covered by covered by the Azure AI Enterprise Network Connection Approver role:
863+
864+
* Azure Application Gateway
865+
* Azure Monitor
866+
* Azure AI Search
867+
* Event Hubs
868+
* Azure SQL Database
869+
* Azure Storage
870+
* Azure Machine Learning workspace
871+
* Azure Machine Learning registry
872+
* Azure AI Foundry
873+
* Azure Key Vault
874+
* Azure CosmosDB
875+
* Azure Database for MySQL
876+
* Azure Database for PostgreSQL
877+
* Azure AI Services
878+
* Azure Cache for Redis
879+
* Container Registry
880+
* API Management
881+
882+
For creating Private Endpoint outbound rules to target resource types not covered by the Azure AI Enterprise Network Connection Approver role, such as Azure Data Factory, Azure Databricks, and Azure Function Apps, a custom scoped-down role is recommended, defined only by the actions necessary to approve private endpoint connections on the target resource types.
883+
884+
For creating Private Endpoint outbound rules to default workspace resources, the required permissions are automatically covered by the role assignments granted during workspace creation, so no additional action is needed.
860885

861886
## Select an Azure Firewall version for allowed only approved outbound
862887

articles/ai-foundry/how-to/connections-add.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ Here's a table of some of the available connection types in Azure AI Foundry por
5353
|-------------------------------|:-------:|:--------------------------------------:|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
5454
| Azure AI Search | || Azure AI Search is an Azure resource that supports information retrieval over your vector and textual data stored in search indexes. |
5555
| Azure Storage | || Azure Storage is a cloud storage solution for storing unstructured data like documents, images, videos, and application installers. |
56-
| Azure Cosmos DB | || Azure Cosmos DB is a globally distributed, multi-model database service that offers low latency, high availability, and scalability across multiple geographical regions. |
56+
| Azure Cosmos DB | || Azure Cosmos DB is a globally distributed, multi-model database service that offers low latency, high availability, and scalability across multiple geographical regions. |
5757
| Azure OpenAI | | | Azure OpenAI is a service that provides access to OpenAI's models including the GPT-4o, GPT-4o mini, GPT-4, GPT-4 Turbo with Vision, GPT-3.5-Turbo, DALLE-3, and Embeddings model series with the security and enterprise capabilities of Azure. |
5858
| Application Insights | | | Azure Application Insights is a service within Azure Monitor that enables developers and DevOps teams to automatically detect performance anomalies, diagnose issues, and gain deep insights into application usage and behavior through powerful telemetry and analytics tools. |
5959
| API key | | | API Key connections handle authentication to your specified target on an individual basis. |

articles/ai-foundry/how-to/create-projects.md

Lines changed: 6 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.custom:
99
- build-2024
1010
- ignite-2024
1111
ms.topic: how-to
12-
ms.date: 04/11/2025
12+
ms.date: 05/20/2025
1313
ms.reviewer: deeikele
1414
ms.author: sgilley
1515
author: sdgilley
@@ -19,7 +19,7 @@ zone_pivot_groups: project-type
1919

2020
# Create a project for Azure AI Foundry
2121

22-
This article describes how to create an [Azure AI Foundry](https://ai.azure.com) project. Projects are folders that let you organize your work for exploring new ideas, and as you're prototyping on a particular use cases.
22+
This article describes how to create an [Azure AI Foundry](https://ai.azure.com) project. Projects let you organize your work for exploring new ideas and as you prototype on a particular use case.
2323

2424
Azure AI Foundry supports two types of projects: a **[!INCLUDE [fdp](../includes/fdp-project-name.md)]** and a **[!INCLUDE [hub](../includes/hub-project-name.md)]**. For more information about the differences between these two project types, see [Types of projects](../what-is-azure-ai-foundry.md#project-types).
2525

@@ -44,7 +44,6 @@ Azure AI Foundry supports two types of projects: a **[!INCLUDE [fdp](../includes
4444
* Evaluations
4545
* Playgrounds
4646

47-
4847
## Prerequisites
4948

5049
Use the following tabs to select the method you plan to use to create a [!INCLUDE [hub](../includes/hub-project-name.md)]:
@@ -70,7 +69,7 @@ Use the following tabs to select the method you plan to use to create a [!INCLUD
7069

7170
---
7271

73-
## Create a project
72+
## Create a [!INCLUDE [hub-project-name](../includes/hub-project-name.md)]
7473

7574
# [Azure AI Foundry portal](#tab/ai-foundry)
7675

@@ -129,7 +128,7 @@ On the project **Overview** page, you can find information about the project.
129128

130129
:::image type="content" source="../media/how-to/projects/project-settings.png" alt-text="Screenshot of an Azure AI Foundry project settings page." lightbox = "../media/how-to/projects/project-settings.png":::
131130

132-
- Name: The name of the project appears in the top left corner. You can rename the project using the edit tool.
131+
- Name: The name of the project appears in the top left corner.
133132
- Subscription: The subscription that hosts the hub that hosts the project.
134133
- Resource group: The resource group that hosts the hub that hosts the project.
135134

@@ -176,9 +175,7 @@ In addition, many resources are only accessible by users in your project workspa
176175
177176
## Related content
178177

179-
- [Quickstart: Get started with Azure AI Foundry](../quickstarts/get-started-code.md?pivots=hub-project)
180-
181-
- [Learn more about Azure AI Foundry](../what-is-azure-ai-foundry.md)
178+
- [Quickstart: Use the chat playground in Azure AI Foundry portal](../quickstarts/get-started-playground.md)
182179

183180
- [Learn more about hubs](../concepts/ai-resources.md)
184181

@@ -188,4 +185,4 @@ In addition, many resources are only accessible by users in your project workspa
188185

189186
[!INCLUDE [create-project-fdp](../includes/create-project-fdp.md)]
190187

191-
::: zone-end
188+
::: zone-end

articles/ai-foundry/how-to/data-add.md

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,9 @@ ms.custom:
99
- build-2024
1010
- ignite-2024
1111
ms.topic: how-to
12-
ms.date: 02/11/2025
13-
ms.author: franksolomon
14-
author: fbsolo-ms1
12+
ms.date: 05/21/2025
13+
author: Blackmist
14+
ms.author: larryfr
1515
---
1616

1717
# How to add and manage data in your Azure AI Foundry project
@@ -29,12 +29,10 @@ Data can help when you need these capabilities:
2929
> - **Lineage:** For any given data, you can view which jobs or prompt flow pipelines consume the data.
3030
> - **Ease-of-use:** An Azure AI Foundry data resembles web browser bookmarks (favorites). Instead of remembering long storage paths that *reference* your frequently-used data on Azure Storage, you can create a data *version* and then access that version of the asset with a friendly name.
3131
32-
## Prerequisites
3332

34-
To create and work with data, you need:
33+
## Prerequisites
3534

36-
- An Azure subscription. If you don't have one, create a [free account](https://azure.microsoft.com/free/).
37-
- An [Azure AI Foundry project](../how-to/create-projects.md).
35+
[!INCLUDE [hub-only-prereq](../includes/hub-only-prereq.md)]
3836

3937
## Create data
4038

articles/ai-foundry/how-to/develop/langchain.md

Lines changed: 26 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -51,8 +51,11 @@ To use LLMs deployed in Azure AI Foundry portal, you need the endpoint and crede
5151
[!INCLUDE [tip-left-pane](../../includes/tip-left-pane.md)]
5252
5353
1. Go to the [Azure AI Foundry](https://ai.azure.com/).
54+
5455
1. Open the project where the model is deployed, if it isn't already open.
56+
5557
1. Go to **Models + endpoints** and select the model you deployed as indicated in the prerequisites.
58+
5659
1. Copy the endpoint URL and the key.
5760

5861
:::image type="content" source="../../media/how-to/inference/serverless-endpoint-url-keys.png" alt-text="Screenshot of the option to copy endpoint URI and keys from an endpoint." lightbox="../../media/how-to/inference/serverless-endpoint-url-keys.png":::
@@ -63,11 +66,19 @@ To use LLMs deployed in Azure AI Foundry portal, you need the endpoint and crede
6366
In this scenario, we placed both the endpoint URL and key in the following environment variables:
6467
6568
```bash
66-
export AZURE_INFERENCE_ENDPOINT="<your-model-endpoint-goes-here>"
69+
export AZURE_INFERENCE_ENDPOINT="https://<resource>.services.ai.azure.com/models"
6770
export AZURE_INFERENCE_CREDENTIAL="<your-key-goes-here>"
6871
```
6972
70-
Once configured, create a client to connect to the endpoint. In this case, we're working with a chat completions model hence we import the class `AzureAIChatCompletionsModel`.
73+
Once configured, create a client to connect with the chat model by using the `init_chat_model`. For Azure OpenAI models, configure the client as indicated at [Using Azure OpenAI models](#using-azure-openai-models).
74+
75+
```python
76+
from langchain.chat_models import init_chat_model
77+
78+
llm = init_chat_model(model="mistral-large-2407", model_provider="azure_ai")
79+
```
80+
81+
You can also use the class `AzureAIChatCompletionsModel` directly.
7182
7283
```python
7384
import os
@@ -80,8 +91,8 @@ model = AzureAIChatCompletionsModel(
8091
)
8192
```
8293
83-
> [!TIP]
84-
> For Azure OpenAI models, configure the client as indicated at [Using Azure OpenAI models](#using-azure-openai-models).
94+
> [!CAUTION]
95+
> **Breaking change:** Parameter `model_name` was renamed `model` in version `0.1.3`.
8596
8697
You can use the following code to create the client if your endpoint supports Microsoft Entra ID:
8798
@@ -93,7 +104,7 @@ from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
93104
model = AzureAIChatCompletionsModel(
94105
endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
95106
credential=DefaultAzureCredential(),
96-
model_name="mistral-large-2407",
107+
model="mistral-large-2407",
97108
)
98109
```
99110
@@ -111,11 +122,11 @@ from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
111122
model = AzureAIChatCompletionsModel(
112123
endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
113124
credential=DefaultAzureCredentialAsync(),
114-
model_name="mistral-large-2407",
125+
model="mistral-large-2407",
115126
)
116127
```
117128
118-
If your endpoint is serving one model, like with the standard deployment, you don't have to indicate `model_name` parameter:
129+
If your endpoint is serving one model, like with the standard deployment, you don't have to indicate `model` parameter:
119130

120131
```python
121132
import os
@@ -180,21 +191,21 @@ chain.invoke({"language": "italian", "text": "hi"})
180191
181192
Models deployed to Azure AI Foundry support the Foundry Models API, which is standard across all the models. Chain multiple LLM operations based on the capabilities of each model so you can optimize for the right model based on capabilities.
182193
183-
In the following example, we create two model clients. One is a producer and another one is a verifier. To make the distinction clear, we're using a multi-model endpoint like the [Foundry Models API](../../model-inference/overview.md) and hence we're passing the parameter `model_name` to use a `Mistral-Large` and a `Mistral-Small` model, quoting the fact that **producing content is more complex than verifying it**.
194+
In the following example, we create two model clients. One is a producer and another one is a verifier. To make the distinction clear, we're using a multi-model endpoint like the [Foundry Models API](../../model-inference/overview.md) and hence we're passing the parameter `model` to use a `Mistral-Large` and a `Mistral-Small` model, quoting the fact that **producing content is more complex than verifying it**.
184195
185196
```python
186197
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
187198
188199
producer = AzureAIChatCompletionsModel(
189200
endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
190201
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
191-
model_name="mistral-large-2407",
202+
model="mistral-large-2407",
192203
)
193204
194205
verifier = AzureAIChatCompletionsModel(
195206
endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
196207
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
197-
model_name="mistral-small",
208+
model="mistral-small",
198209
)
199210
```
200211
@@ -271,7 +282,7 @@ from langchain_azure_ai.embeddings import AzureAIEmbeddingsModel
271282
embed_model = AzureAIEmbeddingsModel(
272283
endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
273284
credential=os.environ['AZURE_INFERENCE_CREDENTIAL'],
274-
model_name="text-embedding-3-large",
285+
model="text-embedding-3-large",
275286
)
276287
```
277288
@@ -305,31 +316,15 @@ for doc in results:
305316
306317
## Using Azure OpenAI models
307318
308-
If you're using Azure OpenAI in Foundry Models or Foundry Models service with OpenAI models with `langchain-azure-ai` package, you might need to use `api_version` parameter to select a specific API version. The following example shows how to connect to an Azure OpenAI in Foundry Models deployment:
309-
310-
```python
311-
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
312-
313-
llm = AzureAIChatCompletionsModel(
314-
endpoint="https://<resource>.openai.azure.com/openai/deployments/<deployment-name>",
315-
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
316-
api_version="2024-05-01-preview",
317-
)
318-
```
319-
320-
> [!IMPORTANT]
321-
> Check which is the API version that your deployment is using. Using a wrong `api_version` or one not supported by the model results in a `ResourceNotFound` exception.
322-
323-
If the deployment is hosted in Azure AI Services, you can use the Foundry Models service:
319+
If you're using Azure OpenAI models with `langchain-azure-ai` package, use the following URL:
324320
325321
```python
326322
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
327323
328324
llm = AzureAIChatCompletionsModel(
329-
endpoint="https://<resource>.services.ai.azure.com/models",
325+
endpoint="https://<resource>.openai.azure.com/openai/v1",
330326
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
331-
model_name="<model-name>",
332-
api_version="2024-05-01-preview",
327+
model="gpt-4o"
333328
)
334329
```
335330
@@ -370,7 +365,7 @@ from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel
370365
model = AzureAIChatCompletionsModel(
371366
endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
372367
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
373-
model_name="mistral-large-2407",
368+
model="mistral-large-2407",
374369
client_kwargs={"logging_enable": True},
375370
)
376371
```

articles/ai-foundry/includes/chat-with-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ To complete this section, you need a local copy of product data. The [Azure-Samp
1515
Follow these steps to add your data in the chat playground to help the assistant answer questions about your products. You're not changing the deployed model itself. Your data is stored separately and securely in your Azure subscription.
1616

1717
1. Go to your project in [Azure AI Foundry](https://ai.azure.com).
18-
1. Select **Playgrounds**.
18+
1. Select **Playgrounds** from the left pane.
1919
1. Select **Try the chat playground**.
2020
1. Select your deployed chat model from the **Deployment** dropdown.
2121

0 commit comments

Comments
 (0)