Skip to content

Commit 9504c1e

Browse files
committed
Merge branch 'main' into release-foundry-toc
2 parents c7884a4 + 13cfb7c commit 9504c1e

22 files changed

+60
-49
lines changed

articles/ai-foundry/model-inference/concepts/deployment-types.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,12 +31,13 @@ To learn more about deployment options for Azure OpenAI models see [Azure OpenAI
3131

3232
Models from third-party model providers with pay-as-you-go billing (collectively called Models-as-a-Service), makes models available in Azure AI model inference under **standard** deployments with a Global processing option (`Global-Standard`).
3333

34-
Models-as-a-Service offers regional deployment options under [Serverless API endpoints](../../../ai-studio/how-to/deploy-models-serverless.md) in Azure AI Foundry. Prompts and outputs are processed within the geography specified during deployment. However, those deployments can't be accessed using the Azure AI model inference endpoint in Azure AI Services.
35-
3634
### Global-Standard
3735

3836
Global deployments leverage Azure's global infrastructure to dynamically route traffic to the data center with best availability for each request. Global standard provides the highest default quota and eliminates the need to load balance across multiple resources. Data stored at rest remains in the designated Azure geography, while data may be processed for inferencing in any Azure location. Learn more about [data residency](https://azure.microsoft.com/explore/global-infrastructure/data-residency/).
3937

38+
> [!NOTE]
39+
> Models-as-a-Service offers regional deployment options under [Serverless API endpoints](../../../ai-studio/how-to/deploy-models-serverless.md) in Azure AI Foundry. Prompts and outputs are processed within the geography specified during deployment. However, those deployments can't be accessed using the Azure AI model inference endpoint in Azure AI Services.
40+
4041
## Control deployment options
4142

4243
Administrators can control which model deployment types are available to their users by using Azure Policies. Learn more about [How to control AI model deployment with custom policies](../../../ai-studio/how-to/custom-policy-model-deployment.md).

articles/ai-foundry/model-inference/faq.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ metadata:
55
description: Get answers to the most popular questions about Azure AI model inference
66
#services: cognitive-services
77
manager: nitinme
8-
ms.service: azure-ai-models
8+
ms.service: azure-ai-model-inference
99
ms.topic: faq
1010
ms.date: 1/21/2025
1111
ms.author: fasantia

articles/ai-foundry/model-inference/index.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ summary: Azure AI model inference provides access to the most powerful models av
66
metadata:
77
title: Azure AI model inference documentation - Quickstarts, How-to's, API Reference - Azure AI Foundry | Microsoft Docs
88
description: Learn how to use flagship models available in the Azure AI model catalog from the key model providers in the industry, including OpenAI, Microsoft, Meta, Mistral, Cohere, G42, and AI21 Labs.
9-
ms.service: azure-ai-models
9+
ms.service: azure-ai-model-inference
1010
ms.custom:
1111
ms.topic: landing-page
1212
author: mrbullwinkle

articles/ai-services/agents/includes/quickstart-csharp.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,12 @@ dotnet add package Azure.AI.Projects
3838
dotnet add package Azure.Identity
3939
```
4040

41+
Next, to authenticate your API requests and run the program, use the [az login](/cli/azure/authenticate-azure-cli-interactively) command to sign into your Azure subscription.
42+
43+
```azurecli
44+
az login
45+
```
46+
4147
Use the following code to create and run an agent. To run this code, you will need to create a connection string using information from your project. This string is in the format:
4248

4349
`<HostName>;<AzureSubscriptionId>;<ResourceGroup>;<ProjectName>`
@@ -56,6 +62,7 @@ For example, your connection string may look something like:
5662

5763
Set this connection string as an environment variable named `PROJECT_CONNECTION_STRING`.
5864

65+
5966
```csharp
6067
// Copyright (c) Microsoft Corporation. All rights reserved.
6168
// Licensed under the MIT License.

articles/ai-services/agents/includes/quickstart-python-openai.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -41,6 +41,12 @@ pip install azure-identity
4141
pip install openai
4242
```
4343

44+
Next, to authenticate your API requests and run the program, use the [az login](/cli/azure/authenticate-azure-cli-interactively) command to sign into your Azure subscription.
45+
46+
```azurecli
47+
az login
48+
```
49+
4450
Use the following code to create and run an agent. To run this code, you will need to create a connection string using information from your project. This string is in the format:
4551

4652
`<HostName>;<AzureSubscriptionId>;<ResourceGroup>;<ProjectName>`

articles/ai-services/agents/includes/quickstart-python.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,11 @@ Run the following commands to install the python packages.
3838
pip install azure-ai-projects
3939
pip install azure-identity
4040
```
41+
Next, to authenticate your API requests and run the program, use the [az login](/cli/azure/authenticate-azure-cli-interactively) command to sign into your Azure subscription.
42+
43+
```azurecli
44+
az login
45+
```
4146

4247
Use the following code to create and run an agent. To run this code, you will need to create a connection string using information from your project. This string is in the format:
4348

@@ -115,7 +120,7 @@ with project_client:
115120
print(f"Messages: {messages}")
116121

117122
# Get the last message from the sender
118-
last_msg = messages.get_last_text_message_by_sender("assistant")
123+
last_msg = messages.get_last_text_message_by_role("assistant")
119124
if last_msg:
120125
print(f"Last Message: {last_msg.text.value}")
121126

articles/ai-services/immersive-reader/how-to-cache-token.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@ public async Task<string> GetTokenAsync()
6161

6262
The `AuthenticationResult` object has an `AccessToken` property, which is the actual token you use when launching the Immersive Reader using the SDK. It also has an `ExpiresOn` property that denotes when the token expires. Before launching the Immersive Reader, you can check whether the token is expired, and acquire a new token only if it expired.
6363

64-
## Using Node.JS
64+
## Using Node.js
6565

6666
Add the [request](https://www.npmjs.com/package/request) npm package to your project. Use the following code to acquire a token, using the authentication values you got when you [created the Immersive Reader resource](./how-to-create-immersive-reader.md).
6767

articles/ai-services/openai/how-to/assistant.md

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ manager: nitinme
77
ms.service: azure-ai-openai
88
ms.custom: references_regions
99
ms.topic: how-to
10-
ms.date: 05/20/2024
10+
ms.date: 01/28/2025
1111
author: aahill
1212
ms.author: aahi
1313
recommendations: false
@@ -26,11 +26,6 @@ Azure OpenAI Assistants (Preview) allows you to create AI assistants tailored to
2626

2727
Code interpreter is available in all regions supported by Azure OpenAI Assistants. The [models page](../concepts/models.md#assistants-preview) contains the most up-to-date information on regions/models where Assistants are currently supported.
2828

29-
### API Versions
30-
31-
- `2024-02-15-preview`
32-
- `2024-05-01-preview`
33-
3429
### Supported file types
3530

3631
|File format|MIME Type|Code Interpreter |

articles/ai-services/openai/how-to/assistants-logic-apps.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ services: cognitive-services
66
manager: nitinme
77
ms.service: azure-ai-openai
88
ms.topic: how-to
9-
ms.date: 05/21/2024
9+
ms.date: 01/28/2025
1010
author: aahill
1111
ms.author: aahi
1212
recommendations: false

articles/ai-services/openai/how-to/code-interpreter.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ services: cognitive-services
66
manager: nitinme
77
ms.service: azure-ai-openai
88
ms.topic: how-to
9-
ms.date: 10/15/2024
9+
ms.date: 01/28/2025
1010
author: aahill
1111
ms.author: aahi
1212
recommendations: false
@@ -32,8 +32,7 @@ We recommend using assistants with the latest models to take advantage of the ne
3232

3333
### API Versions
3434

35-
- `2024-02-15-preview`
36-
- `2024-05-01-preview`
35+
- Starting in `2024-02-15-preview`
3736

3837
### Supported file types
3938

@@ -94,7 +93,7 @@ assistant = client.beta.assistants.create(
9493
# [REST](#tab/rest)
9594

9695
> [!NOTE]
97-
> With Azure OpenAI the `model` parameter requires model deployment name. If your model deployment name is different than the underlying model name then you would adjust your code to ` "model": "{your-custom-model-deployment-name}"`.
96+
> With Azure OpenAI the `model` parameter requires model deployment name. If your model deployment name is different than the underlying model name, then you would adjust your code to ` "model": "{your-custom-model-deployment-name}"`.
9897
9998
```console
10099
curl https://YOUR_RESOURCE_NAME.openai.azure.com/openai/assistants?api-version=2024-05-01-preview \

0 commit comments

Comments
 (0)