Skip to content

Commit 2b72650

Browse files
authored
Merge pull request #265455 from MicrosoftDocs/main
7/06 OOB Publish
2 parents 0f093a4 + 38d7dd0 commit 2b72650

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

63 files changed

+2851
-166
lines changed

articles/ai-services/openai/api-version-deprecation.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ Azure OpenAI API version 2023-12-01-preview is currently the latest preview rele
2222

2323
This version contains support for all the latest Azure OpenAI features including:
2424

25+
- [Text to speech](./text-to-speech-quickstart.md). [**Added in 2024-02-15-preview**]
2526
- [Fine-tuning](./how-to/fine-tuning.md) `gpt-35-turbo`, `babbage-002`, and `davinci-002` models.[**Added in 2023-10-01-preview**]
2627
- [Whisper](./whisper-quickstart.md). [**Added in 2023-09-01-preview**]
2728
- [Function calling](./how-to/function-calling.md) [**Added in 2023-07-01-preview**]
Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
---
2+
title: Quickstart - Getting started with Azure OpenAI Assistants (Preview)
3+
titleSuffix: Azure OpenAI
4+
description: Walkthrough on how to get started with Azure OpenAI assistants with new features like code interpreter and retrieval.
5+
manager: nitinme
6+
ms.service: azure-ai-openai
7+
ms.topic: quickstart
8+
author: mrbullwinkle
9+
ms.author: mbullwin
10+
ms.date: 02/01/2024
11+
zone_pivot_groups: openai-quickstart
12+
recommendations: false
13+
---
14+
15+
16+
# Quickstart: Get started using Azure OpenAI Assistants (Preview)
17+
18+
Azure OpenAI Assistants (Preview) allows you to create AI assistants tailored to your needs through custom instructions and augmented by advanced tools like code interpreter, and custom functions.
19+
20+
::: zone pivot="programming-language-studio"
21+
22+
[!INCLUDE [Studio quickstart](includes/assistants-studio.md)]
23+
24+
::: zone-end
25+
26+
::: zone pivot="programming-language-python"
27+
28+
[!INCLUDE [Python SDK quickstart](includes/assistants-python.md)]
29+
30+
::: zone-end
31+
32+
::: zone pivot="rest-api"
33+
34+
[!INCLUDE [REST API quickstart](includes/assistants-rest.md)]
35+
36+
::: zone-end
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
---
2+
title: Azure OpenAI Service Assistant API concepts
3+
titleSuffix: Azure OpenAI Service
4+
description: Learn about the concepts behind the Azure OpenAI Assistants API.
5+
ms.topic: conceptual
6+
ms.date: 02/05/2023
7+
manager: nitinme
8+
author: mrbullwinkle
9+
ms.author: mbullwin
10+
recommendations: false
11+
---
12+
13+
# Azure OpenAI Assistants API (Preview)
14+
15+
Assistants, a new feature of Azure OpenAI Service, is now available in public preview. Assistants API makes it easier for developers to create applications with sophisticated copilot-like experiences that can sift through data, suggest solutions, and automate tasks.
16+
17+
## Overview
18+
19+
Previously, building custom AI assistants needed heavy lifting even for experienced developers. While the chat completions API is lightweight and powerful, it's inherently stateless, which means that developers had to manage conversation state and chat threads, tool integrations, retrieval documents and indexes, and execute code manually.
20+
21+
The Assistants API, as the stateful evolution of the chat completion API, provides a solution for these challenges.
22+
Assistants API supports persistent automatically managed threads. This means that as a developer you no longer need to develop conversation state management systems and work around a model’s context window constraints. The Assistants API will automatically handle the optimizations to keep the thread below the max context window of your chosen model. Once you create a Thread, you can simply append new messages to it as users respond. Assistants can also access multiple tools in parallel, if needed. These tools include:
23+
24+
- [Code Interpreter](../how-to/code-interpreter.md)
25+
- [Function calling](../how-to/assistant-functions.md)
26+
27+
Assistant API is built on the same capabilities that power OpenAI’s GPT product. Some possible use cases range from AI-powered product recommender, sales analyst app, coding assistant, employee Q&A chatbot, and more. Start building on the no-code Assistants playground on the Azure OpenAI Studio or start building with the API.
28+
29+
> [!IMPORTANT]
30+
> Retrieving untrusted data using Function calling, Code Interpreter with file input, and Assistant Threads functionalities could compromise the security of your Assistant, or the application that uses the Assistant. Learn about mitigation approaches [here](https://aka.ms/oai/assistant-rai).
31+
32+
## Assistants playground
33+
34+
We provide a walkthrough of the Assistants playground in our [quickstart guide](../assistants-quickstart.md). This provides a no-code environment to test out the capabilities of assistants.
35+
36+
## Assistants components
37+
38+
| **Component** | **Description** |
39+
|---|---|
40+
| **Assistant** | Custom AI that uses Azure OpenAI models in conjunction with tools. |
41+
|**Thread** | A conversation session between an Assistant and a user. Threads store Messages and automatically handle truncation to fit content into a model’s context.|
42+
| **Message** | A message created by an Assistant or a user. Messages can include text, images, and other files. Messages are stored as a list on the Thread. |
43+
|**Run** | Activation of an Assistant to begin running based on the contents of the Thread. The Assistant uses its configuration and the Thread’s Messages to perform tasks by calling models and tools. As part of a Run, the Assistant appends Messages to the Thread.|
44+
|**Run Step** | A detailed list of steps the Assistant took as part of a Run. An Assistant can call tools or create Messages during it’s run. Examining Run Steps allows you to understand how the Assistant is getting to its final results. |
45+
46+
## See also
47+
48+
* Learn more about Assistants and [Code Interpreter](../how-to/code-interpreter.md)
49+
* Learn more about Assistants and [function calling](../how-to/assistant-functions.md)
50+
* [Azure OpenAI Assistants API samples](https://github.com/Azure-Samples/azureai-samples/tree/main/scenarios/Assistants)
51+
52+
53+

articles/ai-services/openai/concepts/models.md

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,7 @@ Azure OpenAI Service is powered by a diverse set of models with different capabi
2323
| [Embeddings](#embeddings-models) | A set of models that can convert text into numerical vector form to facilitate text similarity. |
2424
| [DALL-E](#dall-e-models-preview) (Preview) | A series of models in preview that can generate original images from natural language. |
2525
| [Whisper](#whisper-models-preview) (Preview) | A series of models in preview that can transcribe and translate speech to text. |
26+
| [Text to speech](#text-to-speech-models-preview) (Preview) | A series of models in preview that can synthesize text to speech. |
2627

2728
## GPT-4 and GPT-4 Turbo Preview
2829

@@ -65,6 +66,12 @@ The Whisper models, currently in preview, can be used for speech to text.
6566

6667
You can also use the Whisper model via Azure AI Speech [batch transcription](../../speech-service/batch-transcription-create.md) API. Check out [What is the Whisper model?](../../speech-service/whisper-overview.md) to learn more about when to use Azure AI Speech vs. Azure OpenAI Service.
6768

69+
## Text to speech (Preview)
70+
71+
The OpenAI text to speech models, currently in preview, can be used to synthesize text to speech.
72+
73+
You can also use the OpenAI text to speech voices via Azure AI Speech. To learn more, see [OpenAI text to speech voices via Azure OpenAI Service or via Azure AI Speech](../../speech-service/openai-voices.md#openai-text-to-speech-voices-via-azure-openai-service-or-via-azure-ai-speech) guide.
74+
6875
## Model summary table and region availability
6976

7077
> [!IMPORTANT]
@@ -199,13 +206,33 @@ The following Embeddings models are available with [Azure Government](/azure/azu
199206
| `babbage-002` | North Central US <br> Sweden Central | 16,384 | Sep 2021 |
200207
| `davinci-002` | North Central US <br> Sweden Central | 16,384 | Sep 2021 |
201208
| `gpt-35-turbo` (0613) | North Central US <br> Sweden Central | 4,096 | Sep 2021 |
209+
| `gpt-35-turbo` (1106) | North Central US <br> Sweden Central | Input: 16,385<br> Output: 4,096 | Sep 2021|
210+
202211

203212
### Whisper models (Preview)
204213

205214
| Model ID | Model Availability | Max Request (audio file size) |
206215
| --- | --- | :---: |
207216
| `whisper` | North Central US <br> West Europe | 25 MB |
208217

218+
### Text to speech models (Preview)
219+
220+
| Model ID | Model Availability |
221+
| --- | --- | :---: |
222+
| `tts-1` | North Central US <br> Sweden Central |
223+
| `tts-1-hd` | North Central US <br> Sweden Central |
224+
225+
### Assistants (Preview)
226+
227+
For Assistants you need a combination of a supported model, and a supported region. Certain tools and capabilities require the latest models. For example [parallel function](../how-to/assistant-functions.md) calling requires the latest 1106 models.
228+
229+
| Region | `gpt-35-turbo (1106)` | `gpt-4 (1106-preview)` | `gpt-4 (0613)` | `gpt-4 (0314)` | `gpt-35-turbo (0301)` | `gpt-35-turbo (0613)` | `gpt-35-turbo-16k (0613)` | `gpt-4-32k (0314)` | `gpt-4-32k (0613)` |
230+
|---|---|---|---|---|---|---|---|---|---|
231+
| Sweden Central ||||||||||
232+
| East US 2 ||||||||||
233+
| Australia East ||||||||||
234+
235+
209236
## Next steps
210237

211238
- [Learn more about working with Azure OpenAI models](../how-to/working-with-models.md)
Lines changed: 219 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,219 @@
1+
---
2+
title: 'How to use Azure OpenAI Assistants function calling'
3+
titleSuffix: Azure OpenAI
4+
description: Learn how to use Assistants function calling
5+
services: cognitive-services
6+
manager: nitinme
7+
ms.service: azure-ai-openai
8+
ms.topic: how-to
9+
ms.date: 02/01/2024
10+
author: mrbullwinkle
11+
ms.author: mbullwin
12+
recommendations: false
13+
14+
---
15+
16+
# Azure OpenAI Assistants function calling
17+
18+
The Assistants API supports function calling, which allows you to describe the structure of functions to an Assistant and then return the functions that need to be called along with their arguments.
19+
20+
## Function calling support
21+
22+
### Supported models
23+
24+
The [models page](../concepts/models.md#assistants-preview) contains the most up-to-date information on regions/models where Assistants are supported.
25+
26+
To use all features of function calling including parallel functions, you need to use the latest models.
27+
28+
### API Version
29+
30+
- `2024-02-15-preview`
31+
32+
## Example function definition
33+
34+
# [Python 1.x](#tab/python)
35+
36+
```python
37+
from openai import AzureOpenAI
38+
39+
client = AzureOpenAI(
40+
api_key=os.getenv("AZURE_OPENAI_KEY"),
41+
api_version="2024-02-15-preview",
42+
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
43+
)
44+
45+
assistant = client.beta.assistants.create(
46+
instructions="You are a weather bot. Use the provided functions to answer questions.",
47+
model="gpt-4-1106-preview", #Replace with model deployment name
48+
tools=[{
49+
"type": "function",
50+
"function": {
51+
"name": "getCurrentWeather",
52+
"description": "Get the weather in location",
53+
"parameters": {
54+
"type": "object",
55+
"properties": {
56+
"location": {"type": "string", "description": "The city and state e.g. San Francisco, CA"},
57+
"unit": {"type": "string", "enum": ["c", "f"]}
58+
},
59+
"required": ["location"]
60+
}
61+
}
62+
}, {
63+
"type": "function",
64+
"function": {
65+
"name": "getNickname",
66+
"description": "Get the nickname of a city",
67+
"parameters": {
68+
"type": "object",
69+
"properties": {
70+
"location": {"type": "string", "description": "The city and state e.g. San Francisco, CA"},
71+
},
72+
"required": ["location"]
73+
}
74+
}
75+
}]
76+
)
77+
```
78+
79+
# [REST](#tab/rest)
80+
81+
> [!NOTE]
82+
> With Azure OpenAI the `model` parameter requires model deployment name. If your model deployment name is different than the underlying model name then you would adjust your code to ` "model": "{your-custom-model-deployment-name}"`.
83+
84+
```console
85+
curl https://YOUR_RESOURCE_NAME.openai.azure.com/openai/assistants?api-version=2024-02-15-preview \
86+
-H "api-key: $AZURE_OPENAI_KEY" \
87+
-H "Content-Type: application/json" \
88+
-d '{
89+
"instructions": "You are a weather bot. Use the provided functions to answer questions.",
90+
"tools": [{
91+
"type": "function",
92+
"function": {
93+
"name": "getCurrentWeather",
94+
"description": "Get the weather in location",
95+
"parameters": {
96+
"type": "object",
97+
"properties": {
98+
"location": {"type": "string", "description": "The city and state e.g. San Francisco, CA"},
99+
"unit": {"type": "string", "enum": ["c", "f"]}
100+
},
101+
"required": ["location"]
102+
}
103+
}
104+
},
105+
{
106+
"type": "function",
107+
"function": {
108+
"name": "getNickname",
109+
"description": "Get the nickname of a city",
110+
"parameters": {
111+
"type": "object",
112+
"properties": {
113+
"location": {"type": "string", "description": "The city and state e.g. San Francisco, CA"}
114+
},
115+
"required": ["location"]
116+
}
117+
}
118+
}],
119+
"model": "gpt-4-1106-preview"
120+
}'
121+
```
122+
123+
---
124+
125+
## Reading the functions
126+
127+
When you initiate a **Run** with a user Message that triggers the function, the **Run** will enter a pending status. After it processes, the run will enter a requires_action state that you can verify by retrieving the **Run**.
128+
129+
```json
130+
{
131+
"id": "run_abc123",
132+
"object": "thread.run",
133+
"assistant_id": "asst_abc123",
134+
"thread_id": "thread_abc123",
135+
"status": "requires_action",
136+
"required_action": {
137+
"type": "submit_tool_outputs",
138+
"submit_tool_outputs": {
139+
"tool_calls": [
140+
{
141+
"id": "call_abc123",
142+
"type": "function",
143+
"function": {
144+
"name": "getCurrentWeather",
145+
"arguments": "{\"location\":\"San Francisco\"}"
146+
}
147+
},
148+
{
149+
"id": "call_abc456",
150+
"type": "function",
151+
"function": {
152+
"name": "getNickname",
153+
"arguments": "{\"location\":\"Los Angeles\"}"
154+
}
155+
}
156+
]
157+
}
158+
},
159+
...
160+
```
161+
162+
## Submitting function outputs
163+
164+
You can then complete the **Run** by submitting the tool output from the function(s) you call. Pass the `tool_call_id` referenced in the `required_action` object above to match output to each function call.
165+
166+
167+
# [Python 1.x](#tab/python)
168+
169+
```python
170+
from openai import AzureOpenAI
171+
172+
client = AzureOpenAI(
173+
api_key=os.getenv("AZURE_OPENAI_KEY"),
174+
api_version="2024-02-15-preview",
175+
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
176+
)
177+
178+
179+
run = client.beta.threads.runs.submit_tool_outputs(
180+
thread_id=thread.id,
181+
run_id=run.id,
182+
tool_outputs=[
183+
{
184+
"tool_call_id": call_ids[0],
185+
"output": "22C",
186+
},
187+
{
188+
"tool_call_id": call_ids[1],
189+
"output": "LA",
190+
},
191+
]
192+
)
193+
```
194+
195+
# [REST](#tab/rest)
196+
197+
```console
198+
curl https://YOUR_RESOURCE_NAME.openai.azure.com/openai/threads/thread_abc123/runs/run_123/submit_tool_outputs?api-version=2024-02-15-preview \
199+
-H "Content-Type: application/json" \
200+
-H "api-key: $AZURE_OPENAI_KEY" \
201+
-d '{
202+
"tool_outputs": [{
203+
"tool_call_id": "call_abc123",
204+
"output": "{"temperature": "22", "unit": "celsius"}"
205+
}, {
206+
"tool_call_id": "call_abc456",
207+
"output": "{"nickname": "LA"}"
208+
}]
209+
}'
210+
```
211+
212+
---
213+
214+
After you submit tool outputs, the **Run** will enter the `queued` state before it continues execution.
215+
216+
## See also
217+
218+
* Learn more about how to use Assistants with our [How-to guide on Assistants](../how-to/assistant.md).
219+
* [Azure OpenAI Assistants API samples](https://github.com/Azure-Samples/azureai-samples/tree/main/scenarios/Assistants)

0 commit comments

Comments
 (0)