You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/batch.md
+8-3Lines changed: 8 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ recommendations: false
13
13
zone_pivot_groups: openai-fine-tuning-batch
14
14
---
15
15
16
-
# Getting started with Azure OpenAI global batch deployments (preview)
16
+
# Getting started with Azure OpenAI global batch deployments
17
17
18
18
The Azure OpenAI Batch API is designed to handle large-scale and high-volume processing tasks efficiently. Process asynchronous groups of requests with separate quota, with 24-hour target turnaround, at [50% less cost than global standard](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/). With batch processing, rather than send one request at a time you send a large number of requests in a single file. Global batch requests have a separate enqueued token quota avoiding any disruption of your online workloads.
19
19
@@ -65,7 +65,12 @@ Refer to the [models page](../concepts/models.md) for the most up-to-date inform
65
65
66
66
### API support
67
67
68
-
API support was first added with `2024-07-01-preview`. Use `2024-10-01-preview` to take advantage of the latest features.
68
+
|| API Version |
69
+
|---|---|
70
+
|**Latest GA API release:**|`2024-10-21`|
71
+
|**Latest Preview API release:**|`2024-10-01-preview`|
72
+
73
+
Support first added in: `2024-07-01-preview`
69
74
70
75
### Feature support
71
76
@@ -75,7 +80,7 @@ The following aren't currently supported:
75
80
- Integration with Azure OpenAI On Your Data feature.
76
81
77
82
> [!NOTE]
78
-
> Structured outputs is now supported with Global Batch when used in conjunction with API version `2024-08-01-preview` or later. Use `2024-10-01-preview` for the latest features.
83
+
> Structured outputs is now supported with Global Batch.
Once your file has uploaded successfully you can submit the file for batch processing.
128
128
129
129
```http
130
-
curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches?api-version=2024-10-01-preview \
130
+
curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches?api-version=2024-10-21 \
131
131
-H "api-key: $AZURE_OPENAI_API_KEY" \
132
132
-H "Content-Type: application/json" \
133
133
-d '{
@@ -176,7 +176,7 @@ curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches?api-vers
176
176
Once you have created batch job successfully you can monitor its progress either in the Studio or programatically. When checking batch job progress we recommend waiting at least 60 seconds in between each status call.
Cancels an in-progress batch. The batch will be in status `cancelling` for up to 10 minutes, before changing to `cancelled`, where it will have partial results (if any) available in the output file.
:::image type="content" source="../../media/how-to/global-batch/create-batch-job-empty.png" alt-text="Screenshot that shows the batch job creation experience in Azure AI Studio." lightbox="../../media/how-to/global-batch/create-batch-job-empty.png":::
Copy file name to clipboardExpand all lines: articles/ai-services/openai/whats-new.md
+25-1Lines changed: 25 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,7 +10,7 @@ ms.custom:
10
10
- ignite-2023
11
11
- references_regions
12
12
ms.topic: whats-new
13
-
ms.date: 10/01/2024
13
+
ms.date: 10/22/2024
14
14
recommendations: false
15
15
---
16
16
@@ -20,6 +20,30 @@ This article provides a summary of the latest releases and major documentation u
20
20
21
21
## October 2024
22
22
23
+
### Global Batch GA
24
+
25
+
Azure OpenAI global batch is now generally available.
26
+
27
+
The Azure OpenAI Batch API is designed to handle large-scale and high-volume processing tasks efficiently. Process asynchronous groups of requests with separate quota, with 24-hour target turnaround, at [50% less cost than global standard](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/). With batch processing, rather than send one request at a time you send a large number of requests in a single file. Global batch requests have a separate enqueued token quota avoiding any disruption of your online workloads.
28
+
29
+
Key use cases include:
30
+
31
+
***Large-Scale Data Processing:** Quickly analyze extensive datasets in parallel.
32
+
33
+
***Content Generation:** Create large volumes of text, such as product descriptions or articles.
34
+
35
+
***Document Review and Summarization:** Automate the review and summarization of lengthy documents.
36
+
37
+
***Customer Support Automation:** Handle numerous queries simultaneously for faster responses.
38
+
39
+
***Data Extraction and Analysis:** Extract and analyze information from vast amounts of unstructured data.
40
+
41
+
***Natural Language Processing (NLP) Tasks:** Perform tasks like sentiment analysis or translation on large datasets.
42
+
43
+
***Marketing and Personalization:** Generate personalized content and recommendations at scale.
44
+
45
+
For more information on [getting started with global batch deployments](./how-to/batch.md).
46
+
23
47
### o1-preview and o1-mini models limited access
24
48
25
49
The `o1-preview` and `o1-mini` models are now available for API access and model deployment. **Registration is required, and access will be granted based on Microsoft's eligibility criteria**.
0 commit comments