You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/batch.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -67,7 +67,7 @@ Refer to the [models page](../concepts/models.md) for the most up-to-date inform
67
67
68
68
### API support
69
69
70
-
API support was first added with `2024-07-01-preview`.
70
+
API support was first added with `2024-07-01-preview`. Use `2024-10-01-preview` to take advantage of the latest features.
71
71
72
72
### Not supported
73
73
@@ -83,7 +83,7 @@ In the Studio UI the deployment type will appear as `Global-Batch`.
83
83
:::image type="content" source="../media/how-to/global-batch/global-batch.png" alt-text="Screenshot that shows the model deployment dialog in Azure OpenAI Studio with Global-Batch deployment type highlighted." lightbox="../media/how-to/global-batch/global-batch.png":::
84
84
85
85
> [!TIP]
86
-
> Each line of your input file for batch processing has a `model` attribute that requires a global batch **deployment name**. For a given input file, all names must be the same deployment name. This is different from OpenAI where the concept of model deployments does not exist.
86
+
> Each line of your input file for batch processing has a `model` attribute that requires a global batch **deployment name**. For a given input file, all names must be the same deployment name. This is different from OpenAI where the concept of model deployments does not exist.
87
87
>
88
88
> For the best performance we recommend submitting large files for batch processing, rather than a large number of small files with only a few lines in each file.
@@ -367,3 +398,204 @@ List all batch jobs for a particular Azure OpenAI resource.
367
398
```python
368
399
client.batches.list()
369
400
```
401
+
402
+
### List batch (Preview)
403
+
404
+
Use the REST API to list all batch jobs with additional sorting/filtering options.
405
+
406
+
In the examples below we are providing the `generate_time_filter` function to make constructing the filter easier. If you don't wish to use this function the format of the filter string would look like `created_at gt 1728773533 and created_at lt 1729032733 and status eq 'Completed'`.
Once your file has uploaded successfully by reaching a status of `processed` you can submit the file for batch processing.
119
119
120
120
```http
121
-
curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches?api-version=2024-07-01-preview \
121
+
curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches?api-version=2024-10-01-preview \
122
122
-H "api-key: $AZURE_OPENAI_API_KEY" \
123
123
-H "Content-Type: application/json" \
124
124
-d '{
@@ -168,7 +168,7 @@ curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches?api-vers
168
168
Once you have created batch job successfully you can monitor its progress either in the Studio or programatically. When checking batch job progress we recommend waiting at least 60 seconds in between each status call.
Cancels an in-progress batch. The batch will be in status `cancelling` for up to 10 minutes, before changing to `cancelled`, where it will have partial results (if any) available in the output file.
0 commit comments