You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/batch.md
+57-3Lines changed: 57 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -61,8 +61,6 @@ The following models support global batch:
61
61
|`gpt-35-turbo`| 1106 | text |
62
62
|`gpt-35-turbo`| 0613 | text |
63
63
64
-
65
-
66
64
Refer to the [models page](../concepts/models.md) for the most up-to-date information on regions/models where global batch is currently supported.
67
65
68
66
### API support
@@ -166,7 +164,63 @@ The `2024-10-01-preview` REST API adds two new response headers:
166
164
*`deployment-enqueued-tokens` - A approximate token count for your jsonl file calculated immediately after the batch request is submitted. This value represents an estimate based on the number of characters and is not the true token count.
167
165
*`deployment-maximum-enqueued-tokens` The total available enqueued tokens available for this global batch model deployment.
168
166
169
-
These response headers are only available when making a POST request to begin batch processing of a file with the REST API. The language specific client libraries do not currently return these new response headers.
167
+
These response headers are only available when making a POST request to begin batch processing of a file with the REST API. The language specific client libraries do not currently return these new response headers. To return all response headers you can add `-i` to the standard REST request.
168
+
169
+
```http
170
+
curl -i -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/batches?api-version=2024-10-01-preview \
Copy file name to clipboardExpand all lines: articles/ai-services/openai/includes/batch/batch-python.md
+4-6Lines changed: 4 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -144,7 +144,7 @@ file_id = file.id
144
144
145
145
## Create batch job
146
146
147
-
Once your file has uploaded successfully by reaching a status of `processed`you can submit the file for batch processing.
147
+
Once your file has uploaded successfully you can submit the file for batch processing.
148
148
149
149
```python
150
150
# Submit a batch job with the file
@@ -405,7 +405,7 @@ client.batches.list()
405
405
406
406
Use the REST API to list all batch jobs with additional sorting/filtering options.
407
407
408
-
In the examples below we are providing the `generate_time_filter` function to make constructing the filter easier. If you don't wish to use this function the format of the filter string would look like `created_at gt 1728773533 and created_at lt 1729032733 and status eq 'Completed'`.
408
+
In the examples below we are providing the `generate_time_filter` function to make constructing the filter easier. If you don't wish to use this function the format of the filter string would look like `created_at gt 1728860560 and status eq 'Completed'`.
0 commit comments