Skip to content

Commit a85caa9

Browse files
committed
update
1 parent baf82dd commit a85caa9

File tree

1 file changed

+1
-1
lines changed
  • articles/ai-services/openai/how-to

1 file changed

+1
-1
lines changed

articles/ai-services/openai/how-to/batch.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -160,7 +160,7 @@ Yes, from the quota page in the Studio UI. Default quota allocation can be found
160160

161161
The `2024-10-01-preview` REST API adds two new response headers:
162162

163-
* `deployment-enqueued-tokens` - A approximate token count for your jsonl file calculating immediately after the batch request is submitted. This value represents an estimate based on the number of characters and is not the true token count.
163+
* `deployment-enqueued-tokens` - A approximate token count for your jsonl file calculated immediately after the batch request is submitted. This value represents an estimate based on the number of characters and is not the true token count.
164164
* `deployment-maximum-enqueued-tokens` The total available enqueued tokens available for this global batch model deployment.
165165

166166
These response headers are only available when making a POST request to begin batch processing of a file with the REST API. The language specific client libraries do not currently return these new response headers.

0 commit comments

Comments
 (0)