Skip to content

Commit 15d4f3c

Browse files
committed
update
1 parent ce63b89 commit 15d4f3c

File tree

1 file changed

+1
-1
lines changed
  • articles/ai-services/openai/how-to

1 file changed

+1
-1
lines changed

articles/ai-services/openai/how-to/batch.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ Key use cases include:
3434
* **Marketing and Personalization:** Generate personalized content and recommendations at scale.
3535

3636
> [!TIP]
37-
> If your batch jobs are so large that you are hitting the enqueued token limit even after maxing out the quota for your deployment, certain regions now support a new feature that allows you to queue multiple batch jobs with exponential backoff. Once one large batch job completes and your enqueued token quota is once again available, the next batch job can be created and kicked off automatically.To lear more, see [**automating retries of large batch jobs with exponential backoff**](../includes/batch/batch-python.md#queueing-batch-jobs).
37+
> If your batch jobs are so large that you are hitting the enqueued token limit even after maxing out the quota for your deployment, certain regions now support a new feature that allows you to queue multiple batch jobs with exponential backoff. Once one large batch job completes and your enqueued token quota is once again available, the next batch job can be created and kicked off automatically.To lear more, see [**automating retries of large batch jobs with exponential backoff**](#queueing-batch-jobs).
3838
3939
> [!IMPORTANT]
4040
> We aim to process batch requests within 24 hours; we don't expire the jobs that take longer. You can [cancel](#cancel-batch) the job anytime. When you cancel the job, any remaining work is cancelled and any already completed work is returned. You'll be charged for any completed work.

0 commit comments

Comments
 (0)