You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/batch.md
+3-4Lines changed: 3 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ manager: nitinme
6
6
ms.service: azure-ai-openai
7
7
ms.custom:
8
8
ms.topic: how-to
9
-
ms.date: 08/01/2024
9
+
ms.date: 08/04/2024
10
10
author: mrbullwinkle
11
11
ms.author: mbullwin
12
12
recommendations: false
@@ -53,7 +53,6 @@ The following models support global batch:
53
53
| Model | Version | Supported |
54
54
|---|---|
55
55
|`gpt-4o`| 2024-05-13 |Yes (text + vision) |
56
-
|`gpt-4o-mini`| 2024-07-18 |Yes (text + vision) |
57
56
|`gpt-4`| turbo-2024-04-09 | Yes (text only) |
58
57
|`gpt-4`| 0613 | Yes |
59
58
|`gpt-35-turbo`| 0125 | Yes |
@@ -81,7 +80,7 @@ In the Studio UI the deployment type will appear as `Global-Batch`.
81
80
:::image type="content" source="../media/how-to/global-batch/global-batch.png" alt-text="Screenshot that shows the model deployment dialog in Azure OpenAI Studio with Global-Batch deployment type highlighted." lightbox="../media/how-to/global-batch/global-batch.png":::
82
81
83
82
> [!TIP]
84
-
> Each line of your input file for batch processing requires the unique **deployment name** that you chose during model deployment to be present. This value wil be assigned to the `model`parameter. This is different from OpenAI where the concept of model deployments does not exist.
83
+
> Each line of your input file for batch processing requires the unique **deployment name** that you chose during model deployment to be present. This value wil be assigned to the `model`attribute. This is different from OpenAI where the concept of model deployments does not exist.
85
84
86
85
::: zone pivot="programming-language-ai-studio"
87
86
@@ -132,7 +131,7 @@ In the Studio UI the deployment type will appear as `Global-Batch`.
132
131
133
132
### Can images be used with the batch API?
134
133
135
-
This capability is limited to certain multi-modal models. Currently only GPT-4o and GPT-4o mini support images as part of batch requests. Images can be provided as input either via [image url or a base64 encoded representation of the image](#input-format). Images for batch are currently not supported with GPT-4 Turbo.
134
+
This capability is limited to certain multi-modal models. Currently only GPT-4o support images as part of batch requests. Images can be provided as input either via [image url or a base64 encoded representation of the image](#input-format). Images for batch are currently not supported with GPT-4 Turbo.
136
135
137
136
### Can I use the batch API with fine-tuned models?
Copy file name to clipboardExpand all lines: articles/ai-services/openai/includes/batch/batch-python.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -59,7 +59,7 @@ Like [fine-tuning](../../how-to/fine-tuning.md), global batch uses files in JSON
59
59
60
60
The `custom_id` is required to allow you to identify which individual batch request corresponds to a given response. Responses won't be returned in identical order to the order defined in the `.jsonl` batch file.
61
61
62
-
`model` should be set to match the **model deployment name** of the global batch model you wish to target for inference responses.
62
+
`model`attribute should be set to match the name of the Global Batch deployment you wish to target for inference responses.
Copy file name to clipboardExpand all lines: articles/ai-services/openai/includes/batch/batch-rest.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -48,7 +48,7 @@ Like [fine-tuning](../../how-to/fine-tuning.md), global batch uses files in JSON
48
48
49
49
The `custom_id` is required to allow you to identify which individual batch request corresponds to a given response. Responses won't be returned in identical order to the order defined in the `.jsonl` batch file.
50
50
51
-
`model` should be set to match the **model deployment name** of the global batch model you wish to target for inference responses.
51
+
`model`attribute should be set to match the name of the Global Batch deployment you wish to target for inference responses.
Copy file name to clipboardExpand all lines: articles/ai-services/openai/includes/batch/batch-studio.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -48,7 +48,7 @@ Like [fine-tuning](../../how-to/fine-tuning.md), global batch uses files in JSON
48
48
49
49
The `custom_id` is required to allow you to identify which individual batch request corresponds to a given response. Responses won't be returned in identical order to the order defined in the `.jsonl` batch file.
50
50
51
-
`model` should be set to match the **model deployment name** of the global batch model you wish to target for inference responses.
51
+
`model`attribute should be set to match the name of the Global Batch deployment you wish to target for inference responses.
0 commit comments