Skip to content

Commit ee7d53c

Browse files
committed
update
1 parent 36ba331 commit ee7d53c

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

articles/ai-services/openai/how-to/batch.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -80,7 +80,7 @@ In the Studio UI the deployment type will appear as `Global-Batch`.
8080
:::image type="content" source="../media/how-to/global-batch/global-batch.png" alt-text="Screenshot that shows the model deployment dialog in Azure OpenAI Studio with Global-Batch deployment type highlighted." lightbox="../media/how-to/global-batch/global-batch.png":::
8181

8282
> [!TIP]
83-
> Each line of your input file for batch processing has a model attribute that requires a global batch deployment name. For a given input file, all names must be the same deployment name. This is different from OpenAI where the concept of model deployments does not exist.
83+
> Each line of your input file for batch processing has a `model` attribute that requires a global batch **deployment name**. For a given input file, all names must be the same deployment name. This is different from OpenAI where the concept of model deployments does not exist.
8484
8585
::: zone pivot="programming-language-ai-studio"
8686

articles/ai-services/openai/how-to/deployment-types.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Azure OpenAI offers three types of deployments. These provide a varied level of
3030

3131
| **Offering** | **Global-Batch** | **Global-Standard** | **Standard** | **Provisioned** |
3232
|---|:---|:---|:---|:---|
33-
| **Best suited for** | Offline scoring <br><br> Workloads that are not latency sensitive and can be completed in hours.<br><br> For use cases that do not have data processing residency requirements.| Recommended starting place for customers. <br><br>Global-Standard will have the higher default quota and larger number of models available than Standard. <br><br> For production applications that do not have data processing residency requirements. | For customers with data processing residency requirements. | For customers with data residency requirements. Optimized for low to medium volume. | Real-time scoring for large consistent volume. Includes the highest commitments and limits.|
33+
| **Best suited for** | Offline scoring <br><br> Workloads that are not latency sensitive and can be completed in hours.<br><br> For use cases that do not have data processing residency requirements.| Recommended starting place for customers. <br><br>Global-Standard will have the higher default quota and larger number of models available than Standard. <br><br> For production applications that do not have data processing residency requirements. | For customers with data residency requirements. Optimized for low to medium volume. | Real-time scoring for large consistent volume. Includes the highest commitments and limits.|
3434
| **How it works** | Offline processing via files |Traffic may be routed anywhere in the world | | |
3535
| **Getting started** | [Global-Batch](./batch.md) | [Model deployment](./create-resource.md) | [Model deployment](./create-resource.md) | [Provisioned onboarding](./provisioned-throughput-onboarding.md) |
3636
| **Cost** | [Least expensive option](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/) <br> 50% less cost compared to Global Standard prices. Access to all new models with larger quota allocations. | [Global deployment pricing](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/) | [Regional pricing](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/) | May experience cost savings for consistent usage |

0 commit comments

Comments
 (0)