Skip to content

Commit 4784493

Browse files
committed
Learn Editor: Update prompt-caching.md
1 parent be36a78 commit 4784493

File tree

1 file changed

+5
-1
lines changed

1 file changed

+5
-1
lines changed

articles/ai-services/openai/how-to/prompt-caching.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -83,4 +83,8 @@ To improve the likelihood of cache hits occurring, you should structure your req
8383

8484
## Can I disable prompt caching?
8585

86-
Prompt caching is enabled by default. There is no opt-out option.
86+
Prompt caching is enabled by default. There is no opt-out option.
87+
88+
## How does prompt caching work for Provisioned deployments?
89+
90+
For supported models on provisioned deployments, we discount up to 100% of cached input tokens. For more information, see our [Provisioned Throughput documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/provisioned-throughput).

0 commit comments

Comments
 (0)