We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 5f22b8c commit 0715732Copy full SHA for 0715732
articles/ai-services/openai/how-to/prompt-caching.md
@@ -29,7 +29,7 @@ Official support for prompt caching was first added in API version `2024-10-01-p
29
30
## Getting started
31
32
-For a request to take advantage of prompt caching the request must be:
+For a request to take advantage of prompt caching the request must be both:
33
34
- A minimum of 1,024 tokens in length.
35
- The first 1,024 tokens in the prompt must be identical.
0 commit comments