Skip to content

Commit 8350424

Browse files
MeeraDisdgilley
andauthored
Update articles/api-management/azure-openai-enable-semantic-caching.md
Co-authored-by: Sheri Gilley <[email protected]>
1 parent 9ac7221 commit 8350424

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/api-management/azure-openai-enable-semantic-caching.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ Enable semantic caching of responses to Azure OpenAI API requests to reduce band
2222
2323
## Prerequisites
2424

25-
* One or more Azure OpenAI in Foundry Models APIs must be added to your API Management instance. For more information, see [Add an Azure OpenAI in Foundry Models API to Azure API Management](azure-openai-api-from-specification.md).
25+
* One or more Azure OpenAI in FModel Inference APIs must be added to your API Management instance. For more information, see [Add an Azure OpenAI in Model Inference API to Azure API Management](azure-openai-api-from-specification.md).
2626
* Azure OpenAI must have deployments for the following:
2727
* Chat Completion API - Deployment used for API consumer calls
2828
* Embeddings API - Deployment used for semantic caching

0 commit comments

Comments
 (0)