Entra Authentication support in AI Model Inference #11966
Closed
BenoityipMSFT
started this conversation in
General
Replies: 1 comment 1 reply
-
You should be able to allow injection of client - also tagging @rogerbarreto |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Anyone can get Entra authentication working in Semantic Kernel?
How can I integrate the following 3 lines into Semantic Kernel?
AzureAIInferenceClientOptions clientOptions = new AzureAIInferenceClientOptions();
BearerTokenAuthenticationPolicy tokenPolicy = new BearerTokenAuthenticationPolicy(credential, new string[] { "https://cognitiveservices.azure.com/.default" });
clientOptions.AddPolicy(tokenPolicy, HttpPipelinePosition.PerRetry);
https://learn.microsoft.com/en-us/azure/ai-foundry/model-inference/how-to/configure-entra-id?tabs=csharp&pivots=ai-foundry-portal
Beta Was this translation helpful? Give feedback.
All reactions