Replies: 2 comments
-
This is generally how it's done straight to an AOAI endpoint. Not sure how much applies here, but looks like you can combined this with https://docs.litellm.ai/docs/providers/azure to get to where you need to go. I prefer to use ChainedTokenCredential or the specific one you need, ManagedIdentityCredential(), over DefaultAzureCredential as you get a little more control, without needing a bunch of excludes. See here for the docs on all of them. If you are using the system assign managed identity it'll generally just pick it up with ManagedIdentityCredential() but if you are using a user assigned managed identity you'll need to specific the clientid like ManagedIdentityCredential(client_id=<client_id>), which can be search for in Entra or with the following az cli command: Here's some sample code that will try your azcli creds first then try the managed identity, but you can reorder them, add or remove from chained as you see fit. # Create individual credentials for inspection
azure_cli_cred = AzureCliCredential() # This credential will use the Azure CLI to authenticate (az login) az logout to clear the credentials
client_id="<client id of the user assigned managed identity>" # Ensure this is a valid client ID to the associated user-assigned managed identity, use entra to find the client ID
managed_identity_cred = ManagedIdentityCredential(client_id=client_id) # Uncomment this line if you want to use user-assigned managed identity
# managed_identity_cred = ManagedIdentityCredential() # Uncomment this line if you want to use system-assigned managed identity
# Create the chained credential
chained_cred = ChainedTokenCredential(azure_cli_cred, managed_identity_cred)
scope = "https://cognitiveservices.azure.com/.default"
credential = get_bearer_token_provider(chained_cred, scope)
client = AzureOpenAI(
azure_endpoint=endpoint,
azure_ad_token_provider=credential,
api_version=api_version,
)
api_version = '2024-10-21' # Ensure this is a valid API version see: https://learn.microsoft.com/en-us/azure/ai-services/openai/api-version-deprecation#latest-ga-api-release
deployment_name = '<aoai deployment name>' # Ensure this is a valid deployment name from your AOAI endpoint
endpoint = '<endpoint url'
response = client.chat.completions.create(
model=deployment_name,
messages=[
{
"role": "user",
"content": "Give a one word answer, what is the capital of France?",
},
]
)
response_content = response.choices[0].message.content
print(response_content)
|
Beta Was this translation helpful? Give feedback.
-
Hi @drewmacphee thanks for your efforts. However I was a bit unclear in my initial question. I am more interested in the configuration for the Server side. So to not rely on service principals or access keys |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
How can I configure the proxy to use a user-managed identity instead of a service principal with Azure OpenAI? The docs mention DefaultAzureCredential supports managed identities, but I can't find a code reference. Any guidance?
Beta Was this translation helpful? Give feedback.
All reactions