Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

README.md

Overview

This sample shows how to use the OpenAI SDK to create and use a simple AI agent with any model hosted in Microsoft Foundry.

You could use models from Microsoft, OpenAI, DeepSeek, Hugging Face, Meta, xAI or any other model you have deployed in Microsoft Foundry.

Note: Ensure that you pick a model that suits your needs. For example, if you want to use function calling, ensure that the model you pick supports function calling.

Prerequisites

Before you begin, ensure you have the following prerequisites:

  • .NET 10 SDK or later
  • Microsoft Foundry resource
  • A model deployment in your Microsoft Foundry resource. This example defaults to using the Phi-4-mini-instruct model, so if you want to use a different model, ensure that you set your AZURE_AI_MODEL_DEPLOYMENT_NAME environment variable to the name of your deployed model.
  • An API key or role based authentication to access the Microsoft Foundry resource

See here for more info on setting up these prerequisites

Set the following environment variables:

# Replace with your Microsoft Foundry resource endpoint
# Ensure that you have the "/openai/v1/" path in the URL, since this is required when using the OpenAI SDK to access Microsoft Foundry models.
$env:AZURE_OPENAI_ENDPOINT="https://ai-foundry-<myresourcename>.services.ai.azure.com/openai/v1/"

# Optional, defaults to using Azure CLI for authentication if not provided
$env:AZURE_OPENAI_API_KEY="************"

# Optional, defaults to Phi-4-mini-instruct
$env:AZURE_AI_MODEL_DEPLOYMENT_NAME="Phi-4-mini-instruct"