A comprehensive Azure AI Foundry plugin for Genkit Go that provides text generation and chat capabilities using Azure OpenAI and other models available through Azure AI Foundry.
- Azure AI Foundry Plugin for Genkit Go
- Text Generation: Support for GPT-5, GPT-5 mini, GPT-4o, GPT-4o mini, GPT-4 Turbo, GPT-4, and GPT-3.5 Turbo models
- Embeddings: Support for text-embedding-ada-002, text-embedding-3-small, and text-embedding-3-large models
- Image Generation: Support for creating images from text prompts
- Text-to-Speech: Convert text to natural-sounding speech with multiple voices
- Speech-to-Text: Transcribe audio to text using with subtitle support
- Streaming: Full streaming support for real-time responses
- Tool Calling: Complete function calling capabilities for GPT-4 and GPT-3.5-turbo models
- Multimodal Support: Support for text + image inputs (vision models like GPT-5, GPT-4o and GPT-4 Turbo)
- Multi-turn Conversations: Full support for chat history and context management
- Type Safety: Robust type conversion and schema validation
- Flexible Authentication: Support for API keys, Azure Default Credential, and custom token credentials
- GPT-5: Latest advanced model (check Azure for availability)
- GPT-5 mini: Smaller, faster version of GPT-5
- GPT-4o: multimodal model with vision capabilities
- GPT-4o mini: Smaller, faster version of GPT-4o
- GPT-4 Turbo: High-performance GPT-4 with vision support
- GPT-4: Standard GPT-4 model
- GPT-3.5 Turbo: Fast and cost-effective model
All GPT-5, GPT-4 and GPT-3.5-turbo models support function calling (tools).
go get github.com/xavidop/genkit-azure-foundry-gopackage main
import (
"context"
"log"
"os"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
ctx := context.Background()
// Initialize Azure AI Foundry plugin
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: os.Getenv("AZURE_OPENAI_ENDPOINT"),
APIKey: os.Getenv("AZURE_OPENAI_API_KEY"),
}
// Initialize Genkit
g := genkit.Init(ctx,
genkit.WithPlugins(azurePlugin),
genkit.WithDefaultModel("azureaifoundry/gpt-5"),
)
// Optional: Define common models for easy access
azureaifoundry.DefineCommonModels(azurePlugin, g)
log.Println("Starting basic Azure AI Foundry example...")
// Example: Generate text (basic usage)
response, err := genkit.Generate(ctx, g,
ai.WithPrompt("What are the key benefits of using Azure AI Foundry?"),
)
if err != nil {
log.Printf("Error: %v", err)
} else {
log.Printf("Response: %s", response.Text())
}
}package main
import (
"context"
"log"
"github.com/firebase/genkit/go/ai"
"github.com/firebase/genkit/go/genkit"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
ctx := context.Background()
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: "https://your-resource.openai.azure.com/",
APIKey: "your-api-key",
}
g := genkit.Init(ctx,
genkit.WithPlugins(azurePlugin),
)
// Define a GPT-5 model (use your deployment name)
gpt5Model := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: "gpt-5", // Your deployment name in Azure
Type: "chat",
SupportsMedia: true,
}, nil)
// Generate text
response, err := genkit.Generate(ctx, g,
ai.WithModel(gpt4Model),
ai.WithMessages(ai.NewUserMessage(
ai.NewTextPart("Explain quantum computing in simple terms."),
)),
)
if err != nil {
log.Fatal(err)
}
log.Println(response.Text())
}The plugin supports various configuration options:
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: "https://your-resource.openai.azure.com/",
APIKey: "your-api-key", // Use API key
// OR use Azure credential
// Credential: azidentity.NewDefaultAzureCredential(),
APIVersion: "2024-02-15-preview", // Optional
}| Option | Type | Default | Description |
|---|---|---|---|
Endpoint |
string |
required | Azure OpenAI endpoint URL |
APIKey |
string |
"" | API key for authentication |
Credential |
azcore.TokenCredential |
nil |
Azure credential (alternative to API key) |
APIVersion |
string |
Latest | API version to use |
- Go to Azure Portal
- Navigate to your Azure OpenAI resource
- Go to "Keys and Endpoint" section
- Copy your endpoint URL and API key
The plugin supports multiple authentication methods to suit different deployment scenarios:
Best for: Development, testing, and simple scenarios
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_OPENAI_API_KEY="your-api-key"import (
"os"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: os.Getenv("AZURE_OPENAI_ENDPOINT"),
APIKey: os.Getenv("AZURE_OPENAI_API_KEY"),
}Best for: Production deployments, Azure-hosted applications
DefaultAzureCredential automatically tries multiple authentication methods in the following order:
- Environment variables (AZURE_CLIENT_ID, AZURE_CLIENT_SECRET, AZURE_TENANT_ID)
- Managed Identity (when deployed to Azure)
- Azure CLI credentials (for local development)
- Azure PowerShell credentials
- Interactive browser authentication
# Required environment variables
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_TENANT_ID="your-tenant-id"
# Optional: For service principal authentication
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"import (
"fmt"
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
tenantID := os.Getenv("AZURE_TENANT_ID")
// Create DefaultAzureCredential
credential, err := azidentity.NewDefaultAzureCredential(&azidentity.DefaultAzureCredentialOptions{
TenantID: tenantID,
})
if err != nil {
fmt.Fprintf(os.Stderr, "ERROR: %s\n", err)
return
}
// Initialize plugin with credential
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
// Use the plugin with Genkit...
}Best for: Applications deployed to Azure (App Service, Container Apps, VMs, AKS)
When deployed to Azure, Managed Identity provides authentication without storing credentials:
import (
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
// Use Managed Identity
credential, err := azidentity.NewManagedIdentityCredential(nil)
if err != nil {
panic(err)
}
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
}Best for: CI/CD pipelines, automated deployments
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_TENANT_ID="your-tenant-id"
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"import (
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
tenantID := os.Getenv("AZURE_TENANT_ID")
clientID := os.Getenv("AZURE_CLIENT_ID")
clientSecret := os.Getenv("AZURE_CLIENT_SECRET")
credential, err := azidentity.NewClientSecretCredential(tenantID, clientID, clientSecret, nil)
if err != nil {
panic(err)
}
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
}Best for: Local development with Azure CLI installed
# Login to Azure CLI first
az login
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"import (
"os"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
func main() {
endpoint := os.Getenv("AZURE_OPENAI_ENDPOINT")
// Use Azure CLI credentials
credential, err := azidentity.NewAzureCLICredential(nil)
if err != nil {
panic(err)
}
azurePlugin := &azureaifoundry.AzureAIFoundry{
Endpoint: endpoint,
Credential: credential,
}
}Important: The Name in ModelDefinition should match your deployment name in Azure, not the model name. For example:
- If you deployed
gpt-5with deployment namemy-gpt5-deployment, use"my-gpt5-deployment" - If you deployed
gpt-4owith deployment namegpt-4o, use"gpt-4o"
The repository includes comprehensive examples:
examples/basic/- Simple text generationexamples/streaming/- Real-time streaming responsesexamples/chat/- Multi-turn conversation with contextexamples/embeddings/- Text embeddings generationexamples/tool_calling/- Function calling with multiple toolsexamples/vision/- Multimodal image analysisexamples/image_generation/- Generate imagesexamples/text_to_speech/- Convert text to speechexamples/speech_to_text/- Transcribe audio to text
# Set environment variables
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_OPENAI_API_KEY="your-api-key"
# Run basic example
cd examples/basic
go run main.go
# Run streaming example
cd ../streaming
go run main.go
# Run chat example
cd ../chat
go run main.go
# Run tool calling example
cd ../tool_calling
go run main.go
# Run vision example
cd ../vision
go run main.go
# Run image generation example
cd ../image_generation
go run main.go
# Run text-to-speech example
cd ../text_to_speech
go run main.go
# Run speech-to-text example (requires audio files)
cd ../speech_to_text
go run main.go// Define a tool
weatherTool := genkit.DefineTool(g, "get_weather",
"Get current weather",
func(ctx *ai.ToolContext, input struct {
Location string `json:"location"`
Unit string `json:"unit,omitempty"`
}) (string, error) {
return getWeather(input.Location, input.Unit)
},
)
// Use the tool
response, err := genkit.Generate(ctx, g,
ai.WithModel(gpt4Model),
ai.WithTools(weatherTool),
ai.WithPrompt("What's the weather in San Francisco?"),
)GPT-5 and GPT-4o support image inputs:
response, err := genkit.Generate(ctx, g,
ai.WithModel(gpt5Model),
ai.WithMessages(ai.NewUserMessage(
ai.NewTextPart("What's in this image?"),
ai.NewMediaPart("image/jpeg", imageDataURL),
)),
)streamCallback := func(ctx context.Context, chunk *ai.ModelResponseChunk) error {
for _, part := range chunk.Content {
if part.IsText() {
fmt.Print(part.Text)
}
}
return nil
}
response, err := genkit.Generate(ctx, g,
ai.WithModel(gpt4Model),
ai.WithPrompt("Tell me a story"),
ai.WithStreaming(streamCallback),
)// First message
response1, _ := genkit.Generate(ctx, g,
ai.WithModel(gpt4Model),
ai.WithMessages(
ai.NewSystemMessage(ai.NewTextPart("You are a helpful assistant.")),
ai.NewUserTextMessage("What is Azure?"),
),
)
// Follow-up message with context
response2, _ := genkit.Generate(ctx, g,
ai.WithModel(gpt4Model),
ai.WithMessages(
ai.NewSystemMessage(ai.NewTextPart("You are a helpful assistant.")),
ai.NewUserTextMessage("What is Azure?"),
response1.Message, // Previous assistant message
ai.NewUserTextMessage("What are its key services?"),
),
)import (
"github.com/firebase/genkit/go/ai"
azureaifoundry "github.com/xavidop/genkit-azure-foundry-go"
)
// Define an embedder (use your deployment name)
embedder := azurePlugin.DefineEmbedder(g, "text-embedding-3-small")
// Or use common embedders helper
embedders := azureaifoundry.DefineCommonEmbedders(azurePlugin, g)
// Generate embeddings
response, err := genkit.Embed(ctx, g,
ai.WithEmbedder(embedder),
ai.WithEmbedText("Azure AI Foundry provides powerful AI capabilities"),
)
if err != nil {
log.Fatal(err)
}
// Access the embedding vector
embedding := response.Embeddings[0].Embedding // []float32
log.Printf("Embedding dimensions: %d", len(embedding))Generate images with DALL-E models using the standard genkit.Generate() method:
// Define DALL-E model
dallE3 := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: azureaifoundry.ModelDallE3,
Type: "chat",
}, nil)
// Generate image
response, err := genkit.Generate(ctx, g,
ai.WithModel(dallE3),
ai.WithPrompt("A serene landscape with mountains at sunset"),
ai.WithConfig(map[string]interface{}{
"quality": "hd",
"size": "1024x1024",
"style": "vivid",
}),
)
if err != nil {
log.Fatal(err)
}
log.Printf("Image URL: %s", response.Text())Convert text to speech using the standard genkit.Generate() method:
import "encoding/base64"
// Define TTS model
ttsModel := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: azureaifoundry.ModelTTS1HD,
Type: "chat",
}, nil)
// Generate speech
response, err := genkit.Generate(ctx, g,
ai.WithModel(ttsModel),
ai.WithPrompt("Hello! Welcome to Azure AI Foundry."),
ai.WithConfig(map[string]interface{}{
"voice": "nova",
"response_format": "mp3",
"speed": 1.5,
}),
)
if err != nil {
log.Fatal(err)
}
// Decode base64 audio and save file
audioData, _ := base64.StdEncoding.DecodeString(response.Text())
os.WriteFile("output.mp3", audioData, 0644)Transcribe audio to text using the standard genkit.Generate() method:
import "encoding/base64"
// Define Whisper model with media support (required for audio input)
whisperModel := azurePlugin.DefineModel(g, azureaifoundry.ModelDefinition{
Name: azureaifoundry.ModelWhisper1,
Type: "chat",
SupportsMedia: true, // Required for media parts (audio)
}, nil)
// Read and encode audio file
audioData, _ := os.ReadFile("audio.mp3")
base64Audio := base64.StdEncoding.EncodeToString(audioData)
// Transcribe audio
response, err := genkit.Generate(ctx, g,
ai.WithModel(whisperModel),
ai.WithMessages(ai.NewUserMessage(
ai.NewMediaPart("audio/mp3", "data:audio/mp3;base64,"+base64Audio),
)),
ai.WithConfig(map[string]interface{}{
"language": "en",
}),
)
if err != nil {
log.Fatal(err)
}
log.Printf("Transcription: %s", response.Text())-
"Endpoint is required" Error
- Verify
AZURE_OPENAI_ENDPOINTis set correctly - Ensure the endpoint URL includes
https://and trailing/
- Verify
-
"Deployment not found" Error
- Check that the deployment name in your code matches the actual deployment name in Azure
- Verify the model is deployed in your Azure OpenAI resource
-
Authentication Errors
- Ensure your API key is correct
- Check that your Azure subscription is active
- Verify network connectivity to Azure
-
Rate Limit Errors
- Implement exponential backoff retry logic
- Consider upgrading to higher rate limits
- Distribute requests across time
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Follow Conventional Commits format
- Commit your changes (
git commit -m 'feat: add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Apache 2.0 - see LICENSE file for details.
- Genkit team for the excellent Go framework
- Azure AI team for the comprehensive AI platform
- The open source community for inspiration and feedback
Built with โค๏ธ for the Genkit Go community