Features • Getting Started • Quickstart • Guidance
This template provides a complete foundation for building scalable AI-powered applications and integration solutions on Azure with proper security, monitoring, and API management practices. All services, configurations and samples are available via infrastructure-as-code using Bicep modules.
The deployment follows a multi-resource group design with the following services:
- Web App: Sample Python app deployed to App Service, including a staging slot and the routes
/(sample web page),/api/helloworld(sample API endpoint) and/health(health check endpoint). - Function App: Sample Python Azure Function (HTTP trigger) exposing
/api/helloworldreturning Python runtime info and publishing a message to a Service Bus queue (sbq-sample-01) for every request. The enqueued payload includes timestamp, method, URL and all inbound HTTP headers. - Logic App (Standard): Sample stateful workflow (
wf-sample-01) triggered by messages in the Service Bus queuesbq-sample-01(peek-lock). Implements a simple try/catch pattern: the Try scope completes the message; on failure the Catch scope dead‑letters the message and terminates the run as Failed. - API Management: Publishes both backends with versioned routes
/web/v1/helloworldand/function/v1/helloworldsecured via subscription key. - Service Bus: Deploys a namespace account with one sample queue (
sbq-sample-01) and one sample topic (sbt-sample-01).
- AI Foundry with a sample project (
proj-sample-01) and the following model deployments: GPT-4o, GPT-4.1 and text-embedding-3-small (GlobalStandard SKUs). - AI Search with semantic search set to Free tier for experimentation.
- Key Vault for secret management.
- Cosmos DB account with sample database (
database-01) and container (db-01-container-01). - Storage account including blob, queue, table and file services.
- Log Analytics workspace and Application Insights wired to all apps to capture logs and telemetry data.
- Dashboard for real-time monitoring using Application Insights data.
Note
All apps are deployed with individual Service Pans. Function App and Logic App have their own storage account.
This template uses both Managed Identity and Key Vault. Managed identities are assigned to the Web App, Function App, Logic App, and API Management; RBAC modules grant access to Key Vault, Cosmos DB, Service Bus, Storage, and AI services. Secrets can be centralized in Key Vault and accessed via managed identity.
The following RBAC are set by the templates in the respective service modules:
| Service | Access to | Role(s) Assigned |
|---|---|---|
| Web App | Key Vault | Key Vault Secrets User |
| Web App | Cosmos DB | Cosmos DB Operator; Cosmos DB Account Reader Role |
| Web App | AI Foundry | Azure AI User |
| Function App | Key Vault | Key Vault Secrets User |
| Function App | Service Bus | Azure Service Bus Data Owner |
| Function App | AI Foundry | Azure AI User |
| Logic App | Key Vault | Key Vault Secrets User |
| Logic App | Service Bus | Azure Service Bus Data Owner |
| Logic App | AI Foundry | Azure AI User |
| API Management | Key Vault | Key Vault Secrets User |
| AI Search | Storage Account | Storage Blob Data Contributor |
| AI Foundry | Key Vault | Key Vault Secrets User |
| AI Foundry | Cosmos DB | Cosmos DB Operator; Cosmos DB Account Reader. |
| AI Foundry | Storage Account | Storage Blob Data Contributor |
| AI Foundry | AI Search | Search Service Contributor; Search Index Data Contributor |
- Client calls APM endpoint
GET https://{apim-instance}}.azure-api.net/function/v1/helloworld - APIM gateway forwards request to the backend service (Function).
- Function responds the request and enqueues a message (queue
sbq-sample-01, payload includes timestamp, method, URL and headers). - Logic App trigger (peek-lock) fires on new messages.
- Workflow
Tryscope completes the message (removes it from the queue). On failure,Catchscope dead‑letters the message and terminates with error.
There are 3 options for getting started:
- Run the template virtually by using GitHub Codespaces, which sets up tools automatically (quickest way).
- Run in your local VS Code using the VS Code Dev Containers extension.
- Setting-up a Local Environment (MacOS, Linux or Windows).
Prerequisites:
- Azure subscription with permissions to create resource groups and deploy resources.
- GitHub account.
Steps:
- Open the repository in GitHub Codespaces
- Configure the settings and create the Codespace (this may take several minutes)
Prerequisites:
- Azure subscription with permissions to create resource groups and deploy resources.
- Dev Containers extension for VS Code
- Docker Desktop
Steps:
- Start Docker Desktop
- Open the project in a VS Code Dev Container (this may take several minutes)
Prerequisites:
- Azure subscription with permissions to create resource groups and deploy resources.
- Install Azure Developer CLI
- Windows:
winget install microsoft.azd - Linux:
curl -fsSL https://aka.ms/install-azd.sh | bash - MacOS:
brew tap azure/azd && brew install azd
- Windows:
- Install Python 3.12+ for local development
- Install Azure Functions Core Tools for local development
- Install Azure CLI for advanced scenarios (optional)
Steps:
-
Clone the repository locally:
git clone <repository-url> cd <repository-folder>
-
Install Python dependencies:
pip install -r src/web-app/requirements.txt pip install -r src/function-app/requirements.txt
-
Open VS Code and load the local project folder
-
In the VS Code (local or VS Code for web, if using Codespace via browser), open a terminal window.
-
Sign into your Azure account:
azd auth login --use-device-code
-
Initialize the environment (optional):
azd init
-
Provision Azure resources and deploy the app code:
azd up
This will:
- Prompt for environment name (if not initialised yet)
- Prompt for Azure subscription and region (if not selected yet)
- Provision all Azure resources
- Deploy all three applications source code (web, function and logic app)
- Configure API Management endpoints and backends
Note
Alternative deployment methods:
- For infra provisioning only, use
azd provision - For code deployment only, use
azd deploy
-
Configure GitHub CI/CD pipeline (optional, when using your own repository):
azd pipeline config
-
Test the web application using a browser
- Visit the web app URL (typically
https://{app-service-name}.azurewebsites.net)
- Visit the web app URL (typically
-
Test the sample APIs via API Management using your subscription key:
curl -X GET "https://{apim-instance}.azure-api.net/web/v1/helloworld" \ -H "Ocp-Apim-Subscription-Key: <SUBSCRIPTION_KEY>" curl -X GET "https://{apim-instance}.azure-api.net/function/v1/helloworld" \ -H "Ocp-Apim-Subscription-Key: <SUBSCRIPTION_KEY>"
- Requires Python /FastAPI:
cd src/web-app uvicorn main:app --host 0.0.0.0 --port 8000 - Access the web app interface at
http://localhost:8000 - Access the web app API at
http://localhost/api/helloworld:8000
- Requires Python and Azure Functions Core Tools:
cd src/function-app func host start - Access the function at
http://localhost:7071/api/helloworld - Each request also enqueues a JSON message to the Service Bus queue
sbq-sample-01via an output binding (@app.service_bus_queue_output). The binding uses the connection string settingServiceBusConnection. - Create a
local.settings.jsonfile insidesrc/function-app/folder (not committed) with a valid Service Bus connection string if you want to run the queue flow locally:{ "IsEncrypted": false, "Values": { "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "python", "ServiceBusConnection": "Endpoint=sb://<namespace>.servicebus.windows.net/;SharedAccessKeyName=<KeyName>;SharedAccessKey=<KeyValue>" } }
- Deployed via AZD. You can edit the workflow under
src/logic-app/{workflow-name}/and redeploy withazd uporazd deploy.
- API definitions in
infra/modules/apim/api/*.yaml. On deploy, these are published under/web/v1and/function/v1with subscription-key security.
Important
The applications are designed to work with managed identities in Azure. For local development, you can use connection strings or service principal authentication. When running locally, you may need to configure environment variables for Azure service connections.
Once the application is developed, the Bicep files may need to be extended with additional service modules and configuration. The solution follows a modular architecture with the following structure:
infra/
├── main.bicep # Main orchestration file
├── main.parameters.json # Environment-specific parameters
└── modules/
├── ai/ # AI services (Azure OpenAI, AI Search)
├── apim/ # API Management configuration
├── app/ # Application services (Web App, Functions, Logic Apps)
├── cosmosdb/ # Cosmos DB configuration
├── keyvault/ # Key Vault configuration
├── monitor/ # Monitoring and logging
├── security/ # RBAC configurations
├── servicebus/ # Service Bus messaging
└── storage/ # Storage account configuration
To add new services:
- Add or reuse a module under
infra/modules/<service>and expose outputs you need. - Reference the module from
infra/main.bicep, passing shared settings (tags, location) and linking to existing outputs. - Grant RBAC via the security modules under
infra/modules/securityusing managed identities of your apps. - Update
main.parameters.jsonwith any required parameters
API version discovery
Use Azure CLI locally or in Codespaces/Dev Containers to list provider API versions when introducing new resources:
az provider show --namespace Microsoft.Web --query "resourceTypes[?resourceType=='sites'].apiVersions" -o tsvTo remove all resources at once, including the resource groups, and purge any soft-deleted service, just run:
azd down --purgeNote
Azd will scan and list all the resource(s) to be deleted and their respective groups, within the current environment, asking for a confirmation before proceeding. Keep the terminal open during the process until it's done.
This template deploys Azure AI Foundry model deployments (gpt-4o, gpt-4.1, text-embedding-3-small) which may not be available in all regions or SKUs. Check the up-to-date model availability and choose a supported region for your subscription:
- Model availability: https://learn.microsoft.com/azure/ai-services/openai/concepts/models#standard-deployment-model-availability
- Region selection: availability varies by model; pick a region listed as supporting the "GlobalStandard" SKU for the models you need and verify current availability before deploying.
Apart from choosing a supported region, ensure your subscription has quota for the selected models/SKUs. Check and request increases via the Azure portal or with Azure CLI:
- Azure OpenAI quotas and limits: https://learn.microsoft.com/azure/ai-services/openai/quotas-limits
- Azure portal: Help + support > New support request > Service and subscription limits (quotas)
You can estimate the cost of this project's architecture with Azure's pricing calculator.
- Azure Developer CLI (azd)
- Azure AI Foundry and OpenAI
- Azure AI Search
- Azure API Management
- Azure App Service (Linux, Python)
- Azure Functions Python Developer Guide
- Azure Logic Apps
- Azure Well-Architected Framework
- Develop Python apps that use Azure AI services
- FastAPI Documentation
- Bicep Documentation

