To deploy this solution, ensure you have access to an Azure subscription with the necessary permissions to create resource groups, resources, app registrations, and assign roles at the resource group level. This should include Contributor role at the subscription level and Role Based Access Control (RBAC) permissions at the subscription and/or resource group level. Follow the steps in Azure Account Set Up. Follow the steps in Fabric Capacity Set Up.
Check the Azure Products by Region page and select a region where the following services are available:
Here are some example regions where the services are available: East US, East US2, Australia East, UK South, France Central.
If you encounter issues running PowerShell scripts due to the policy of not being digitally signed, you can temporarily adjust the ExecutionPolicy by running the following command in an elevated PowerShell session:
Set-ExecutionPolicy -Scope Process -ExecutionPolicy BypassThis will allow the scripts to run for the current session without permanently changing your system's policy.
- Follow the steps in Fabric Deployment to create a Fabric workspace
Pick from the options below to see step-by-step instructions for GitHub Codespaces, VS Code Dev Containers, VS Code (Web), Local Environments, and Bicep deployments.
Deploy in GitHub Codespaces
You can run this solution using GitHub Codespaces. The button will open a web-based VS Code instance in your browser:
-
Open the solution accelerator (this may take several minutes):
-
Accept the default values on the create Codespaces page.
-
Open a terminal window if it is not already open.
-
Continue with the deploying steps.
Deploy in VS Code
You can run this solution in VS Code Dev Containers, which will open the project in your local VS Code using the Dev Containers extension:
-
Start Docker Desktop (install it if not already installed).
-
Open the project:
-
In the VS Code window that opens, once the project files show up (this may take several minutes), open a terminal window.
-
Continue with the deploying steps.
Deploy in Visual Studio Code (WEB)
You can run this solution in VS Code Web. The button will open a web-based VS Code instance in your browser:
-
Open the solution accelerator (this may take several minutes):
-
When prompted, sign in using your Microsoft account linked to your Azure subscription.
Select the appropriate subscription to continue.
-
Once the solution opens, the AI Foundry terminal will automatically start running the following command to install the required dependencies:
sh install.sh
During this process, you’ll be prompted with the message:
What would you like to do with these files? - Overwrite with versions from template - Keep my existing files unchangedChoose “Overwrite with versions from template” and provide a unique environment name when prompted.
-
Continue with the deploying steps.
Deploy in your local Environment
If you're not using one of the above options for opening the project, then you'll need to:
-
Make sure the following tools are installed:
- PowerShell (v7.0+) - available for Windows, macOS, and Linux.
- Azure Developer CLI (azd) (v1.15.0+) - version
- Python 3.9+
- Docker Desktop
- Git
- Microsoft ODBC Driver 17
-
Clone the repository or download the project code via command-line:
azd init -t microsoft/agentic-applications-for-unified-data-foundation-solution-accelerator/
-
Open the project folder in your terminal or editor.
-
Continue with the deploying steps.
Consider the following settings during your deployment to modify specific settings:
Configurable Deployment Settings
When you start the deployment, most parameters will have default values, but you can update the following settings here:
| Setting | Description | Default value |
|---|---|---|
| Azure Region | The region where resources will be created. | (empty) |
| Environment Name | A 3–20 character alphanumeric value used to generate a unique ID to prefix the resources. | env_name |
| Backend Programming Language | Programming language for the backend API: python or dotnet. | (empty) |
| Use Case | Use case: Retail-sales-analysis or Insurance-improve-customer-meetings. | (empty) |
| Deployment Type | Select from a drop-down list (allowed: Standard, GlobalStandard). |
GlobalStandard |
| GPT Model | Choose from gpt-4, gpt-4o, gpt-4o-mini. | gpt-4o-mini |
| GPT Model Version | The version of the selected GPT model. | 2024-07-18 |
| OpenAI API Version | The Azure OpenAI API version to use. | 2025-01-01-preview |
| GPT Model Deployment Capacity | Configure capacity for GPT models (in thousands). | 30k |
| Image Tag | Docker image tag to deploy. Common values: latest, dev, hotfix. |
latest |
| Use Local Build | Boolean flag to determine if local container builds should be used. | false |
| Existing Log Analytics Workspace | To reuse an existing Log Analytics Workspace ID. | (empty) |
| Existing Azure AI Foundry Project | To reuse an existing Azure AI Foundry Project ID instead of creating a new one. | (empty) |
[Optional] Quota Recommendations
By default, the Gpt-4o-mini model capacity in deployment is set to 30k tokens, so we recommend updating the following:
For Global Standard | GPT-4o-mini - increase the capacity to at least 150k tokens post-deployment for optimal performance.
Depending on your subscription quota and capacity, you can adjust quota settings to better meet your specific needs. You can also adjust the deployment parameters for additional optimization.
Reusing an Existing Log Analytics Workspace
Guide to get your Existing Workspace ID
Reusing an Existing Azure AI Foundry Project
Guide to get your Existing Project ID
Once you've opened the project in Codespaces, Dev Containers, Visual Studio Code (WEB), or locally, you can deploy it to Azure by following these steps:
-
Login to Azure:
azd auth login
azd auth login --tenant-id <tenant-id>
-
Provision and deploy all the resources:
azd up
-
Provide an
azdenvironment name (e.g., "daapp"). -
Select a subscription from your Azure account and choose a location that has quota for all the resources.
-
Choose the programming language for the backend API:
- Python
- .NET (dotnet)
-
Choose the use case:
- Retail-sales-analysis
- Insurance-improve-customer-meetings
This deployment will take 7-10 minutes to provision the resources in your account and set up the solution with sample data.
If you encounter an error or timeout during deployment, changing the location may help, as there could be availability constraints for the resources.
-
Once the deployment has completed successfully, copy the 2 bash commands from the terminal (ex.
bash ./infra/scripts/agent_scripts/run_create_agents_scripts.shandbash ./infra/scripts/fabric_scripts/run_fabric_items_scripts.sh <fabric-workspaceId>) for later use.
Note: If you are running this deployment in GitHub Codespaces or VS Code Dev Container or Visual Studio Code (WEB) skip to step 9.
-
Create and activate a virtual environment
python -m venv .venv
source .venv/Scripts/activate -
Login to Azure
az login
Alternatively, login to Azure using a device code (recommended when using VS Code Web):
az login --use-device-code
Note: you will need to open a Git Bash terminal to complete steps 10 and 11.
-
Run the bash script from the output of the azd deployment. The script will look like the following:
bash ./infra/scripts/agent_scripts/run_create_agents_scripts.sh
If you don't have azd env then you need to pass parameters along with the command. Then the command will look like the following:
bash ./infra/scripts/agent_scripts/run_create_agents_scripts.sh <project-endpoint> <solution-name> <gpt-model-name> <ai-foundry-resource-id> <api-app-name> <resource-group>
-
Run the bash script from the output of the azd deployment. Replace the with your Fabric workspace Id created in the previous steps. The script will look like the following:
bash ./infra/scripts/fabric_scripts/run_fabric_items_scripts.sh <fabric-workspaceId>
If you don't have azd env then you need to pass parameters along with the command. Then the command will look like the following:
bash ./infra/scripts/fabric_scripts/run_fabric_items_scripts.sh <fabric-workspaceId> <solutionname> <ai-foundry-name> <backend-api-mid-principal> <backend-api-mid-client> <api-app-name> <resourcegroup>
-
Once the script has run successfully, go to the deployed resource group, find the App Service, and get the app URL from
Default domain. -
If you are done trying out the application, you can delete the resources by running
azd down.
-
Add App Authentication
Follow steps in App Authentication to configure authentication in app service. Note: Authentication changes can take up to 10 minutes
-
Deleting Resources After a Failed Deployment
- Follow steps in Delete Resource Group if your deployment fails and/or you need to clean up the resources.
-
Cleaning Up Fabric Resources
If you are done trying out the accelerator and want to clean up the Fabric resources (lakehouse, SQL database, and role assignments), run the following script:
bash ./infra/scripts/fabric_scripts/delete_fabric_items_scripts.sh <fabric-workspaceId>
If you don't have azd env then you need to pass parameters along with the command:
bash ./infra/scripts/fabric_scripts/delete_fabric_items_scripts.sh <fabric-workspaceId> <solutionname> <backend-api-principal-id>
Note: This script will remove the lakehouse, SQL database, and service principal role assignments from the Fabric workspace. To completely remove all Azure resources, use
azd down.
To help you get started, here are some Sample Questions you can ask in the app:
For Retail sales analysis use case:
- Show total revenue by year for last 5 years as a line chart.
- Show top 10 products by Revenue in the last year in a table.
- Show as a donut chart.
For Insurance improve customer meetings use case:
- I'm meeting Ida Abolina. Can you summarize her customer information and tell me the number of claims, payments, and communications she's had?
- Can you provide details of her communications?
- Based on Ida's policy data has she ever missed a payment?
These questions serve as a great starting point to explore insights from the data.
- Follow the steps in CopilotStudioDeployment
To set up and run the application locally for development, see the Local Development Setup Guide.