Skip to content

Commit d7413b1

Browse files
removed local deployment file as content is similar to DeploymentGuide
2 parents c476b99 + 53124b4 commit d7413b1

14 files changed

+593
-174
lines changed

.github/workflows/test.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ jobs:
5151
- name: Run tests with coverage
5252
if: env.skip_tests == 'false'
5353
run: |
54-
pytest --cov=. --cov-report=term-missing --cov-report=xml
54+
pytest --cov=. --cov-report=term-missing --cov-report=xml --ignore=tests/e2e-test/tests
5555
5656
- name: Skip coverage report if no tests
5757
if: env.skip_tests == 'true'

docs/CustomizingAzdParameters.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@ By default this template will use the environment name as the prefix to prevent
99
| Name | Type | Default Value | Purpose |
1010
| ------------------------------- | ------ | ----------------- | --------------------------------------------------------------------------------------------------- |
1111
| `AZURE_ENV_NAME` | string | `macae` | Used as a prefix for all resource names to ensure uniqueness across environments. |
12-
| `AZURE_LOCATION` | string | `swedencentral` | Location of the Azure resources. Controls where the infrastructure will be deployed. |
13-
| `AZURE_ENV_OPENAI_LOCATION` | string | `swedencentral` | Specifies the region for OpenAI resource deployment. |
12+
| `AZURE_LOCATION` | string | `<User selects during deployment>` | Location of the Azure resources. Controls where the infrastructure will be deployed. |
13+
| `AZURE_ENV_OPENAI_LOCATION` | string | `<User selects during deployment>` | Specifies the region for OpenAI resource deployment. |
1414
| `AZURE_ENV_MODEL_DEPLOYMENT_TYPE` | string | `GlobalStandard` | Defines the deployment type for the AI model (e.g., Standard, GlobalStandard). |
1515
| `AZURE_ENV_MODEL_NAME` | string | `gpt-4o` | Specifies the name of the GPT model to be deployed. |
1616
| `AZURE_ENV_FOUNDRY_PROJECT_ID` | string | `<Existing Workspace Id>` | Set this if you want to reuse an AI Foundry Project instead of creating a new one. |

docs/DeploymentGuide.md

Lines changed: 19 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -233,7 +233,7 @@ The easiest way to run this accelerator is in a VS Code Dev Containers, which wi
233233

234234
## Detailed Development Container setup instructions
235235

236-
The solution contains a [development container](https://code.visualstudio.com/docs/remote/containers) with all the required tooling to develop and deploy the accelerator. To deploy the Chat With Your Data accelerator using the provided development container you will also need:
236+
The solution contains a [development container](https://code.visualstudio.com/docs/remote/containers) with all the required tooling to develop and deploy the accelerator. To deploy the Multi-Agent solutions accelerator using the provided development container you will also need:
237237

238238
- [Visual Studio Code](https://code.visualstudio.com)
239239
- [Remote containers extension for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers)
@@ -287,7 +287,7 @@ The files for the dev container are located in `/.devcontainer/` folder.
287287

288288
- You can use the Bicep extension for VSCode (Right-click the `.bicep` file, then select "Show deployment plan") or use the Azure CLI:
289289
```bash
290-
az deployment group create -g <resource-group-name> -f deploy/macae-dev.bicep --query 'properties.outputs'
290+
az deployment group create -g <resource-group-name> -f infra/main.bicep --query 'properties.outputs'
291291
```
292292
- **Note**: You will be prompted for a `principalId`, which is the ObjectID of your user in Entra ID. To find it, use the Azure Portal or run:
293293
@@ -301,7 +301,7 @@ The files for the dev container are located in `/.devcontainer/` folder.
301301
302302
**Role Assignments in Bicep Deployment:**
303303
304-
The **macae-dev.bicep** deployment includes the assignment of the appropriate roles to AOAI and Cosmos services. If you want to modify an existing implementation—for example, to use resources deployed as part of the simple deployment for local debugging—you will need to add your own credentials to access the Cosmos and AOAI services. You can add these permissions using the following commands:
304+
The **main.bicep** deployment includes the assignment of the appropriate roles to AOAI and Cosmos services. If you want to modify an existing implementation—for example, to use resources deployed as part of the simple deployment for local debugging—you will need to add your own credentials to access the Cosmos and AOAI services. You can add these permissions using the following commands:
305305
306306
```bash
307307
az cosmosdb sql role assignment create --resource-group <solution-accelerator-rg> --account-name <cosmos-db-account-name> --role-definition-name "Cosmos DB Built-in Data Contributor" --principal-id <aad-user-object-id> --scope /subscriptions/<subscription-id>/resourceGroups/<solution-accelerator-rg>/providers/Microsoft.DocumentDB/databaseAccounts/<cosmos-db-account-name>
@@ -321,6 +321,10 @@ The files for the dev container are located in `/.devcontainer/` folder.
321321
5. **Create a `.env` file:**
322322
323323
- Navigate to the `src\backend` folder and create a `.env` file based on the provided `.env.sample` file.
324+
- Update the `.env` file with the required values from your Azure resource group in Azure Portal App Service environment variables.
325+
- Alternatively, if resources were
326+
provisioned using `azd provision` or `azd up`, a `.env` file is automatically generated in the `.azure/<env-name>/.env`
327+
file. To get your `<env-name>` run `azd env list` to see which env is default.
324328
325329
6. **Fill in the `.env` file:**
326330
@@ -338,8 +342,19 @@ The files for the dev container are located in `/.devcontainer/` folder.
338342
```bash
339343
pip install -r requirements.txt
340344
```
345+
346+
9. **Build the frontend (important):**
341347
342-
9. **Run the application:**
348+
- Before running the frontend server, you must build the frontend to generate the necessary `build/assets` directory.
349+
350+
From the `src/frontend` directory, run:
351+
352+
```bash
353+
npm install
354+
npm run build
355+
```
356+
357+
10. **Run the application:**
343358
344359
- From the src/backend directory:
345360

docs/LocalDeployment.md

Lines changed: 180 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,180 @@
1+
# Guide to local development
2+
3+
## Requirements:
4+
5+
- Python 3.10 or higher + PIP
6+
- Azure CLI, and an Azure Subscription
7+
- Visual Studio Code IDE
8+
9+
# Local setup
10+
11+
> **Note for macOS Developers**: If you are using macOS on Apple Silicon (ARM64) the DevContainer will **not** work. This is due to a limitation with the Azure Functions Core Tools (see [here](https://github.com/Azure/azure-functions-core-tools/issues/3112)). We recommend using the [Non DevContainer Setup](./NON_DEVCONTAINER_SETUP.md) instructions to run the accelerator locally.
12+
13+
The easiest way to run this accelerator is in a VS Code Dev Containers, which will open the project in your local VS Code using the [Dev Containers extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers):
14+
15+
1. Start Docker Desktop (install it if not already installed)
16+
1. Open the project:
17+
[![Open in Dev Containers](https://img.shields.io/static/v1?style=for-the-badge&label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/microsoft/Multi-Agent-Custom-Automation-Engine-Solution-Accelerator)
18+
19+
1. In the VS Code window that opens, once the project files show up (this may take several minutes), open a terminal window
20+
21+
## Detailed Development Container setup instructions
22+
23+
The solution contains a [development container](https://code.visualstudio.com/docs/remote/containers) with all the required tooling to develop and deploy the accelerator. To deploy the Multi Agent Solution accelerator using the provided development container you will also need:
24+
25+
* [Visual Studio Code](https://code.visualstudio.com)
26+
* [Remote containers extension for Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers)
27+
28+
If you are running this on Windows, we recommend you clone this repository in [WSL](https://code.visualstudio.com/docs/remote/wsl)
29+
30+
```cmd
31+
git clone https://github.com/microsoft/Multi-Agent-Custom-Automation-Engine-Solution-Accelerator
32+
```
33+
34+
Open the cloned repository in Visual Studio Code and connect to the development container.
35+
36+
```cmd
37+
code .
38+
```
39+
40+
!!! tip
41+
Visual Studio Code should recognize the available development container and ask you to open the folder using it. For additional details on connecting to remote containers, please see the [Open an existing folder in a container](https://code.visualstudio.com/docs/remote/containers#_quick-start-open-an-existing-folder-in-a-container) quickstart.
42+
43+
When you start the development container for the first time, the container will be built. This usually takes a few minutes. **Please use the development container for all further steps.**
44+
45+
The files for the dev container are located in `/.devcontainer/` folder.
46+
47+
## Local deployment and debugging:
48+
49+
1. **Clone the repository.**
50+
51+
2. **Log into the Azure CLI:**
52+
53+
- Check your login status using:
54+
```bash
55+
az account show
56+
```
57+
- If not logged in, use:
58+
```bash
59+
az login
60+
```
61+
- To specify a tenant, use:
62+
```bash
63+
az login --tenant <tenant_id>
64+
```
65+
66+
3. **Create a Resource Group:**
67+
68+
- You can create it either through the Azure Portal or the Azure CLI:
69+
```bash
70+
az group create --name <resource-group-name> --location EastUS2
71+
```
72+
73+
4. **Deploy the Bicep template:**
74+
75+
- You can use the Bicep extension for VSCode (Right-click the `.bicep` file, then select "Show deployment plane") or use the Azure CLI:
76+
```bash
77+
az deployment group create -g <resource-group-name> -f infra/main.bicep --query 'properties.outputs'
78+
```
79+
- **Note**: You will be prompted for a `principalId`, which is the ObjectID of your user in Entra ID. To find it, use the Azure Portal or run:
80+
```bash
81+
az ad signed-in-user show --query id -o tsv
82+
```
83+
You will also be prompted for locations for Cosmos and OpenAI services. This is to allow separate regions where there may be service quota restrictions.
84+
85+
- **Additional Notes**:
86+
87+
**Role Assignments in Bicep Deployment:**
88+
89+
The **macae-dev.bicep** deployment includes the assignment of the appropriate roles to AOAI and Cosmos services. If you want to modify an existing implementation—for example, to use resources deployed as part of the simple deployment for local debugging—you will need to add your own credentials to access the Cosmos and AOAI services. You can add these permissions using the following commands:
90+
```bash
91+
az cosmosdb sql role assignment create --resource-group <solution-accelerator-rg> --account-name <cosmos-db-account-name> --role-definition-name "Cosmos DB Built-in Data Contributor" --principal-id <aad-user-object-id> --scope /subscriptions/<subscription-id>/resourceGroups/<solution-accelerator-rg>/providers/Microsoft.DocumentDB/databaseAccounts/<cosmos-db-account-name>
92+
```
93+
94+
```bash
95+
az role assignment create --assignee <aad-user-upn> --role "Azure AI User" --scope /subscriptions/<subscription-id>/resourceGroups/<solution-accelerator-rg>/providers/Microsoft.CognitiveServices/accounts/<azure-ai-foundry-name>
96+
```
97+
**Using a Different Database in Cosmos:**
98+
99+
You can set the solution up to use a different database in Cosmos. For example, you can name it something like autogen-dev. To do this:
100+
1. Change the environment variable **COSMOSDB_DATABASE** to the new database name.
101+
2. You will need to create the database in the Cosmos DB account. You can do this from the Data Explorer pane in the portal, click on the drop down labeled “_+ New Container_” and provide all the necessary details.
102+
103+
6. **Create a `.env` file:**
104+
105+
- Navigate to the `src\backend` folder and create a `.env` file based on the provided `.env.sample` file.
106+
- Update the `.env` file with the required values from your Azure resource group in Azure Portal App Service environment variables.
107+
- Alternatively, if resources were
108+
provisioned using `azd provision` or `azd up`, a `.env` file is automatically generated in the `.azure/<env-name>/.env`
109+
file. To get your `<env-name>` run `azd env list` to see which env is default.
110+
111+
7. **Fill in the `.env` file:**
112+
113+
- Use the output from the deployment or check the Azure Portal under "Deployments" in the resource group.
114+
- Make sure to set APP_ENV to "**dev**" in `.env` file.
115+
116+
8. **(Optional) Set up a virtual environment:**
117+
118+
- If you are using `venv`, create and activate your virtual environment for both the frontend and backend folders.
119+
120+
9. **Install requirements - frontend:**
121+
122+
- In each of the frontend and backend folders -
123+
Open a terminal in the `src` folder and run:
124+
```bash
125+
pip install -r requirements.txt
126+
```
127+
128+
9. **Build the frontend (important):**
129+
130+
- Before running the frontend server, you must build the frontend to generate the necessary `build/assets` directory.
131+
132+
From the `src/frontend` directory, run:
133+
134+
```bash
135+
npm install
136+
npm run build
137+
```
138+
139+
11. **Run the application:**
140+
- From the src/backend directory:
141+
```bash
142+
python app_kernel.py
143+
```
144+
- In a new terminal from the src/frontend directory
145+
```bash
146+
python frontend_server.py
147+
```
148+
149+
10. Open a browser and navigate to `http://localhost:3000`
150+
11. To see swagger API documentation, you can navigate to `http://localhost:8000/docs`
151+
152+
## Debugging the solution locally
153+
154+
You can debug the API backend running locally with VSCode using the following launch.json entry:
155+
156+
```
157+
{
158+
"name": "Python Debugger: Backend",
159+
"type": "debugpy",
160+
"request": "launch",
161+
"cwd": "${workspaceFolder}/src/backend",
162+
"module": "uvicorn",
163+
"args": ["app:app", "--reload"],
164+
"jinja": true
165+
}
166+
```
167+
To debug the python server in the frontend directory (frontend_server.py) and related, add the following launch.json entry:
168+
169+
```
170+
{
171+
"name": "Python Debugger: Frontend",
172+
"type": "debugpy",
173+
"request": "launch",
174+
"cwd": "${workspaceFolder}/src/frontend",
175+
"module": "uvicorn",
176+
"args": ["frontend_server:app", "--port", "3000", "--reload"],
177+
"jinja": true
178+
}
179+
```
180+

infra/main.bicep

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1738,3 +1738,4 @@ output AZURE_AI_MODEL_DEPLOYMENT_NAME string = aiFoundryAiServicesModelDeploymen
17381738
// output APPLICATIONINSIGHTS_CONNECTION_STRING string = applicationInsights.outputs.connectionString
17391739
output AZURE_AI_AGENT_MODEL_DEPLOYMENT_NAME string = aiFoundryAiServicesModelDeployment.name
17401740
output AZURE_AI_AGENT_ENDPOINT string = aiFoundryAiServices.outputs.aiProjectInfo.apiEndpoint
1741+
output APP_ENV string = 'Prod'

src/backend/app_kernel.py

Lines changed: 29 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -202,69 +202,55 @@ async def input_task_endpoint(input_task: InputTask, request: Request):
202202
if not input_task.session_id:
203203
input_task.session_id = str(uuid.uuid4())
204204

205+
# Wrap initialization and agent creation in its own try block for setup errors
205206
try:
206-
# Create all agents instead of just the planner agent
207-
# This ensures other agents are created first and the planner has access to them
208207
kernel, memory_store = await initialize_runtime_and_context(
209208
input_task.session_id, user_id
210209
)
211-
client = None
212-
try:
213-
client = config.get_ai_project_client()
214-
except Exception as client_exc:
215-
logging.error(f"Error creating AIProjectClient: {client_exc}")
216-
210+
client = config.get_ai_project_client()
217211
agents = await AgentFactory.create_all_agents(
218212
session_id=input_task.session_id,
219213
user_id=user_id,
220214
memory_store=memory_store,
221215
client=client,
222216
)
217+
except Exception as setup_exc:
218+
logging.error(f"Failed to initialize agents or context: {setup_exc}")
219+
track_event_if_configured(
220+
"InputTaskSetupError",
221+
{"session_id": input_task.session_id, "error": str(setup_exc)},
222+
)
223+
raise HTTPException(
224+
status_code=500, detail="Could not initialize services for your request."
225+
) from setup_exc
223226

227+
try:
224228
group_chat_manager = agents[AgentType.GROUP_CHAT_MANAGER.value]
225-
226-
# Convert input task to JSON for the kernel function, add user_id here
227-
228-
# Use the planner to handle the task
229229
await group_chat_manager.handle_input_task(input_task)
230230

231-
# Get plan from memory store
232231
plan = await memory_store.get_plan_by_session(input_task.session_id)
233-
234-
if not plan: # If the plan is not found, raise an error
232+
if not plan:
235233
track_event_if_configured(
236234
"PlanNotFound",
237-
{
238-
"status": "Plan not found",
239-
"session_id": input_task.session_id,
240-
"description": input_task.description,
241-
},
235+
{"status": "Plan not found", "session_id": input_task.session_id},
242236
)
243237
raise HTTPException(status_code=404, detail="Plan not found")
244-
# Log custom event for successful input task processing
238+
245239
track_event_if_configured(
246240
"InputTaskProcessed",
247-
{
248-
"status": f"Plan created with ID: {plan.id}",
249-
"session_id": input_task.session_id,
250-
"plan_id": plan.id,
251-
"description": input_task.description,
252-
},
241+
{"status": f"Plan created with ID: {plan.id}", "session_id": input_task.session_id},
253242
)
254-
if client:
255-
try:
256-
client.close()
257-
except Exception as e:
258-
logging.error(f"Error sending to AIProjectClient: {e}")
259243
return {
260244
"status": f"Plan created with ID: {plan.id}",
261245
"session_id": input_task.session_id,
262246
"plan_id": plan.id,
263247
"description": input_task.description,
264248
}
265-
249+
except HTTPException:
250+
# Re-raise HTTPExceptions so they are not caught by the generic block
251+
raise
266252
except Exception as e:
267-
# Extract clean error message for rate limit errors
253+
# This now specifically handles errors during task processing
268254
error_msg = str(e)
269255
if "Rate limit is exceeded" in error_msg:
270256
match = re.search(r"Rate limit is exceeded\. Try again in (\d+) seconds?\.", error_msg)
@@ -273,13 +259,16 @@ async def input_task_endpoint(input_task: InputTask, request: Request):
273259

274260
track_event_if_configured(
275261
"InputTaskError",
276-
{
277-
"session_id": input_task.session_id,
278-
"description": input_task.description,
279-
"error": str(e),
280-
},
262+
{"session_id": input_task.session_id, "error": str(e)},
281263
)
282-
raise HTTPException(status_code=400, detail=f"{error_msg}") from e
264+
raise HTTPException(status_code=400, detail=f"Error processing plan: {error_msg}") from e
265+
finally:
266+
# Ensure the client is closed even if an error occurs
267+
if 'client' in locals() and client:
268+
try:
269+
client.close()
270+
except Exception as e:
271+
logging.error(f"Error closing AIProjectClient: {e}")
283272

284273

285274
@app.post("/api/human_feedback")

0 commit comments

Comments
 (0)