Skip to content

Commit 2123669

Browse files
authored
feat: add Log Analytics workspace and diagnostics for deployment infrastructure (#300)
* feat: add Log Analytics workspace and diagnostics for deployment container - Added a Log Analytics workspace resource to monitor deployment activities. - Implemented diagnostic settings for the deployment container and blob service. - Updated variables for log analytics retention and included options for enabling/disabling log analytics. - Adjusted scale capacity for cognitive deployments to avoid exceeding quotas. - Enhanced troubleshooting documentation for deployment errors and quota issues. * feat: add search index configuration script with data fetching and uploading functionality * feat: add time sleep resources for network readiness and OpenAI provisioning, and create private endpoint for Azure OpenAI service * fix: adjust dependency declaration for OpenAI provisioning wait and add triggers for readiness check * fix: update AZURE_ENV_NAME to handle pull request context * feat: add diagnostic settings for Azure resources to enhance monitoring and logging capabilities * fix: remove unused azd_env_seed variable from naming configuration * fix: update naming conventions for Power Platform NSGs in diagnostic settings * fix: update comment for Azure AI Search Service diagnostic settings and remove unused categories
1 parent 670ed2a commit 2123669

12 files changed

+615
-130
lines changed

docs/app_registration_setup.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ To enable secure automation and integration with Azure and Power Platform, you n
77
1. Login to your Power Platform:
88

99
```shell
10-
pac auth create
10+
pac auth create --deviceCode
1111
```
1212

1313
1. Create new **App Registration**:

docs/cicd.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ All infrastructure for CI/CD lives under `cicd/` and can be customized to meet y
1313

1414
## Prerequisites
1515

16-
- Working local environment of this template. If you do not have one, Follow the step by step instructions for setting up your [**Local Environment**](../README.md#local-environment)
16+
- Working local environment of this template. If you do not have one, Follow the step by step instructions for setting up your [**Local Environment**](../README.md#local-environment).
1717
- An Azure subscription with either User Access Administrator or Owner permissions to create workload identity resources like service principal, and OIDC to be used by the GitHub Actions.
1818
- GitHub CLI (`gh`) installed and authenticated to trigger the bootstrap workflow from your terminal.
1919

docs/troubleshooting.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
# Troubleshooting tips
2+
3+
## Quota error during deployment
4+
5+
If you see an InsufficientQuota error mentioning "Tokens Per Minute", the requested `scale.capacity` (thousands of TPM) exceeds your subscription's available quota — lower `scale.capacity` in TFVARS or request a quota increase in the Azure portal.
6+
7+
## Private endpoint fails with AccountProvisioningStateInvalid
8+
9+
This occurs when Terraform tries to create the private endpoint before the Azure OpenAI (Cognitive Services) account leaves the `Accepted` state; wait until the resource shows `Succeeded` (portal or `az resource show`) and re-run the provisioning (`azd provision`).
10+
11+
## Use GitHub Copilot to help troubleshoot
12+
13+
If you're unsure how to fix a deployment error, open the relevant files in VS Code and use GitHub Copilot for suggestions. Copilot can propose TFVARS overrides, sample values, terraform plan snippets, or concise support-request wording; always review and test generated suggestions before applying them.

infra/main.ai.tf

Lines changed: 40 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,14 @@
11
# Copyright (c) Microsoft Corporation.
22
# Licensed under the MIT license.
33

4+
# Wait for network infrastructure to be ready
5+
resource "time_sleep" "wait_for_network_ready" {
6+
depends_on = [
7+
module.copilot_studio
8+
]
9+
create_duration = "30s"
10+
}
11+
412
module "azure_open_ai" {
513
# checkov:skip=CKV2_AZURE_22: Customer-managed keys should be added in production usage but are not included here for simplicity.
614
# checkov:skip=CKV_AZURE_236: The Power Platform AI Search connector only supports service principal, API key, or interactive auth.
@@ -32,19 +40,41 @@ module "azure_open_ai" {
3240
]
3341
}
3442

35-
private_endpoints = {
36-
pe_endpoint = {
37-
name = "pe-${azurecaf_name.main_names.results["azurerm_cognitive_account"]}"
38-
private_service_connection_name = "pe_endpoint_connection"
39-
subnet_resource_id = local.pe_primary_subnet_id
40-
}
41-
}
4243
managed_identities = {
4344
system_assigned = true
4445
}
4546
tags = var.tags
4647

47-
depends_on = [module.copilot_studio]
48+
depends_on = [time_sleep.wait_for_network_ready]
49+
}
50+
51+
# Wait for Azure OpenAI service to be fully provisioned
52+
resource "time_sleep" "wait_for_openai_provisioning" {
53+
depends_on = [module.azure_open_ai]
54+
create_duration = "60s"
55+
56+
# Ensure the OpenAI service is in a ready state before proceeding with private endpoint creation
57+
triggers = {
58+
openai_id = module.azure_open_ai.resource.id
59+
}
60+
}
61+
62+
# Create private endpoint separately to ensure OpenAI service is fully ready
63+
resource "azurerm_private_endpoint" "openai_pe" {
64+
name = "pe-${azurecaf_name.main_names.results["azurerm_cognitive_account"]}"
65+
location = local.primary_azure_region
66+
resource_group_name = local.resource_group_name
67+
subnet_id = local.pe_primary_subnet_id
68+
tags = var.tags
69+
70+
private_service_connection {
71+
name = "pe_endpoint_connection"
72+
private_connection_resource_id = module.azure_open_ai.resource.id
73+
subresource_names = ["account"]
74+
is_manual_connection = false
75+
}
76+
77+
depends_on = [time_sleep.wait_for_openai_provisioning]
4878
}
4979

5080
# Private DNS zone for Azure OpenAI private endpoint resolution
@@ -69,12 +99,12 @@ resource "azurerm_private_dns_zone_virtual_network_link" "aoai_dns_links" {
6999
}
70100

71101
# DNS A record for Azure OpenAI private endpoint
72-
# The module creates the private endpoint, so we reference it from the module outputs
102+
# Reference the separately created private endpoint
73103
resource "azurerm_private_dns_a_record" "aoai_dns_record" {
74104
name = module.azure_open_ai.resource.name
75105
zone_name = azurerm_private_dns_zone.aoai_dns.name
76106
resource_group_name = local.resource_group_name
77107
ttl = 10
78-
records = [module.azure_open_ai.private_endpoints["pe_endpoint"].private_service_connection[0].private_ip_address]
108+
records = [azurerm_private_endpoint.openai_pe.private_service_connection[0].private_ip_address]
79109
tags = var.tags
80110
}

infra/main.app_insights.tf

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,13 +3,27 @@
33

44
resource "random_uuid" "uid" {}
55

6+
resource "azurerm_log_analytics_workspace" "monitoring" {
7+
count = var.include_log_analytics ? 1 : 0
8+
9+
daily_quota_gb = -1
10+
location = local.primary_azure_region
11+
name = azurecaf_name.main_names.results["azurerm_log_analytics_workspace"]
12+
resource_group_name = local.resource_group_name
13+
retention_in_days = var.log_analytics_retention_in_days
14+
sku = "PerGB2018"
15+
tags = var.tags
16+
}
17+
618
resource "azurerm_application_insights" "insights" {
719
count = var.include_app_insights ? 1 : 0
820

921
application_type = "web"
1022
location = local.primary_azure_region
1123
name = "${var.resource_prefix}-appinsights-${var.resource_suffix}"
1224
resource_group_name = local.resource_group_name
25+
workspace_id = var.include_log_analytics ? azurerm_log_analytics_workspace.monitoring[0].id : null
26+
tags = var.tags
1327
}
1428

1529
resource "azurerm_application_insights_workbook" "workbook" {

0 commit comments

Comments
 (0)