Skip to content

Commit ddb7de7

Browse files
2 parents 679f875 + a6b05ea commit ddb7de7

29 files changed

+3568
-76
lines changed

.github/workflows/test.yml

Lines changed: 42 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ on:
66
- main
77
- dev
88
- demo
9+
- psl-backend-unittest
910
pull_request:
1011
types:
1112
- opened
@@ -18,49 +19,48 @@ on:
1819
- demo
1920

2021
jobs:
21-
# frontend_tests:
22-
# runs-on: ubuntu-latest
23-
24-
# steps:
25-
# - name: Checkout code
26-
# uses: actions/checkout@v3
27-
28-
# - name: Set up Node.js
29-
# uses: actions/setup-node@v3
30-
# with:
31-
# node-version: '20'
32-
33-
# - name: Check if Frontend Test Files Exist
34-
# id: check_frontend_tests
35-
# run: |
36-
# if [ -z "$(find App/frontend/src -type f -name '*.test.js' -o -name '*.test.ts' -o -name '*.test.tsx')" ]; then
37-
# echo "No frontend test files found, skipping frontend tests."
38-
# echo "skip_frontend_tests=true" >> $GITHUB_ENV
39-
# else
40-
# echo "Frontend test files found, running tests."
41-
# echo "skip_frontend_tests=false" >> $GITHUB_ENV
42-
# fi
43-
44-
# - name: Install Frontend Dependencies
45-
# if: env.skip_frontend_tests == 'false'
46-
# run: |
47-
# cd App/frontend
48-
# npm install
49-
50-
# - name: Run Frontend Tests with Coverage
51-
# if: env.skip_frontend_tests == 'false'
52-
# run: |
53-
# cd App/frontend
54-
# npm run test -- --coverage
55-
56-
# - name: Skip Frontend Tests
57-
# if: env.skip_frontend_tests == 'true'
58-
# run: |
59-
# echo "Skipping frontend tests because no test files were found."
22+
# frontend_tests:
23+
# runs-on: ubuntu-latest
24+
25+
# steps:
26+
# - name: Checkout code
27+
# uses: actions/checkout@v3
28+
29+
# - name: Set up Node.js
30+
# uses: actions/setup-node@v3
31+
# with:
32+
# node-version: '20'
33+
34+
# - name: Check if Frontend Test Files Exist
35+
# id: check_frontend_tests
36+
# run: |
37+
# if [ -z "$(find App/frontend/src -type f -name '*.test.js' -o -name '*.test.ts' -o -name '*.test.tsx')" ]; then
38+
# echo "No frontend test files found, skipping frontend tests."
39+
# echo "skip_frontend_tests=true" >> $GITHUB_ENV
40+
# else
41+
# echo "Frontend test files found, running tests."
42+
# echo "skip_frontend_tests=false" >> $GITHUB_ENV
43+
# fi
44+
45+
# - name: Install Frontend Dependencies
46+
# if: env.skip_frontend_tests == 'false'
47+
# run: |
48+
# cd App/frontend
49+
# npm install
50+
51+
# - name: Run Frontend Tests with Coverage
52+
# if: env.skip_frontend_tests == 'false'
53+
# run: |
54+
# cd App/frontend
55+
# npm run test -- --coverage
56+
57+
# - name: Skip Frontend Tests
58+
# if: env.skip_frontend_tests == 'true'
59+
# run: |
60+
# echo "Skipping frontend tests because no test files were found."
6061

6162
backend_tests:
6263
runs-on: ubuntu-latest
63-
6464

6565
steps:
6666
- name: Checkout code
@@ -69,7 +69,7 @@ jobs:
6969
- name: Set up Python
7070
uses: actions/setup-python@v4
7171
with:
72-
python-version: '3.11'
72+
python-version: "3.11"
7373

7474
- name: Install Backend Dependencies
7575
run: |
@@ -81,7 +81,7 @@ jobs:
8181
- name: Check if Backend Test Files Exist
8282
id: check_backend_tests
8383
run: |
84-
if [ -z "$(find src/api -type f -name 'test_*.py')" ]; then
84+
if [ -z "$(find src/tests/api -type f -name 'test_*.py')" ]; then
8585
echo "No backend test files found, skipping backend tests."
8686
echo "skip_backend_tests=true" >> $GITHUB_ENV
8787
else
@@ -94,8 +94,6 @@ jobs:
9494
run: |
9595
pytest --cov=. --cov-report=term-missing --cov-report=xml
9696
97-
98-
9997
- name: Skip Backend Tests
10098
if: env.skip_backend_tests == 'true'
10199
run: |

documents/CustomizingAzdParameters.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -45,4 +45,9 @@ Change the Embedding Deployment Capacity (choose a number based on available emb
4545

4646
```shell
4747
azd env set AZURE_OPENAI_EMBEDDING_MODEL_CAPACITY 80
48+
```
49+
50+
Set the Log Analytics Workspace Id if you need to reuse the existing workspace which is already existing
51+
```shell
52+
azd env set AZURE_ENV_LOG_ANALYTICS_WORKSPACE_ID '/subscriptions/<subscription-id>/resourceGroups/<resource-group>/providers/Microsoft.OperationalInsights/workspaces/<workspace-name>'
4853
```

documents/DeploymentGuide.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -116,6 +116,8 @@ When you start the deployment, most parameters will have **default values**, but
116116
| **GPT Model Deployment Capacity** | Configure capacity for **GPT models**. | 30k |
117117
| **Embedding Model** | Default: **text-embedding-ada-002**. | text-embedding-ada-002 |
118118
| **Embedding Model Capacity** | Set the capacity for **embedding models**. | 80k |
119+
| **Existing Log analytics workspace** | To reuse the existing Log analytics workspace Id. | |
120+
119121

120122
</details>
121123

documents/Fabric_deployment.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,13 @@
33
### How to customize
44

55
If you'd like to customize the solution accelerator, here are some ways you might do that:
6-
- Ingest your own [audio conversation files](./ConversationalDataFormat.md) by uploading them into the `cu_audio_files_all` lakehouse folder and run the data pipeline
7-
- Deploy with Microsoft Fabric by following the steps in [Fabric_deployment.md](./Fabric_deployment.md)
6+
1. Ingest your own [audio conversation files](./ConversationalDataFormat.md) by uploading them into the `cu_audio_files_all` lakehouse folder and run the data pipeline
7+
2. Deploy with Microsoft Fabric by following the steps in [Fabric_deployment.md](./Fabric_deployment.md)
88

99

10-
3. **Create Fabric workspace**
10+
3. **Create or Use an Existing Microsoft Fabric Workspace**
11+
12+
> ℹ️ **Note:** If you already have an existing Microsoft Fabric Workspace, you can skip workspace creation and **continue from Point 5 (Environment Creation)**.
1113
1. Navigate to ([Fabric Workspace](https://app.fabric.microsoft.com/))
1214
2. Click on Data Engineering experience
1315
3. Click on Workspaces from left Navigation
@@ -38,7 +40,7 @@ If you'd like to customize the solution accelerator, here are some ways you migh
3840
4. ```cd ./Conversation-Knowledge-Mining-Solution-Accelerator/Deployment/scripts/fabric_scripts```
3941
5. ```sh ./run_fabric_items_scripts.sh keyvault_param workspaceid_param solutionprefix_param```
4042
1. keyvault_param - the name of the keyvault that was created in Step 1
41-
2. workspaceid_param - the workspaceid created in Step 2
43+
2. workspaceid_param - Existing workspaceid or workspaceid created in Step 3
4244
3. solutionprefix_param - prefix used to append to lakehouse upon creation
4345
5. **Add App Authentication**
4446

@@ -53,4 +55,4 @@ All files WAV files can be uploaded in the corresponding Lakehouse in the data/F
5355

5456
### Post-deployment
5557
- To process additional files, manually execute the pipeline_notebook after uploading new files.
56-
- The OpenAI prompt can be modified within the Fabric notebooks.
58+
- The OpenAI prompt can be modified within the Fabric notebooks.

infra/deploy_ai_foundry.bicep

Lines changed: 20 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,14 @@ param keyVaultName string
55
param cuLocation string
66
param deploymentType string
77
param gptModelName string
8+
param gptModelVersion string
89
param azureOpenAIApiVersion string
910
param gptDeploymentCapacity int
1011
param embeddingModel string
1112
param embeddingDeploymentCapacity int
1213
param managedIdentityObjectId string
14+
param existingLogAnalyticsWorkspaceId string = ''
15+
1316
var abbrs = loadJsonContent('./abbreviations.json')
1417
var storageName = '${solutionName}hubstorage'
1518
var storageSkuName = 'Standard_LRS'
@@ -32,6 +35,7 @@ var aiModelDeployments = [
3235
name: deploymentType
3336
capacity: gptDeploymentCapacity
3437
}
38+
version: gptModelVersion
3539
raiPolicyName: 'Microsoft.Default'
3640
}
3741
{
@@ -47,11 +51,21 @@ var aiModelDeployments = [
4751

4852
var containerRegistryNameCleaned = replace(containerRegistryName, '-', '')
4953

54+
var useExisting = !empty(existingLogAnalyticsWorkspaceId)
55+
var existingLawSubscription = useExisting ? split(existingLogAnalyticsWorkspaceId, '/')[2] : ''
56+
var existingLawResourceGroup = useExisting ? split(existingLogAnalyticsWorkspaceId, '/')[4] : ''
57+
var existingLawName = useExisting ? split(existingLogAnalyticsWorkspaceId, '/')[8] : ''
58+
5059
resource keyVault 'Microsoft.KeyVault/vaults@2022-07-01' existing = {
5160
name: keyVaultName
5261
}
5362

54-
resource logAnalytics 'Microsoft.OperationalInsights/workspaces@2023-09-01' = {
63+
resource existingLogAnalyticsWorkspace 'Microsoft.OperationalInsights/workspaces@2023-09-01' existing = if (useExisting) {
64+
name: existingLawName
65+
scope: resourceGroup(existingLawSubscription ,existingLawResourceGroup)
66+
}
67+
68+
resource logAnalytics 'Microsoft.OperationalInsights/workspaces@2023-09-01' = if (!useExisting){
5569
name: workspaceName
5670
location: location
5771
tags: {}
@@ -71,7 +85,7 @@ resource applicationInsights 'Microsoft.Insights/components@2020-02-02' = {
7185
Application_Type: 'web'
7286
publicNetworkAccessForIngestion: 'Enabled'
7387
publicNetworkAccessForQuery: 'Disabled'
74-
WorkspaceResourceId: logAnalytics.id
88+
WorkspaceResourceId: useExisting ? existingLogAnalyticsWorkspace.id : logAnalytics.id
7589
}
7690
}
7791

@@ -542,7 +556,10 @@ output aiSearchService string = aiSearch.name
542556
output aiProjectName string = aiProject.name
543557

544558
output applicationInsightsId string = applicationInsights.id
545-
output logAnalyticsWorkspaceResourceName string = logAnalytics.name
559+
output logAnalyticsWorkspaceResourceName string = useExisting ? existingLogAnalyticsWorkspace.name : logAnalytics.name
560+
output logAnalyticsWorkspaceResourceGroup string = useExisting ? existingLawResourceGroup : resourceGroup().name
561+
output logAnalyticsWorkspaceSubscription string = useExisting ? existingLawSubscription : subscription().subscriptionId
562+
546563
output storageAccountName string = storageNameCleaned
547564

548565
output azureOpenAIKeyName string = azureOpenAIApiKeyEntry.name

infra/deploy_post_deployment_scripts.bicep

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,13 @@ param sqlDbName string
1717
param sqlUsers array = [
1818
]
1919
param logAnalyticsWorkspaceResourceName string
20+
param logAnalyticsWorkspaceResourceGroup string
21+
param logAnalyticsWorkspaceSubscription string
2022
var resourceGroupName = resourceGroup().name
2123

2224
resource logAnalytics 'Microsoft.OperationalInsights/workspaces@2020-10-01' existing = {
2325
name: logAnalyticsWorkspaceResourceName
24-
scope: resourceGroup()
26+
scope: resourceGroup(logAnalyticsWorkspaceSubscription, logAnalyticsWorkspaceResourceGroup)
2527
}
2628

2729
resource containerAppEnv 'Microsoft.App/managedEnvironments@2022-03-01' = {

infra/main.bicep

Lines changed: 11 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,9 @@ var abbrs = loadJsonContent('./abbreviations.json')
66
@description('A unique prefix for all resources in this deployment. This should be 3-20 characters long:')
77
param environmentName string
88

9+
@description('Optional: Existing Log Analytics Workspace Resource ID')
10+
param existingLogAnalyticsWorkspaceId string = ''
11+
912
@minLength(1)
1013
@description('Location for the Content Understanding service deployment:')
1114
@allowed(['swedencentral', 'australiaeast'])
@@ -28,18 +31,12 @@ param secondaryLocation string
2831
])
2932
param deploymentType string = 'GlobalStandard'
3033

31-
@minLength(1)
3234
@description('Name of the GPT model to deploy:')
33-
@allowed([
34-
'gpt-4o-mini'
35-
'gpt-4o'
36-
'gpt-4'
37-
])
3835
param gptModelName string = 'gpt-4o-mini'
3936

40-
// @minLength(1)
41-
// @description('Version of the GPT model to deploy:')
42-
// param gptModelVersion string = '2024-02-15-preview' //'2024-08-06'
37+
@description('Version of the GPT model to deploy:')
38+
param gptModelVersion string = '2024-07-18'
39+
4340
var azureOpenAIApiVersion = '2025-01-01-preview'
4441

4542
@minValue(10)
@@ -102,11 +99,14 @@ module aifoundry 'deploy_ai_foundry.bicep' = {
10299
cuLocation: contentUnderstandingLocation
103100
deploymentType: deploymentType
104101
gptModelName: gptModelName
102+
gptModelVersion: gptModelVersion
105103
azureOpenAIApiVersion: azureOpenAIApiVersion
106104
gptDeploymentCapacity: gptDeploymentCapacity
107105
embeddingModel: embeddingModel
108106
embeddingDeploymentCapacity: embeddingDeploymentCapacity
109107
managedIdentityObjectId: managedIdentityModule.outputs.managedIdentityOutput.objectId
108+
existingLogAnalyticsWorkspaceId: existingLogAnalyticsWorkspaceId
109+
110110
}
111111
scope: resourceGroup(resourceGroup().name)
112112
}
@@ -168,6 +168,8 @@ module uploadFiles 'deploy_post_deployment_scripts.bicep' = {
168168
managedIdentityClientId:managedIdentityModule.outputs.managedIdentityOutput.clientId
169169
keyVaultName:aifoundry.outputs.keyvaultName
170170
logAnalyticsWorkspaceResourceName: aifoundry.outputs.logAnalyticsWorkspaceResourceName
171+
logAnalyticsWorkspaceResourceGroup: aifoundry.outputs.logAnalyticsWorkspaceResourceGroup
172+
logAnalyticsWorkspaceSubscription: aifoundry.outputs.logAnalyticsWorkspaceSubscription
171173
sqlServerName: sqlDBModule.outputs.sqlServerName
172174
sqlDbName: sqlDBModule.outputs.sqlDbName
173175
sqlUsers: [

infra/main.bicepparam

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,8 @@ param contentUnderstandingLocation = readEnvironmentVariable('AZURE_CONTENT_UNDE
66
param secondaryLocation = readEnvironmentVariable('AZURE_SECONDARY_LOCATION', 'eastus2')
77
param deploymentType = readEnvironmentVariable('AZURE_OPEN_AI_MODEL_DEPLOYMENT_TYPE', 'GlobalStandard')
88
param gptModelName = readEnvironmentVariable('AZURE_OPEN_AI_DEPLOYMENT_MODEL', 'gpt-4o-mini')
9+
param gptModelVersion = readEnvironmentVariable('AZURE_ENV_MODEL_VERSION', '2024-07-18')
910
param gptDeploymentCapacity = int(readEnvironmentVariable('AZURE_OPEN_AI_DEPLOYMENT_MODEL_CAPACITY', '30'))
1011
param embeddingModel = readEnvironmentVariable('AZURE_OPENAI_EMBEDDING_MODEL', 'text-embedding-ada-002')
1112
param embeddingDeploymentCapacity = int(readEnvironmentVariable('AZURE_OPENAI_EMBEDDING_MODEL_CAPACITY', '80'))
13+
param existingLogAnalyticsWorkspaceId = readEnvironmentVariable('AZURE_ENV_LOG_ANALYTICS_WORKSPACE_ID', '')

0 commit comments

Comments
 (0)