Skip to content

Commit dfff049

Browse files
Prasanjeet-MicrosoftRoopan-MicrosoftAjitPadhi-MicrosoftPavan-Microsoftross-p-smith
authored
chore: merge dev into main (features, bug fixes, infra updates) (#1820)
Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: Roopan-Microsoft <[email protected]> Co-authored-by: Ajit Padhi <[email protected]> Co-authored-by: Roopan P M <[email protected]> Co-authored-by: Pavan-Microsoft <[email protected]> Co-authored-by: Ross Smith <[email protected]> Co-authored-by: gpickett <[email protected]> Co-authored-by: Francia Riesco <[email protected]> Co-authored-by: Francia Riesco <[email protected]> Co-authored-by: Prajwal D C <[email protected]> Co-authored-by: Harmanpreet-Microsoft <[email protected]> Co-authored-by: UtkarshMishra-Microsoft <[email protected]> Co-authored-by: Priyanka-Microsoft <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Kiran-Siluveru-Microsoft <[email protected]> Co-authored-by: Prashant-Microsoft <[email protected]> Co-authored-by: Rohini-Microsoft <[email protected]> Co-authored-by: Avijit-Microsoft <[email protected]> Co-authored-by: RaviKiran-Microsoft <[email protected]> Co-authored-by: Somesh Joshi <[email protected]> Co-authored-by: Himanshi Agrawal <[email protected]> Co-authored-by: pradeepjha-microsoft <[email protected]> Co-authored-by: Harmanpreet Kaur <[email protected]> Co-authored-by: Bangarraju-Microsoft <[email protected]> Co-authored-by: Harsh-Microsoft <[email protected]> Co-authored-by: Kanchan-Microsoft <[email protected]> Co-authored-by: Cristopher Coronado <[email protected]> Co-authored-by: Cristopher Coronado Moreira <[email protected]> Co-authored-by: Vamshi-Microsoft <[email protected]>
1 parent 04d970c commit dfff049

33 files changed

+1339
-7
lines changed

.env.sample

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,7 @@ AZURE_KEY_VAULT_ENDPOINT=
6464
CONVERSATION_FLOW=
6565
# Chat History CosmosDB Integration Settings
6666
AZURE_COSMOSDB_ACCOUNT_NAME=
67+
AZURE_COSMOSDB_ACCOUNT_KEY=
6768
AZURE_COSMOSDB_DATABASE_NAME=
6869
AZURE_COSMOSDB_CONVERSATIONS_CONTAINER_NAME=
6970
AZURE_COSMOSDB_ENABLE_FEEDBACK=

.github/workflows/ci.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ jobs:
103103
imageName: ghcr.io/azure-samples/chat-with-your-data-solution-accelerator
104104
cacheFrom: ghcr.io/azure-samples/chat-with-your-data-solution-accelerator
105105
imageTag: ${{ env.imageTag }}
106-
runCmd: make ci && make deploy
106+
runCmd: export optional_args="./code/tests" && make ci && make deploy
107107
refFilterForPush: refs/heads/${{ github.event_name == 'schedule' && 'main' || github.ref_name }}
108108
env: |
109109
AZURE_CLIENT_ID

.github/workflows/test-automation.yml

Lines changed: 106 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,106 @@
1+
name: Test Automation CWYD
2+
on:
3+
push:
4+
branches:
5+
- main
6+
- dev
7+
paths:
8+
- 'tests/e2e-test/**'
9+
schedule:
10+
- cron: '0 13 * * 3' # Runs at 1 PM UTC once a week on Wednesday
11+
workflow_dispatch:
12+
13+
env:
14+
web_url: ${{ vars.CWYD_WEB_URL }}
15+
admin_url: ${{ vars.CWYD_ADMIN_URL }}
16+
accelerator_name: "Chat with your Data"
17+
18+
jobs:
19+
test:
20+
runs-on: ubuntu-latest
21+
steps:
22+
- name: Checkout repository
23+
uses: actions/checkout@v4
24+
25+
- name: Set up Python
26+
uses: actions/setup-python@v4
27+
with:
28+
python-version: '3.13'
29+
30+
- name: Install dependencies
31+
run: |
32+
python -m pip install --upgrade pip
33+
pip install -r tests/e2e-test/requirements.txt
34+
35+
- name: Ensure browsers are installed
36+
run: python -m playwright install --with-deps chromium
37+
38+
- name: Run tests(1)
39+
id: test1
40+
run: |
41+
xvfb-run pytest --headed --html=report/report.html --self-contained-html
42+
working-directory: tests/e2e-test
43+
continue-on-error: true
44+
45+
- name: Sleep for 30 seconds
46+
if: ${{ steps.test1.outcome == 'failure' }}
47+
run: sleep 30s
48+
shell: bash
49+
50+
- name: Run tests(2)
51+
id: test2
52+
if: ${{ steps.test1.outcome == 'failure' }}
53+
run: |
54+
xvfb-run pytest --headed --html=report/report.html --self-contained-html
55+
working-directory: tests/e2e-test
56+
continue-on-error: true
57+
58+
- name: Sleep for 60 seconds
59+
if: ${{ steps.test2.outcome == 'failure' }}
60+
run: sleep 60s
61+
shell: bash
62+
63+
- name: Run tests(3)
64+
id: test3
65+
if: ${{ steps.test2.outcome == 'failure' }}
66+
run: |
67+
xvfb-run pytest --headed --html=report/report.html --self-contained-html
68+
working-directory: tests/e2e-test
69+
70+
- name: Upload test report
71+
id: upload_report
72+
uses: actions/upload-artifact@v4
73+
if: ${{ !cancelled() }}
74+
with:
75+
name: cwyd-test-report
76+
path: tests/e2e-test/report/*
77+
78+
- name: Send Notification
79+
if: always()
80+
run: |
81+
RUN_URL="https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}"
82+
REPORT_URL=${{ steps.upload_report.outputs.artifact-url }}
83+
IS_SUCCESS=${{ steps.test1.outcome == 'success' || steps.test2.outcome == 'success' || steps.test3.outcome == 'success' }}
84+
# Construct the email body
85+
if [ "$IS_SUCCESS" = "true" ]; then
86+
EMAIL_BODY=$(cat <<EOF
87+
{
88+
"body": "<p>Dear Team,</p><p>We would like to inform you that the ${{ env.accelerator_name }} Test Automation process has completed successfully.</p><p><strong>Run URL:</strong> <a href=\"${RUN_URL}\">${RUN_URL}</a><br></p><p><strong>Test Report:</strong> <a href=\"${REPORT_URL}\">${REPORT_URL}</a></p><p>Best regards,<br>Your Automation Team</p>",
89+
"subject": "${{ env.accelerator_name }} Test Automation - Success"
90+
}
91+
EOF
92+
)
93+
else
94+
EMAIL_BODY=$(cat <<EOF
95+
{
96+
"body": "<p>Dear Team,</p><p>We would like to inform you that the ${{ env.accelerator_name }} Test Automation process has encountered an issue and has failed to complete successfully.</p><p><strong>Run URL:</strong> <a href=\"${RUN_URL}\">${RUN_URL}</a><br> ${OUTPUT}</p><p><strong>Test Report:</strong> <a href=\"${REPORT_URL}\">${REPORT_URL}</a></p><p>Please investigate the matter at your earliest convenience.</p><p>Best regards,<br>Your Automation Team</p>",
97+
"subject": "${{ env.accelerator_name }} Test Automation - Failure"
98+
}
99+
EOF
100+
)
101+
fi
102+
103+
# Send the notification
104+
curl -X POST "${{ secrets.EMAILNOTIFICATION_LOGICAPP_URL_TA }}" \
105+
-H "Content-Type: application/json" \
106+
-d "$EMAIL_BODY" || echo "Failed to send notification"

.github/workflows/tests.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@ jobs:
7878
7979
echo "MIN_COVERAGE=$MIN_COVERAGE" >> "$GITHUB_OUTPUT"
8080
- name: Run Python Tests
81-
run: make python-test optional_args="--junitxml=coverage-junit.xml --cov=. --cov-report xml:coverage.xml --cov-fail-under ${{ steps.coverage-value.outputs.MIN_COVERAGE }}"
81+
run: make python-test optional_args="--junitxml=coverage-junit.xml --cov=. --cov-report xml:coverage.xml --cov-fail-under ${{ steps.coverage-value.outputs.MIN_COVERAGE }} ./code/tests"
8282
- uses: actions/upload-artifact@v4
8383
if: ${{ !cancelled() }}
8484
with:

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -418,6 +418,8 @@ temp/
418418
# so that Azure App Service can install the dependencies
419419
requirements.txt
420420
!infra/prompt-flow/cwyd/requirements.txt
421+
!tests/e2e-test/requirements.txt
422+
!tests/llm-evaluator/requirements.txt
421423

422424
# Cypress UI tests screenshots folder
423425
tests/integration/ui/cypress/screenshots/

code/backend/pages/01_Ingest_Data.py

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,11 @@ def sanitize_metadata_value(value):
6464

6565

6666
def add_url_embeddings(urls: list[str]):
67+
has_valid_url = bool(list(filter(str.strip, urls)))
68+
if not has_valid_url:
69+
st.error("Please enter at least one valid URL.")
70+
return
71+
6772
params = {}
6873
if env_helper.FUNCTION_KEY is not None:
6974
params["code"] = env_helper.FUNCTION_KEY

infra/main.bicep

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1380,8 +1380,11 @@ var azureOpenAIEmbeddingModelInfo = string({
13801380

13811381
var azureCosmosDBInfo = string({
13821382
account_name: databaseType == 'CosmosDB' ? cosmosDBModule.outputs.cosmosOutput.cosmosAccountName : ''
1383+
account_key: databaseType == 'CosmosDB' && useKeyVault ? storekeys.outputs.COSMOS_ACCOUNT_KEY_NAME : ''
13831384
database_name: databaseType == 'CosmosDB' ? cosmosDBModule.outputs.cosmosOutput.cosmosDatabaseName : ''
1384-
container_name: databaseType == 'CosmosDB' ? cosmosDBModule.outputs.cosmosOutput.cosmosContainerName : ''
1385+
conversations_container_name: databaseType == 'CosmosDB'
1386+
? cosmosDBModule.outputs.cosmosOutput.cosmosContainerName
1387+
: ''
13851388
})
13861389

13871390
var azurePostgresDBInfo = string({
@@ -1448,7 +1451,7 @@ var azureOpenaiConfigurationInfo = string({
14481451
max_tokens: azureOpenAIMaxTokens
14491452
top_p: azureOpenAITopP
14501453
temperature: azureOpenAITemperature
1451-
version: azureOpenAIApiVersion
1454+
api_version: azureOpenAIApiVersion
14521455
resource: azureOpenAIResourceName
14531456
api_key: useKeyVault ? storekeys.outputs.OPENAI_KEY_NAME : ''
14541457
})
@@ -1499,5 +1502,6 @@ output AZURE_ML_WORKSPACE_NAME string = orchestrationStrategy == 'prompt_flow'
14991502
output RESOURCE_TOKEN string = resourceToken
15001503
output AZURE_COSMOSDB_INFO string = azureCosmosDBInfo
15011504
output AZURE_POSTGRESQL_INFO string = azurePostgresDBInfo
1505+
output DATABASE_TYPE string = databaseType
15021506
output OPEN_AI_FUNCTIONS_SYSTEM_PROMPT string = openAIFunctionsSystemPrompt
15031507
output SEMENTIC_KERNEL_SYSTEM_PROMPT string = semanticKernelSystemPrompt

infra/main.json

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
"_generator": {
66
"name": "bicep",
77
"version": "0.35.1.17967",
8-
"templateHash": "4060171277545073989"
8+
"templateHash": "6933035325950046645"
99
}
1010
},
1111
"parameters": {
@@ -12202,7 +12202,7 @@
1220212202
},
1220312203
"AZURE_OPENAI_CONFIGURATION_INFO": {
1220412204
"type": "string",
12205-
"value": "[string(createObject('service_name', parameters('speechServiceName'), 'stream', parameters('azureOpenAIStream'), 'system_message', parameters('azureOpenAISystemMessage'), 'stop_sequence', parameters('azureOpenAIStopSequence'), 'max_tokens', parameters('azureOpenAIMaxTokens'), 'top_p', parameters('azureOpenAITopP'), 'temperature', parameters('azureOpenAITemperature'), 'version', parameters('azureOpenAIApiVersion'), 'resource', parameters('azureOpenAIResourceName'), 'api_key', if(parameters('useKeyVault'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'storekeys'), '2022-09-01').outputs.OPENAI_KEY_NAME.value, '')))]"
12205+
"value": "[string(createObject('service_name', parameters('speechServiceName'), 'stream', parameters('azureOpenAIStream'), 'system_message', parameters('azureOpenAISystemMessage'), 'stop_sequence', parameters('azureOpenAIStopSequence'), 'max_tokens', parameters('azureOpenAIMaxTokens'), 'top_p', parameters('azureOpenAITopP'), 'temperature', parameters('azureOpenAITemperature'), 'api_version', parameters('azureOpenAIApiVersion'), 'resource', parameters('azureOpenAIResourceName'), 'api_key', if(parameters('useKeyVault'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'storekeys'), '2022-09-01').outputs.OPENAI_KEY_NAME.value, '')))]"
1220612206
},
1220712207
"AZURE_OPENAI_EMBEDDING_MODEL_INFO": {
1220812208
"type": "string",
@@ -12278,12 +12278,16 @@
1227812278
},
1227912279
"AZURE_COSMOSDB_INFO": {
1228012280
"type": "string",
12281-
"value": "[string(createObject('account_name', if(equals(parameters('databaseType'), 'CosmosDB'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_cosmos_db'), '2022-09-01').outputs.cosmosOutput.value.cosmosAccountName, ''), 'database_name', if(equals(parameters('databaseType'), 'CosmosDB'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_cosmos_db'), '2022-09-01').outputs.cosmosOutput.value.cosmosDatabaseName, ''), 'container_name', if(equals(parameters('databaseType'), 'CosmosDB'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_cosmos_db'), '2022-09-01').outputs.cosmosOutput.value.cosmosContainerName, '')))]"
12281+
"value": "[string(createObject('account_name', if(equals(parameters('databaseType'), 'CosmosDB'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_cosmos_db'), '2022-09-01').outputs.cosmosOutput.value.cosmosAccountName, ''), 'account_key', if(and(equals(parameters('databaseType'), 'CosmosDB'), parameters('useKeyVault')), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'storekeys'), '2022-09-01').outputs.COSMOS_ACCOUNT_KEY_NAME.value, ''), 'database_name', if(equals(parameters('databaseType'), 'CosmosDB'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_cosmos_db'), '2022-09-01').outputs.cosmosOutput.value.cosmosDatabaseName, ''), 'conversations_container_name', if(equals(parameters('databaseType'), 'CosmosDB'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_cosmos_db'), '2022-09-01').outputs.cosmosOutput.value.cosmosContainerName, '')))]"
1228212282
},
1228312283
"AZURE_POSTGRESQL_INFO": {
1228412284
"type": "string",
1228512285
"value": "[string(createObject('host_name', if(equals(parameters('databaseType'), 'PostgreSQL'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_postgres_sql'), '2022-09-01').outputs.postgresDbOutput.value.postgreSQLServerName, ''), 'database_name', if(equals(parameters('databaseType'), 'PostgreSQL'), reference(extensionResourceId(format('/subscriptions/{0}/resourceGroups/{1}', subscription().subscriptionId, variables('rgName')), 'Microsoft.Resources/deployments', 'deploy_postgres_sql'), '2022-09-01').outputs.postgresDbOutput.value.postgreSQLDatabaseName, ''), 'user', ''))]"
1228612286
},
12287+
"DATABASE_TYPE": {
12288+
"type": "string",
12289+
"value": "[parameters('databaseType')]"
12290+
},
1228712291
"OPEN_AI_FUNCTIONS_SYSTEM_PROMPT": {
1228812292
"type": "string",
1228912293
"value": "[variables('openAIFunctionsSystemPrompt')]"

tests/e2e-test/.gitignore

Lines changed: 168 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,168 @@
1+
# Byte-compiled / optimized / DLL files
2+
__pycache__/
3+
*.py[cod]
4+
*$py.class
5+
6+
# C extensions
7+
*.so
8+
9+
# Distribution / packaging
10+
.Python
11+
build/
12+
develop-eggs/
13+
dist/
14+
downloads/
15+
eggs/
16+
.eggs/
17+
lib/
18+
lib64/
19+
parts/
20+
sdist/
21+
var/
22+
wheels/
23+
share/python-wheels/
24+
*.egg-info/
25+
.installed.cfg
26+
*.egg
27+
MANIFEST
28+
29+
# PyInstaller
30+
# Usually these files are written by a python script from a template
31+
# before PyInstaller builds the exe, so as to inject date/other infos into it.
32+
*.manifest
33+
*.spec
34+
35+
# Installer logs
36+
pip-log.txt
37+
pip-delete-this-directory.txt
38+
39+
# Unit test / coverage reports
40+
htmlcov/
41+
.tox/
42+
.nox/
43+
.coverage
44+
.coverage.*
45+
.cache
46+
nosetests.xml
47+
coverage.xml
48+
*.cover
49+
*.py,cover
50+
.hypothesis/
51+
.pytest_cache/
52+
cover/
53+
report.html
54+
55+
# Translations
56+
*.mo
57+
*.pot
58+
59+
# Django stuff:
60+
*.log
61+
local_settings.py
62+
db.sqlite3
63+
db.sqlite3-journal
64+
65+
# Flask stuff:
66+
instance/
67+
.webassets-cache
68+
69+
# Scrapy stuff:
70+
.scrapy
71+
72+
# Sphinx documentation
73+
docs/_build/
74+
75+
# PyBuilder
76+
.pybuilder/
77+
target/
78+
79+
# Jupyter Notebook
80+
.ipynb_checkpoints
81+
82+
# IPython
83+
profile_default/
84+
ipython_config.py
85+
86+
# pyenv
87+
# For a library or package, you might want to ignore these files since the code is
88+
# intended to run in multiple environments; otherwise, check them in:
89+
# .python-version
90+
91+
# pipenv
92+
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
93+
# However, in case of collaboration, if having platform-specific dependencies or dependencies
94+
# having no cross-platform support, pipenv may install dependencies that don't work, or not
95+
# install all needed dependencies.
96+
#Pipfile.lock
97+
98+
# poetry
99+
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
100+
# This is especially recommended for binary packages to ensure reproducibility, and is more
101+
# commonly ignored for libraries.
102+
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
103+
#poetry.lock
104+
105+
# pdm
106+
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
107+
#pdm.lock
108+
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
109+
# in version control.
110+
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
111+
.pdm.toml
112+
.pdm-python
113+
.pdm-build/
114+
115+
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
116+
__pypackages__/
117+
118+
# Celery stuff
119+
celerybeat-schedule
120+
celerybeat.pid
121+
122+
# SageMath parsed files
123+
*.sage.py
124+
125+
# Environments
126+
.env
127+
.venv
128+
env/
129+
venv/
130+
ENV/
131+
env.bak/
132+
venv.bak/
133+
microsoft/
134+
135+
# Spyder project settings
136+
.spyderproject
137+
.spyproject
138+
139+
# Rope project settings
140+
.ropeproject
141+
142+
# mkdocs documentation
143+
/site
144+
145+
# mypy
146+
.mypy_cache/
147+
.dmypy.json
148+
dmypy.json
149+
150+
# Pyre type checker
151+
.pyre/
152+
153+
# pytype static type analyzer
154+
.pytype/
155+
156+
# Cython debug symbols
157+
cython_debug/
158+
159+
# PyCharm
160+
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
161+
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
162+
# and can be added to the global gitignore or merged into this file. For a more nuclear
163+
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
164+
.idea/
165+
archive/
166+
report/
167+
screenshots/
168+
videos/

0 commit comments

Comments
 (0)