Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/CAdeploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ on:
- cron: '0 6,18 * * *' # Runs at 6:00 AM and 6:00 PM GMT

env:
GPT_MIN_CAPACITY: 250
TEXT_EMBEDDING_MIN_CAPACITY: 40
GPT_MIN_CAPACITY: 200
TEXT_EMBEDDING_MIN_CAPACITY: 80
BRANCH_NAME: ${{ github.head_ref || github.ref_name }}

jobs:
Expand Down
3 changes: 2 additions & 1 deletion docs/CustomizingAzdParameters.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,10 @@ By default this template will use the environment name as the prefix to prevent
| `AZURE_ENV_EMBEDDING_MODEL_CAPACITY` | integer | `80` | Set the capacity for embedding model deployment. |
| `AZURE_ENV_IMAGETAG` | string | `latest` | Set the image tag (allowed values: `latest`, `dev`, `hotfix`). |
| `AZURE_LOCATION` | string | `japaneast` | Sets the Azure region for resource deployment. |
| `AZURE_ENV_LOG_ANALYTICS_WORKSPACE_ID` | string | `<Existing Workspace Id>` | Reuses an existing Log Analytics Workspace instead of provisioning a new one. |
| `AZURE_ENV_LOG_ANALYTICS_WORKSPACE_ID` | string | Guide to get your [Existing Workspace ID](/docs/re-use-log-analytics.md) | Reuses an existing Log Analytics Workspace instead of provisioning a new one. |
| `AZURE_EXISTING_AI_PROJECT_RESOURCE_ID` | string | `<Existing AI Foundry Project Resource Id>` | Reuses an existing AI Foundry Project Resource Id instead of provisioning a new one. |


## How to Set a Parameter
To customize any of the above values, run the following command **before** `azd up`:

Expand Down
8 changes: 8 additions & 0 deletions docs/DeploymentGuide.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,14 @@ To adjust quota settings, follow these [steps](./AzureGPTQuotaSettings.md).

</details>

<details>

<summary><b>Reusing an Existing Log Analytics Workspace</b></summary>

Guide to get your [Existing Workspace ID](/docs/re-use-log-analytics.md)

</details>

### Deploying with AZD

Once you've opened the project in [Codespaces](#github-codespaces), [Dev Containers](#vs-code-dev-containers), or [locally](#local-environment), you can deploy it to Azure by following these steps:
Expand Down
8 changes: 4 additions & 4 deletions docs/QuotaCheck.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ azd auth login

### 📌 Default Models & Capacities:
```
gpt-4o-mini:30, text-embedding-ada-002:80
gpt-4o-mini:200, text-embedding-ada-002:80
```
### 📌 Default Regions:
```
Expand All @@ -37,19 +37,19 @@ eastus, uksouth, eastus2, northcentralus, swedencentral, westus, westus2, southc
```
✔️ Check specific model(s) in default regions:
```
./quota_check_params.sh --models gpt-4o-mini:30,text-embedding-ada-002:80
./quota_check_params.sh --models gpt-4o-mini:200,text-embedding-ada-002:80
```
✔️ Check default models in specific region(s):
```
./quota_check_params.sh --regions eastus,westus
```
✔️ Passing Both models and regions:
```
./quota_check_params.sh --models gpt-4o-mini:30 --regions eastus,westus2
./quota_check_params.sh --models gpt-4o-mini:200 --regions eastus,westus2
```
✔️ All parameters combined:
```
./quota_check_params.sh --models gpt-4o-mini:30,text-embedding-ada-002:80 --regions eastus,westus --verbose
./quota_check_params.sh --models gpt-4o-mini:200,text-embedding-ada-002:80 --regions eastus,westus --verbose
```

### **Sample Output**
Expand Down
Binary file added docs/images/re_use_log/logAnalytics.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/re_use_log/logAnalyticsJson.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/re_use_log/logAnalyticsList.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
31 changes: 31 additions & 0 deletions docs/re-use-log-analytics.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
[← Back to *DEPLOYMENT* guide](/docs/DeploymentGuide.md#deployment-options--steps)

# Reusing an Existing Log Analytics Workspace
To configure your environment to use an existing Log Analytics Workspace, follow these steps:
---
### 1. Go to Azure Portal
Go to https://portal.azure.com

### 2. Search for Log Analytics
In the search bar at the top, type "Log Analytics workspaces" and click on it and click on the workspace you want to use.

![alt text](../docs/images/re_use_log/logAnalyticsList.png)

### 3. Copy Resource ID
In the Overview pane, Click on JSON View

![alt text](../docs/images/re_use_log/logAnalytics.png)

Copy Resource ID that is your Workspace ID

![alt text](../docs/images/re_use_log/logAnalyticsJson.png)

### 4. Set the Workspace ID in Your Environment
Run the following command in your terminal
```bash
azd env set AZURE_ENV_LOG_ANALYTICS_WORKSPACE_ID '<Existing Log Analytics Workspace Id>'
```
Replace `<Existing Log Analytics Workspace Id>` with the value obtained from Step 3.

### 5. Continue Deployment
Proceed with the next steps in the [deployment guide](/docs/DeploymentGuide.md#deployment-options--steps).
28 changes: 18 additions & 10 deletions infra/scripts/copy_kb_files.sh
Original file line number Diff line number Diff line change
Expand Up @@ -123,46 +123,54 @@ echo "Uploading files to Azure Blob Storage"
# Using az storage blob upload-batch to upload files with managed identity authentication, as the az storage fs directory upload command is not working with managed identity authentication.
az storage blob upload-batch --account-name "$storageAccount" --destination data/"$extractedFolder1" --source $extractionPath1 --auth-mode login --pattern '*' --overwrite --output none
if [ $? -ne 0 ]; then
retries=3
maxRetries=5
retries=$maxRetries
sleepTime=10
echo "Error: Failed to upload files to Azure Blob Storage. Retrying upload...($((4 - retries)) of 3)"
attempt=1
while [ $retries -gt 0 ]; do
echo "Error: Failed to upload files to Azure Blob Storage. Retrying upload...$attempt of $maxRetries in $sleepTime seconds"
sleep $sleepTime
az storage blob upload-batch --account-name "$storageAccount" --destination data/"$extractedFolder1" --source $extractionPath1 --auth-mode login --pattern '*' --overwrite --output none
if [ $? -eq 0 ]; then
echo "Files uploaded successfully to Azure Blob Storage."
break
else
((retries--))
echo "Retrying upload... ($((4 - retries)) of 3)"
((attempt++))
sleepTime=$((sleepTime * 2))
sleep $sleepTime
fi
done
exit 1
if [ $retries -eq 0 ]; then
echo "Error: Failed to upload files after all retry attempts."
exit 1
fi
else
echo "Files uploaded successfully to Azure Blob Storage."
fi

az storage blob upload-batch --account-name "$storageAccount" --destination data/"$extractedFolder2" --source $extractionPath2 --auth-mode login --pattern '*' --overwrite --output none
if [ $? -ne 0 ]; then
retries=3
maxRetries=5
retries=$maxRetries
attempt=1
sleepTime=10
echo "Error: Failed to upload files to Azure Blob Storage. Retrying upload...($((4 - retries)) of 3)"
while [ $retries -gt 0 ]; do
echo "Error: Failed to upload files to Azure Blob Storage. Retrying upload...$attempt of $maxRetries in $sleepTime seconds"
sleep $sleepTime
az storage blob upload-batch --account-name "$storageAccount" --destination data/"$extractedFolder2" --source $extractionPath2 --auth-mode login --pattern '*' --overwrite --output none
if [ $? -eq 0 ]; then
echo "Files uploaded successfully to Azure Blob Storage."
break
else
((retries--))
echo "Retrying upload... ($((4 - retries)) of 3)"
((attempt++))
sleepTime=$((sleepTime * 2))
sleep $sleepTime
fi
done
exit 1
if [ $retries -eq 0 ]; then
echo "Error: Failed to upload files after all retry attempts."
exit 1
fi
else
echo "Files uploaded successfully to Azure Blob Storage."
fi
Expand Down
2 changes: 1 addition & 1 deletion infra/scripts/quota_check_params.sh
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ log_verbose() {
}

# Default Models and Capacities (Comma-separated in "model:capacity" format)
DEFAULT_MODEL_CAPACITY="gpt-4o-mini:30,text-embedding-ada-002:80"
DEFAULT_MODEL_CAPACITY="gpt-4o-mini:200,text-embedding-ada-002:80"

# Convert the comma-separated string into an array
IFS=',' read -r -a MODEL_CAPACITY_PAIRS <<< "$DEFAULT_MODEL_CAPACITY"
Expand Down
Loading