This document describes how to deploy things. It covers.
-
How to create a new deployment, which is only needed if you want to create a new deployment or the original deployment gets broken somehow.
-
Setting up Azure Active Directory permissions for SQL Server, which is needed to allow users to access the data in the SQL database.
-
Updating the Power BI reports, which is needed to ensure that the Power BI reports are using the nightly data.
Generally you never have to do this - there is a deployment. You might need to do it if somebody deletes the deployment, or you need to update or repair it.
This section describes how to create a new deployment.
Before you can initially create a deployment, you need the following.
-
A PC to run the tooling on. The tooling was tested using Linux, but anything running bash should be fine, including a Mac or WSL on Windows. This PC must have various utilities installed . These include the following.
-
An Azure subscription. This will contain the various components that get deployed.
-
Power BI running on a Windows PC. Sorry - if you want to use Power BI, you need to use Windows.
-
An access key for Safety Culture. You can create this by logging into the Safety Culture website.
Follow the following steps.
-
Set up a config file. TODO: document with an example.
Before running any of the bash commands, you should source this config file.
. config/my_config_file.sh -
Ensure that you have created an Azure subscription to use, and that you are logged into Azure, defaulting to that subscription.
az login --use-device-code az account show
If necessary, you can log in using a different account, or use
az account setto reset which subscription is in use. -
Run the deploy script.
bash scripts/deploy.sh
This will fail because the container apps job (rather tediously) refuses to create until the image is uploaded. Ignore this initial error.
TODO: we should do better here, but this is not blocking anything for now.
-
Build and push the container image, then redeploy. This should succeed
bash scripts/build.sh bash scripts/deploy.sh
The access token needs to be added to the field accessToken in the key vault.
-
Go to the Azure portal.
-
Find the key vault and click on it.
-
Select
Access policiesfrom the left hand pane to create an access policy allowing you access to secrets in the key vault.-
Select
Create -
In the
Permissionsscreen,Select allunderSecret permissions, and click theNextbutton -
In the
Principalscreen, search for your own account, and click it, then click theNextbutton -
Ignore the
Applicationscreen, select nothing and click theNextbutton -
Finally, click the
Createbutton
-
-
Select
Objectsfrom the left hand pane to create the secret.-
Select
Secrets -
Click
Generate/Import -
Enter
accessTokenas the name -
Enter your API token for Safety Culture as the
Secret value -
Click the
Createbutton
-
Upload the various CSV files containing historical data.
-
Download the files from the Sharepoint site.
-
Go to the Azure Portal, and log in with the correct identity.
-
Select the resource group containing the exporter (you can find a list of resource groups in the menu). This will show a list of resources in the RG.
-
Find the upload button for the storage account.
-
Expand the
Data Storageoption on the left, and click onContainers. -
There should be one container in the list named
csvdata. Click on it. -
There should be an
Uploadbutton at the top of the screen.
-
-
Select the CSV files to upload and click the
Uploadbutton. The full list of files is as follows.-
places.csv -
historical_nightly.csv -
historic_all_suf.csv -
historic_welfare_checks.csv -
place_synonyms.csv -
location_corrections.csv
-
Once the files are uploaded, you can continue.
The places file should live update itself, or we will all get very bored doing it manually every time it changes.
You'll need a SAS token - you can get one like this:
az storage container generate-sas \
--account-name ${STORAGEACCOUNTNAME} \
--name automation \
--permissions w \
--expiry 2040-12-31T23:59:00Z \
--https-only \
--output tsvTo create the power automate flow, you can go to Power Automate, and then create a flow.
-
Trigger:
Sharepoint: When a file is created or modified (properties only)-
Set the site address to be the IT operations site from the dropdown
-
Set the library name to
Data Warehousefrom the dropdown -
Find the
Places listin the dropdown for the folder
-
-
Next action is
Sharepoint: Get File Contentthat pulls the content ofplaces.xlsx -
Finally, action of
HTTP(the plain HTTP one, not one of the variants)-
Set the URI to be
https://STORAGEACCOUNTNAME.blob.core.windows.net/automation/places.csv?SASTOKEN, substituting storage site name and SAS token -
Method is
PUT -
Headers:
-
x-ms-blob-type: BlockBlob -
Content-Type: application/octet-stream
-
-
Turn off chunking in the options
-
Body is the file content from the previous action
-
This must be done before users can actually use the provisioned data.
This process sets yourself up as the Entra managed admin for the SQL Server Database. It only needs to be done once, unless the admin leaves and needs to be replaced.
-
Find the SQL Server instance (not the database) in the portal (look in the resource group), and click on it.
-
Configure AAD access, with yourself as the admin as follows
-
On the left, click on
Microsoft Entra IDunderSettingsto see the screen of Microsoft Entra options. -
Ensure that the
Support only Microsoft Entra IDoption is not checked. -
Click on
Set adminat the top of the screen. -
Select your own account, and click
Select -
Click
Saveat the top of the screen, so the change is not immediately forgotten.
-
-
Enable access from your IP address.
-
Click on
NetworkingunderSecurityto see the networking screen. -
Click on
Add your client IPv4 address -
Click the
SAVEbutton
-
This must be done to allow individual users to run Power BI reports, and involves entering either their IDs or the ID of a security group they are a member of into the portal.
-
Click on
SQL databasesunderSettingsof the SQL Server instance. -
Click on the
sqldbdatabase. -
Click on
Query Editor -
Do not enter a password; you should click the
Continue as yourmail@yourdomainbutton. -
Enter the following for each user you wish to grant rights, assuming that their email is
user@domain.CREATE USER [user@domain] FROM EXTERNAL PROVIDER; ALTER ROLE db_datareader ADD MEMBER [user@domain]; -
If you want to grant permissions to a security group, then create a security group in the Entra admin centre, and then do
CREATE USER [SecurityGroupName] FROM EXTERNAL PROVIDER; ALTER ROLE db_datareader ADD MEMBER [SecurityGroupName];
This section describes how to update the Power BI reports to use the latest data. The process is as follows.
-
Download the
pbixfile. -
Open the
pbixfile in Power BI Desktop. -
Click on
Transform datato open the Power Query editor, then click onData source settingsin the top menu.- Update the SQL data source to have the correct server name (from the SQL server instance created above).
-
Save the
pbixfile, and upload it to the Power BI service. -
In the Power BI service, go to the dataset settings for the report.
-
Click on
Data source credentialsand ensure that the credentials are set to use theMicrosoft Entra IDauthentication method. -
Click on
Scheduled refreshand ensure that the refresh is set to run daily at 08:00. -
Click on
Applyto save the changes. -
In the Power BI service, go to the report and click on
Refresh nowto ensure that the report is using the latest data.
-