You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+48-35Lines changed: 48 additions & 35 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -84,39 +84,51 @@ Simple deploy
84
84
4. Open Workspace
85
85
5. Retrieve Workspace ID from URL, refer to documentation additional assistance ([here](https://learn.microsoft.com/en-us/fabric/admin/portal-workspace#identify-your-workspace-id))
86
86
87
-
3.**Deploy Fabric resources and artifacts**
88
-
1. Navigate to ([Azure Portal](https://portal.azure.com/))
89
-
2. Click on Azure Cloud Shell in the top right of navigation Menu (add image)
90
-
3. Run the run the following command:
91
-
1.```az login```***Follow instructions in Azure Cloud Shell for login instructions
1. keyvault_param - the name of the keyvault that was created in Step 1
97
+
2. workspaceid_param - the workspaceid created in Step 2
98
+
3. solutionprefix_param - prefix used to append to lakehouse upon creation
99
+
4. Get Fabric Lakehouse connection details:
100
+
5. Once deployment is complete, navigate to Fabric Workspace
101
+
6. Find Lakehouse in workspace (ex.lakehouse_*solutionprefix_param*)
102
+
7. Click on the ```...``` next to the SQL Analytics Endpoint
103
+
8. Click on ```Copy SQL connection string```
104
+
9. Click Copy button in popup window.
105
+
10. Wait 10-15 minutes to allow the data pipelines to finish processing then proceed to next step.
106
+
4.**Open Power BI report**
107
+
1. Download the .pbix file from the [Reports folder](Deployment/Reports/).
108
+
2. Open Power BI report in Power BI Dashboard
109
+
3. Click on `Transform Data` menu option from the Task Bar
110
+
4. Click `Data source settings`
111
+
5. Click `Change Source...`
112
+
6. Input the Server link (from Fabric Workspace)
113
+
7. Input Database name (the lakehouse name from Fabric Workspace)
114
+
8. Click `OK`
115
+
9. Click `Edit Permissions`
116
+
10. If not signed in, sign in your credentials and proceed to click OK
117
+
11. Click `Close`
118
+
12. Report should refresh with new connection.
119
+
5.**Publish Power BI**
120
+
1. Click `Publish` (from PBI report in Power BI Desktop application)
121
+
2. Select Fabric Workspace
122
+
3. Click `Select`
123
+
4. After publish is complete, navigate to Fabric Workspace
124
+
5. Click `...` next to the Semantic model for Power BI report
125
+
6. Click on `Settings`
126
+
7. Click on `Edit credentials` (under Data source credentials)
127
+
8. Select `OAuth2` for the Authentication method
128
+
9. Select option for `Privacy level setting for this data source`
129
+
10. Click `Sign in`
130
+
11. Navigate back to Fabric workspace and click on Power BI report
131
+
6.**Schedule Post-Processing Notebook**
120
132
It is essential to update dates daily as they advance based on the current day at the time of deployment. Since the Power BI report relies on the current date, we highly recommend scheduling or running the 03_post_processing notebook daily in the workspace. Please note that this process modifies the original date of the processed data. If you do not wish to run this, do not execute the 03_post_processing notebook.
121
133
122
134
To schedule the notebook, follow these steps:
@@ -129,9 +141,10 @@ Simple deploy
129
141
130
142
### Process audio files
131
143
Currently, audio files are not processed during deployment. To manually process audio files, follow these steps:
132
-
- Open the pipeline_notebook
144
+
- Open the `pipeline_notebook`
145
+
- Comment out cell 2 (only if there are zero files in the `conversation_input` data folder waiting for JSON processing)
0 commit comments