Skip to content

Commit 8ecf50b

Browse files
Fix Markdown syntax issues
1 parent c66ebb8 commit 8ecf50b

File tree

18 files changed

+257
-244
lines changed

18 files changed

+257
-244
lines changed

Deployment-Pipelines/README.md

Lines changed: 11 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,7 @@ Last updated: 2025-04-15
99

1010
------------------------------------------
1111

12-
> Lakehouse Schema and Deployment Pipelines
13-
12+
> Lakehouse Schema and Deployment Pipelines
1413
1514
<details>
1615
<summary><b> List of References </b> (Click to expand)</summary>
@@ -33,15 +32,14 @@ Last updated: 2025-04-15
3332

3433
- [Overview](#overview)
3534
- [Demo](#demo)
36-
- [Create a Workspace](#create-a-workspace)
37-
- [Create a Lakehouse](#create-a-lakehouse)
38-
- [Create a New Semantic Model](#create-a-new-semantic-model)
39-
- [Auto-Generate Report with Copilot](#auto-generate-report-with-copilot)
40-
- [Create a Deployment Pipeline](#create-a-deployment-pipeline)
41-
- [Deploy to Production](#deploy-to-production)
35+
- [Create a Workspace](#create-a-workspace)
36+
- [Create a Lakehouse](#create-a-lakehouse)
37+
- [Create a New Semantic Model](#create-a-new-semantic-model)
38+
- [Auto-Generate Report with Copilot](#auto-generate-report-with-copilot)
39+
- [Create a Deployment Pipeline](#create-a-deployment-pipeline)
40+
- [Deploy to Production](#deploy-to-production)
4241
- [How to refresh the data](#how-to-refresh-the-data)
4342

44-
4543
</details>
4644

4745
## Overview
@@ -61,12 +59,12 @@ Process Overview:
6159
> `Specifics for Lakehouse:` For lakehouses, the deployment process typically `includes the structure and metadata but not the actual data tables`. This is why you might see the structure and semantic models deployed, but the tables themselves need to be manually refreshed or reloaded in the target environment.<br/> <br/>
6260
> `Deployment Rules:` You can set deployment rules to manage different stages and change content settings during deployment. For example, you can specify default lakehouses for notebooks to avoid manual changes post-deployment.
6361
64-
## Demo
62+
## Demo
6563

6664
<div align="center">
6765
<img src="https://github.com/user-attachments/assets/46045fa8-34e8-4fff-a343-d49f36eece89" alt="Centered Image" style="border: 2px solid #4CAF50; border-radius: 2px; padding: 2px; width: 500px; height: auto;"/>
6866
</div>
69-
67+
7068
### Create a Workspace
7169

7270
1. Navigate to the Microsoft Fabric portal.
@@ -122,11 +120,10 @@ Process Overview:
122120

123121
<img width="550" alt="image" src="https://github.com/user-attachments/assets/25bda38f-41fa-45f9-aa01-e647d9f4bd84" />
124122

125-
4. At this point, you should see something similar like following: 
123+
4. At this point, you should see something similar like following:
126124

127125
<img width="550" alt="image" src="https://github.com/user-attachments/assets/e1f88782-ddc6-4fb8-947a-af23b92a8415" />
128126

129-
130127
### Auto-Generate Report with Copilot
131128

132129
> [!NOTE]
@@ -193,6 +190,7 @@ Process Overview:
193190
| **Incremental Refresh** | Refreshes only the data that has changed since the last refresh, improving efficiency. Click [here to understand more about incremental refresh](../Workloads-Specific/PowerBi/IncrementalRefresh.md)| - **Evaluate Changes**: Checks for changes in the data source based on a DateTime column.<br>- **Retrieve Data**: Only changed data is retrieved and loaded.<br>- **Replace Data**: Updated data is processed and replaced. |
194191

195192
Steps to Set Up Incremental Refresh:
193+
196194
1. **Create or Open a Dataflow**: Start by creating a new Dataflow Gen2 or opening an existing one.
197195
2. **Configure the Query**: Ensure your query includes a DateTime column that can be used to filter the data.
198196
3. **Enable Incremental Refresh**: Right-click the query and select Incremental Refresh. Configure the settings, such as the DateTime column and the time range for data extraction.
Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,2 @@
11

2-
32
Last updated: 2025-04-15

GitHub-Integration.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
1-
# Integrating GitHub with Microsoft Fabric - Overview
1+
# Integrating GitHub with Microsoft Fabric - Overview
22

33
Costa Rica
44

5-
[![GitHub](https://badgen.net/badge/icon/github?icon=github&label)](https://github.com)
5+
[![GitHub](https://badgen.net/badge/icon/github?icon=github&label)](https://github.com)
66
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
77
[brown9804](https://github.com/brown9804)
88

@@ -34,7 +34,7 @@ Last updated: 2025-04-15
3434

3535
</details>
3636

37-
https://github.com/user-attachments/assets/64f099a1-b749-47a6-b723-fa1cb5c575a3
37+
<https://github.com/user-attachments/assets/64f099a1-b749-47a6-b723-fa1cb5c575a3>
3838

3939
## Connect a workspace to a Git repo
4040

Monitoring-Observability/FabricActivatorRulePipeline/README.md

Lines changed: 18 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ Last updated: 2025-04-15
1010
----------
1111

1212
> This process shows how to set up Microsoft Fabric Activator to automate workflows by detecting file creation events in a storage system and triggering another pipeline to run. <br/>
13+
>
1314
> 1. **First Pipeline**: The process starts with a pipeline that ends with a `Copy Data` activity. This activity uploads data into the `Lakehouse`. <br/>
1415
> 2. **Event Stream Setup**: An `Event Stream` is configured in Activator to monitor the Lakehouse for file creation or data upload events. <br/>
1516
> 3. **Triggering the Second Pipeline**: Once the event is detected (e.g., a file is uploaded), the Event Stream triggers the second pipeline to continue the workflow.
@@ -25,19 +26,19 @@ Last updated: 2025-04-15
2526
<details>
2627
<summary><b>List of Content </b> (Click to expand)</summary>
2728

28-
- [Set Up the First Pipeline](#set-up-the-first-pipeline)
29-
- [Configure Activator to Detect the Event](#configure-activator-to-detect-the-event)
30-
- [Set Up the Second Pipeline](#set-up-the-second-pipeline)
31-
- [Define the Rule in Activator](#define-the-rule-in-activator)
32-
- [Test the Entire Workflow](#test-the-entire-workflow)
33-
- [Troubleshooting If Needed](#troubleshooting-if-needed)
29+
- [Set Up the First Pipeline](#set-up-the-first-pipeline)
30+
- [Configure Activator to Detect the Event](#configure-activator-to-detect-the-event)
31+
- [Set Up the Second Pipeline](#set-up-the-second-pipeline)
32+
- [Define the Rule in Activator](#define-the-rule-in-activator)
33+
- [Test the Entire Workflow](#test-the-entire-workflow)
34+
- [Troubleshooting If Needed](#troubleshooting-if-needed)
3435

3536
</details>
3637

3738
> [!NOTE]
3839
> This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format. Click here to see the [example script](./GeneratesRandomData.ipynb)
3940
40-
https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d
41+
<https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d>
4142

4243
## Set Up the First Pipeline
4344

@@ -50,14 +51,14 @@ https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d
5051
- Ensure the file name and path are consistent and predictable (e.g., `trigger_file.json` in a specific folder).
5152
3. **Publish and Test**: Publish the pipeline and test it to ensure the trigger file is created successfully.
5253

53-
https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831
54+
<https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831>
5455

5556
## Configure Activator to Detect the Event
5657

5758
> [!TIP]
5859
> Event options:
5960
60-
https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
61+
<https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d>
6162

6263
1. **Set Up an Event**:
6364
- Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). Click on `Real-Time`:
@@ -71,18 +72,18 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
7172
<img width="550" alt="image" src="https://github.com/user-attachments/assets/94e5556b-5d56-4a42-9edd-83b514e7c953" />
7273

7374
- Add a source:
74-
75+
7576
<img width="550" alt="image" src="https://github.com/user-attachments/assets/9709a690-f3b5-453b-b3d9-c67d4b1a9465" />
7677

7778
<img width="550" alt="image" src="https://github.com/user-attachments/assets/8dcadd23-4abb-47ee-82ca-f3868cb818e1" />
7879

79-
https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b
80+
<https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b>
8081

8182
2. **Test Event Detection**:
8283
- Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation.
8384
- Check the **Event Details** screen in Activator to confirm the event is logged.
8485

85-
https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd
86+
<https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd>
8687

8788
## Set Up the Second Pipeline
8889

@@ -91,13 +92,13 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
9192
- Ensure it is configured to accept external triggers.
9293
2. **Publish the Pipeline**: Publish the second pipeline and ensure it is ready to be triggered.
9394

94-
https://github.com/user-attachments/assets/5b630579-a0ec-4d5b-b973-d9b4fdd8254c
95+
<https://github.com/user-attachments/assets/5b630579-a0ec-4d5b-b973-d9b4fdd8254c>
9596

9697
## Define the Rule in Activator
9798

9899
1. **Setup the Activator**:
99100

100-
https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568
101+
<https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568>
101102

102103
2. **Create a New Rule**:
103104
- In `Activator`, create a rule that responds to the event you just configured.
@@ -109,17 +110,18 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
109110
- Save the rule and activate it.
110111
- Ensure the rule is enabled and ready to respond to the event.
111112

112-
https://github.com/user-attachments/assets/5f139eeb-bab0-4d43-9f22-bbe44503ed75
113+
<https://github.com/user-attachments/assets/5f139eeb-bab0-4d43-9f22-bbe44503ed75>
113114

114115
## Test the Entire Workflow
115116

116117
1. **Run the First Pipeline**: Execute the first pipeline and verify that the trigger file is created.
117118
2. **Monitor Activator**: Check the `Event Details` and `Rule Activation Details` in Activator to ensure the event is detected and the rule is activated.
118119
3. **Verify the Second Pipeline**: Confirm that the second pipeline is triggered and runs successfully.
119120

120-
https://github.com/user-attachments/assets/0a1dab70-2317-4636-b0be-aa0cb301b496
121+
<https://github.com/user-attachments/assets/0a1dab70-2317-4636-b0be-aa0cb301b496>
121122

122123
## Troubleshooting (If Needed)
124+
123125
- If the second pipeline does not trigger:
124126
1. Double-check the rule configuration in Activator.
125127
2. Review the logs in Activator for any errors or warnings.
@@ -128,4 +130,3 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
128130
<h3 style="color: #4CAF50;">Total Visitors</h3>
129131
<img src="https://profile-counter.glitch.me/brown9804/count.svg" alt="Visitor Count" style="border: 2px solid #4CAF50; border-radius: 5px; padding: 5px;"/>
130132
</div>
131-

Monitoring-Observability/MonitorUsage.md

Lines changed: 26 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -27,22 +27,22 @@ Last updated: 2025-04-15
2727

2828
</details>
2929

30-
## Content
30+
## Content
3131

3232
- [Microsoft Fabric Capacity Metrics app](#microsoft-fabric-capacity-metrics-app)
33-
- [Installation Steps](#installation-steps)
34-
- [Configuration Steps](#configuration-steps)
35-
- [Troubleshooting](#troubleshooting)
33+
- [Installation Steps](#installation-steps)
34+
- [Configuration Steps](#configuration-steps)
35+
- [Troubleshooting](#troubleshooting)
3636
- [Admin monitoring](#admin-monitoring)
37-
- [Configure the Admin Monitoring Workspace](#configure-the-admin-monitoring-workspace)
38-
- [How to Use Data from Admin Monitoring Workspace Custom Reports](#how-to-use-data-from-admin-monitoring-workspace-custom-reports)
37+
- [Configure the Admin Monitoring Workspace](#configure-the-admin-monitoring-workspace)
38+
- [How to Use Data from Admin Monitoring Workspace Custom Reports](#how-to-use-data-from-admin-monitoring-workspace-custom-reports)
3939
- [Monitor Hub](#monitor-hub)
40-
- [How to Access and Use the Monitor Hub](#how-to-access-and-use-the-monitor-hub)
41-
- [Extending Activity History](#extending-activity-history)
40+
- [How to Access and Use the Monitor Hub](#how-to-access-and-use-the-monitor-hub)
41+
- [Extending Activity History](#extending-activity-history)
4242

43-
## Microsoft Fabric Capacity Metrics app
43+
## Microsoft Fabric Capacity Metrics app
4444

45-
> The `Microsoft Fabric Capacity Metrics app` is designed to provide comprehensive monitoring capabilities for Microsoft Fabric capacities. It helps administrators track capacity consumption, identify performance bottlenecks, and make informed decisions about scaling and resource allocation. The app provides detailed insights into capacity utilization, throttling, and system events, enabling proactive management of resources to ensure optimal performance. <br/> <br/>
45+
> The `Microsoft Fabric Capacity Metrics app` is designed to provide comprehensive monitoring capabilities for Microsoft Fabric capacities. It helps administrators track capacity consumption, identify performance bottlenecks, and make informed decisions about scaling and resource allocation. The app provides detailed insights into capacity utilization, throttling, and system events, enabling proactive management of resources to ensure optimal performance. <br/> <br/>
4646
> This app is essential for maintaining the health and efficiency of your Microsoft Fabric capacities
4747
4848
| **Feature** | **Description** |
@@ -58,7 +58,7 @@ Last updated: 2025-04-15
5858
- Navigate to [Microsoft Fabric](https://app.fabric.microsoft.com/). In the left panel, locate the `Apps` icon and click on `Get apps`.
5959

6060
<img width="550" alt="image" src="https://github.com/user-attachments/assets/931eb614-bb29-4e03-9637-4a9ef0cc3e7a">
61-
61+
6262
- Search for `Microsoft Fabric Capacity Metrics`:
6363

6464
<img width="550" alt="image" src="https://github.com/user-attachments/assets/a743d770-f1ea-474b-8d2c-c363e2a40e13">
@@ -72,6 +72,7 @@ Last updated: 2025-04-15
7272
<img width="550" alt="image" src="https://github.com/user-attachments/assets/49aa432b-a3fd-4a2d-a504-ce08841b681e">
7373

7474
### Configuration Steps
75+
7576
1. **Run the App for the First Time**:
7677
- In Microsoft Fabric, go to **Apps** and select the Microsoft Fabric Capacity Metrics app.
7778
- When prompted with `You have to connect to your own data to view this report`, select **Connect**.
@@ -85,7 +86,7 @@ Last updated: 2025-04-15
8586
- Go to the Power BI service and sign in with your admin account.
8687
- Click on the `Settings` gear icon in the top right corner.
8788
- Select `Admin Portal` from the dropdown menu.
88-
89+
8990
<img width="550" alt="image" src="https://github.com/user-attachments/assets/5b3a8f2a-8062-4a2f-b121-96522088c2d7">
9091

9192
2. Access Capacity Settings:
@@ -101,7 +102,7 @@ Last updated: 2025-04-15
101102

102103
<img width="550" alt="image" src="https://github.com/user-attachments/assets/8b028365-bf3d-4dfd-a40e-95f355c27ff4">
103104

104-
- **UTC_offset**: Enter your organization's standard time in UTC (e.g., for Central Standard Time, enter `-6`).
105+
- **UTC_offset**: Enter your organization's standard time in UTC (e.g., for Central Standard Time, enter `-6`).
105106

106107
<img width="550" alt="image" src="https://github.com/user-attachments/assets/b2f5e435-d3a0-4e2b-ae80-0376caa2e00b">
107108

@@ -128,7 +129,7 @@ Last updated: 2025-04-15
128129
- If the app doesn't show data or can't refresh, try deleting the old app and reinstalling the latest version.
129130
- Update the semantic model credentials if needed.
130131

131-
## Admin monitoring
132+
## Admin monitoring
132133

133134
> `Admin monitoring workspace` in Microsoft Fabric is a powerful tool for administrators to track and analyze usage metrics across their organization. This workspace provides detailed insights into how different features and services are being utilized, helping admins make informed decisions to optimize performance and resource allocation.
134135
@@ -148,11 +149,11 @@ Benefits of Using Admin Monitoring Workspace:
148149
3. **Optimize Resources**: Make data-driven decisions about scaling and resource allocation to ensure optimal performance.
149150
4. **Ensure Compliance**: Use the Purview Hub to monitor data governance and compliance, ensuring that your organization adheres to relevant regulations and standards.
150151

151-
152152
### Configure the Admin Monitoring Workspace
153153

154154
> [!IMPORTANT]
155-
> - **Permissions**: `Only users with direct admin roles can set up the Admin Monitoring workspace`. If the admin role `is assigned through a group, data refreshes may fail`. <br/>
155+
>
156+
> - **Permissions**: `Only users with direct admin roles can set up the Admin Monitoring workspace`. If the admin role `is assigned through a group, data refreshes may fail`. <br/>
156157
> - **Read-Only Workspace**: The `Admin Monitoring workspace is read-only`. Users, including admins, cannot edit or view properties of items such as semantic models and reports within the workspace. `Admins can share reports and semantic models within the workspace with other users by assigning them a workspace viewer role or providing direct access links.`
157158
> - **Reinitializing the Workspace**: If needed, `you can reinitialize the workspace by executing an API call to delete the semantic model and then reinstalling the workspace`.
158159
@@ -202,7 +203,7 @@ Benefits of Using Admin Monitoring Workspace:
202203
<img width="550" alt="image" src="https://github.com/user-attachments/assets/827780f4-193a-4c04-82ae-0edaa7c0312b">
203204

204205
2. **Create Custom Reports**: You can utilize copilot capabilities to automatically create your report and edit it. Request additional pages with your content or even ask questions about your data.
205-
206+
206207
<img width="550" alt="image" src="https://github.com/user-attachments/assets/12368a38-cd80-4bdb-b249-efb2b9225260">
207208

208209
<img width="550" alt="image" src="https://github.com/user-attachments/assets/c928cae7-4bb3-48b9-8bf9-7f05f2f0b7e2">
@@ -213,7 +214,7 @@ Benefits of Using Admin Monitoring Workspace:
213214
214215
| Semantic model access | Workspace access |
215216
| --- | --- |
216-
| <img width="550" alt="image" src="https://github.com/user-attachments/assets/a35a7168-a84c-43c1-9aa9-8e81c93b92fc"> | <img width="550" alt="image" src="https://github.com/user-attachments/assets/b4e3eb6a-be98-4194-8800-7702d72b27a9"> |
217+
| <img width="550" alt="image" src="https://github.com/user-attachments/assets/a35a7168-a84c-43c1-9aa9-8e81c93b92fc"> | <img width="550" alt="image" src="https://github.com/user-attachments/assets/b4e3eb6a-be98-4194-8800-7702d72b27a9"> |
217218

218219
## Monitor Hub
219220

@@ -239,16 +240,15 @@ Benefits of Using Admin Monitoring Workspace:
239240

240241
> For example:
241242
242-
https://github.com/user-attachments/assets/0f7fecfb-0b04-422b-abca-fcbe8827e2a2
243+
<https://github.com/user-attachments/assets/0f7fecfb-0b04-422b-abca-fcbe8827e2a2>
243244

244245
3. **Search and Filter**:
245246
- Use the keyword search box to find specific activities.
246247
- Apply filters to narrow down the results based on status, time period, item type, owner, and workspace location.
247-
248-
| Column Options | Filter Options |
249-
| --- | --- |
250-
| <img width="550" alt="image" src="https://github.com/user-attachments/assets/67c12153-1ddc-40e3-8c82-1514f3afc6a8"> | <img width="550" alt="image" src="https://github.com/user-attachments/assets/3dfcdc57-fd54-42f2-8e3c-9694fa7dca88"> |
251248

249+
| Column Options | Filter Options |
250+
| --- | --- |
251+
| <img width="550" alt="image" src="https://github.com/user-attachments/assets/67c12153-1ddc-40e3-8c82-1514f3afc6a8"> | <img width="550" alt="image" src="https://github.com/user-attachments/assets/3dfcdc57-fd54-42f2-8e3c-9694fa7dca88"> |
252252

253253
5. **Take Actions**: If you have the necessary permissions, you can perform actions on activities by selecting the More options (...) next to the activity name.
254254

@@ -262,11 +262,12 @@ https://github.com/user-attachments/assets/0f7fecfb-0b04-422b-abca-fcbe8827e2a2
262262

263263
### Extending Activity History
264264

265-
> To extend your activity tracking beyond 30 day, you can use `Microsoft Purview`: <br/>
265+
> To extend your activity tracking beyond 30 day, you can use `Microsoft Purview`: <br/>
266+
>
266267
> - Provides extended audit log retention up to 1 year with appropriate licensing. <br>
267268
> - Use the Purview portal to view and export detailed activity logs. <br>
268269
> - Utilize the Purview REST API to access scan history beyond 30 days.
269-
270+
270271
Steps to Access Microsoft Purview via Audit Logs:
271272

272273
1. **Navigate to the Admin Portal**:

0 commit comments

Comments
 (0)