Skip to content

Commit d4b7b9e

Browse files
authored
Merge pull request #10 from MicrosoftCloudEssentials-LearningHub/ai-fabric
overview + demo
2 parents 79c8e35 + c61196b commit d4b7b9e

File tree

22 files changed

+1965
-245
lines changed

22 files changed

+1965
-245
lines changed

.github/.markdownlint.json

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
{
2+
"default": true,
3+
"MD005": false,
4+
"MD013": false,
5+
"MD028": false,
6+
"MD029": false,
7+
"MD033": false,
8+
"MD048": false,
9+
"MD040": false,
10+
"MD041": false
11+
}
Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
name: Validate and Fix Markdown
2+
3+
on:
4+
pull_request:
5+
branches:
6+
- main
7+
push:
8+
branches:
9+
- main
10+
11+
permissions:
12+
contents: write
13+
14+
jobs:
15+
validate-and-fix-markdown:
16+
runs-on: ubuntu-latest
17+
18+
steps:
19+
- name: Checkout repository
20+
uses: actions/checkout@v4
21+
with:
22+
fetch-depth: 0
23+
24+
- name: Set up Node.js
25+
uses: actions/setup-node@v3
26+
with:
27+
node-version: '16'
28+
29+
- name: Install Markdown Linter
30+
run: npm install -g markdownlint-cli
31+
32+
- name: Lint and Fix Markdown files
33+
run: markdownlint '**/*.md' --fix --config .github/.markdownlint.json
34+
35+
- name: Configure Git
36+
run: |
37+
git config --global user.email "github-actions[bot]@users.noreply.github.com"
38+
git config --global user.name "github-actions[bot]"
39+
40+
- name: Commit changes
41+
run: |
42+
git add -A
43+
git commit -m "Fix Markdown syntax issues" || echo "No changes to commit"
44+
git push origin HEAD:${{ github.event.pull_request.head.ref }}
Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
name: Validate and Fix Notebook
2+
3+
on:
4+
pull_request:
5+
branches:
6+
- main
7+
push:
8+
branches:
9+
- main
10+
11+
permissions:
12+
contents: write
13+
14+
jobs:
15+
validate-and-fix-notebook:
16+
runs-on: ubuntu-latest
17+
18+
steps:
19+
- name: Checkout repository
20+
uses: actions/checkout@v4
21+
with:
22+
fetch-depth: 0
23+
24+
- name: Set up Python
25+
uses: actions/setup-python@v4
26+
with:
27+
python-version: '3.x'
28+
29+
- name: Install Jupyter and nbformat
30+
run: |
31+
pip install jupyter nbformat
32+
33+
- name: Validate and Fix Notebook
34+
run: |
35+
python -c "
36+
import nbformat
37+
import glob
38+
for file in glob.glob('**/*.ypyb', recursive=True):
39+
with open(file, 'r') as f:
40+
nb = nbformat.read(f, as_version=4)
41+
nbformat.validate(nb)
42+
if 'application/vnd.beylor-adapt+notebook' not in nb.metadata:
43+
nb.metadata['application/vnd.beylor-adapt+notebook'] = {'version': '1.0'}
44+
with open(file, 'w') as f:
45+
nbformat.write(nb, f)
46+
"
47+
48+
- name: Configure Git
49+
run: |
50+
git config --global user.email "github-actions[bot]@users.noreply.github.com"
51+
git config --global user.name "github-actions[bot]"
52+
53+
- name: Commit changes
54+
run: |
55+
git add -A
56+
git commit -m "Fix notebook format issues" || echo "No changes to commit"
57+
git push origin HEAD:${{ github.event.pull_request.head.ref }}

Deployment-Pipelines/README.md

Lines changed: 12 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,11 @@ Costa Rica
55
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
66
[brown9804](https://github.com/brown9804)
77

8-
Last updated: 2025-04-15
8+
Last updated: 2025-04-21
99

1010
------------------------------------------
1111

12-
> Lakehouse Schema and Deployment Pipelines
13-
12+
> Lakehouse Schema and Deployment Pipelines
1413
1514
<details>
1615
<summary><b> List of References </b> (Click to expand)</summary>
@@ -33,15 +32,14 @@ Last updated: 2025-04-15
3332

3433
- [Overview](#overview)
3534
- [Demo](#demo)
36-
- [Create a Workspace](#create-a-workspace)
37-
- [Create a Lakehouse](#create-a-lakehouse)
38-
- [Create a New Semantic Model](#create-a-new-semantic-model)
39-
- [Auto-Generate Report with Copilot](#auto-generate-report-with-copilot)
40-
- [Create a Deployment Pipeline](#create-a-deployment-pipeline)
41-
- [Deploy to Production](#deploy-to-production)
35+
- [Create a Workspace](#create-a-workspace)
36+
- [Create a Lakehouse](#create-a-lakehouse)
37+
- [Create a New Semantic Model](#create-a-new-semantic-model)
38+
- [Auto-Generate Report with Copilot](#auto-generate-report-with-copilot)
39+
- [Create a Deployment Pipeline](#create-a-deployment-pipeline)
40+
- [Deploy to Production](#deploy-to-production)
4241
- [How to refresh the data](#how-to-refresh-the-data)
4342

44-
4543
</details>
4644

4745
## Overview
@@ -61,12 +59,12 @@ Process Overview:
6159
> `Specifics for Lakehouse:` For lakehouses, the deployment process typically `includes the structure and metadata but not the actual data tables`. This is why you might see the structure and semantic models deployed, but the tables themselves need to be manually refreshed or reloaded in the target environment.<br/> <br/>
6260
> `Deployment Rules:` You can set deployment rules to manage different stages and change content settings during deployment. For example, you can specify default lakehouses for notebooks to avoid manual changes post-deployment.
6361
64-
## Demo
62+
## Demo
6563

6664
<div align="center">
6765
<img src="https://github.com/user-attachments/assets/46045fa8-34e8-4fff-a343-d49f36eece89" alt="Centered Image" style="border: 2px solid #4CAF50; border-radius: 2px; padding: 2px; width: 500px; height: auto;"/>
6866
</div>
69-
67+
7068
### Create a Workspace
7169

7270
1. Navigate to the Microsoft Fabric portal.
@@ -122,11 +120,10 @@ Process Overview:
122120

123121
<img width="550" alt="image" src="https://github.com/user-attachments/assets/25bda38f-41fa-45f9-aa01-e647d9f4bd84" />
124122

125-
4. At this point, you should see something similar like following: 
123+
4. At this point, you should see something similar like following:
126124

127125
<img width="550" alt="image" src="https://github.com/user-attachments/assets/e1f88782-ddc6-4fb8-947a-af23b92a8415" />
128126

129-
130127
### Auto-Generate Report with Copilot
131128

132129
> [!NOTE]
@@ -193,6 +190,7 @@ Process Overview:
193190
| **Incremental Refresh** | Refreshes only the data that has changed since the last refresh, improving efficiency. Click [here to understand more about incremental refresh](../Workloads-Specific/PowerBi/IncrementalRefresh.md)| - **Evaluate Changes**: Checks for changes in the data source based on a DateTime column.<br>- **Retrieve Data**: Only changed data is retrieved and loaded.<br>- **Replace Data**: Updated data is processed and replaced. |
194191

195192
Steps to Set Up Incremental Refresh:
193+
196194
1. **Create or Open a Dataflow**: Start by creating a new Dataflow Gen2 or opening an existing one.
197195
2. **Configure the Query**: Ensure your query includes a DateTime column that can be used to filter the data.
198196
3. **Enable Incremental Refresh**: Right-click the query and select Incremental Refresh. Configure the settings, such as the DateTime column and the time range for data extraction.
Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,2 @@
11

2-
3-
Last updated: 2025-04-15
2+
Last updated: 2025-04-21

GitHub-Integration.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
1-
# Integrating GitHub with Microsoft Fabric - Overview
1+
# Integrating GitHub with Microsoft Fabric - Overview
22

33
Costa Rica
44

5-
[![GitHub](https://badgen.net/badge/icon/github?icon=github&label)](https://github.com)
5+
[![GitHub](https://badgen.net/badge/icon/github?icon=github&label)](https://github.com)
66
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
77
[brown9804](https://github.com/brown9804)
88

9-
Last updated: 2025-04-15
9+
Last updated: 2025-04-21
1010

1111
----------
1212

@@ -34,7 +34,7 @@ Last updated: 2025-04-15
3434

3535
</details>
3636

37-
https://github.com/user-attachments/assets/64f099a1-b749-47a6-b723-fa1cb5c575a3
37+
<https://github.com/user-attachments/assets/64f099a1-b749-47a6-b723-fa1cb5c575a3>
3838

3939
## Connect a workspace to a Git repo
4040

Monitoring-Observability/FabricActivatorRulePipeline/README.md

Lines changed: 19 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,12 @@ Costa Rica
55
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
66
[brown9804](https://github.com/brown9804)
77

8-
Last updated: 2025-04-15
8+
Last updated: 2025-04-21
99

1010
----------
1111

1212
> This process shows how to set up Microsoft Fabric Activator to automate workflows by detecting file creation events in a storage system and triggering another pipeline to run. <br/>
13+
>
1314
> 1. **First Pipeline**: The process starts with a pipeline that ends with a `Copy Data` activity. This activity uploads data into the `Lakehouse`. <br/>
1415
> 2. **Event Stream Setup**: An `Event Stream` is configured in Activator to monitor the Lakehouse for file creation or data upload events. <br/>
1516
> 3. **Triggering the Second Pipeline**: Once the event is detected (e.g., a file is uploaded), the Event Stream triggers the second pipeline to continue the workflow.
@@ -25,19 +26,19 @@ Last updated: 2025-04-15
2526
<details>
2627
<summary><b>List of Content </b> (Click to expand)</summary>
2728

28-
- [Set Up the First Pipeline](#set-up-the-first-pipeline)
29-
- [Configure Activator to Detect the Event](#configure-activator-to-detect-the-event)
30-
- [Set Up the Second Pipeline](#set-up-the-second-pipeline)
31-
- [Define the Rule in Activator](#define-the-rule-in-activator)
32-
- [Test the Entire Workflow](#test-the-entire-workflow)
33-
- [Troubleshooting If Needed](#troubleshooting-if-needed)
29+
- [Set Up the First Pipeline](#set-up-the-first-pipeline)
30+
- [Configure Activator to Detect the Event](#configure-activator-to-detect-the-event)
31+
- [Set Up the Second Pipeline](#set-up-the-second-pipeline)
32+
- [Define the Rule in Activator](#define-the-rule-in-activator)
33+
- [Test the Entire Workflow](#test-the-entire-workflow)
34+
- [Troubleshooting If Needed](#troubleshooting-if-needed)
3435

3536
</details>
3637

3738
> [!NOTE]
3839
> This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format. Click here to see the [example script](./GeneratesRandomData.ipynb)
3940
40-
https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d
41+
<https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d>
4142

4243
## Set Up the First Pipeline
4344

@@ -50,14 +51,14 @@ https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d
5051
- Ensure the file name and path are consistent and predictable (e.g., `trigger_file.json` in a specific folder).
5152
3. **Publish and Test**: Publish the pipeline and test it to ensure the trigger file is created successfully.
5253

53-
https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831
54+
<https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831>
5455

5556
## Configure Activator to Detect the Event
5657

5758
> [!TIP]
5859
> Event options:
5960
60-
https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
61+
<https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d>
6162

6263
1. **Set Up an Event**:
6364
- Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). Click on `Real-Time`:
@@ -71,18 +72,18 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
7172
<img width="550" alt="image" src="https://github.com/user-attachments/assets/94e5556b-5d56-4a42-9edd-83b514e7c953" />
7273

7374
- Add a source:
74-
75+
7576
<img width="550" alt="image" src="https://github.com/user-attachments/assets/9709a690-f3b5-453b-b3d9-c67d4b1a9465" />
7677

7778
<img width="550" alt="image" src="https://github.com/user-attachments/assets/8dcadd23-4abb-47ee-82ca-f3868cb818e1" />
7879

79-
https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b
80+
<https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b>
8081

8182
2. **Test Event Detection**:
8283
- Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation.
8384
- Check the **Event Details** screen in Activator to confirm the event is logged.
8485

85-
https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd
86+
<https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd>
8687

8788
## Set Up the Second Pipeline
8889

@@ -91,13 +92,13 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
9192
- Ensure it is configured to accept external triggers.
9293
2. **Publish the Pipeline**: Publish the second pipeline and ensure it is ready to be triggered.
9394

94-
https://github.com/user-attachments/assets/5b630579-a0ec-4d5b-b973-d9b4fdd8254c
95+
<https://github.com/user-attachments/assets/5b630579-a0ec-4d5b-b973-d9b4fdd8254c>
9596

9697
## Define the Rule in Activator
9798

9899
1. **Setup the Activator**:
99100

100-
https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568
101+
<https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568>
101102

102103
2. **Create a New Rule**:
103104
- In `Activator`, create a rule that responds to the event you just configured.
@@ -109,17 +110,18 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
109110
- Save the rule and activate it.
110111
- Ensure the rule is enabled and ready to respond to the event.
111112

112-
https://github.com/user-attachments/assets/5f139eeb-bab0-4d43-9f22-bbe44503ed75
113+
<https://github.com/user-attachments/assets/5f139eeb-bab0-4d43-9f22-bbe44503ed75>
113114

114115
## Test the Entire Workflow
115116

116117
1. **Run the First Pipeline**: Execute the first pipeline and verify that the trigger file is created.
117118
2. **Monitor Activator**: Check the `Event Details` and `Rule Activation Details` in Activator to ensure the event is detected and the rule is activated.
118119
3. **Verify the Second Pipeline**: Confirm that the second pipeline is triggered and runs successfully.
119120

120-
https://github.com/user-attachments/assets/0a1dab70-2317-4636-b0be-aa0cb301b496
121+
<https://github.com/user-attachments/assets/0a1dab70-2317-4636-b0be-aa0cb301b496>
121122

122123
## Troubleshooting (If Needed)
124+
123125
- If the second pipeline does not trigger:
124126
1. Double-check the rule configuration in Activator.
125127
2. Review the logs in Activator for any errors or warnings.
@@ -128,4 +130,3 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
128130
<h3 style="color: #4CAF50;">Total Visitors</h3>
129131
<img src="https://profile-counter.glitch.me/brown9804/count.svg" alt="Visitor Count" style="border: 2px solid #4CAF50; border-radius: 5px; padding: 5px;"/>
130132
</div>
131-

0 commit comments

Comments
 (0)