Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
2e5acbd
overview + demo
brown9804 Apr 21, 2025
f305725
Merge 2e5acbd5fadfe7a1aeb99c1b457b292b52790df1 into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
c35eb84
Update last modified date in Markdown files
github-actions[bot] Apr 21, 2025
bc3ee34
notebook
brown9804 Apr 21, 2025
ee9b78a
Merge bc3ee342956e6ffac94d8df6ad44f2c9fd6cbd2c into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
1ac1f37
change format
brown9804 Apr 21, 2025
91b6c61
Merge 1ac1f37a751f2a9bf5c5b39cb8906ebc5d5fe865 into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
45921fe
pipeline + validate_and_fix_markdown.yml
brown9804 Apr 21, 2025
bfb4d6c
Merge 45921fe3eb56ac2880f0cd4263a6720cde7cf246 into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
3fa3cc3
fixing lbs issue
brown9804 Apr 21, 2025
42209eb
Merge 3fa3cc39f4cd055d07865e38b14875a6887022d0 into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
f0d2168
trying to fix list and fix markdown
brown9804 Apr 21, 2025
471d303
Merge f0d21688492a24120f79ebd8e980b77be5400fe1 into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
b8f4bae
+ .markdownlint.json
brown9804 Apr 21, 2025
e6461b1
Merge b8f4bae5099e562865e0308ff113ff39b20a613e into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
a97aa9e
changing path
brown9804 Apr 21, 2025
55d2536
Merge a97aa9efd3325ac770e2aeaa6920a46daa206729 into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
10b4ad8
paht
brown9804 Apr 21, 2025
52262f9
Merge 10b4ad87106c0d85eaca8515bc8a0a32f88cad8b into 4a4d2369d8ee044fe…
brown9804 Apr 21, 2025
fadedfd
clean format
brown9804 Apr 21, 2025
79c8e35
format
brown9804 Apr 21, 2025
61e665a
mk lint updated
brown9804 Apr 21, 2025
1238349
Merge 61e665acec0b93d080da97648e410ed21396300d into 79c8e357405c504f6…
brown9804 Apr 21, 2025
70c8f28
omit roles for format
brown9804 Apr 21, 2025
c66ebb8
Merge 70c8f28b5999c69582160e12ef95a6136ea9297a into 79c8e357405c504f6…
brown9804 Apr 21, 2025
8ecf50b
Fix Markdown syntax issues
github-actions[bot] Apr 21, 2025
4965128
+ validate_and_fix_notebook.yml
brown9804 Apr 21, 2025
2beed83
Merge 496512861be7df214bdb57bcb8c0d89564ea4d2c into 79c8e357405c504f6…
brown9804 Apr 21, 2025
c61196b
Update last modified date in Markdown files
github-actions[bot] Apr 21, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions .github/.markdownlint.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
{
"default": true,
"MD005": false,
"MD013": false,
"MD028": false,
"MD029": false,
"MD033": false,
"MD048": false,
"MD040": false,
"MD041": false
}
44 changes: 44 additions & 0 deletions .github/workflows/validate_and_fix_markdown.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
name: Validate and Fix Markdown

on:
pull_request:
branches:
- main
push:
branches:
- main

permissions:
contents: write

jobs:
validate-and-fix-markdown:
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Set up Node.js
uses: actions/setup-node@v3
with:
node-version: '16'

- name: Install Markdown Linter
run: npm install -g markdownlint-cli

- name: Lint and Fix Markdown files
run: markdownlint '**/*.md' --fix --config .github/.markdownlint.json

- name: Configure Git
run: |
git config --global user.email "github-actions[bot]@users.noreply.github.com"
git config --global user.name "github-actions[bot]"

- name: Commit changes
run: |
git add -A
git commit -m "Fix Markdown syntax issues" || echo "No changes to commit"
git push origin HEAD:${{ github.event.pull_request.head.ref }}
57 changes: 57 additions & 0 deletions .github/workflows/validate_and_fix_notebook.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
name: Validate and Fix Notebook

on:
pull_request:
branches:
- main
push:
branches:
- main

permissions:
contents: write

jobs:
validate-and-fix-notebook:
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x'

- name: Install Jupyter and nbformat
run: |
pip install jupyter nbformat

- name: Validate and Fix Notebook
run: |
python -c "
import nbformat
import glob
for file in glob.glob('**/*.ypyb', recursive=True):
with open(file, 'r') as f:
nb = nbformat.read(f, as_version=4)
nbformat.validate(nb)
if 'application/vnd.beylor-adapt+notebook' not in nb.metadata:
nb.metadata['application/vnd.beylor-adapt+notebook'] = {'version': '1.0'}
with open(file, 'w') as f:
nbformat.write(nb, f)
"

- name: Configure Git
run: |
git config --global user.email "github-actions[bot]@users.noreply.github.com"
git config --global user.name "github-actions[bot]"

- name: Commit changes
run: |
git add -A
git commit -m "Fix notebook format issues" || echo "No changes to commit"
git push origin HEAD:${{ github.event.pull_request.head.ref }}
26 changes: 12 additions & 14 deletions Deployment-Pipelines/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,11 @@ Costa Rica
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
[brown9804](https://github.com/brown9804)

Last updated: 2025-04-15
Last updated: 2025-04-21

------------------------------------------

> Lakehouse Schema and Deployment Pipelines

> Lakehouse Schema and Deployment Pipelines

<details>
<summary><b> List of References </b> (Click to expand)</summary>
Expand All @@ -33,15 +32,14 @@ Last updated: 2025-04-15

- [Overview](#overview)
- [Demo](#demo)
- [Create a Workspace](#create-a-workspace)
- [Create a Lakehouse](#create-a-lakehouse)
- [Create a New Semantic Model](#create-a-new-semantic-model)
- [Auto-Generate Report with Copilot](#auto-generate-report-with-copilot)
- [Create a Deployment Pipeline](#create-a-deployment-pipeline)
- [Deploy to Production](#deploy-to-production)
- [Create a Workspace](#create-a-workspace)
- [Create a Lakehouse](#create-a-lakehouse)
- [Create a New Semantic Model](#create-a-new-semantic-model)
- [Auto-Generate Report with Copilot](#auto-generate-report-with-copilot)
- [Create a Deployment Pipeline](#create-a-deployment-pipeline)
- [Deploy to Production](#deploy-to-production)
- [How to refresh the data](#how-to-refresh-the-data)


</details>

## Overview
Expand All @@ -61,12 +59,12 @@ Process Overview:
> `Specifics for Lakehouse:` For lakehouses, the deployment process typically `includes the structure and metadata but not the actual data tables`. This is why you might see the structure and semantic models deployed, but the tables themselves need to be manually refreshed or reloaded in the target environment.<br/> <br/>
> `Deployment Rules:` You can set deployment rules to manage different stages and change content settings during deployment. For example, you can specify default lakehouses for notebooks to avoid manual changes post-deployment.

## Demo
## Demo

<div align="center">
<img src="https://github.com/user-attachments/assets/46045fa8-34e8-4fff-a343-d49f36eece89" alt="Centered Image" style="border: 2px solid #4CAF50; border-radius: 2px; padding: 2px; width: 500px; height: auto;"/>
</div>

### Create a Workspace

1. Navigate to the Microsoft Fabric portal.
Expand Down Expand Up @@ -122,11 +120,10 @@ Process Overview:

<img width="550" alt="image" src="https://github.com/user-attachments/assets/25bda38f-41fa-45f9-aa01-e647d9f4bd84" />

4. At this point, you should see something similar like following: 
4. At this point, you should see something similar like following:

<img width="550" alt="image" src="https://github.com/user-attachments/assets/e1f88782-ddc6-4fb8-947a-af23b92a8415" />


### Auto-Generate Report with Copilot

> [!NOTE]
Expand Down Expand Up @@ -193,6 +190,7 @@ Process Overview:
| **Incremental Refresh** | Refreshes only the data that has changed since the last refresh, improving efficiency. Click [here to understand more about incremental refresh](../Workloads-Specific/PowerBi/IncrementalRefresh.md)| - **Evaluate Changes**: Checks for changes in the data source based on a DateTime column.<br>- **Retrieve Data**: Only changed data is retrieved and loaded.<br>- **Replace Data**: Updated data is processed and replaced. |

Steps to Set Up Incremental Refresh:

1. **Create or Open a Dataflow**: Start by creating a new Dataflow Gen2 or opening an existing one.
2. **Configure the Query**: Ensure your query includes a DateTime column that can be used to filter the data.
3. **Enable Incremental Refresh**: Right-click the query and select Incremental Refresh. Configure the settings, such as the DateTime column and the time range for data extraction.
Expand Down
3 changes: 1 addition & 2 deletions Deployment-Pipelines/samples/data/readme.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@


Last updated: 2025-04-15
Last updated: 2025-04-21
10 changes: 4 additions & 6 deletions GitHub-Integration.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# Integrating GitHub with Microsoft Fabric - Overview
# Integrating GitHub with Microsoft Fabric - Overview

Costa Rica

[![GitHub](https://badgen.net/badge/icon/github?icon=github&label)](https://github.com)
[![GitHub](https://badgen.net/badge/icon/github?icon=github&label)](https://github.com)
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
[brown9804](https://github.com/brown9804)

Last updated: 2025-04-15
Last updated: 2025-04-21

----------

Expand All @@ -25,8 +25,6 @@ Last updated: 2025-04-15
<details>
<summary><b>Table of Content </b> (Click to expand)</summary>

- [Wiki](#wiki)
- [Content](#content)
- [Connect a workspace to a Git repo](#connect-a-workspace-to-a-git-repo)
- [Connecting to a workspace Already linked to GitHub](#connecting-to-a-workspace-already-linked-to-github)
- [Commit changes to git](#commit-changes-to-git)
Expand All @@ -36,7 +34,7 @@ Last updated: 2025-04-15

</details>

https://github.com/user-attachments/assets/64f099a1-b749-47a6-b723-fa1cb5c575a3
<https://github.com/user-attachments/assets/64f099a1-b749-47a6-b723-fa1cb5c575a3>

## Connect a workspace to a Git repo

Expand Down
37 changes: 19 additions & 18 deletions Monitoring-Observability/FabricActivatorRulePipeline/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,12 @@ Costa Rica
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
[brown9804](https://github.com/brown9804)

Last updated: 2025-04-15
Last updated: 2025-04-21

----------

> This process shows how to set up Microsoft Fabric Activator to automate workflows by detecting file creation events in a storage system and triggering another pipeline to run. <br/>
>
> 1. **First Pipeline**: The process starts with a pipeline that ends with a `Copy Data` activity. This activity uploads data into the `Lakehouse`. <br/>
> 2. **Event Stream Setup**: An `Event Stream` is configured in Activator to monitor the Lakehouse for file creation or data upload events. <br/>
> 3. **Triggering the Second Pipeline**: Once the event is detected (e.g., a file is uploaded), the Event Stream triggers the second pipeline to continue the workflow.
Expand All @@ -25,19 +26,19 @@ Last updated: 2025-04-15
<details>
<summary><b>List of Content </b> (Click to expand)</summary>

- [Set Up the First Pipeline](#set-up-the-first-pipeline)
- [Configure Activator to Detect the Event](#configure-activator-to-detect-the-event)
- [Set Up the Second Pipeline](#set-up-the-second-pipeline)
- [Define the Rule in Activator](#define-the-rule-in-activator)
- [Test the Entire Workflow](#test-the-entire-workflow)
- [Troubleshooting If Needed](#troubleshooting-if-needed)
- [Set Up the First Pipeline](#set-up-the-first-pipeline)
- [Configure Activator to Detect the Event](#configure-activator-to-detect-the-event)
- [Set Up the Second Pipeline](#set-up-the-second-pipeline)
- [Define the Rule in Activator](#define-the-rule-in-activator)
- [Test the Entire Workflow](#test-the-entire-workflow)
- [Troubleshooting If Needed](#troubleshooting-if-needed)

</details>

> [!NOTE]
> This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format. Click here to see the [example script](./GeneratesRandomData.ipynb)

https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d
<https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d>

## Set Up the First Pipeline

Expand All @@ -50,14 +51,14 @@ https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d
- Ensure the file name and path are consistent and predictable (e.g., `trigger_file.json` in a specific folder).
3. **Publish and Test**: Publish the pipeline and test it to ensure the trigger file is created successfully.

https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831
<https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831>

## Configure Activator to Detect the Event

> [!TIP]
> Event options:

https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
<https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d>

1. **Set Up an Event**:
- Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). Click on `Real-Time`:
Expand All @@ -71,18 +72,18 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
<img width="550" alt="image" src="https://github.com/user-attachments/assets/94e5556b-5d56-4a42-9edd-83b514e7c953" />

- Add a source:

<img width="550" alt="image" src="https://github.com/user-attachments/assets/9709a690-f3b5-453b-b3d9-c67d4b1a9465" />

<img width="550" alt="image" src="https://github.com/user-attachments/assets/8dcadd23-4abb-47ee-82ca-f3868cb818e1" />

https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b
<https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b>

2. **Test Event Detection**:
- Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation.
- Check the **Event Details** screen in Activator to confirm the event is logged.

https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd
<https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd>

## Set Up the Second Pipeline

Expand All @@ -91,13 +92,13 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
- Ensure it is configured to accept external triggers.
2. **Publish the Pipeline**: Publish the second pipeline and ensure it is ready to be triggered.

https://github.com/user-attachments/assets/5b630579-a0ec-4d5b-b973-d9b4fdd8254c
<https://github.com/user-attachments/assets/5b630579-a0ec-4d5b-b973-d9b4fdd8254c>

## Define the Rule in Activator

1. **Setup the Activator**:

https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568
<https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568>

2. **Create a New Rule**:
- In `Activator`, create a rule that responds to the event you just configured.
Expand All @@ -109,17 +110,18 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
- Save the rule and activate it.
- Ensure the rule is enabled and ready to respond to the event.

https://github.com/user-attachments/assets/5f139eeb-bab0-4d43-9f22-bbe44503ed75
<https://github.com/user-attachments/assets/5f139eeb-bab0-4d43-9f22-bbe44503ed75>

## Test the Entire Workflow

1. **Run the First Pipeline**: Execute the first pipeline and verify that the trigger file is created.
2. **Monitor Activator**: Check the `Event Details` and `Rule Activation Details` in Activator to ensure the event is detected and the rule is activated.
3. **Verify the Second Pipeline**: Confirm that the second pipeline is triggered and runs successfully.

https://github.com/user-attachments/assets/0a1dab70-2317-4636-b0be-aa0cb301b496
<https://github.com/user-attachments/assets/0a1dab70-2317-4636-b0be-aa0cb301b496>

## Troubleshooting (If Needed)

- If the second pipeline does not trigger:
1. Double-check the rule configuration in Activator.
2. Review the logs in Activator for any errors or warnings.
Expand All @@ -128,4 +130,3 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d
<h3 style="color: #4CAF50;">Total Visitors</h3>
<img src="https://profile-counter.glitch.me/brown9804/count.svg" alt="Visitor Count" style="border: 2px solid #4CAF50; border-radius: 5px; padding: 5px;"/>
</div>

Loading