You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: learn-pr/wwl/orchestrate-processes-in-fabric/includes/1-introduction.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,4 +20,4 @@ The articles covered in this module include:
20
20
- Event-based triggers and scheduling in Microsoft Fabric.
21
21
- practice some dynamic features of Microsoft Fabric notebooks
22
22
23
-
Students learn how Data Factory in Microsoft Fabric is used for modern data integration, including ingesting, preparing, and transforming data from various sources. They understand how to create pipelines, automate processes with triggers, and utilize Fast Copy for rapid data movement to Lakehouse and Data Warehouse. Additionally, they explore Notebooks in Microsoft Fabric for interactive data exploration, multi-language support, data visualization, and collaboration. The training provides an overview of integration with other Microsoft Fabric services, event-based triggers, scheduling, and dynamic features of notebooks, providing an understanding of data workflows within the Microsoft Fabric ecosystem.
23
+
Students learn how Data Factory in Microsoft Fabric is used for modern data integration, including ingesting, preparing, and transforming data from various sources. They understand how to create pipelines, automate processes with triggers, and utilize Fast Copy for rapid data movement to Lakehouse and Data Warehouse. Additionally, they explore Notebooks in Microsoft Fabric for interactive data exploration, multi-language support, data visualization, and collaboration. The training provides an overview of integration with other Microsoft Fabric services, event-based triggers, scheduling, and dynamic features of notebooks, providing an understanding of data workflows within Microsoft Fabric.
Copy file name to clipboardExpand all lines: learn-pr/wwl/orchestrate-processes-in-fabric/includes/2-choose-between-pipeline-notebook.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -16,7 +16,7 @@ Dataflows offer a low-code interface to ingest data from numerous sources and tr
16
16
17
17
### Power Query integration
18
18
19
-
Dataflows are built using the Power Query experience, available across Microsoft products like Excel, Power BI, Power Platform, and Dynamics 365 Insights. Power Query enables users, from beginners to professionals, to perform data ingestion and transformations with ease. It supports joins, aggregations, data cleansing, custom transformations, and more, all through a user-friendly, visual, low-code interface.
19
+
Dataflows are built using the Power Query experience, available across Microsoft products like Excel, Power BI, and Power Platform. Power Query enables users, from beginners to professionals, to perform data ingestion and transformations with ease. It supports joins, aggregations, data cleansing, custom transformations, and more, all through a user-friendly, visual, low-code interface.
20
20
21
21
### Real-World Uses Cases for Dataflows
22
22
@@ -44,7 +44,7 @@ Data pipelines offer powerful workflow capabilities at cloud-scale, enabling you
44
44
### Important features of data Pipelines
45
45
46
46
-**Complex Workflows**: Build workflows that can refresh dataflows, move large volumes of data, and define control flow pipelines.
47
-
-**ETL and Data Factory Workflows**: Create complex ETL (Extract, Transform, Load) and data factory workflows to perform various tasks at scale.
47
+
-**ETL and Data Factory Workflows**: Create complex ETL (Extract, Transform, Load) and data factory workflows which perform various tasks at scale.
48
48
-**Control Flow Capabilities**: Utilize built-in control flow features to build workflow logic with loops and conditionals.
49
49
50
50
### End-to-End ETL Data Pipeline
@@ -57,6 +57,6 @@ Combine a configuration-driven copy activity with your low-code dataflow refresh
57
57
-**Multi-language Support:** Users can write and execute code in multiple languages within the same notebook, enhancing flexibility and collaboration.
58
58
-**Visualization:** Notebooks support rich data visualization, enabling users to create charts, graphs, and other visual representations of data.
59
59
-**Collaboration:** Notebooks facilitate collaboration by allowing multiple users to work on the same document simultaneously, share insights, and track changes.
60
-
-**Integration with Fabric Services:** Notebooks seamlessly integrate with other Microsoft Fabric services, such as Data Factory, Synapse Data Engineering, and Synapse Data Science. This provides a unified platform for end-to-end data workflows.
60
+
-**Integration with Fabric Services:** Notebooks seamlessly integrate with other Microsoft Fabric services, such as Data Factory, Synapse Data Engineering, and Synapse Data Science. This approach provides a unified platform for end-to-end data workflows.
61
61
62
62
When comparing these technologies, it's important to note that while Data Factory focuses on data integration and pipeline automation, notebooks in Microsoft Fabric provide an interactive and ***collaborative*** environment for data exploration, documentation, transformation, and analysis. Both tools complement each other, offering a comprehensive solution for managing and analyzing data within the Microsoft Fabric ecosystem.
Copy file name to clipboardExpand all lines: learn-pr/wwl/orchestrate-processes-in-fabric/includes/4-exercise-implement-dynamic-patterns-to-notebooks.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,4 +6,4 @@ Now it's your chance to build and schedule a pipeline with a dynamic notebook in
6
6
7
7
Launch the exercise and follow the instructions.
8
8
9
-
[](https://go.microsoft.com/fwlink/?linkid=2260722)
9
+
[](https://go.microsoft.com/fwlink/?linkid=2260721)
Copy file name to clipboardExpand all lines: learn-pr/wwl/orchestrate-processes-in-fabric/includes/6-summary.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
-
In this module, you learned about the functionalities and applications of Microsoft Fabric's Data Factory. The module covered how Data Factory offers a modern approach to data integration, allowing for the collection, preparation, and transformation of data from various sources. You learned about Dataflows, a low-code interface that enables data ingestion from numerous sources and transformation using over 300 data transformations. The module also discussed the use of Notebooks for interactive data exploration, multi-language coding, rich data visualization, collaboration, and integration with other Fabric services. Lastly, you learned about the different options for scheduling jobs in Fabric, including traditional, proactive, and event-driven methods.
1
+
In this module, you learned about the functionalities and applications of Microsoft Fabric's Data Factory. The module covered how Data Factory offers a modern approach to data integration, allowing for the collection, preparation, and transformation of data from various sources. You learned about Dataflows, a low-code interface that enables data ingestion from numerous sources and transformation using over 300 data transformations.
2
2
3
-
The main takeaways from this module include understanding how Data Factory and Notebooks complement each other in managing and analyzing data within the Microsoft Fabric ecosystem. You also learned about the real-world use cases of Dataflows, such as data consolidation for reporting, data preparation for machine learning, real-time data processing, data migration, and self-service data preparation. Additionally, you gained insights into how to initiate data pipeline runs either manually or automatically and how to automate tasks based on events in storage accounts using storage event triggers.
3
+
The main takeaways from this module include understanding how Data Factory and Notebooks complement each other in managing and analyzing data within the Microsoft Fabric ecosystem. Additionally, you gained insights into how to initiate data pipeline runs either manually or automatically and how to automate tasks based on events in storage accounts using storage event triggers.
4
4
5
5
More Reading:
6
6
1.[Getting started with Microsoft Fabric](https://docs.microsoft.com/azure/data-factory/introduction)
0 commit comments