You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: learn-pr/wwl/orchestrate-processes-in-fabric/includes/2-choose-between-pipeline-notebook.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,20 +20,20 @@ Dataflows are built using the Power Query experience, available across Microsoft
20
20
21
21
### Real-World Uses Cases for Dataflows
22
22
23
-
**Data Consolidation for Reporting**
23
+
**Data Consolidation for Reporting**:
24
24
Organizations often have data spread across multiple sources such as databases, cloud storage, and on-premises systems. Dataflows can be used to consolidate this
25
25
data into a single, unified dataset, which can then be used for reporting and analytics. For example, a company might use Dataflows to combine sales data from different regions into a single dataset for a comprehensive sales report. This single dataset can be further curated and promoted into a semantic model for use by a larger audience.
26
26
27
-
**Data Preparation for Machine Learning**
27
+
**Data Preparation for Machine Learning**:
28
28
Dataflows can be used to prepare and clean data for machine learning models. This method includes tasks such as data cleansing, transformation, and feature engineering. For instance, a data science team might use Dataflows to preprocess customer data, removing duplicates and normalizing values before feeding it into a machine learning model.
29
29
30
-
**Real-Time Data Processing**
30
+
**Real-Time Data Processing**:
31
31
Dataflows can handle real-time data ingestion and transformation, making them ideal for scenarios where timely data processing is crucial. For example, an e-commerce platform might use Dataflows to process real-time transaction data, updating inventory levels and generating real-time sales reports.
32
32
33
-
**Data Migration**
33
+
**Data Migration**:
34
34
When migrating data from legacy systems to modern platforms, Dataflows can be used to extract, transform, and load (ETL) data into the new system. This process ensures that data is accurately and efficiently transferred, minimizing downtime and data loss. For instance, a company migrating from an on-premises database to Azure SQL Database might use Dataflows to handle the data migration process.
35
35
36
-
**Self-Service Data Preparation**
36
+
**Self-Service Data Preparation**:
37
37
Dataflows provide a low-code interface that allows business users to prepare their own data without needing extensive technical knowledge. This approach empowers users to create their own dataflows for tasks such as data cleansing, transformation, and enrichment, reducing the dependency on IT teams. For example, a marketing team might use Dataflows to prepare campaign data for analysis.
38
38
39
39
These use cases demonstrate the flexibility and power of Dataflows in handling various data integration and transformation task and show a powerful self-service feature. Self-service might be more appealing to your organization's business users while still providing a roadmap to a larger ELT project that utilizes pipelines and notebooks.
0 commit comments