Skip to content

Commit 72ccb67

Browse files
added colons back to text
1 parent d17dd2a commit 72ccb67

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

learn-pr/wwl/orchestrate-processes-in-fabric/includes/2-choose-between-pipeline-notebook.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -20,20 +20,20 @@ Dataflows are built using the Power Query experience, available across Microsoft
2020

2121
### Real-World Uses Cases for Dataflows
2222

23-
**Data Consolidation for Reporting**
23+
**Data Consolidation for Reporting**:
2424
Organizations often have data spread across multiple sources such as databases, cloud storage, and on-premises systems. Dataflows can be used to consolidate this
2525
data into a single, unified dataset, which can then be used for reporting and analytics. For example, a company might use Dataflows to combine sales data from different regions into a single dataset for a comprehensive sales report. This single dataset can be further curated and promoted into a semantic model for use by a larger audience.
2626

27-
**Data Preparation for Machine Learning**
27+
**Data Preparation for Machine Learning**:
2828
Dataflows can be used to prepare and clean data for machine learning models. This method includes tasks such as data cleansing, transformation, and feature engineering. For instance, a data science team might use Dataflows to preprocess customer data, removing duplicates and normalizing values before feeding it into a machine learning model.
2929

30-
**Real-Time Data Processing**
30+
**Real-Time Data Processing**:
3131
Dataflows can handle real-time data ingestion and transformation, making them ideal for scenarios where timely data processing is crucial. For example, an e-commerce platform might use Dataflows to process real-time transaction data, updating inventory levels and generating real-time sales reports.
3232

33-
**Data Migration**
33+
**Data Migration**:
3434
When migrating data from legacy systems to modern platforms, Dataflows can be used to extract, transform, and load (ETL) data into the new system. This process ensures that data is accurately and efficiently transferred, minimizing downtime and data loss. For instance, a company migrating from an on-premises database to Azure SQL Database might use Dataflows to handle the data migration process.
3535

36-
**Self-Service Data Preparation**
36+
**Self-Service Data Preparation**:
3737
Dataflows provide a low-code interface that allows business users to prepare their own data without needing extensive technical knowledge. This approach empowers users to create their own dataflows for tasks such as data cleansing, transformation, and enrichment, reducing the dependency on IT teams. For example, a marketing team might use Dataflows to prepare campaign data for analysis.
3838

3939
These use cases demonstrate the flexibility and power of Dataflows in handling various data integration and transformation task and show a powerful self-service feature. Self-service might be more appealing to your organization's business users while still providing a roadmap to a larger ELT project that utilizes pipelines and notebooks.

0 commit comments

Comments
 (0)