Skip to content

Commit ceb94d8

Browse files
committed
Resolving renamed images
1 parent 6401588 commit ceb94d8

File tree

3 files changed

+6
-11
lines changed

3 files changed

+6
-11
lines changed

articles/data-factory/tutorial-data-flow-delta-lake.md

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -54,13 +54,11 @@ In this step, you create a pipeline that contains a data flow activity.
5454
1. In the **General** tab for the pipeline, enter **DeltaLake** for **Name** of the pipeline.
5555
1. In the **Activities** pane, expand the **Move and Transform** accordion. Drag and drop the **Data Flow** activity from the pane to the pipeline canvas.
5656

57-
:::image type="content" source="media/tutorial-data-flow/activity1.png" alt-text="Screenshot that shows the pipeline canvas where you can drop the Data Flow activity.":::
58-
1. In the **Adding Data Flow** pop-up, select **Create new Data Flow** and then name your data flow **DeltaLake**. Select Finish when done.
57+
:::image type="content" source="media/tutorial-data-flow/activity.png" alt-text="Screenshot that shows the pipeline canvas where you can drop the Data Flow activity.":::
5958

60-
:::image type="content" source="media/tutorial-data-flow/activity2.png" alt-text="Screenshot that shows where you name your data flow when you create a new data flow.":::
6159
1. In the top bar of the pipeline canvas, slide the **Data Flow debug** slider on. Debug mode allows for interactive testing of transformation logic against a live Spark cluster. Data Flow clusters take 5-7 minutes to warm up and users are recommended to turn on debug first if they plan to do Data Flow development. For more information, see [Debug Mode](concepts-data-flow-debug-mode.md).
6260

63-
:::image type="content" source="media/tutorial-data-flow/dataflow1.png" alt-text="Screenshot that shows where is the Data flow debug slider.":::
61+
:::image type="content" source="media/tutorial-data-flow/dataflow.png" alt-text="Screenshot that shows where is the Data flow debug slider.":::
6462

6563
## Build transformation logic in the data flow canvas
6664

articles/data-factory/tutorial-data-flow-dynamic-columns.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,10 +45,10 @@ In this step, you'll create a pipeline that contains a data flow activity.
4545
1. In the **General** tab for the pipeline, enter **DeltaLake** for **Name** of the pipeline.
4646
1. In the factory top bar, slide the **Data Flow debug** slider on. Debug mode allows for interactive testing of transformation logic against a live Spark cluster. Data Flow clusters take 5-7 minutes to warm up and users are recommended to turn on debug first if they plan to do Data Flow development. For more information, see [Debug Mode](concepts-data-flow-debug-mode.md).
4747

48-
:::image type="content" source="media/tutorial-data-flow/dataflow1.png" alt-text="Data Flow Activity":::
48+
:::image type="content" source="media/tutorial-data-flow/dataflow.png" alt-text="Data Flow Activity":::
4949
1. In the **Activities** pane, expand the **Move and Transform** accordion. Drag and drop the **Data Flow** activity from the pane to the pipeline canvas.
5050

51-
:::image type="content" source="media/tutorial-data-flow/activity1.png" alt-text="Screenshot that shows the pipeline canvas where you can drop the Data Flow activity.":::
51+
:::image type="content" source="media/tutorial-data-flow/activity.png" alt-text="Screenshot that shows the pipeline canvas where you can drop the Data Flow activity.":::
5252
1. In the **Adding Data Flow** pop-up, select **Create new Data Flow** and then name your data flow **DynaCols**. Select Finish when done.
5353

5454
## Build dynamic column mapping in data flows

articles/data-factory/tutorial-data-flow-write-to-lake.md

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -53,13 +53,10 @@ In this step, you'll create a pipeline that contains a data flow activity.
5353
1. In the **General** tab for the pipeline, enter **DeltaLake** for **Name** of the pipeline.
5454
1. In the factory top bar, slide the **Data Flow debug** slider on. Debug mode allows for interactive testing of transformation logic against a live Spark cluster. Data Flow clusters take 5-7 minutes to warm up and users are recommended to turn on debug first if they plan to do Data Flow development. For more information, see [Debug Mode](concepts-data-flow-debug-mode.md).
5555

56-
:::image type="content" source="media/tutorial-data-flow/dataflow1.png" alt-text="Data Flow Activity":::
56+
:::image type="content" source="media/tutorial-data-flow/dataflow.png" alt-text="Data Flow Activity":::
5757
1. In the **Activities** pane, expand the **Move and Transform** accordion. Drag and drop the **Data Flow** activity from the pane to the pipeline canvas.
5858

59-
:::image type="content" source="media/tutorial-data-flow/activity1.png" alt-text="Screenshot that shows the pipeline canvas where you can drop the Data Flow activity.":::
60-
1. In the **Adding Data Flow** pop-up, select **Create new Data Flow** and then name your data flow **DeltaLake**. Click Finish when done.
61-
62-
:::image type="content" source="media/tutorial-data-flow/activity2.png" alt-text="Screenshot that shows where you name your data flow when you create a new data flow.":::
59+
:::image type="content" source="media/tutorial-data-flow/activity.png" alt-text="Screenshot that shows the pipeline canvas where you can drop the Data Flow activity.":::
6360

6461
## Build transformation logic in the data flow canvas
6562

0 commit comments

Comments
 (0)