Skip to content

Commit 8cfc181

Browse files
committed
Resolved blocking issues
1 parent b6d5c42 commit 8cfc181

File tree

4 files changed

+7
-7
lines changed

4 files changed

+7
-7
lines changed

articles/data-factory/connector-azure-blob-storage.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -629,7 +629,7 @@ Format specific settings are located in the documentation for that format. For m
629629

630630
In source transformation, you can read from a container, folder, or individual file in Azure Blob Storage. Use the **Source options** tab to manage how the files are read.
631631

632-
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Source options":::
632+
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Screenshot of source options tab in mapping data flow source transformation.":::
633633

634634
**Wildcard paths:** Using a wildcard pattern will instruct the service to loop through each matching folder and file in a single source transformation. This is an effective way to process multiple files within a single flow. Add multiple wildcard matching patterns with the plus sign that appears when you hover over your existing wildcard pattern.
635635

@@ -651,7 +651,7 @@ Wildcard examples:
651651

652652
First, set a wildcard to include all paths that are the partitioned folders plus the leaf files that you want to read.
653653

654-
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Partition source file settings":::
654+
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Screenshot of partition source file settings in mapping data flow source transformation.":::
655655

656656
Use the **Partition root path** setting to define what the top level of the folder structure is. When you view the contents of your data via a data preview, you'll see that the service will add the resolved partitions found in each of your folder levels.
657657

articles/data-factory/connector-azure-data-lake-storage.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -543,7 +543,7 @@ Format specific settings are located in the documentation for that format. For m
543543

544544
In the source transformation, you can read from a container, folder, or individual file in Azure Data Lake Storage Gen2. The **Source options** tab lets you manage how the files get read.
545545

546-
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Source options":::
546+
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Screenshot of source options tab in mapping data flow source transformation.":::
547547

548548
**Wildcard path:** Using a wildcard pattern will instruct ADF to loop through each matching folder and file in a single Source transformation. This is an effective way to process multiple files within a single flow. Add multiple wildcard matching patterns with the + sign that appears when hovering over your existing wildcard pattern.
549549

@@ -565,7 +565,7 @@ Wildcard examples:
565565

566566
First, set a wildcard to include all paths that are the partitioned folders plus the leaf files that you wish to read.
567567

568-
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Partition source file settings":::
568+
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Screenshot of partition source file settings in mapping data flow source transformation.":::
569569

570570
Use the Partition Root Path setting to define what the top level of the folder structure is. When you view the contents of your data via a data preview, you'll see that ADF will add the resolved partitions found in each of your folder levels.
571571

articles/data-factory/connector-azure-data-lake-store.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -439,7 +439,7 @@ Format-specific settings are located in the documentation for that format. For m
439439

440440
In the source transformation, you can read from a container, folder, or individual file in Azure Data Lake Storage Gen1. The **Source options** tab lets you manage how the files get read.
441441

442-
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Source options":::
442+
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Screenshot of source options tab in mapping data flow source transformation.":::
443443

444444
**Wildcard path:** Using a wildcard pattern will instruct the service to loop through each matching folder and file in a single Source transformation. This is an effective way to process multiple files within a single flow. Add multiple wildcard matching patterns with the + sign that appears when hovering over your existing wildcard pattern.
445445

@@ -461,7 +461,7 @@ Wildcard examples:
461461

462462
First, set a wildcard to include all paths that are the partitioned folders plus the leaf files that you wish to read.
463463

464-
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Partition source file settings":::
464+
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Screenshot of partition source file settings in mapping data flow source transformation.":::
465465

466466
Use the Partition Root Path setting to define what the top level of the folder structure is. When you view the contents of your data via a data preview, you'll see that the service will add the resolved partitions found in each of your folder levels.
467467

articles/data-factory/connector-azure-sql-data-warehouse.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1015,7 +1015,7 @@ By default, a data flow run will fail on the first error it gets. You can choose
10151015

10161016
**Report success on error:** If enabled, the data flow will be marked as a success even if error rows are found.
10171017

1018-
:::image type="content" source="media/data-flow/sql-error-row-handling.png" alt-text="Screenshot that shows the error row handling":::
1018+
:::image type="content" source="media/data-flow/sql-error-row-handling.png" alt-text="Diagram that shows the error row handling in mapping data flow sink transformation.":::
10191019

10201020
## Lookup activity properties
10211021

0 commit comments

Comments
 (0)