Skip to content

Commit b5edc05

Browse files
authored
Merge pull request #209863 from jess-hu-340/0831-update-screenshot15
Update screenshot in doc ( Blob Storage, SQL Database, Amazon S3 )
2 parents 4b4ada9 + 8cfc181 commit b5edc05

15 files changed

+14
-14
lines changed

articles/data-factory/connector-amazon-simple-storage-service.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 06/29/2022
11+
ms.date: 09/01/2022
1212
---
1313

1414
# Copy and transform data in Amazon Simple Storage Service using Azure Data Factory or Azure Synapse Analytics
@@ -302,7 +302,7 @@ Format specific settings are located in the documentation for that format. For m
302302

303303
In source transformation, you can read from a container, folder, or individual file in Amazon S3. Use the **Source options** tab to manage how the files are read.
304304

305-
:::image type="content" source="media/data-flow/sourceOptions1.png" alt-text="Screenshot of Source options.":::
305+
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Screenshot of Source options.":::
306306

307307
**Wildcard paths:** Using a wildcard pattern will instruct the service to loop through each matching folder and file in a single source transformation. This is an effective way to process multiple files within a single flow. Add multiple wildcard matching patterns with the plus sign that appears when you hover over your existing wildcard pattern.
308308

@@ -324,7 +324,7 @@ Wildcard examples:
324324

325325
First, set a wildcard to include all paths that are the partitioned folders plus the leaf files that you want to read.
326326

327-
:::image type="content" source="media/data-flow/partfile2.png" alt-text="Screenshot of partition source file settings.":::
327+
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Screenshot of partition source file settings.":::
328328

329329
Use the **Partition root path** setting to define what the top level of the folder structure is. When you view the contents of your data via a data preview, you'll see that the service will add the resolved partitions found in each of your folder levels.
330330

articles/data-factory/connector-azure-blob-storage.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 08/24/2022
11+
ms.date: 09/01/2022
1212
---
1313

1414
# Copy and transform data in Azure Blob Storage by using Azure Data Factory or Azure Synapse Analytics
@@ -629,7 +629,7 @@ Format specific settings are located in the documentation for that format. For m
629629

630630
In source transformation, you can read from a container, folder, or individual file in Azure Blob Storage. Use the **Source options** tab to manage how the files are read.
631631

632-
:::image type="content" source="media/data-flow/sourceOptions1.png" alt-text="Source options":::
632+
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Screenshot of source options tab in mapping data flow source transformation.":::
633633

634634
**Wildcard paths:** Using a wildcard pattern will instruct the service to loop through each matching folder and file in a single source transformation. This is an effective way to process multiple files within a single flow. Add multiple wildcard matching patterns with the plus sign that appears when you hover over your existing wildcard pattern.
635635

@@ -651,7 +651,7 @@ Wildcard examples:
651651

652652
First, set a wildcard to include all paths that are the partitioned folders plus the leaf files that you want to read.
653653

654-
:::image type="content" source="media/data-flow/partfile2.png" alt-text="Partition source file settings":::
654+
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Screenshot of partition source file settings in mapping data flow source transformation.":::
655655

656656
Use the **Partition root path** setting to define what the top level of the folder structure is. When you view the contents of your data via a data preview, you'll see that the service will add the resolved partitions found in each of your folder levels.
657657

articles/data-factory/connector-azure-data-lake-storage.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 08/15/2022
11+
ms.date: 09/01/2022
1212
---
1313

1414
# Copy and transform data in Azure Data Lake Storage Gen2 using Azure Data Factory or Azure Synapse Analytics
@@ -543,7 +543,7 @@ Format specific settings are located in the documentation for that format. For m
543543

544544
In the source transformation, you can read from a container, folder, or individual file in Azure Data Lake Storage Gen2. The **Source options** tab lets you manage how the files get read.
545545

546-
:::image type="content" source="media/data-flow/sourceOptions1.png" alt-text="Source options":::
546+
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Screenshot of source options tab in mapping data flow source transformation.":::
547547

548548
**Wildcard path:** Using a wildcard pattern will instruct ADF to loop through each matching folder and file in a single Source transformation. This is an effective way to process multiple files within a single flow. Add multiple wildcard matching patterns with the + sign that appears when hovering over your existing wildcard pattern.
549549

@@ -565,7 +565,7 @@ Wildcard examples:
565565

566566
First, set a wildcard to include all paths that are the partitioned folders plus the leaf files that you wish to read.
567567

568-
:::image type="content" source="media/data-flow/partfile2.png" alt-text="Partition source file settings":::
568+
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Screenshot of partition source file settings in mapping data flow source transformation.":::
569569

570570
Use the Partition Root Path setting to define what the top level of the folder structure is. When you view the contents of your data via a data preview, you'll see that ADF will add the resolved partitions found in each of your folder levels.
571571

articles/data-factory/connector-azure-data-lake-store.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 07/04/2022
11+
ms.date: 09/01/2022
1212
---
1313

1414
# Copy data to or from Azure Data Lake Storage Gen1 using Azure Data Factory or Azure Synapse Analytics
@@ -439,7 +439,7 @@ Format-specific settings are located in the documentation for that format. For m
439439

440440
In the source transformation, you can read from a container, folder, or individual file in Azure Data Lake Storage Gen1. The **Source options** tab lets you manage how the files get read.
441441

442-
:::image type="content" source="media/data-flow/sourceOptions1.png" alt-text="Source options":::
442+
:::image type="content" source="media/data-flow/source-options-1.png" alt-text="Screenshot of source options tab in mapping data flow source transformation.":::
443443

444444
**Wildcard path:** Using a wildcard pattern will instruct the service to loop through each matching folder and file in a single Source transformation. This is an effective way to process multiple files within a single flow. Add multiple wildcard matching patterns with the + sign that appears when hovering over your existing wildcard pattern.
445445

@@ -461,7 +461,7 @@ Wildcard examples:
461461

462462
First, set a wildcard to include all paths that are the partitioned folders plus the leaf files that you wish to read.
463463

464-
:::image type="content" source="media/data-flow/partfile2.png" alt-text="Partition source file settings":::
464+
:::image type="content" source="media/data-flow/part-file-2.png" alt-text="Screenshot of partition source file settings in mapping data flow source transformation.":::
465465

466466
Use the Partition Root Path setting to define what the top level of the folder structure is. When you view the contents of your data via a data preview, you'll see that the service will add the resolved partitions found in each of your folder levels.
467467

articles/data-factory/connector-azure-sql-data-warehouse.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1015,7 +1015,7 @@ By default, a data flow run will fail on the first error it gets. You can choose
10151015

10161016
**Report success on error:** If enabled, the data flow will be marked as a success even if error rows are found.
10171017

1018-
:::image type="content" source="media/data-flow/sql-error-row-handling.png" alt-text="Screenshot that shows the error row handling" border="false":::
1018+
:::image type="content" source="media/data-flow/sql-error-row-handling.png" alt-text="Diagram that shows the error row handling in mapping data flow sink transformation.":::
10191019

10201020
## Lookup activity properties
10211021

articles/data-factory/connector-azure-sql-database.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.service: data-factory
88
ms.subservice: data-movement
99
ms.topic: conceptual
1010
ms.custom: synapse
11-
ms.date: 08/10/2022
11+
ms.date: 09/02/2022
1212
---
1313

1414
# Copy and transform data in Azure SQL Database by using Azure Data Factory or Azure Synapse Analytics
13.9 KB
Loading
1.02 KB
Loading
28.9 KB
Loading
Binary file not shown.

0 commit comments

Comments
 (0)