Skip to content

Commit e6802da

Browse files
authored
Merge pull request #72328 from linda33wj/master
Update ADF content per customer feedback
2 parents b8c2da1 + 87a0f37 commit e6802da

File tree

2 files changed

+6
-5
lines changed

2 files changed

+6
-5
lines changed

articles/data-factory/copy-activity-overview.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313

1414
ms.topic: conceptual
15-
ms.date: 02/15/2019
15+
ms.date: 04/08/2019
1616
ms.author: jingwang
1717

1818
---
@@ -51,14 +51,15 @@ Copy Activity goes through the following stages to copy data from a source to a
5151

5252
You can use Copy Activity to **copy files as-is** between two file-based data stores, in which case the data is copied efficiently without any serialization/deserialization.
5353

54-
Copy Activity also supports reading from and writing to files in specified formats: **Text, JSON, Avro, ORC, and Parquet**, and compression codec **GZip, Deflate, BZip2, and ZipDeflate** are supported. See [Supported file and compression formats](supported-file-formats-and-compression-codecs.md) with details.
54+
Copy Activity also supports reading from and writing to files in specified formats: **Text, JSON, Avro, ORC, and Parquet**, and compressing and decompresing files with the following codecs: **GZip, Deflate, BZip2, and ZipDeflate**. See [Supported file and compression formats](supported-file-formats-and-compression-codecs.md) with details.
5555

5656
For example, you can do the following copy activities:
5757

58-
* Copy data in on-premises SQL Server and write to Azure Data Lake Store in ORC format.
58+
* Copy data in on-premises SQL Server and write to Azure Data Lake Storage Gen2 in Parquet format.
5959
* Copy files in text (CSV) format from on-premises File System and write to Azure Blob in Avro format.
60-
* Copy zipped files from on-premises File System and decompress then land to Azure Data Lake Store.
60+
* Copy zipped files from on-premises File System and decompress then land to Azure Data Lake Storage Gen2.
6161
* Copy data in GZip compressed text (CSV) format from Azure Blob and write to Azure SQL Database.
62+
* And many more cases with serialization/deserialization or compression/decompression need.
6263

6364
## Supported regions
6465

articles/data-factory/tutorial-bulk-copy-portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -211,7 +211,7 @@ The **GetTableListAndTriggerCopyData** pipeline performs two steps:
211211
* Looks up the Azure SQL Database system table to get the list of tables to be copied.
212212
* Triggers the pipeline **IterateAndCopySQLTables** to do the actual data copy.
213213

214-
The **GetTableListAndTriggerCopyData** takes a list of tables as a parameter. For each table in the list, it copies data from the table in Azure SQL Database to Azure SQL Data Warehouse using staged copy and PolyBase.
214+
The **IterateAndCopySQLTables** takes a list of tables as a parameter. For each table in the list, it copies data from the table in Azure SQL Database to Azure SQL Data Warehouse using staged copy and PolyBase.
215215

216216
### Create the pipeline IterateAndCopySQLTables
217217

0 commit comments

Comments
 (0)