Skip to content

Commit d783dd0

Browse files
authored
Merge pull request #57480 from linda33wj/master
Update ADF copy content
2 parents 92e3d4c + c5a1ea6 commit d783dd0

File tree

5 files changed

+10
-13
lines changed

5 files changed

+10
-13
lines changed

articles/data-factory/connector-amazon-simple-storage-service.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.reviewer: douglasl
99
ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
12-
ms.date: 09/13/2018
12+
ms.date: 11/08/2018
1313
ms.author: jingwang
1414

1515
---
@@ -90,7 +90,7 @@ To copy data from Amazon S3, set the type property of the dataset to **AmazonS3O
9090
| Property | Description | Required |
9191
|:--- |:--- |:--- |
9292
| type | The type property of the dataset must be set to: **AmazonS3Object** |Yes |
93-
| bucketName | The S3 bucket name. Wildcard filter is not supported. |Yes |
93+
| bucketName | The S3 bucket name. Wildcard filter is not supported. |Yes for Copy activity, No for Lookup/GetMetadta activity |
9494
| key | The **name or wildcard filter** of S3 object key under the specified bucket. Applies only when "prefix" property is not specified. <br/><br/>The wildcard filter is only supported for file name part but not folder part. Allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character).<br/>- Example 1: `"key": "rootfolder/subfolder/*.csv"`<br/>- Example 2: `"key": "rootfolder/subfolder/???20180427.txt"`<br/>Use `^` to escape if your actual file name has wildcard or this escape char inside. |No |
9595
| prefix | Prefix for the S3 object key. Objects whose keys start with this prefix are selected. Applies only when "key" property is not specified. |No |
9696
| version | The version of the S3 object, if S3 versioning is enabled. |No |

articles/data-factory/connector-azure-blob-storage.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ ms.reviewer: douglasl
88
ms.service: data-factory
99
ms.workload: data-services
1010
ms.topic: conceptual
11-
ms.date: 10/31/2018
11+
ms.date: 11/08/2018
1212
ms.author: jingwang
1313

1414
---
@@ -243,7 +243,7 @@ To copy data to and from Blob storage, set the type property of the dataset to *
243243
| Property | Description | Required |
244244
|:--- |:--- |:--- |
245245
| type | The type property of the dataset must be set to **AzureBlob**. |Yes |
246-
| folderPath | Path to the container and folder in the blob storage. Wildcard filter is not supported. An example is myblobcontainer/myblobfolder/. |Yes |
246+
| folderPath | Path to the container and folder in the blob storage. Wildcard filter is not supported. An example is myblobcontainer/myblobfolder/. |Yes for Copy activity, No for Lookup/GetMetadata activity |
247247
| fileName | **Name or wildcard filter** for the blob(s) under the specified "folderPath". If you don't specify a value for this property, the dataset points to all blobs in the folder. <br/><br/>For filter, allowed wildcards are: `*` (matches zero or more characters) and `?` (matches zero or single character).<br/>- Example 1: `"fileName": "*.csv"`<br/>- Example 2: `"fileName": "???20180427.txt"`<br/>Use `^` to escape if your actual file name has wildcard or this escape char inside.<br/><br/>When fileName isn't specified for an output dataset and **preserveHierarchy** isn't specified in the activity sink, the copy activity automatically generates the blob name with the following pattern: "*Data.[activity run id GUID].[GUID if FlattenHierarchy].[format if configured].[compression if configured]*". An example is "Data.0a405f8a-93ff-4c6f-b3be-f69616f1df7a.txt.gz". |No |
248248
| format | If you want to copy files as is between file-based stores (binary copy), skip the format section in both the input and output dataset definitions.<br/><br/>If you want to parse or generate files with a specific format, the following file format types are supported: **TextFormat**, **JsonFormat**, **AvroFormat**, **OrcFormat**, and **ParquetFormat**. Set the **type** property under **format** to one of these values. For more information, see the [Text format](supported-file-formats-and-compression-codecs.md#text-format), [JSON format](supported-file-formats-and-compression-codecs.md#json-format), [Avro format](supported-file-formats-and-compression-codecs.md#avro-format), [Orc format](supported-file-formats-and-compression-codecs.md#orc-format), and [Parquet format](supported-file-formats-and-compression-codecs.md#parquet-format) sections. |No (only for binary copy scenario) |
249249
| compression | Specify the type and level of compression for the data. For more information, see [Supported file formats and compression codecs](supported-file-formats-and-compression-codecs.md#compression-support).<br/>Supported types are **GZip**, **Deflate**, **BZip2**, and **ZipDeflate**.<br/>Supported levels are **Optimal** and **Fastest**. |No |

articles/data-factory/connector-azure-sql-data-warehouse.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313
ms.devlang: na
1414
ms.topic: conceptual
15-
ms.date: 07/28/2018
15+
ms.date: 11/08/2018
1616
ms.author: jingwang
1717

1818
---
@@ -213,7 +213,7 @@ To copy data from or to Azure SQL Data Warehouse, set the **type** property of t
213213
| Property | Description | Required |
214214
|:--- |:--- |:--- |
215215
| type | The **type** property of the dataset must be set to **AzureSqlDWTable**. | Yes |
216-
| tableName | The name of the table or view in the Azure SQL Data Warehouse instance that the linked service refers to. | Yes |
216+
| tableName | The name of the table or view in the Azure SQL Data Warehouse instance that the linked service refers to. | No for source, Yes for sink |
217217

218218
#### Dataset properties example
219219

@@ -253,7 +253,6 @@ To copy data from Azure SQL Data Warehouse, set the **type** property in the Cop
253253

254254
- If the **sqlReaderQuery** is specified for the **SqlSource**, the Copy Activity runs this query against the Azure SQL Data Warehouse source to get the data. Or you can specify a stored procedure. Specify the **sqlReaderStoredProcedureName** and **storedProcedureParameters** if the stored procedure takes parameters.
255255
- If you don't specify either **sqlReaderQuery** or **sqlReaderStoredProcedureName**, the columns defined in the **structure** section of the dataset JSON are used to construct a query. `select column1, column2 from mytable` runs against Azure SQL Data Warehouse. If the dataset definition doesn't have the **structure**, all columns are selected from the table.
256-
- When you use **sqlReaderStoredProcedureName**, you still need to specify a dummy **tableName** property in the dataset JSON.
257256

258257
#### SQL query example
259258

articles/data-factory/connector-azure-sql-database.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313
ms.devlang: na
1414
ms.topic: conceptual
15-
ms.date: 09/12/2018
15+
ms.date: 11/08/2018
1616
ms.author: jingwang
1717

1818
---
@@ -204,7 +204,7 @@ To copy data from or to Azure SQL Database, set the **type** property of the dat
204204
| Property | Description | Required |
205205
|:--- |:--- |:--- |
206206
| type | The **type** property of the dataset must be set to **AzureSqlTable**. | Yes |
207-
| tableName | The name of the table or view in the Azure SQL Database instance that the linked service refers to. | Yes |
207+
| tableName | The name of the table or view in the Azure SQL Database instance that the linked service refers to. | No for source, Yes for sink |
208208

209209
#### Dataset properties example
210210

@@ -244,7 +244,6 @@ To copy data from Azure SQL Database, set the **type** property in the Copy Acti
244244

245245
- If the **sqlReaderQuery** is specified for the **SqlSource**, Copy Activity runs this query against the Azure SQL Database source to get the data. Or you can specify a stored procedure. Specify **sqlReaderStoredProcedureName** and **storedProcedureParameters** if the stored procedure takes parameters.
246246
- If you don't specify either **sqlReaderQuery** or **sqlReaderStoredProcedureName**, the columns defined in the **structure** section of the dataset JSON are used to construct a query. `select column1, column2 from mytable` runs against Azure SQL Database. If the dataset definition doesn't have the **structure**, all columns are selected from the table.
247-
- When you use **sqlReaderStoredProcedureName**, you still need to specify a dummy **tableName** property in the dataset JSON.
248247

249248
#### SQL query example
250249

articles/data-factory/connector-sql-server.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313
ms.devlang: na
1414
ms.topic: conceptual
15-
ms.date: 09/12/2018
15+
ms.date: 11/08/2018
1616
ms.author: jingwang
1717

1818
---
@@ -115,7 +115,7 @@ To copy data from/to SQL Server database, set the type property of the dataset t
115115
| Property | Description | Required |
116116
|:--- |:--- |:--- |
117117
| type | The type property of the dataset must be set to: **SqlServerTable** | Yes |
118-
| tableName |Name of the table or view in the SQL Server database instance that linked service refers to. | Yes |
118+
| tableName |Name of the table or view in the SQL Server database instance that linked service refers to. | No for source, Yes for sink |
119119

120120
**Example:**
121121

@@ -155,7 +155,6 @@ To copy data from SQL Server, set the source type in the copy activity to **SqlS
155155

156156
- If the **sqlReaderQuery** is specified for the SqlSource, the Copy Activity runs this query against the SQL Server source to get the data. Alternatively, you can specify a stored procedure by specifying the **sqlReaderStoredProcedureName** and **storedProcedureParameters** (if the stored procedure takes parameters).
157157
- If you do not specify either "sqlReaderQuery" or "sqlReaderStoredProcedureName", the columns defined in the "structure" section of the dataset JSON are used to construct a query (`select column1, column2 from mytable`) to run against the SQL Server. If the dataset definition does not have the "structure", all columns are selected from the table.
158-
- When you use **sqlReaderStoredProcedureName**, you still need to specify a dummy **tableName** property in the dataset JSON.
159158

160159
**Example: using SQL query**
161160

0 commit comments

Comments
 (0)