Skip to content

Commit e725d63

Browse files
authored
Merge pull request #48374 from linda33wj/master
Add tips for ADF connectors
2 parents 710e8fa + 45a86b5 commit e725d63

File tree

3 files changed

+14
-9
lines changed

3 files changed

+14
-9
lines changed

articles/data-factory/connector-azure-cosmos-db.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -31,9 +31,9 @@ You can copy data from Azure Cosmos DB to any supported sink data store, or copy
3131
Specifically, this Azure Cosmos DB connector supports:
3232

3333
- Cosmos DB [SQL API](https://docs.microsoft.com/azure/cosmos-db/documentdb-introduction).
34-
- Importing/exporting JSON documents as-is, or copying data from/to tabular dataset e.g. SQL database, CSV files, etc.
34+
- Importing/exporting JSON documents as-is, or copying data from/to tabular dataset e.g. SQL database, CSV files, etc. To copy documents as-is to/from JSON files or another Cosmos DB collection, see [Import/Export JSON documents](#importexport-json-documents).
3535

36-
To copy documents as-is to/from JSON files or another Cosmos DB collection, see [Import/Export JSON documents](#importexport-json-documents).
36+
Data Factory integrates with [Cosmos DB bulk executor library](https://github.com/Azure/azure-cosmosdb-bulkexecutor-dotnet-getting-started) to provide the best performance writing into Cosmos DB.
3737

3838
## Getting started
3939

@@ -162,7 +162,7 @@ To copy data to Azure Cosmos DB, set the sink type in the copy activity to **Doc
162162
|:--- |:--- |:--- |
163163
| type | The type property of the copy activity sink must be set to: **DocumentDbCollectionSink** |Yes |
164164
| writeBehavior |Describe how to write data into Cosmos DB. Allowed values are: `insert` and `upsert`.<br/>The behavior of **upsert** is to replace the document if an document of the same id already exist; otherwise insert it. Note ADF will automatically generate an id for the document if it is not specified either in the original doc or by column mapping), which means you need to make sure your document has an "id" so that upsert work as expected. |No, default is insert |
165-
| writeBatchSize | Data Factory use [Cosmos DB bulk executor](https://github.com/Azure/azure-cosmosdb-bulkexecutor-dotnet-getting-started) to write data into Cosmos DB. "writeBatchSize" controls the size of documents we provide to the library each time. You can try increase writeBatchSize to improve performance. |No |
165+
| writeBatchSize | Data Factory use [Cosmos DB bulk executor library](https://github.com/Azure/azure-cosmosdb-bulkexecutor-dotnet-getting-started) to write data into Cosmos DB. "writeBatchSize" controls the size of documents we provide to the library each time. You can try increase writeBatchSize to improve performance. |No, default is 10,000 |
166166
| nestingSeparator |A special character in the source column name to indicate that nested document is needed. <br/><br/>For example, `Name.First` in the output dataset structure generates the following JSON structure in the Cosmos DB document:`"Name": {"First": "[value maps to this column from source]"}` when the nestedSeparator is dot. |No (default is dot `.`) |
167167

168168
**Example:**

articles/data-factory/connector-azure-data-lake-storage.md

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ ms.reviewer: douglasl
99
ms.service: data-factory
1010
ms.workload: data-services
1111
ms.topic: conceptual
12-
ms.date: 06/26/2018
12+
ms.date: 08/07/2018
1313
ms.author: jingwang
1414

1515
---
@@ -26,7 +26,10 @@ You can copy data from any supported source data store to Data Lake Storage Gen2
2626
Specifically, this connector supports:
2727

2828
- Copying data by using account key.
29-
- Copying files as is or parsing or generating files with [supported file formats and compression codecs](supported-file-formats-and-compression-codecs.md).
29+
- Copying files as-is or parsing or generating files with [supported file formats and compression codecs](supported-file-formats-and-compression-codecs.md).
30+
31+
>[!TIP]
32+
>If you enable the hierarchical namespace, currently there is no interoperability of operations between Blob and ADLS Gen2 APIs. In case you hit the error of "ErrorCode=FilesystemNotFound" with detailed message as "The specified filesystem does not exist.", it's caused by the specified sink file system was created via Blob API instead of ADLS Gen2 API elsewhere. To fix the issue, please use a non-existed file system name and ADF will copy the data properly.
3033
3134
## Get started
3235

articles/data-factory/connector-sap-business-warehouse.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ ms.workload: data-services
1212
ms.tgt_pltfrm: na
1313
ms.devlang: na
1414
ms.topic: conceptual
15-
ms.date: 02/07/2018
15+
ms.date: 08/07/2018
1616
ms.author: jingwang
1717

1818
---
@@ -38,10 +38,12 @@ Specifically, this SAP Business Warehouse connector supports:
3838
To use this SAP Business Warehouse connector, you need to:
3939

4040
- Set up a Self-hosted Integration Runtime. See [Self-hosted Integration Runtime](create-self-hosted-integration-runtime.md) article for details.
41-
- Install the **SAP NetWeaver library** on the Integration Runtime machine. You can get the SAP Netweaver library from your SAP administrator, or directly from the [SAP Software Download Center](https://support.sap.com/swdc). Search for the **SAP Note #1025361** to get the download location for the most recent version. Make sure that you pick the **64-bit** SAP NetWeaver library, which matches your Integration Runtime installation. Then install all files included in the SAP NetWeaver RFC SDK according to the SAP Note. The SAP NetWeaver library is also included in the SAP Client Tools installation.
41+
- Install the **SAP NetWeaver library** on the Integration Runtime machine. You can get the SAP Netweaver library from your SAP administrator, or directly from the [SAP Software Download Center](https://support.sap.com/swdc). Search for the **SAP Note #1025361** to get the download location for the most recent version. Make sure that you pick the **64-bit** SAP NetWeaver library which matches your Integration Runtime installation. Then install all files included in the SAP NetWeaver RFC SDK according to the SAP Note. The SAP NetWeaver library is also included in the SAP Client Tools installation.
4242

43-
> [!TIP]
44-
> Put the dlls extracted from the NetWeaver RFC SDK into system32 folder.
43+
>[!TIP]
44+
>To troubleshoot connectivity issue to SAP BW, make sure:
45+
>- All dependency libraries extracted from the NetWeaver RFC SDK are in place in the %windir%\system32 folder. Usually it has icudt34.dll, icuin34.dll, icuuc34.dll, libicudecnumber.dll, librfc32.dll, libsapucum.dll, sapcrypto.dll, sapcryto_old.dll, sapnwrfc.dll.
46+
>- The needed ports used to connect to SAP Server are enabled on the Self-hosted IR machine, which usually are port 3300 and 3201.
4547
4648
## Getting started
4749

0 commit comments

Comments
 (0)