You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -474,17 +475,20 @@ To copy data from Azure Database for PostgreSQL, set the source type in the copy
474
475
475
476
### Azure Database for PostgreSQL as sink
476
477
477
-
To copy data to Azure Database for PostgreSQL, the following properties are supported in the copy activity **sink** section:
478
+
To copy data to Azure Database for PostgreSQL, set the sink type in the copy activity to **SqlSink**. The following properties are supported in the copy activity **sink** section:
478
479
479
-
| Property | Description | Required |
480
-
|:--- |:--- |:--- |
481
-
| type | The type property of the copy activity sink must be set to **AzurePostgreSqlSink**. | Yes |
482
-
| preCopyScript | Specify a SQL query for the copy activity to execute before you write data into Azure Database for PostgreSQL in each run. You can use this property to clean up the preloaded data. | No |
483
-
| writeMethod | The method used to write data into Azure Database for PostgreSQL.<br>Allowed values are: **CopyCommand** (default, which is more performant), **BulkInsert**. | No |
484
-
| writeBatchSize | The number of rows loaded into Azure Database for PostgreSQL per batch.<br>Allowed value is an integer that represents the number of rows. | No (default is 1,000,000) |
485
-
| writeBatchTimeout | Wait time for the batch insert operation to complete before it times out.<br>Allowed values are Timespan strings. An example is 00:30:00 (30 minutes). | No (default is 00:30:00) |
480
+
| Property | Description | Required | Connector support version |
481
+
|:--- |:--- |:--- |:--- |
482
+
| type | The type property of the copy activity sink must be set to **AzurePostgreSQLSink**. | Yes | Version 1.0 & Version 2.0 |
483
+
| preCopyScript | Specify a SQL query for the copy activity to execute before you write data into Azure Database for PostgreSQL in each run. You can use this property to clean up the preloaded data. | No | Version 1.0 & Version 2.0 |
484
+
| writeMethod | The method used to write data into Azure Database for PostgreSQL.<br>Allowed values are: **CopyCommand** (default, which is more performant), **BulkInsert** and **Upsert** (Version 2.0 only). | No | Version 1.0 & Version 2.0 |
485
+
| upsertSettings | Specify the group of the settings for write behavior. <br/> Apply when the WriteBehavior option is `Upsert`. | No | Version 2.0 |
486
+
|***Under `upsertSettings`:***|||
487
+
| keys | Specify the column names for unique row identification. Either a single key or a series of keys can be used. Keys must be a primary key or unique column. If not specified, the primary key is used. | No | Version 2.0 |
488
+
| writeBatchSize | The number of rows loaded into Azure Database for PostgreSQL per batch.<br>Allowed value is an integer that represents the number of rows. | No (default is 1,000,000) | Version 1.0 & Version 2.0 |
489
+
| writeBatchTimeout | Wait time for the batch insert operation to complete before it times out.<br>Allowed values are Timespan strings. An example is 00:30:00 (30 minutes). | No (default is 00:30:00) | Version 1.0 & Version 2.0 |
486
490
487
-
**Example**:
491
+
**Example 1: Copy Command**
488
492
489
493
```json
490
494
"activities":[
@@ -518,6 +522,47 @@ To copy data to Azure Database for PostgreSQL, the following properties are supp
Copy activity natively supports upsert operations. To perform an upsert, user should provide key column(s) that are either primary keys or unique columns. If the user does not provide key column(s) then primary key column(s) in the sink table are used. Copy Activity will update non-key column(s) in the sink table where the key column value(s) match those in the source table; otherwise, it will insert new data.
565
+
521
566
## Parallel copy from Azure Database for PostgreSQL
522
567
523
568
The Azure Database for PostgreSQL connector in copy activity provides built-in data partitioning to copy data in parallel. You can find data partitioning options on the **Source** tab of the copy activity.
> Script activity is only supported in the version 2.0 connector.
693
+
> [!IMPORTANT]
694
+
> Multi-query statements using output parameters are not supported. It is recommended that you split any output queries into separate script blocks within the same or different script activity.
695
+
>
696
+
> Multi-query statements using positional parameters are not supported. It is recommended that you split any positional queries into separate script blocks within the same or different script activity.
697
+
698
+
For more information about script activity, see [Script activity](transform-data-using-script.md).
699
+
644
700
## Lookup activity properties
645
701
646
702
For more information about the properties, see [Lookup activity](control-flow-lookup-activity.md).
Copy file name to clipboardExpand all lines: articles/data-factory/transform-data-using-script.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.topic: conceptual
6
6
author: nabhishek
7
7
ms.author: abnarain
8
8
ms.custom: synapse
9
-
ms.date: 10/03/2024
9
+
ms.date: 08/01/2025
10
10
ms.subservice: orchestration
11
11
---
12
12
@@ -20,12 +20,12 @@ Using the script activity, you can execute common operations with Data Manipulat
20
20
21
21
You can use the Script activity to invoke a SQL script in one of the following data stores in your enterprise or on an Azure virtual machine (VM):
22
22
23
+
- Azure Database for PostgreSQL (Version 2.0)
23
24
- Azure SQL Database
24
-
- Azure Synapse Analytics
25
-
- SQL Server Database. If you're using SQL Server, install Self-hosted integration runtime on the same machine that hosts the database or on a separate machine that has access to the database. Self-Hosted integration runtime is a component that connects data sources on-premises/on Azure VM with cloud services in a secure and managed way. See the [Self-hosted integration runtime](create-self-hosted-integration-runtime.md) article for details.
25
+
- Azure Synapse Analytics
26
+
- SQL Server Database. If you're using SQL Server, install Self-hosted integration runtime on the same machine that hosts the database or on a separate machine that has access to the database. Self-Hosted integration runtime is a component that connects data sources on-premises/on Azure VM with cloud services in a secure and managed way. See the [Self-hosted integration runtime](create-self-hosted-integration-runtime.md) article for details.
26
27
- Oracle
27
28
- Snowflake
28
-
- Azure Database for PostgreSQL
29
29
30
30
The script can contain either a single SQL statement or multiple SQL statements that run sequentially. You can use the Script task for the following purposes:
0 commit comments