You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can copy data from Azure Database for PostgreSQL to any supported sink data store. Or, you can copy data from any supported source data store to Azure Database for PostgreSQL. For a list of data stores that are supported as sources and sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
32
+
You can copy data from Azure Database for PostgreSQL to any supported sink data store. Or, you can copy data from any supported source data store to Azure Database for PostgreSQL. For a list of data stores that the copy activity supports as sources and sinks, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
33
33
34
34
Azure Data Factory provides a built-in driver to enable connectivity. Therefore, you don't need to manually install any driver to use this connector.
35
35
@@ -73,7 +73,9 @@ A typical connection string is `Server=<server>.postgres.database.azure.com;Data
73
73
}
74
74
```
75
75
76
-
**Example: store password in Azure Key Vault**:
76
+
**Example**:
77
+
78
+
***Store password in Azure Key Vault***
77
79
78
80
```json
79
81
{
@@ -100,7 +102,7 @@ A typical connection string is `Server=<server>.postgres.database.azure.com;Data
100
102
101
103
## Dataset properties
102
104
103
-
For a full list of sections and properties available for defining datasets, see [Datasets in Azure Data Factory](concepts-datasets-linked-services.md). This section provides a list of properties that are supported by Azure Database for PostgreSQL dataset.
105
+
For a full list of sections and properties available for defining datasets, see [Datasets in Azure Data Factory](concepts-datasets-linked-services.md). This section provides a list of properties that Azure Database for PostgreSQL supports in datasets.
104
106
105
107
To copy data from Azure Database for PostgreSQL, set the type property of the dataset to **AzurePostgreSqlTable**. The following properties are supported:
106
108
@@ -109,7 +111,7 @@ To copy data from Azure Database for PostgreSQL, set the type property of the da
109
111
| type | The type property of the dataset must be set to **AzurePostgreSqlTable**| Yes |
110
112
| tableName | Name of the table | No (if "query" in activity source is specified) |
111
113
112
-
**Example**
114
+
**Example**:
113
115
114
116
```json
115
117
{
@@ -127,7 +129,7 @@ To copy data from Azure Database for PostgreSQL, set the type property of the da
127
129
128
130
## Copy activity properties
129
131
130
-
For a full list of sections and properties available for defining activities, see [Pipelines and activities in Azure Data Factory](concepts-pipelines-activities.md). This section provides a list of properties supported by Azure Database for PostgreSQL source.
132
+
For a full list of sections and properties available for defining activities, see [Pipelines and activities in Azure Data Factory](concepts-pipelines-activities.md). This section provides a list of properties supported by an Azure Database for PostgreSQL source.
131
133
132
134
### Azure Database for PostgreSql as source
133
135
@@ -138,7 +140,7 @@ To copy data from Azure Database for PostgreSQL, set the source type in the copy
138
140
| type | The type property of the copy activity source must be set to **AzurePostgreSqlSource**| Yes |
139
141
| query | Use the custom SQL query to read data. For example: `"SELECT * FROM MyTable"`| No (if the tableName property in the dataset is specified) |
140
142
141
-
**Example:**
143
+
**Example**:
142
144
143
145
```json
144
146
"activities":[
@@ -181,7 +183,7 @@ To copy data to Azure Database for PostgreSQL, the following properties are supp
181
183
| writeBatchSize | Inserts data into the Azure Database for PostgreSQL table when the buffer size reaches writeBatchSize.<br>Allowed value is an integer that represents the number of rows. | No (default is 10,000) |
182
184
| writeBatchTimeout | Wait time for the batch insert operation to complete before it times out.<br>Allowed values are Timespan strings. An example is 00:30:00 (30 minutes). | No (default is 00:00:30) |
0 commit comments