You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/connector-amazon-redshift.md
+75-23Lines changed: 75 additions & 23 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ author: jianleishen
7
7
ms.subservice: data-movement
8
8
ms.custom: synapse
9
9
ms.topic: conceptual
10
-
ms.date: 09/12/2024
10
+
ms.date: 05/28/2025
11
11
---
12
12
13
13
# Copy data from Amazon Redshift using Azure Data Factory or Synapse Analytics
@@ -16,20 +16,25 @@ ms.date: 09/12/2024
16
16
17
17
This article outlines how to use the Copy Activity in Azure Data Factory and Synapse Analytics pipelines to copy data from an Amazon Redshift. It builds on the [copy activity overview](copy-activity-overview.md) article that presents a general overview of copy activity.
18
18
19
+
> [!IMPORTANT]
20
+
> The Amazon Redshift version 2.0 (Preview) provides improved native Amazon Redshift support. If you are using the Amazon Redshift version 1.0 in your solution, you are recommended to [upgrade your Amazon Redshift connector](#upgrade-the-amazon-redshift-connector) at your earliest convenience. Refer to this [section](#differences-between-amazon-redshift-connector-version-20-and-version-10) for details on the difference between version 2.0 (Preview) and version 1.0.
21
+
19
22
## Supported capabilities
20
23
21
24
This Amazon Redshift connector is supported for the following capabilities:
For a list of data stores that are supported as sources or sinks by the copy activity, see the [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats) table.
31
34
32
-
Specifically, this Amazon Redshift connector supports retrieving data from Redshift using query or built-in Redshift UNLOAD support.
35
+
For version 2.0 (Preview), you need to [install the Amazon Redshift ODBC driver](https://docs.aws.amazon.com/redshift/latest/mgmt/odbc20-install-win.html) manually. For version 1.0, the service provides a built-in driver to enable connectivity, therefore you don't need to manually install any driver.
36
+
37
+
The Amazon Redshift connector supports retrieving data from Redshift using query or built-in Redshift UNLOAD support.
33
38
34
39
The connector supports the Windows versions in this [article](create-self-hosted-integration-runtime.md#prerequisites).
35
40
@@ -38,7 +43,7 @@ The connector supports the Windows versions in this [article](create-self-hosted
38
43
39
44
## Prerequisites
40
45
41
-
* If you are copying data to an on-premises data store using [Self-hosted Integration Runtime](create-self-hosted-integration-runtime.md), grant Integration Runtime (use IP address of the machine) the access to Amazon Redshift cluster. See [Authorize access to the cluster](https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-authorize-cluster-access.html) for instructions.
46
+
* If you are copying data to an on-premises data store using [Self-hosted Integration Runtime](create-self-hosted-integration-runtime.md), grant Integration Runtime (use IP address of the machine) the access to Amazon Redshift cluster. See [Authorize access to the cluster](https://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-authorize-cluster-access.html) for instructions. If you use the version 2.0, your self-hosted integration runtime version should be 5.54.0.0 or above.
42
47
* If you are copying data to an Azure data store, see [Azure Data Center IP Ranges](https://www.microsoft.com/download/details.aspx?id=41653) for the Compute IP address and SQL ranges used by the Azure data centers.
43
48
44
49
## Getting started
@@ -78,14 +83,42 @@ The following properties are supported for Amazon Redshift linked service:
78
83
| Property | Description | Required |
79
84
|:--- |:--- |:--- |
80
85
| type | The type property must be set to: **AmazonRedshift**| Yes |
86
+
| version | The version that you specify. | Yes for version 2.0 (Preview). |
81
87
| server |IP address or host name of the Amazon Redshift server. |Yes |
82
88
| port |The number of the TCP port that the Amazon Redshift server uses to listen for client connections. |No, default is 5439 |
83
89
| database |Name of the Amazon Redshift database. |Yes |
84
90
| username |Name of user who has access to the database. |Yes |
85
91
| password |Password for the user account. Mark this field as a SecureString to store it securely, or [reference a secret stored in Azure Key Vault](store-credentials-in-key-vault.md). |Yes |
86
-
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. You can use Azure Integration Runtime or Self-hosted Integration Runtime (if your data store is located in private network). If not specified, it uses the default Azure Integration Runtime. |No |
92
+
| connectVia | The [Integration Runtime](concepts-integration-runtime.md) to be used to connect to the data store. <br>If you select version 2.0 (Preview), you can only use the self-hosted integration runtime and its version should be 5.54.0.0 or above.<br>If you select version 1.0, you can use Azure Integration Runtime or Self-hosted Integration Runtime (if your data store is located in private network). If not specified, it uses the default Azure Integration Runtime. |No |
87
93
88
-
**Example:**
94
+
**Example: version 2.0 (Preview)**
95
+
96
+
```json
97
+
{
98
+
"name": "AmazonRedshiftLinkedService",
99
+
"properties":
100
+
{
101
+
"type": "AmazonRedshift",
102
+
"version": "2.0",
103
+
"typeProperties":
104
+
{
105
+
"server": "<server name>",
106
+
"database": "<database name>",
107
+
"username": "<username>",
108
+
"password": {
109
+
"type": "SecureString",
110
+
"value": "<password>"
111
+
}
112
+
},
113
+
"connectVia": {
114
+
"referenceName": "<name of Integration Runtime>",
115
+
"type": "IntegrationRuntimeReference"
116
+
}
117
+
}
118
+
}
119
+
```
120
+
121
+
**Example: version 1.0**
89
122
90
123
```json
91
124
{
@@ -234,26 +267,45 @@ For this sample use case, copy activity unloads data from Amazon Redshift to Ama
234
267
235
268
## Data type mapping for Amazon Redshift
236
269
237
-
When copying data from Amazon Redshift, the following mappings are used from Amazon Redshift data types to interim data types used internally within the service. See [Schema and data type mappings](copy-activity-schema-and-type-mapping.md) to learn about how copy activity maps the source schema and data type to the sink.
238
-
239
-
| Amazon Redshift data type | Interim service data type |
240
-
|:--- |:--- |
241
-
| BIGINT |Int64 |
242
-
| BOOLEAN |String |
243
-
| CHAR |String |
244
-
| DATE |DateTime |
245
-
| DECIMAL |Decimal |
246
-
| DOUBLE PRECISION |Double |
247
-
| INTEGER |Int32 |
248
-
| REAL |Single |
249
-
| SMALLINT |Int16 |
250
-
| TEXT |String |
251
-
| TIMESTAMP |DateTime |
252
-
| VARCHAR |String |
270
+
When you copy data from Amazon Redshift, the following mappings apply from Amazon Redshift's data types to the internal data types used by the service. To learn about how the copy activity maps the source schema and data type to the sink, see [Schema and data type mappings](copy-activity-schema-and-type-mapping.md).
271
+
272
+
| Amazon Redshift data type | Interim service data type (for version 2.0 (Preview)) | Interim service data type (for version 1.0) |
273
+
|:--- |:--- |:--- |
274
+
| BIGINT |Int64 |Int64 |
275
+
| BOOLEAN |Boolean |String |
276
+
| CHAR |String |String |
277
+
| DATE |DateTime |DateTime |
278
+
| DECIMAL |String |Decimal |
279
+
| DOUBLE PRECISION |Double |Double |
280
+
| INTEGER |Int32 |Int32 |
281
+
| REAL |Single |Single |
282
+
| SMALLINT |Int16 |Int16 |
283
+
| TEXT |String |String |
284
+
| TIMESTAMP |DateTime |DateTime |
285
+
| VARCHAR |String |String |
253
286
254
287
## Lookup activity properties
255
288
256
289
To learn details about the properties, check [Lookup activity](control-flow-lookup-activity.md).
257
290
291
+
## Upgrade the Amazon Redshift connector
292
+
293
+
Here are steps that help you upgrade the Amazon Redshift connector:
294
+
295
+
1. In **Edit linked service** page, select version 2.0 (Preview) and configure the linked service by referring to [linked service properties](#linked-service-properties).
296
+
297
+
2. The data type mapping for the Amazon Redshift linked service version 2.0 (Preview) is different from that for the version 1.0. To learn the latest data type mapping, see [Data type mapping for Amazon Redshift](#data-type-mapping-for-amazon-redshift).
298
+
299
+
3. Apply a self-hosted integration runtime with version 5.54.0.0 or above. Azure integration runtime is not supported by version 2.0 (Preview).
300
+
301
+
## <aname="differences-between-amazon-redshift-connector-version-20-and-version-10"></a>Differences between Amazon Redshift connector version 2.0 (Preview) and version 1.0
302
+
303
+
The Amazon Redshift connector version 2.0 (Preview) offers new functionalities and is compatible with most features of version 1.0. The following table shows the feature differences between version 2.0 (Preview) and version 1.0.
304
+
305
+
| Version 2.0 (Preview) | Version 1.0 |
306
+
| :----------- | :------- |
307
+
| Only support the self-hosted integration runtime with version 5.54.0.0 or above. | Support the Azure integration runtime and self-hosted integration runtime. |
308
+
| The following mappings are used from Amazon Redshift data types to interim service data type.<br><br>BOOLEAN -> Boolean <br>DECIMAL -> String| The following mappings are used from Amazon Redshift data types to interim service data type.<br><br>BOOLEAN -> String <br>DECIMAL -> Decimal|
309
+
258
310
## Related content
259
311
For a list of data stores supported as sources and sinks by the copy activity, see [supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
0 commit comments