You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/concepts-linked-services.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,7 +17,7 @@ ms.date: 09/25/2024
17
17
18
18
This article describes what linked services are, how they're defined in JSON format, and how they're used in Azure Data Factory and Azure Synapse Analytics.
19
19
20
-
To learn more read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse](../synapse-analytics/overview-what-is.md).
20
+
To learn more, read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse](../synapse-analytics/overview-what-is.md).
21
21
22
22
## Overview
23
23
@@ -27,7 +27,7 @@ Now, a **dataset** is a named view of data that simply points to or references t
27
27
28
28
Before you create a dataset, you must create a **linked service** to link your data store to the Data Factory or Synapse Workspace. Linked services are much like connection strings, which define the connection information needed for the service to connect to external resources. Think of it this way: the dataset represents the structure of the data within the linked data stores, and the linked service defines the connection to the data source. For example, an Azure Storage linked service links a storage account to the service. An Azure Blob dataset represents the blob container and the folder within that Azure Storage account that contains the input blobs to be processed.
29
29
30
-
Here is a sample scenario. To copy data from Blob storage to a SQL Database, you create two linked services: Azure Storage and Azure SQL Database. Then, create two datasets: Azure Blob dataset (which refers to the Azure Storage linked service) and Azure SQL Table dataset (which refers to the Azure SQL Database linked service). The Azure Storage and Azure SQL Database linked services contain connection strings that the service uses at runtime to connect to your Azure Storage and Azure SQL Database, respectively. The Azure Blob dataset specifies the blob container and blob folder that contains the input blobs in your Blob storage. The Azure SQL Table dataset specifies the SQL table in your SQL Database to which the data is to be copied.
30
+
Here's a sample scenario. To copy data from Blob storage to a SQL Database, you create two linked services: Azure Storage and Azure SQL Database. Then, create two datasets: Azure Blob dataset (which refers to the Azure Storage linked service) and Azure SQL Table dataset (which refers to the Azure SQL Database linked service). The Azure Storage and Azure SQL Database linked services contain connection strings that the service uses at runtime to connect to your Azure Storage and Azure SQL Database, respectively. The Azure Blob dataset specifies the blob container and blob folder that contains the input blobs in your Blob storage. The Azure SQL Table dataset specifies the SQL table in your SQL Database to which the data is to be copied.
31
31
32
32
The following diagram shows the relationships among pipeline, activity, dataset, and linked service in the service:
33
33
@@ -37,21 +37,21 @@ The following diagram shows the relationships among pipeline, activity, dataset,
37
37
38
38
# [Azure Data Factory](#tab/data-factory)
39
39
40
-
To create a new linked service in Azure Data Factory Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **New** to create a new linked service.
40
+
To create a new linked service in Azure Data Factory Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **+ New** to create a new linked service.
41
41
42
42
:::image type="content" source="media/concepts-linked-services/create-linked-service.png" alt-text="Shows the Azure Data Factory studio Manage tab with linked services and the New button highlighted.":::
43
43
44
-
After selecting New to create a new linked service you will be able to choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
44
+
After selecting **+ New** to create a new linked service you can choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
45
45
46
46
:::image type="content" source="media/concepts-linked-services/new-linked-service-window.png" alt-text="Shows the new linked service window.":::
47
47
48
48
# [Synapse Analytics](#tab/synapse-analytics)
49
49
50
-
To create a new linked service in Synapse Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **New** to create a new linked service.
50
+
To create a new linked service in Synapse Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **+ New** to create a new linked service.
51
51
52
52
:::image type="content" source="media/concepts-linked-services/create-linked-service-synapse.png" alt-text="Shows the Azure Data Factory studio Manage tab with linked services and the New button highlighted.":::
53
53
54
-
After selecting New to create a new linked service you will be able to choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
54
+
After selecting **+ New** to create a new linked service you are able to choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
55
55
56
56
:::image type="content" source="media/concepts-linked-services/new-linked-service-window.png" alt-text="Shows the new linked service window.":::
57
57
@@ -113,15 +113,15 @@ Linked services can be created in the Azure Data Factory UX via the [management
113
113
114
114
You can create linked services by using one of these tools or SDKs: [.NET API](quickstart-create-data-factory-dot-net.md), [PowerShell](quickstart-create-data-factory-powershell.md), [REST API](quickstart-create-data-factory-rest-api.md), [Azure Resource Manager Template](quickstart-create-data-factory-resource-manager-template.md), and [Azure portal](quickstart-create-data-factory-portal.md).
115
115
116
-
When creating a linked service, the user needs appropriate authorization to the designated service. If sufficient access is not granted, the user will not be able to see the available resources and will need to use manual entry option.
116
+
When creating a linked service, the user needs appropriate authorization to the designated service. If sufficient access isn't granted, the user can't see the available resources and needs to use manual entry option.
117
117
118
118
## Data store linked services
119
119
120
-
You can find the list of supported data stores in the [connector overview](copy-activity-overview.md#supported-data-stores-and-formats) article. Click a data store to learn the supported connection properties.
120
+
You can find the list of supported data stores in the [connector overview](copy-activity-overview.md#supported-data-stores-and-formats) article. Select a data store to learn the supported connection properties.
121
121
122
122
## Compute linked services
123
123
124
-
Reference [compute environments supported](compute-linked-services.md) for details about different compute environments you can connect to from your service as well as the different configurations.
124
+
Reference [compute environments supported](compute-linked-services.md) for details about different compute environments you can connect to from your service and the different configurations.
Copy file name to clipboardExpand all lines: articles/data-factory/connector-troubleshoot-guide.md
+13-13Lines changed: 13 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,7 +29,7 @@ You can refer to the troubleshooting pages for each connector to see problems sp
29
29
-[DB2](connector-troubleshoot-db2.md)
30
30
-[Delimited text format](connector-troubleshoot-delimited-text.md)
31
31
-[Dynamics 365, Dataverse (Common Data Service), and Dynamics CRM](connector-troubleshoot-dynamics-dataverse.md)
32
-
-[FTP, SFTP and HTTP](connector-troubleshoot-ftp-sftp-http.md)
32
+
-[FTP, SFTP, and HTTP](connector-troubleshoot-ftp-sftp-http.md)
33
33
-[Hive](connector-troubleshoot-hive.md)
34
34
-[Oracle](connector-troubleshoot-oracle.md)
35
35
-[ORC format](connector-troubleshoot-orc.md)
@@ -41,7 +41,7 @@ You can refer to the troubleshooting pages for each connector to see problems sp
41
41
42
42
## General copy activity errors
43
43
44
-
The errors below are general to the copy activity and could occur with any connector.
44
+
The following errors are general to the copy activity and could occur with any connector.
45
45
46
46
#### Error code: 20000
47
47
@@ -56,9 +56,9 @@ The errors below are general to the copy activity and could occur with any conne
56
56
57
57
-**Message**: `An error occurred when invoking Java Native Interface.`
58
58
59
-
-**Cause**: If the error message contains "Cannot create JVM: JNI return code [-6][JNI call failed: Invalid arguments.]", the possible cause is that JVM can't be created because some illegal (global) arguments are set.
59
+
-**Cause**: If the error message contains "Can't create JVM: JNI return code [-6][JNI call failed: Invalid arguments.]," the possible cause is that JVM can't be created because some illegal (global) arguments are set.
60
60
61
-
-**Recommendation**: Log in to the machine that hosts *each node* of your self-hosted integration runtime. Check to ensure that the system variable is set correctly, as follows: `_JAVA_OPTIONS "-Xms256m -Xmx16g" with memory bigger than 8G`. Restart all the integration runtime nodes, and then rerun the pipeline.
61
+
-**Recommendation**: Sign in to the machine that hosts *each node* of your self-hosted integration runtime. Check to ensure that the system variable is set correctly, as follows: `_JAVA_OPTIONS "-Xms256m -Xmx16g" with memory bigger than 8G`. Restart all the integration runtime nodes, and then rerun the pipeline.
62
62
63
63
#### Error code: 20020
64
64
@@ -75,9 +75,9 @@ The errors below are general to the copy activity and could occur with any conne
75
75
76
76
-**Cause**: This error might occur when you copy data with connectors such as Azure Blob, SFTP, and so on. Federal Information Processing Standards (FIPS) defines a certain set of cryptographic algorithms that are allowed to be used. When FIPS mode is enabled on the machine, some cryptographic classes that copy activity depends on are blocked in some scenarios.
77
77
78
-
-**Resolution**: Learn [why we’re not recommending “FIPS Mode” anymore](https://techcommunity.microsoft.com/t5/microsoft-security-baselines/why-we-8217-re-not-recommending-8220-fips-mode-8221-anymore/ba-p/701037), and evaluate whether you can disable FIPS on your self-hosted IR machine.
78
+
-**Resolution**: Learn [why we’re not recommending "FIPS Mode" anymore](https://techcommunity.microsoft.com/t5/microsoft-security-baselines/why-we-8217-re-not-recommending-8220-fips-mode-8221-anymore/ba-p/701037), and evaluate whether you can disable FIPS on your self-hosted IR machine.
79
79
80
-
Alternatively, if you only want to bypass FIPS and make the activity runs succeed, do the following:
80
+
Alternatively, if you only want to bypass FIPS and make the activity runs succeed, take the following steps:
81
81
82
82
1. Open the folder where Self-hosted IR is installed. The path is usually *C:\Program Files\Microsoft Integration Runtime \<IR version>\Shared*.
83
83
@@ -107,7 +107,7 @@ The errors below are general to the copy activity and could occur with any conne
107
107
108
108
-**Message**: `The toke type '%tokenType;' from your authorization server is not supported, supported types: '%tokenTypes;'.`
109
109
110
-
-**Cause**: Your authorization server is not supported.
110
+
-**Cause**: Your authorization server isn't supported.
111
111
112
112
-**Recommendation**: Use an authorization server that can return tokens with supported token types.
113
113
@@ -129,7 +129,7 @@ The errors below are general to the copy activity and could occur with any conne
129
129
130
130
-**Message**: `The format settings are missing in dataset %dataSetName;.`
131
131
132
-
-**Cause**: The dataset type is Binary, which is not supported.
132
+
-**Cause**: The dataset type is Binary, which isn't supported.
133
133
134
134
-**Recommendation**: Use the DelimitedText, Json, Avro, Orc, or Parquet dataset instead.
135
135
@@ -147,13 +147,13 @@ The errors below are general to the copy activity and could occur with any conne
147
147
148
148
-**Message**: `Failed to retrieve source file ('%name;') metadata to validate data consistency.`
149
149
150
-
-**Cause**: There is a transient issue on the sink data store, or retrieving metadata from the sink data store is not allowed.
150
+
-**Cause**: There's a transient issue on the sink data store, or retrieving metadata from the sink data store isn't allowed.
151
151
152
152
#### Error code: 20703
153
153
154
154
-**Message**: `Failed to retrieve sink file ('%name;') metadata to validate data consistency.`
155
155
156
-
-**Cause**: There is a transient issue on the sink data store, or retrieving metadata from the sink data store is not allowed.
156
+
-**Cause**: There's a transient issue on the sink data store, or retrieving metadata from the sink data store isn't allowed.
157
157
158
158
#### Error code: 20704
159
159
@@ -245,15 +245,15 @@ The errors below are general to the copy activity and could occur with any conne
245
245
246
246
-**Cause**: You provide a wrong or invalid query to fetch the data/schemas.
247
247
248
-
-**Recommendation**: Verify your query is valid and can return data/schemas. Use [Script activity](transform-data-using-script.md) if you want to execute non-query scripts and your data store is supported. Alternatively, consider to use stored procedure that returns a dummy result to execute your non-query scripts.
248
+
-**Recommendation**: Verify your query is valid and can return data/schemas. Use [Script activity](transform-data-using-script.md) if you want to execute nonquery scripts and your data store is supported. Alternatively, consider to use stored procedure that returns a dummy result to execute your nonquery scripts.
249
249
250
250
#### Error code: 11775
251
251
252
252
-**Message**: `Failed to connect to your instance of Azure Database for PostgreSQL flexible server. '%'`
253
253
254
-
-**Cause**: Exact cause depends on the text returned in `'%'`. If it is **The operation has timed out**, it can be because the instance of PostgreSQL is stopped or because the network connectivity method configured for your instance doesn't allow connections from the Integration Runtime selected. User or password provided are incorrect. If it is **28P01: password authentication failed for user "*youruser*"**, it means that the user provided doesn't exist in the instance or that the password is incorrect. If it is **28000: no pg_hba.conf entry for host "*###.###.###.###*", user "*youruser*", database "*yourdatabase*", no encryption**, it means that the encryption method selected is not compatible with the configuration of the server.
254
+
-**Cause**: Exact cause depends on the text returned in `'%'`. If it's **The operation has timed out**, it can be because the instance of PostgreSQL is stopped or because the network connectivity method configured for your instance doesn't allow connections from the Integration Runtime selected. User or password provided are incorrect. If it's **28P01: password authentication failed for user <youruser>**, it means that the user provided doesn't exist in the instance or that the password is incorrect. If it's **28000: no pg_hba.conf entry for host "*###.###.###.###*", user "<youruser>", database "<yourdatabase>", no encryption**, it means that the encryption method selected isn't compatible with the configuration of the server.
255
255
256
-
-**Recommendation**: Confirm that the user provided exists in your instance of PostgreSQL and that the password corresponds to the one currently assigned to that user. Make sure that the encryption method selected is accepted by your instance of PostgreSQL, based on its current configuration. If the network connectivity method of your instance is configured for Private access (VNet integration), use a Self-Hosted Integration Runtime (IR) to connect to it. If it is configured for Public access (allowed IP addresses), it is recommended to use an Azure IR with managed virtual network and deploy a managed private endpoint to connect to your instance. When it is configured for Public access (allowed IP addresses) a less recommended alternative consists in creating firewall rules in your instance to allow traffic originating on the IP addresses used by the Azure IR you're using.
256
+
-**Recommendation**: Confirm that the user provided exists in your instance of PostgreSQL and that the password corresponds to the one currently assigned to that user. Make sure that the encryption method selected is accepted by your instance of PostgreSQL, based on its current configuration. If the network connectivity method of your instance is configured for Private access (virtual network integration), use a Self-Hosted Integration Runtime (IR) to connect to it. If it's configured for Public access (allowed IP addresses), it's recommended to use an Azure IR with managed virtual network and deploy a managed private endpoint to connect to your instance. When it's configured for Public access (allowed IP addresses) a less recommended alternative consists in creating firewall rules in your instance to allow traffic originating on the IP addresses used by the Azure IR you're using.
0 commit comments