Skip to content

Commit 1cf6e8f

Browse files
committed
Acrolinx improvements
1 parent 44eabbb commit 1cf6e8f

File tree

4 files changed

+243
-243
lines changed

4 files changed

+243
-243
lines changed

articles/data-factory/concepts-linked-services.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ ms.date: 09/25/2024
1717

1818
This article describes what linked services are, how they're defined in JSON format, and how they're used in Azure Data Factory and Azure Synapse Analytics.
1919

20-
To learn more read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse](../synapse-analytics/overview-what-is.md).
20+
To learn more, read the introductory article for [Azure Data Factory](introduction.md) or [Azure Synapse](../synapse-analytics/overview-what-is.md).
2121

2222
## Overview
2323

@@ -27,7 +27,7 @@ Now, a **dataset** is a named view of data that simply points to or references t
2727

2828
Before you create a dataset, you must create a **linked service** to link your data store to the Data Factory or Synapse Workspace. Linked services are much like connection strings, which define the connection information needed for the service to connect to external resources. Think of it this way: the dataset represents the structure of the data within the linked data stores, and the linked service defines the connection to the data source. For example, an Azure Storage linked service links a storage account to the service. An Azure Blob dataset represents the blob container and the folder within that Azure Storage account that contains the input blobs to be processed.
2929

30-
Here is a sample scenario. To copy data from Blob storage to a SQL Database, you create two linked services: Azure Storage and Azure SQL Database. Then, create two datasets: Azure Blob dataset (which refers to the Azure Storage linked service) and Azure SQL Table dataset (which refers to the Azure SQL Database linked service). The Azure Storage and Azure SQL Database linked services contain connection strings that the service uses at runtime to connect to your Azure Storage and Azure SQL Database, respectively. The Azure Blob dataset specifies the blob container and blob folder that contains the input blobs in your Blob storage. The Azure SQL Table dataset specifies the SQL table in your SQL Database to which the data is to be copied.
30+
Here's a sample scenario. To copy data from Blob storage to a SQL Database, you create two linked services: Azure Storage and Azure SQL Database. Then, create two datasets: Azure Blob dataset (which refers to the Azure Storage linked service) and Azure SQL Table dataset (which refers to the Azure SQL Database linked service). The Azure Storage and Azure SQL Database linked services contain connection strings that the service uses at runtime to connect to your Azure Storage and Azure SQL Database, respectively. The Azure Blob dataset specifies the blob container and blob folder that contains the input blobs in your Blob storage. The Azure SQL Table dataset specifies the SQL table in your SQL Database to which the data is to be copied.
3131

3232
The following diagram shows the relationships among pipeline, activity, dataset, and linked service in the service:
3333

@@ -37,21 +37,21 @@ The following diagram shows the relationships among pipeline, activity, dataset,
3737

3838
# [Azure Data Factory](#tab/data-factory)
3939

40-
To create a new linked service in Azure Data Factory Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **New** to create a new linked service.
40+
To create a new linked service in Azure Data Factory Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **+ New** to create a new linked service.
4141

4242
:::image type="content" source="media/concepts-linked-services/create-linked-service.png" alt-text="Shows the Azure Data Factory studio Manage tab with linked services and the New button highlighted.":::
4343

44-
After selecting New to create a new linked service you will be able to choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
44+
After selecting **+ New** to create a new linked service you can choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
4545

4646
:::image type="content" source="media/concepts-linked-services/new-linked-service-window.png" alt-text="Shows the new linked service window.":::
4747

4848
# [Synapse Analytics](#tab/synapse-analytics)
4949

50-
To create a new linked service in Synapse Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **New** to create a new linked service.
50+
To create a new linked service in Synapse Studio, select the **Manage** tab and then **linked services**, where you can see any existing linked services you defined. Select **+ New** to create a new linked service.
5151

5252
:::image type="content" source="media/concepts-linked-services/create-linked-service-synapse.png" alt-text="Shows the Azure Data Factory studio Manage tab with linked services and the New button highlighted.":::
5353

54-
After selecting New to create a new linked service you will be able to choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
54+
After selecting **+ New** to create a new linked service you are able to choose any of the supported connectors and configure its details accordingly. Thereafter you can use the linked service in any pipelines you create.
5555

5656
:::image type="content" source="media/concepts-linked-services/new-linked-service-window.png" alt-text="Shows the new linked service window.":::
5757

@@ -113,15 +113,15 @@ Linked services can be created in the Azure Data Factory UX via the [management
113113

114114
You can create linked services by using one of these tools or SDKs: [.NET API](quickstart-create-data-factory-dot-net.md), [PowerShell](quickstart-create-data-factory-powershell.md), [REST API](quickstart-create-data-factory-rest-api.md), [Azure Resource Manager Template](quickstart-create-data-factory-resource-manager-template.md), and [Azure portal](quickstart-create-data-factory-portal.md).
115115

116-
When creating a linked service, the user needs appropriate authorization to the designated service. If sufficient access is not granted, the user will not be able to see the available resources and will need to use manual entry option.
116+
When creating a linked service, the user needs appropriate authorization to the designated service. If sufficient access isn't granted, the user can't see the available resources and needs to use manual entry option.
117117

118118
## Data store linked services
119119

120-
You can find the list of supported data stores in the [connector overview](copy-activity-overview.md#supported-data-stores-and-formats) article. Click a data store to learn the supported connection properties.
120+
You can find the list of supported data stores in the [connector overview](copy-activity-overview.md#supported-data-stores-and-formats) article. Select a data store to learn the supported connection properties.
121121

122122
## Compute linked services
123123

124-
Reference [compute environments supported](compute-linked-services.md) for details about different compute environments you can connect to from your service as well as the different configurations.
124+
Reference [compute environments supported](compute-linked-services.md) for details about different compute environments you can connect to from your service and the different configurations.
125125

126126
## Related content
127127

articles/data-factory/connector-troubleshoot-guide.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ You can refer to the troubleshooting pages for each connector to see problems sp
2929
- [DB2](connector-troubleshoot-db2.md)
3030
- [Delimited text format](connector-troubleshoot-delimited-text.md)
3131
- [Dynamics 365, Dataverse (Common Data Service), and Dynamics CRM](connector-troubleshoot-dynamics-dataverse.md)
32-
- [FTP, SFTP and HTTP](connector-troubleshoot-ftp-sftp-http.md)
32+
- [FTP, SFTP, and HTTP](connector-troubleshoot-ftp-sftp-http.md)
3333
- [Hive](connector-troubleshoot-hive.md)
3434
- [Oracle](connector-troubleshoot-oracle.md)
3535
- [ORC format](connector-troubleshoot-orc.md)
@@ -41,7 +41,7 @@ You can refer to the troubleshooting pages for each connector to see problems sp
4141

4242
## General copy activity errors
4343

44-
The errors below are general to the copy activity and could occur with any connector.
44+
The following errors are general to the copy activity and could occur with any connector.
4545

4646
#### Error code: 20000
4747

@@ -56,9 +56,9 @@ The errors below are general to the copy activity and could occur with any conne
5656

5757
- **Message**: `An error occurred when invoking Java Native Interface.`
5858

59-
- **Cause**: If the error message contains "Cannot create JVM: JNI return code [-6][JNI call failed: Invalid arguments.]", the possible cause is that JVM can't be created because some illegal (global) arguments are set.
59+
- **Cause**: If the error message contains "Can't create JVM: JNI return code [-6][JNI call failed: Invalid arguments.]," the possible cause is that JVM can't be created because some illegal (global) arguments are set.
6060

61-
- **Recommendation**: Log in to the machine that hosts *each node* of your self-hosted integration runtime. Check to ensure that the system variable is set correctly, as follows: `_JAVA_OPTIONS "-Xms256m -Xmx16g" with memory bigger than 8G`. Restart all the integration runtime nodes, and then rerun the pipeline.
61+
- **Recommendation**: Sign in to the machine that hosts *each node* of your self-hosted integration runtime. Check to ensure that the system variable is set correctly, as follows: `_JAVA_OPTIONS "-Xms256m -Xmx16g" with memory bigger than 8G`. Restart all the integration runtime nodes, and then rerun the pipeline.
6262

6363
#### Error code: 20020
6464

@@ -75,9 +75,9 @@ The errors below are general to the copy activity and could occur with any conne
7575

7676
- **Cause**: This error might occur when you copy data with connectors such as Azure Blob, SFTP, and so on. Federal Information Processing Standards (FIPS) defines a certain set of cryptographic algorithms that are allowed to be used. When FIPS mode is enabled on the machine, some cryptographic classes that copy activity depends on are blocked in some scenarios.
7777

78-
- **Resolution**: Learn [why we’re not recommending FIPS Mode anymore](https://techcommunity.microsoft.com/t5/microsoft-security-baselines/why-we-8217-re-not-recommending-8220-fips-mode-8221-anymore/ba-p/701037), and evaluate whether you can disable FIPS on your self-hosted IR machine.
78+
- **Resolution**: Learn [why we’re not recommending "FIPS Mode" anymore](https://techcommunity.microsoft.com/t5/microsoft-security-baselines/why-we-8217-re-not-recommending-8220-fips-mode-8221-anymore/ba-p/701037), and evaluate whether you can disable FIPS on your self-hosted IR machine.
7979

80-
Alternatively, if you only want to bypass FIPS and make the activity runs succeed, do the following:
80+
Alternatively, if you only want to bypass FIPS and make the activity runs succeed, take the following steps:
8181

8282
1. Open the folder where Self-hosted IR is installed. The path is usually *C:\Program Files\Microsoft Integration Runtime \<IR version>\Shared*.
8383

@@ -107,7 +107,7 @@ The errors below are general to the copy activity and could occur with any conne
107107

108108
- **Message**: `The toke type '%tokenType;' from your authorization server is not supported, supported types: '%tokenTypes;'.`
109109

110-
- **Cause**: Your authorization server is not supported.
110+
- **Cause**: Your authorization server isn't supported.
111111

112112
- **Recommendation**: Use an authorization server that can return tokens with supported token types.
113113

@@ -129,7 +129,7 @@ The errors below are general to the copy activity and could occur with any conne
129129

130130
- **Message**: `The format settings are missing in dataset %dataSetName;.`
131131

132-
- **Cause**: The dataset type is Binary, which is not supported.
132+
- **Cause**: The dataset type is Binary, which isn't supported.
133133

134134
- **Recommendation**: Use the DelimitedText, Json, Avro, Orc, or Parquet dataset instead.
135135

@@ -147,13 +147,13 @@ The errors below are general to the copy activity and could occur with any conne
147147

148148
- **Message**: `Failed to retrieve source file ('%name;') metadata to validate data consistency.`
149149

150-
- **Cause**: There is a transient issue on the sink data store, or retrieving metadata from the sink data store is not allowed.
150+
- **Cause**: There's a transient issue on the sink data store, or retrieving metadata from the sink data store isn't allowed.
151151

152152
#### Error code: 20703
153153

154154
- **Message**: `Failed to retrieve sink file ('%name;') metadata to validate data consistency.`
155155

156-
- **Cause**: There is a transient issue on the sink data store, or retrieving metadata from the sink data store is not allowed.
156+
- **Cause**: There's a transient issue on the sink data store, or retrieving metadata from the sink data store isn't allowed.
157157

158158
#### Error code: 20704
159159

@@ -245,15 +245,15 @@ The errors below are general to the copy activity and could occur with any conne
245245

246246
- **Cause**: You provide a wrong or invalid query to fetch the data/schemas.
247247

248-
- **Recommendation**: Verify your query is valid and can return data/schemas. Use [Script activity](transform-data-using-script.md) if you want to execute non-query scripts and your data store is supported. Alternatively, consider to use stored procedure that returns a dummy result to execute your non-query scripts.
248+
- **Recommendation**: Verify your query is valid and can return data/schemas. Use [Script activity](transform-data-using-script.md) if you want to execute nonquery scripts and your data store is supported. Alternatively, consider to use stored procedure that returns a dummy result to execute your nonquery scripts.
249249

250250
#### Error code: 11775
251251

252252
- **Message**: `Failed to connect to your instance of Azure Database for PostgreSQL flexible server. '%'`
253253

254-
- **Cause**: Exact cause depends on the text returned in `'%'`. If it is **The operation has timed out**, it can be because the instance of PostgreSQL is stopped or because the network connectivity method configured for your instance doesn't allow connections from the Integration Runtime selected. User or password provided are incorrect. If it is **28P01: password authentication failed for user "*youruser*"**, it means that the user provided doesn't exist in the instance or that the password is incorrect. If it is **28000: no pg_hba.conf entry for host "*###.###.###.###*", user "*youruser*", database "*yourdatabase*", no encryption**, it means that the encryption method selected is not compatible with the configuration of the server.
254+
- **Cause**: Exact cause depends on the text returned in `'%'`. If it's **The operation has timed out**, it can be because the instance of PostgreSQL is stopped or because the network connectivity method configured for your instance doesn't allow connections from the Integration Runtime selected. User or password provided are incorrect. If it's **28P01: password authentication failed for user &lt;youruser&gt;**, it means that the user provided doesn't exist in the instance or that the password is incorrect. If it's **28000: no pg_hba.conf entry for host "*###.###.###.###*", user "&lt;youruser&gt;", database "&lt;yourdatabase&gt;", no encryption**, it means that the encryption method selected isn't compatible with the configuration of the server.
255255

256-
- **Recommendation**: Confirm that the user provided exists in your instance of PostgreSQL and that the password corresponds to the one currently assigned to that user. Make sure that the encryption method selected is accepted by your instance of PostgreSQL, based on its current configuration. If the network connectivity method of your instance is configured for Private access (VNet integration), use a Self-Hosted Integration Runtime (IR) to connect to it. If it is configured for Public access (allowed IP addresses), it is recommended to use an Azure IR with managed virtual network and deploy a managed private endpoint to connect to your instance. When it is configured for Public access (allowed IP addresses) a less recommended alternative consists in creating firewall rules in your instance to allow traffic originating on the IP addresses used by the Azure IR you're using.
256+
- **Recommendation**: Confirm that the user provided exists in your instance of PostgreSQL and that the password corresponds to the one currently assigned to that user. Make sure that the encryption method selected is accepted by your instance of PostgreSQL, based on its current configuration. If the network connectivity method of your instance is configured for Private access (virtual network integration), use a Self-Hosted Integration Runtime (IR) to connect to it. If it's configured for Public access (allowed IP addresses), it's recommended to use an Azure IR with managed virtual network and deploy a managed private endpoint to connect to your instance. When it's configured for Public access (allowed IP addresses) a less recommended alternative consists in creating firewall rules in your instance to allow traffic originating on the IP addresses used by the Azure IR you're using.
257257

258258
## Related content
259259

0 commit comments

Comments
 (0)