Skip to content

Commit 9c724cc

Browse files
Merge pull request #302387 from whhender/resolving-order
Updating order to resolve confusion
2 parents a902230 + 5647a64 commit 9c724cc

File tree

1 file changed

+29
-33
lines changed

1 file changed

+29
-33
lines changed

articles/data-factory/connector-microsoft-fabric-warehouse.md

Lines changed: 29 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -35,33 +35,6 @@ This Microsoft Fabric Warehouse connector is supported for the following capabil
3535

3636
[!INCLUDE [data-factory-v2-connector-get-started](includes/data-factory-v2-connector-get-started.md)]
3737

38-
## Create a Microsoft Fabric Warehouse linked service using UI
39-
40-
Use the following steps to create a Microsoft Fabric Warehouse linked service in the Azure portal UI.
41-
42-
1. Browse to the Manage tab in your Azure Data Factory or Synapse workspace and select Linked Services, then select New:
43-
44-
# [Azure Data Factory](#tab/data-factory)
45-
46-
:::image type="content" source="media/doc-common-process/new-linked-service.png" alt-text="Screenshot of creating a new linked service with Azure Data Factory UI.":::
47-
48-
# [Azure Synapse](#tab/synapse-analytics)
49-
50-
:::image type="content" source="media/doc-common-process/new-linked-service-synapse.png" alt-text="Screenshot of creating a new linked service with Azure Synapse UI.":::
51-
52-
2. Search for Warehouse and select the connector.
53-
54-
:::image type="content" source="media/connector-microsoft-fabric-warehouse/microsoft-fabric-warehouse-connector.png" alt-text="Screenshot showing select Microsoft Fabric Warehouse connector.":::
55-
56-
1. Configure the service details, test the connection, and create the new linked service.
57-
58-
:::image type="content" source="media/connector-microsoft-fabric-warehouse/configure-microsoft-fabric-warehouse-linked-service.png" alt-text="Screenshot of configuration for Microsoft Fabric Warehouse linked service.":::
59-
60-
61-
## Connector configuration details
62-
63-
The following sections provide details about properties that are used to define Data Factory entities specific to Microsoft Fabric Warehouse.
64-
6538
## Linked service properties
6639

6740
The Microsoft Fabric Warehouse connector supports the following authentication types. See the corresponding sections for details:
@@ -78,17 +51,17 @@ To use service principal authentication, follow these steps.
7851
- Client secret value, which is the service principal key in the linked service.
7952
- Tenant ID
8053

81-
2. Grant the service principal at least the **Contributor** role in Microsoft Fabric workspace. Follow these steps:
54+
1. Grant the service principal at least the **Contributor** role in Microsoft Fabric workspace. Follow these steps:
8255
1. Go to your Microsoft Fabric workspace, select **Manage access** on the top bar. Then select **Add people or groups**.
83-
56+
8457
:::image type="content" source="media/connector-microsoft-fabric-warehouse/fabric-workspace-manage-access.png" alt-text="Screenshot shows selecting Fabric workspace Manage access.":::
8558

8659
:::image type="content" source="media/connector-microsoft-fabric-warehouse/manage-access-pane.png" alt-text=" Screenshot shows Fabric workspace Manage access pane.":::
87-
60+
8861
1. In **Add people** pane, enter your service principal name, and select your service principal from the drop-down list.
89-
62+
9063
1. Specify the role as **Contributor** or higher (Admin, Member), then select **Add**.
91-
64+
9265
:::image type="content" source="media/connector-microsoft-fabric-warehouse/select-workspace-role.png" alt-text="Screenshot shows adding Fabric workspace role.":::
9366

9467
1. Your service principal is displayed on **Manage access** pane.
@@ -136,6 +109,28 @@ You can also store service principal key in Azure Key Vault.
136109
}
137110
```
138111

112+
## Create a Microsoft Fabric Warehouse linked service using UI
113+
114+
Use the following steps to create a Microsoft Fabric Warehouse linked service in the Azure portal UI.
115+
116+
1. Browse to the Manage tab in your Azure Data Factory or Synapse workspace and select Linked Services, then select New:
117+
118+
# [Azure Data Factory](#tab/data-factory)
119+
120+
:::image type="content" source="media/doc-common-process/new-linked-service.png" alt-text="Screenshot of creating a new linked service with Azure Data Factory UI.":::
121+
122+
# [Azure Synapse](#tab/synapse-analytics)
123+
124+
:::image type="content" source="media/doc-common-process/new-linked-service-synapse.png" alt-text="Screenshot of creating a new linked service with Azure Synapse UI.":::
125+
126+
1. Search for Warehouse and select the connector.
127+
128+
:::image type="content" source="media/connector-microsoft-fabric-warehouse/microsoft-fabric-warehouse-connector.png" alt-text="Screenshot showing select Microsoft Fabric Warehouse connector.":::
129+
130+
1. Configure the service details, test the connection, and create the new linked service.
131+
132+
:::image type="content" source="media/connector-microsoft-fabric-warehouse/configure-microsoft-fabric-warehouse-linked-service.png" alt-text="Screenshot of configuration for Microsoft Fabric Warehouse linked service.":::
133+
139134
## Dataset properties
140135

141136
For a full list of sections and properties available for defining datasets, see the [Datasets](concepts-datasets-linked-services.md) article.
@@ -195,7 +190,6 @@ To copy data from Microsoft Fabric Warehouse, set the **type** property in the C
195190
| partitionUpperBound | The maximum value of the partition column for partition range splitting. This value is used to decide the partition stride, not for filtering the rows in table. All rows in the table or query result will be partitioned and copied. If not specified, copy activity auto detect the value. <br>Apply when the partition option is `DynamicRange`. For an example, see the [Parallel copy from Microsoft Fabric Warehouse](#parallel-copy-from-microsoft-fabric-warehouse) section. | No |
196191
| partitionLowerBound | The minimum value of the partition column for partition range splitting. This value is used to decide the partition stride, not for filtering the rows in table. All rows in the table or query result will be partitioned and copied. If not specified, copy activity auto detect the value.<br>Apply when the partition option is `DynamicRange`. For an example, see the [Parallel copy from Microsoft Fabric Warehouse](#parallel-copy-from-microsoft-fabric-warehouse) section. | No |
197192

198-
199193
>[!Note]
200194
>When using stored procedure in source to retrieve data, note if your stored procedure is designed as returning different schema when different parameter value is passed in, you may encounter failure or see unexpected result when importing schema from UI or when copying data to Microsoft Fabric Warehouse with auto table creation.
201195
@@ -532,6 +526,7 @@ Settings specific to Microsoft Fabric Warehouse are available in the Source Opti
532526
>Read via staging is not supported. CDC support for Microsoft Fabric Warehouse source is currently not available.
533527
534528
### Microsoft Fabric Warehouse as the sink
529+
535530
Settings specific to Microsoft Fabric Warehouse are available in the Settings tab of the sink transformation.
536531

537532
| Name | Description | Required | Allowed Values | Data flow script property |
@@ -557,6 +552,7 @@ If the staging storage location has a firewall enabled, access issues may occur.
557552

558553

559554
### Error row handling
555+
560556
By default, a data flow run will fail on the first error it gets. You can choose to Continue on error that allows your data flow to complete even if individual rows have errors. The service provides different options for you to handle these error rows.
561557

562558
Transaction Commit: Choose whether your data gets written in a single transaction or in batches. Single transaction will provide better performance and no data written will be visible to others until the transaction completes. Batch transactions have worse performance but can work for large datasets.

0 commit comments

Comments
 (0)