You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Azure Data Factory](../../data-factory/introduction.md) is a cloud-based
18
-
ETL and data integration service. It allows you to create data-driven workflows
19
-
to move and transform data at scale.
18
+
[ETL](https://en.wikipedia.org/wiki/Extract,_transform,_load) and data integration service. It allows you to create data-driven workflows to move and transform data at scale.
20
19
21
20
Using Data Factory, you can create and schedule data-driven workflows
22
21
(called pipelines) that ingest data from disparate data stores. Pipelines can
@@ -55,14 +54,13 @@ In Data Factory, you can use the **Copy** activity to copy data among
55
54
data stores located on-premises and in the cloud to Azure Cosmos DB for PostgreSQL. If you're
56
55
new to Data Factory, here's a quick guide on how to get started:
57
56
58
-
1. Once Data Factory is provisioned, go to your data factory. You see the Data
59
-
Factory home page as shown in the following image:
57
+
1. Once Data Factory is provisioned, go to your data factory and launch Azure Data Factory Studio. You see the Data Factory home page as shown in the following image:
60
58
61
59
:::image type="content" source="media/howto-ingestion/azure-data-factory-home.png" alt-text="Screenshot showing the landing page of Azure Data Factory.":::
62
60
63
-
2. On the home page, select **Orchestrate**.
61
+
2. On the Azure Data Factory Studio home page, select **Orchestrate**.
64
62
65
-
:::image type="content" source="media/howto-ingestion/azure-data-factory-orchestrate.png" alt-text="Screenshot showing the Orchestrate page of Azure Data Factory.":::
63
+
:::image type="content" source="media/howto-ingestion/azure-data-factory-orchestrate.png" alt-text="Screenshot showing the 'Orchestrate' page of Azure Data Factory.":::
66
64
67
65
3. Under **Properties**, enter a name for the pipeline.
68
66
@@ -75,7 +73,7 @@ new to Data Factory, here's a quick guide on how to get started:
75
73
5. Configure **Source**.
76
74
77
75
1. On the **Activities** page, select the **Source** tab. Select **New** to create a source dataset.
78
-
2. In the **New Dataset** dialog box, select **Azure Blob Storage**, and then select **Continue**.
76
+
2. In the **New Dataset** dialog box, select **Azure Blob Storage**, and then select **Continue**.
79
77
3. Choose the format type of your data, and then select **Continue**.
80
78
4. On the **Set properties** page, under **Linked service**, select **New**.
81
79
5. On the **New linked service** page, enter a name for the linked service, and select your storage account from the **Storage account name** list.
@@ -91,24 +89,24 @@ new to Data Factory, here's a quick guide on how to get started:
91
89
1. On the **Activities** page, select the **Sink** tab. Select **New** to create a sink dataset.
92
90
2. In the **New Dataset** dialog box, select **Azure Database for PostgreSQL**, and then select **Continue**.
93
91
3. On the **Set properties** page, under **Linked service**, select **New**.
94
-
4. On the **New linked service** page, enter a name for the linked service, and select your cluster from the **Server name**list. Add connection details and test the connection.
95
-
96
-
> [!NOTE]
97
-
>
98
-
> If your cluster isn't present in the drop down, use the **Enter manually**option to add server details.
99
-
92
+
4. On the **New linked service** page, enter a name for the linked service, and select **Enter manually**in the **Account selection method**.
93
+
5. Enter your cluster's coordinator name in the **Fully qualified domain name** field. You can copy the coordinator's name from the *Overview* page of your Azure Cosmos DB for PostgreSQL cluster.
94
+
6. Leave default port 5432 in the **Port** field for direct connection to the coordinator or replace it with port 6432 to connect to [the managed PgBouncer](./concepts-connection-pool.md) port.
95
+
7. Enter database name on your cluster and provide credentials to connect to it.
96
+
8. Select **SSL**in the **Encryption method**drop-down list.
97
+
100
98
:::image type="content" source="media/howto-ingestion/azure-data-factory-configure-sink.png" alt-text="Screenshot that shows configuring Sink in Azure Data Factory.":::
101
99
102
-
5. Select **Create** to save the configuration.
103
-
6. On the **Set properties** screen, select **OK**.
104
-
5. In the **Sink** tab on the **Activities** page, select the table name where you want to ingest the data.
105
-
6. Under **Write method**, select **Copy command**.
100
+
9. Select **Test connection** at the bottom of the panel to validate sink configuration.
101
+
10. Select **Create** to save the configuration.
102
+
11. On the **Set properties** screen, select **OK**.
103
+
12. In the **Sink** tab on the **Activities** page, select **Open** next to the *Sink dataset* drop-down list and select the table name on destination cluster where you want to ingest the data.
104
+
13. Under **Write method**, select **Copy command**.
106
105
107
106
:::image type="content" source="media/howto-ingestion/azure-data-factory-copy-command.png" alt-text="Screenshot that shows selecting the table and Copy command.":::
108
107
109
108
7. From the toolbar above the canvas, select **Validate** to validate pipeline
110
-
settings. Fix any errors, revalidate, and ensure that the pipeline has
111
-
been successfully validated.
109
+
settings. Fix any errors, revalidate, and ensure that the pipeline is successfully validated.
112
110
113
111
8. Select **Debug** from the toolbar to execute the pipeline.
114
112
@@ -128,5 +126,5 @@ as shown below:
128
126
129
127
## Next steps
130
128
131
-
Learn how to create a [real-time
132
-
dashboard](tutorial-design-database-realtime.md) with Azure Cosmos DB for PostgreSQL.
129
+
-Learn how to create a [real-time dashboard](tutorial-design-database-realtime.md) with Azure Cosmos DB for PostgreSQL.
130
+
- Learn how to [move your workload to Azure Cosmos DB for PostgreSQL](./quickstart-build-scalable-apps-overview.md)
0 commit comments