Skip to content

Commit f13c597

Browse files
authored
Merge pull request #276753 from maud-lv/ml-confluentconnector
Add Confluent connectors
2 parents 99b8b04 + 6645cca commit f13c597

File tree

14 files changed

+244
-53
lines changed

14 files changed

+244
-53
lines changed
Lines changed: 153 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,153 @@
1+
---
2+
title: Use Confluent Connectors in Azure (preview)
3+
description: Learn how to use Confluent Connectors in Azure (preview) to connect an instance of Apache Kafka® & Apache Flink on Confluent Cloud to Azure Blob Storage.
4+
# customerIntent: As a developer I want use Confluent Connectors in Azure
5+
ms.topic: how-to
6+
ms.date: 05/28/2024
7+
ms.author: malev
8+
author: maud-lv
9+
---
10+
11+
# Use Confluent Connectors in Azure (preview)
12+
13+
Confluent Cloud offers a solution designed to help developers connect their Confluent clusters to popular data sources and sinks. This solution is available in Azure using the Confluent Connectors feature.
14+
15+
> [!NOTE]
16+
> Currently, Apache Kafka® & Apache Flink® on Confluent Cloud™ - An Azure Native ISV Service only supports Confluent Connectors for Azure Blob Storage, including source and sink connectors.
17+
18+
In this guide, you learn how to connect an instance of Apache Kafka & Apache Flink on Confluent Cloud to Azure Blob Storage.
19+
20+
## Prerequisites
21+
22+
* An Azure account with an active subscription. [Create an account for free](https://azure.microsoft.com/free)
23+
* An [Azure Blob Storage](/azure/storage/blobs/storage-quickstart-blobs-portal) resource.
24+
* A [Confluent organization](./create.md) created on Azure Native ISV Services
25+
* The Azure subscription Owner or subscription Contributor role is required. If necessary, contact your subscription administrator to assign you one of these roles.
26+
* A [configured environment, cluster, and topic](https://docs.confluent.io/cloud/current/get-started/index.html) inside the confluent organization. If you don't have one already, go to Confluent to create these constructs.
27+
28+
## Create a Confluent sink Connector for Azure Blob Storage (preview)
29+
30+
Follow these steps to create a sink connector for Azure Blob Storage.
31+
32+
1. Open your Confluent organization and select **Confluent** > **Confluent Connectors (Preview)** from the left menu.
33+
34+
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot from the Azure portal showing the Confluent Connectors menu.":::
35+
36+
2. Select **Create new connector**. A connector pane opens up on the right hand side. Select or enter the following information under **Create a new connector**.
37+
38+
### Basics
39+
40+
Set the basic settings below, then select **Next**.
41+
42+
| Setting | Example value | Description |
43+
|---------------------|---------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
44+
| **Connector Type** | *Sink* | A sink connector pulls data from Kafka topics and pushes it into an external database or system for storage or further processing. |
45+
| **Connector Class** | *Azure Blob Storage Sink* | Select the Azure service you want to connect. Azure Blob Storage is currently the only available option. |
46+
| **Connector name** | *blob-sink-connector* | Enter a name for your connector. |
47+
| **Environment** | *env1* | Select the environment where you would like to create this connector. |
48+
| **Cluster** | *cluster1* | Select the cluster where you would like to create this connector. |
49+
| **Topics** | *topic_1* | Select one or more topics from where the data needs to be pulled. If there are no topics in the cluster in the selected cluster, create one by selecting **new topic**, which will open the Confluent website. |
50+
| **Subscription** | *My subscription* | Select the Azure subscription for the Azure Blob Storage where the data needs to be pushed. |
51+
| **Storage Account** | *storageaccount1* | Select the storage account where the data needs to be pushed. If needed, select **Create new** to create a new [storage account](../../storage/common/storage-account-create.md#basics-tab). |
52+
| **Container** | *container1* | Select the container within the storage account where the data needs to be pushed. If needed, [create a new container](../../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container). |
53+
54+
:::image type="content" source="./media/confluent-connectors/basic-sink.png" alt-text="Screenshot from the Azure portal showing the Basic tab, creating a sink connector.":::
55+
56+
### Authentication
57+
58+
Configure the authentication of your Kafka cluster via API keys. **Create New** is selected by default, which means that API keys will be automatically generated and configured when the connector is created. Proceed to the next tab.
59+
60+
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot from the Azure portal showing the Authentication tab.":::
61+
62+
### Configuration
63+
64+
| Setting | Example value | Description |
65+
|------------------------|---------------|---------------------------------------------------------------------------------------------------------------------|
66+
| **Input Data Format** | *JSON* | Select an input Kafka record data format type among the following options: AVRO, JSON, string, Protobuf. |
67+
| **Output Data Format** | *JSON* | Select an output data format among the following options: AVRO, JSON, string, Protobuf. |
68+
| **Time interval** | *Hourly* | Select the time interval in which you would like the data to be grouped. Choose between hourly and daily. |
69+
| **Flush size** | *1000* | Optionally enter a flush size. Default flush size is 1000. |
70+
| **Number of tasks** | *1* | Optionally enter the maximum number of tasks you would like your connector to support simultaneously. Default is 1. |
71+
72+
:::image type="content" source="./media/confluent-connectors/configuration-sink.png" alt-text="Screenshot from the Azure portal showing the Configuration tab for a sink connector.":::
73+
74+
Select **Review + create** to continue.
75+
76+
### Review + Create
77+
78+
Review the listed settings for your new connector to ensure that the details are good to go. Once done, select **Create** to begin the connector deployment.
79+
80+
A notification is displayed on the top right, calling out the status of the deployment. Once it shows "created", refresh the **Confluent Connectors (Preview)** page. You can now see the new connector tile on this page.
81+
82+
## Create a Confluent source Connector for Azure Blob Storage (preview)
83+
84+
1. Open your Confluent organization and select **Confluent** > **Confluent Connectors (Preview)** from the left menu.
85+
86+
:::image type="content" source="./media/confluent-connectors/create-new-connector.png" alt-text="Screenshot from the Azure portal showing the Confluent Connectors menu.":::
87+
88+
2. Select **Create new connector**. A connector pane opens up on the right hand side. Select or enter the following information under **Create a new connector**.
89+
90+
### Basics
91+
92+
Set the basic settings below, then select **Next**.
93+
94+
| Setting | Example value | Description |
95+
|---------------------|-------------------------|----------------------------------------------------------------------------------------------------------|
96+
| **Connector Type** | *Source* | A source connector pulls data from an external database or system and pushes it into Kafka topics. |
97+
| **Connector Class** | *Azure Blob Storage* | Select the Azure service you want to connect. Azure Blob Storage is currently the only available option. |
98+
| **Connector name** | *blob-source-connector* | Enter a name for your connector. |
99+
| **Environment** | *env1* | Select the environment where you would like to create this connector. |
100+
| **Cluster** | *cluster1* | Select the cluster where you would like to create this connector. |
101+
| **Subscription** | *My subscription* | Select the Azure subscription for the Azure Blob Storage where the data needs to be pulled. |
102+
| **Storage Account** | *storageaccount1* | Select the storage account where the data needs to be pulled. If needed, select **Create new** to create a new [storage account](../../storage/common/storage-account-create.md#basics-tab). |
103+
| **Container** | *container1* | Select the container within the storage account where the data needs to be pushed. If needed, [create a new container](../../storage/blobs/storage-quickstart-blobs-portal.md#create-a-container). |
104+
105+
:::image type="content" source="./media/confluent-connectors/basic-source.png" alt-text="Screenshot from the Azure portal showing the Basic tab, creating a source connector.":::
106+
107+
### Authentication
108+
109+
Configure the authentication of your Kafka cluster via API keys. **Create New** is selected by default, which means that API keys will be automatically generated and configured when the connector is created. Proceed to the next tab.
110+
111+
:::image type="content" source="./media/confluent-connectors/authentication.png" alt-text="Screenshot from the Azure portal showing the Authentication tab.":::
112+
113+
### Configuration
114+
115+
| Setting | Example value | Description |
116+
|--------------------------|-----------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
117+
| **Input Data Format** | *JSON* | Select an input Kafka record data format type among the following options: AVRO, JSON, string, Protobuf. |
118+
| **Output Data Format** | *JSON* | Select an output data format among the following options: AVRO, JSON, string, Protobuf. |
119+
| **Topic name and regex** | `my-topic:.*\.json+` | Configure the topic name and the regex pattern of your messages to ensure they are mapped. For example, `*my-topic:.*\.json+` moves all the files ending with .json into *my-topic*. |
120+
| **Number of tasks** | *1* | Optionally enter the maximum number of tasks you would like your connector to support simultaneously. Default is 1. |
121+
122+
:::image type="content" source="./media/confluent-connectors/configuration-source.png" alt-text="Screenshot from the Azure portal showing the Configuration tab, creating a source connector.":::
123+
124+
Select **Review + create** to continue.
125+
126+
### Review + Create
127+
128+
Review the listed settings for your new connector to ensure that the details are good to go. Once done, select **Create** to begin the connector deployment.
129+
130+
A notification is displayed on the top right, calling out the status of the deployment. Once it shows *completed*, refresh the **Confluent Connectors** page. You can now see the new connector tile on this page.
131+
132+
## Manage Azure Confluent Connectors (preview)
133+
134+
1. Open your Confluent organization and select **Confluent** > **Confluent Connectors** from the left menu.
135+
1. Select your **Environment** and **Cluster** from the dropdown menu. The Azure portal now displays the list of Azure connectors in the respective environment and cluster. The following optional actions are available:
136+
137+
* Filter connectors by **Type** (**Source** or **Sink**) and **Status** (**Running**, **Failed**, **Provisioning, or **Paused**).
138+
* Search for a connector by entering a name in the search bar on the right hand side.
139+
140+
:::image type="content" source="./media/confluent-connectors/display-connectors.png" alt-text="Screenshot of the Azure platform that shows a list of existing connectors in the Confluent Connectors (Preview) tab." lightbox="./media/confluent-connectors/display-connectors.png":::
141+
142+
To learn more about a connector, select the connector tile, which opens the Confluent UI. On this page, you can see the connector health, throughput and other stats, edit, and delete the connector.
143+
144+
## Next steps
145+
146+
- For help with troubleshooting, see [Troubleshooting Apache Kafka & Apache Flink on Confluent Cloud solutions](troubleshoot.md).
147+
- Get started with Apache Kafka & Apache Flink on Confluent Cloud - An Azure Native ISV Service on
148+
149+
> [!div class="nextstepaction"]
150+
> [Azure portal](https://portal.azure.com/#view/HubsExtension/BrowseResource/resourceType/Microsoft.Confluent%2Forganizations)
151+
152+
> [!div class="nextstepaction"]
153+
> [Azure Marketplace](https://azuremarketplace.microsoft.com/marketplace/apps/confluentinc.confluent-cloud-azure-prod?tab=Overview)

0 commit comments

Comments
 (0)