Skip to content

Commit bf446dd

Browse files
committed
Update Kafka ClickPipe setup instructions
1 parent d0951c3 commit bf446dd

File tree

9 files changed

+32
-43
lines changed

9 files changed

+32
-43
lines changed

docs/integrations/data-ingestion/clickpipes/kafka.md

Lines changed: 32 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -18,13 +18,11 @@ import cp_step1 from '@site/static/images/integrations/data-ingestion/clickpipes
1818
import cp_step2 from '@site/static/images/integrations/data-ingestion/clickpipes/cp_step2.png';
1919
import cp_step3 from '@site/static/images/integrations/data-ingestion/clickpipes/cp_step3.png';
2020
import cp_step4a from '@site/static/images/integrations/data-ingestion/clickpipes/cp_step4a.png';
21-
import cp_step4a3 from '@site/static/images/integrations/data-ingestion/clickpipes/cp_step4a3.png';
2221
import cp_step4b from '@site/static/images/integrations/data-ingestion/clickpipes/cp_step4b.png';
2322
import cp_step5 from '@site/static/images/integrations/data-ingestion/clickpipes/cp_step5.png';
24-
import cp_success from '@site/static/images/integrations/data-ingestion/clickpipes/cp_success.png';
2523
import cp_remove from '@site/static/images/integrations/data-ingestion/clickpipes/cp_remove.png';
26-
import cp_destination from '@site/static/images/integrations/data-ingestion/clickpipes/cp_destination.png';
2724
import cp_overview from '@site/static/images/integrations/data-ingestion/clickpipes/cp_overview.png';
25+
import cp_table_settings from '@site/static/images/integrations/data-ingestion/clickpipes/cp_table_settings.png';
2826
import Image from '@theme/IdealImage';
2927

3028
# Integrating Kafka with ClickHouse Cloud
@@ -33,26 +31,24 @@ You have familiarized yourself with the [ClickPipes intro](./index.md).
3331

3432
## Creating your first Kafka ClickPipe {#creating-your-first-kafka-clickpipe}
3533

36-
1. Access the SQL Console for your ClickHouse Cloud Service.
34+
<VerticalStepper type="numbered" headerLevel="h3">
3735

38-
<Image img={cp_service} alt="ClickPipes service" size="md" border/>
36+
### Navigate to Data Sources {#1-load-sql-console}
37+
Select the `Data Sources` button on the left-side menu and click on "Set up a ClickPipe"
38+
<Image img={cp_step0} alt="Select imports" size="md"/>
3939

40+
### Select a data source {#2-select-data-source}
41+
Select your data source.
42+
<Image img={cp_step1} alt="Select data source type" size="md"/>
4043

41-
2. Select the `Data Sources` button on the left-side menu and click on "Set up a ClickPipe"
44+
### Configure the data source {#3-configure-data-source}
45+
Fill out the form by providing your ClickPipe with a name, a description (optional), your credentials, and other connection details.
46+
<Image img={cp_step2} alt="Fill out connection details" size="md"/>
4247

43-
<Image img={cp_step0} alt="Select imports" size="lg" border/>
44-
45-
3. Select your data source.
46-
47-
<Image img={cp_step1} alt="Select data source type" size="lg" border/>
48-
49-
4. Fill out the form by providing your ClickPipe with a name, a description (optional), your credentials, and other connection details.
50-
51-
<Image img={cp_step2} alt="Fill out connection details" size="lg" border/>
52-
53-
5. Configure the schema registry. A valid schema is required for Avro streams and optional for JSON. This schema will be used to parse [AvroConfluent](../../../interfaces/formats.md/#data-format-avro-confluent) or validate JSON messages on the selected topic.
48+
### Configure a schema registry (Optional) {#4-configure-your-schema-registry}
49+
A valid schema is required for Avro streams and optional for JSON. This schema will be used to parse [AvroConfluent](../../../interfaces/formats.md/#data-format-avro-confluent) or validate JSON messages on the selected topic.
5450
- Avro messages that cannot be parsed or JSON messages that fail validation will generate an error.
55-
- the "root" path of the schema registry. For example, a Confluent Cloud schema registry URL is just an HTTPS url with no path, like `https://test-kk999.us-east-2.aws.confluent.cloud` If only the root
51+
- The "root" path of the schema registry. For example, a Confluent Cloud schema registry URL is just an HTTPS url with no path, like `https://test-kk999.us-east-2.aws.confluent.cloud` If only the root
5652
path is specified, the schema used to determine column names and types in step 4 will be determined by the id embedded in the sampled Kafka messages.
5753
- the path `/schemas/ids/[ID]` to the schema document by the numeric schema id. A complete url using a schema id would be `https://registry.example.com/schemas/ids/1000`
5854
- the path `/subjects/[subject_name]` to the schema document by subject name. Optionally, a specific version can be referenced by appending `/versions/[version]` to the url (otherwise ClickPipes
@@ -61,45 +57,38 @@ will retrieve the latest version). A complete url using a schema subject would
6157
Note that in all cases ClickPipes will automatically retrieve an updated or different schema from the registry if indicated by the schema ID embedded in the message. If the message is written
6258
without an embedded schema id, then the specific schema ID or subject must be specified to parse all messages.
6359

64-
6. Select your topic and the UI will display a sample document from the topic.
65-
66-
<Image img={cp_step3} alt="Set data format and topic" size="lg" border/>
60+
### Configure a Reverse Private Endpoint(Optional) {#5-configure-reverse-private-endpoint}
61+
Configure a Reverse Private Endpoint to allow ClickPipes to connect to your Kafka cluster using AWS PrivateLink.
62+
See our [AWS PrivateLink documentation](./aws-privatelink.md) for more information.
6763

68-
7. In the next step, you can select whether you want to ingest data into a new ClickHouse table or reuse an existing one. Follow the instructions in the screen to modify your table name, schema, and settings. You can see a real-time preview of your changes in the sample table at the top.
64+
### Select your topic {#6-select-your-topic}
65+
Select your topic and the UI will display a sample document from the topic.
66+
<Image img={cp_step3} alt="Set your topic" size="md"/>
6967

70-
<Image img={cp_step4a} alt="Set table, schema, and settings" size="lg" border/>
68+
### Configure your destination table {#7-configure-your-destination-table}
7169

72-
You can also customize the advanced settings using the controls provided
70+
In the next step, you can select whether you want to ingest data into a new ClickHouse table or reuse an existing one. Follow the instructions in the screen to modify your table name, schema, and settings. You can see a real-time preview of your changes in the sample table at the top.
7371

74-
<Image img={cp_step4a3} alt="Set advanced controls" size="lg" border/>
72+
<Image img={cp_step4a} alt="Set table, schema, and settings" size="md"/>
7573

76-
8. Alternatively, you can decide to ingest your data in an existing ClickHouse table. In that case, the UI will allow you to map fields from the source to the ClickHouse fields in the selected destination table.
74+
You can also customize the advanced settings using the controls provided
7775

78-
<Image img={cp_step4b} alt="Use an existing table" size="lg" border/>
76+
<Image img={cp_table_settings} alt="Set advanced controls" size="md"/>
7977

80-
9. Finally, you can configure permissions for the internal ClickPipes user.
8178

82-
**Permissions:** ClickPipes will create a dedicated user for writing data into a destination table. You can select a role for this internal user using a custom role or one of the predefined role:
79+
### Configure permissions {#8-configure-permissions}
80+
ClickPipes will create a dedicated user for writing data into a destination table. You can select a role for this internal user using a custom role or one of the predefined role:
8381
- `Full access`: with the full access to the cluster. It might be useful if you use Materialized View or Dictionary with the destination table.
8482
- `Only destination table`: with the `INSERT` permissions to the destination table only.
8583

86-
<Image img={cp_step5} alt="Permissions" size="lg" border/>
87-
88-
10. By clicking on "Complete Setup", the system will register you ClickPipe, and you'll be able to see it listed in the summary table.
89-
90-
<Image img={cp_success} alt="Success notice" size="sm" border/>
91-
92-
<Image img={cp_remove} alt="Remove notice" size="lg" border/>
93-
94-
The summary table provides controls to display sample data from the source or the destination table in ClickHouse
95-
96-
<Image img={cp_destination} alt="View destination" size="lg" border/>
84+
<Image img={cp_step5} alt="Permissions" size="md"/>
9785

98-
As well as controls to remove the ClickPipe and display a summary of the ingest job.
86+
### Complete setup {#9-complete-setup}
87+
Clicking on "Create ClickPipe" will create and run your ClickPipe. It will now be listed in the Data Sources section.
9988

100-
<Image img={cp_overview} alt="View overview" size="lg" border/>
89+
<Image img={cp_overview} alt="View overview" size="md"/>
10190

102-
11. **Congratulations!** you have successfully set up your first ClickPipe. If this is a streaming ClickPipe it will be continuously running, ingesting data in real-time from your remote data source.
91+
</VerticalStepper>
10392

10493
## Supported data sources {#supported-data-sources}
10594

Binary file not shown.
-159 KB
Loading
-126 KB
Loading
67.3 KB
Loading
-70.3 KB
Loading
67.6 KB
Loading
Binary file not shown.
55 KB
Loading

0 commit comments

Comments
 (0)