You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
<Imageimg={cp_step1}alt="Select data source type"size="lg"border/>
48
-
49
-
4. Fill out the form by providing your ClickPipe with a name, a description (optional), your credentials, and other connection details.
50
-
51
-
<Imageimg={cp_step2}alt="Fill out connection details"size="lg"border/>
52
-
53
-
5. Configure the schema registry. A valid schema is required for Avro streams and optional for JSON. This schema will be used to parse [AvroConfluent](../../../interfaces/formats.md/#data-format-avro-confluent) or validate JSON messages on the selected topic.
48
+
### Configure a schema registry (Optional) {#4-configure-your-schema-registry}
49
+
A valid schema is required for Avro streams and optional for JSON. This schema will be used to parse [AvroConfluent](../../../interfaces/formats.md/#data-format-avro-confluent) or validate JSON messages on the selected topic.
54
50
- Avro messages that cannot be parsed or JSON messages that fail validation will generate an error.
55
-
-the "root" path of the schema registry. For example, a Confluent Cloud schema registry URL is just an HTTPS url with no path, like `https://test-kk999.us-east-2.aws.confluent.cloud` If only the root
51
+
-The "root" path of the schema registry. For example, a Confluent Cloud schema registry URL is just an HTTPS url with no path, like `https://test-kk999.us-east-2.aws.confluent.cloud` If only the root
56
52
path is specified, the schema used to determine column names and types in step 4 will be determined by the id embedded in the sampled Kafka messages.
57
53
- the path `/schemas/ids/[ID]` to the schema document by the numeric schema id. A complete url using a schema id would be `https://registry.example.com/schemas/ids/1000`
58
54
- the path `/subjects/[subject_name]` to the schema document by subject name. Optionally, a specific version can be referenced by appending `/versions/[version]` to the url (otherwise ClickPipes
@@ -61,45 +57,38 @@ will retrieve the latest version). A complete url using a schema subject would
61
57
Note that in all cases ClickPipes will automatically retrieve an updated or different schema from the registry if indicated by the schema ID embedded in the message. If the message is written
62
58
without an embedded schema id, then the specific schema ID or subject must be specified to parse all messages.
63
59
64
-
6. Select your topic and the UI will display a sample document from the topic.
65
-
66
-
<Imageimg={cp_step3}alt="Set data format and topic"size="lg"border/>
60
+
### Configure a Reverse Private Endpoint(Optional) {#5-configure-reverse-private-endpoint}
61
+
Configure a Reverse Private Endpoint to allow ClickPipes to connect to your Kafka cluster using AWS PrivateLink.
62
+
See our [AWS PrivateLink documentation](./aws-privatelink.md) for more information.
67
63
68
-
7. In the next step, you can select whether you want to ingest data into a new ClickHouse table or reuse an existing one. Follow the instructions in the screen to modify your table name, schema, and settings. You can see a real-time preview of your changes in the sample table at the top.
64
+
### Select your topic {#6-select-your-topic}
65
+
Select your topic and the UI will display a sample document from the topic.
66
+
<Imageimg={cp_step3}alt="Set your topic"size="md"/>
69
67
70
-
<Imageimg={cp_step4a}alt="Set table, schema, and settings"size="lg"border/>
68
+
### Configure your destination table {#7-configure-your-destination-table}
71
69
72
-
You can also customize the advanced settings using the controls provided
70
+
In the next step, you can select whether you want to ingest data into a new ClickHouse table or reuse an existing one. Follow the instructions in the screen to modify your table name, schema, and settings. You can see a real-time preview of your changes in the sample table at the top.
<Imageimg={cp_step4a}alt="Set table, schema, and settings"size="md"/>
75
73
76
-
8. Alternatively, you can decide to ingest your data in an existing ClickHouse table. In that case, the UI will allow you to map fields from the source to the ClickHouse fields in the selected destination table.
74
+
You can also customize the advanced settings using the controls provided
77
75
78
-
<Imageimg={cp_step4b}alt="Use an existing table"size="lg"border/>
9. Finally, you can configure permissions for the internal ClickPipes user.
81
78
82
-
**Permissions:** ClickPipes will create a dedicated user for writing data into a destination table. You can select a role for this internal user using a custom role or one of the predefined role:
ClickPipes will create a dedicated user for writing data into a destination table. You can select a role for this internal user using a custom role or one of the predefined role:
83
81
- `Full access`: with the full access to the cluster. It might be useful if you use Materialized View or Dictionary with the destination table.
84
82
- `Only destination table`: with the `INSERT` permissions to the destination table only.
11.**Congratulations!** you have successfully set up your first ClickPipe. If this is a streaming ClickPipe it will be continuously running, ingesting data in real-time from your remote data source.
91
+
</VerticalStepper>
103
92
104
93
## Supported data sources {#supported-data-sources}
0 commit comments