Skip to content

Commit 40e1236

Browse files
committed
Merge branch 'main' of https://github.com/ClickHouse/clickhouse-docs into anchor_fixes
2 parents 215b4b5 + 742ebf2 commit 40e1236

File tree

10 files changed

+126
-23
lines changed

10 files changed

+126
-23
lines changed

docs/cloud/reference/changelog.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,19 @@ import dashboards from '@site/static/images/cloud/reference/may-30-dashboards.pn
3131

3232
In addition to this ClickHouse Cloud changelog, please see the [Cloud Compatibility](/cloud/reference/cloud-compatibility.md) page.
3333

34+
## July 11, 2025 {#june-11-2025}
35+
36+
- New services now store database and table metadata in a central **SharedCatalog**,
37+
a new model for coordination and object lifecycles which enables:
38+
- **Cloud-scale DDL**, even under high concurrency
39+
- **Resilient deletion and new DDL operations**
40+
- **Fast spin-up and wake-ups** as stateless nodes now launch with no disk dependencies
41+
- **Stateless compute across both native and open formats**, including Iceberg and Delta Lake
42+
43+
Read more about SharedCatalog in our [blog](https://clickhouse.com/blog/clickhouse-cloud-stateless-compute)
44+
45+
- We now support the ability to launch HIPAA compliant services in GCP `europe-west4`
46+
3447
## June 27, 2025 {#june-27-2025}
3548

3649
- We now officially support a Terraform provider for managing database privileges

docs/guides/sre/keeper/index.md

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -450,11 +450,18 @@ Example of feature flag config that disables `multi_read` and enables `check_not
450450

451451
The following features are available:
452452

453-
- `multi_read` - support for read multi request. Default: `1`
454-
- `filtered_list` - support for list request which filters results by the type of node (ephemeral or persistent). Default: `1`
455-
- `check_not_exists` - support for `CheckNotExists` request, which asserts that node doesn't exists. Default: `0`
456-
- `create_if_not_exists` - support for `CreateIfNotExists` request, which will try to create a node if it doesn't exist. If it exists, no changes are applied and `ZOK` is returned. Default: `0`
457-
- `remove_recursive` - support for `RemoveRecursive` request, which removes the node along with its subtree. Default: `0`
453+
| Feature | Description | Default |
454+
|------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------|---------|
455+
| `multi_read` | Support for read multi request | `1` |
456+
| `filtered_list` | Support for list request which filters results by the type of node (ephemeral or persistent) | `1` |
457+
| `check_not_exists` | Support for `CheckNotExists` request, which asserts that node doesn't exist | `1` |
458+
| `create_if_not_exists` | Support for `CreateIfNotExists` request, which will try to create a node if it doesn't exist. If it exists, no changes are applied and `ZOK` is returned | `1` |
459+
| `remove_recursive` | Support for `RemoveRecursive` request, which removes the node along with its subtree | `1` |
460+
461+
:::note
462+
Some of the feature flags are enabled by default from version 25.7.
463+
The recommended way of upgrading Keeper to 25.7+ is to first upgrade to version 24.9+.
464+
:::
458465

459466
### Migration from ZooKeeper {#migration-from-zookeeper}
460467

docs/integrations/data-ingestion/clickpipes/secure-kinesis.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,9 @@ IAM policy (Please replace `{STREAM_NAME}` with your Kinesis stream name):
8888
{
8989
"Action": [
9090
"kinesis:SubscribeToShard",
91-
"kinesis:DescribeStreamConsumer"
91+
"kinesis:DescribeStreamConsumer",
92+
"kinesis:RegisterStreamConsumer",
93+
"kinesis:DeregisterStreamConsumer"
9294
],
9395
"Resource": [
9496
"arn:aws:kinesis:region:account-id:stream/{STREAM_NAME}/*"
Lines changed: 65 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,65 @@
1+
---
2+
sidebar_label: 'Kafka Connector Sink on Confluent Cloud'
3+
sidebar_position: 2
4+
slug: /integrations/kafka/cloud/confluent/custom-connector-cloud
5+
description: 'Guide to using the fully managed ClickHouse Connector Sinkon Confluent Cloud'
6+
title: 'Integrating Confluent Cloud with ClickHouse'
7+
keywords: ['Kafka', 'Confluent Cloud']
8+
---
9+
10+
import ConnectionDetails from '@site/docs/_snippets/_gather_your_details_http.mdx';
11+
import Image from '@theme/IdealImage';
12+
13+
# Integrating Confluent Cloud with ClickHouse
14+
15+
<div class='vimeo-container'>
16+
<iframe src="//www.youtube.com/embed/SQAiPVbd3gg"
17+
width="640"
18+
height="360"
19+
frameborder="0"
20+
allow="autoplay;
21+
fullscreen;
22+
picture-in-picture"
23+
allowfullscreen>
24+
</iframe>
25+
</div>
26+
27+
## Prerequisites {#prerequisites}
28+
We assume you are familiar with:
29+
* [ClickHouse Connector Sink](../kafka-clickhouse-connect-sink.md)
30+
* Confluent Cloud
31+
32+
## The official Kafka connector from ClickHouse with Confluent Cloud {#the-official-kafka-connector-from-clickhouse-with-confluent-cloud}
33+
34+
### Installing on Confluent Cloud {#installing-on-confluent-cloud}
35+
This is meant to be a quick guide to get you started with the ClickHouse Sink Connector on Confluent Cloud.
36+
For more details, please refer to the [official Confluent documentation](https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#uploading-and-launching-the-connector).
37+
38+
#### Create a Topic {#create-a-topic}
39+
Creating a topic on Confluent Cloud is fairly simple, and there are detailed instructions [here](https://docs.confluent.io/cloud/current/client-apps/topics/manage.html).
40+
41+
#### Important notes {#important-notes}
42+
43+
* The Kafka topic name must be the same as the ClickHouse table name. The way to tweak this is by using a transformer (for example [`ExtractTopic`](https://docs.confluent.io/platform/current/connect/transforms/extracttopic.html)).
44+
* More partitions does not always mean more performance - see our upcoming guide for more details and performance tips.
45+
46+
#### Gather your connection details {#gather-your-connection-details}
47+
<ConnectionDetails />
48+
49+
50+
#### Install Connector {#install-connector}
51+
Install the fully managed ClickHouse Sink Connector on Confluent Cloud following the [official documentation](https://docs.confluent.io/cloud/current/connectors/cc-clickhouse-sink-connector/cc-clickhouse-sink.html).
52+
53+
54+
#### Configure the Connector {#configure-the-connector}
55+
During the configuration of the ClickHouse Sink Connector, you will need to provide the following details:
56+
- hostname of your ClickHouse server
57+
- port of your ClickHouse server (default is 8443)
58+
- username and password for your ClickHouse server
59+
- database name in ClickHouse where the data will be written
60+
- topic name in Kafka that will be used to write data to ClickHouse
61+
62+
The Confluent Cloud UI supports advanced configuration options to adjust poll intervals, batch sizes, and other parameters to optimize performance.
63+
64+
#### Known limitations {#known-limitations}
65+
* See the list of [Connectors limitations in the official docs](https://docs.confluent.io/cloud/current/connectors/cc-clickhouse-sink-connector/cc-clickhouse-sink.html#limitations)

docs/integrations/data-ingestion/kafka/confluent/custom-connector.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
sidebar_label: 'Kafka Connector Sink on Confluent Platform'
3-
sidebar_position: 2
3+
sidebar_position: 3
44
slug: /integrations/kafka/cloud/confluent/custom-connector
55
description: 'Using ClickHouse Connector Sink with Kafka Connect and ClickHouse'
66
title: 'Integrating Confluent Cloud with ClickHouse'
@@ -10,7 +10,7 @@ import ConnectionDetails from '@site/docs/_snippets/_gather_your_details_http.md
1010
import Image from '@theme/IdealImage';
1111
import AddCustomConnectorPlugin from '@site/static/images/integrations/data-ingestion/kafka/confluent/AddCustomConnectorPlugin.png';
1212

13-
# Integrating Confluent Cloud with ClickHouse
13+
# Integrating Confluent platform with ClickHouse
1414

1515
<div class='vimeo-container'>
1616
<iframe src="//www.youtube.com/embed/SQAiPVbd3gg"
@@ -27,16 +27,16 @@ import AddCustomConnectorPlugin from '@site/static/images/integrations/data-inge
2727
## Prerequisites {#prerequisites}
2828
We assume you are familiar with:
2929
* [ClickHouse Connector Sink](../kafka-clickhouse-connect-sink.md)
30-
* Confluent Cloud and [Custom Connectors](https://docs.confluent.io/cloud/current/connectors/bring-your-connector/overview.html).
30+
* Confluent Platform and [Custom Connectors](https://docs.confluent.io/cloud/current/connectors/bring-your-connector/overview.html).
3131

32-
## The official Kafka connector from ClickHouse with Confluent Cloud {#the-official-kafka-connector-from-clickhouse-with-confluent-cloud}
32+
## The official Kafka connector from ClickHouse with Confluent Platform {#the-official-kafka-connector-from-clickhouse-with-confluent-platform}
3333

34-
### Installing on Confluent Cloud {#installing-on-confluent-cloud}
35-
This is meant to be a quick guide to get you started with the ClickHouse Sink Connector on Confluent Cloud.
34+
### Installing on Confluent platform {#installing-on-confluent-platform}
35+
This is meant to be a quick guide to get you started with the ClickHouse Sink Connector on Confluent Platform.
3636
For more details, please refer to the [official Confluent documentation](https://docs.confluent.io/cloud/current/connectors/bring-your-connector/custom-connector-qs.html#uploading-and-launching-the-connector).
3737

3838
#### Create a Topic {#create-a-topic}
39-
Creating a topic on Confluent Cloud is fairly simple, and there are detailed instructions [here](https://docs.confluent.io/cloud/current/client-apps/topics/manage.html).
39+
Creating a topic on Confluent Platform is fairly simple, and there are detailed instructions [here](https://docs.confluent.io/cloud/current/client-apps/topics/manage.html).
4040

4141
#### Important Notes {#important-notes}
4242

docs/integrations/data-ingestion/kafka/confluent/index.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,5 +10,6 @@ title: 'Integrating Confluent Cloud with ClickHouse'
1010

1111
Confluent platform provides two options to integration with ClickHouse
1212

13-
* [ClickHouse Connect Sink on Confluent Cloud](./custom-connector.md) using the custom connectors feature
13+
* [ClickHouse Connect Sink on Confluent Cloud](./confluent-cloud.md)
14+
* [ClickHouse Connect Sink on Confluent Platform](./custom-connector.md) using the custom connectors feature
1415
* [HTTP Sink Connector for Confluent Platform](./kafka-connect-http.md) that integrates Apache Kafka with an API via HTTP or HTTPS

docs/integrations/data-ingestion/kafka/confluent/kafka-connect-http.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
sidebar_label: 'HTTP Sink Connector for Confluent Platform'
3-
sidebar_position: 3
3+
sidebar_position: 4
44
slug: /integrations/kafka/cloud/confluent/http
55
description: 'Using HTTP Connector Sink with Kafka Connect and ClickHouse'
66
title: 'Confluent HTTP Sink Connector'

scripts/translate/requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,6 @@ sniffio==1.3.1
1515
tqdm==4.67.1
1616
typing_extensions==4.12.2
1717
xxhash==3.5.0
18-
llama_index==0.12.28
18+
llama_index==0.12.41
1919
python-frontmatter
2020
markdown-it-py

sidebars.js

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -839,6 +839,7 @@ const sidebars = {
839839
items: [
840840
"integrations/data-ingestion/kafka/index",
841841
"integrations/data-ingestion/kafka/kafka-clickhouse-connect-sink",
842+
"integrations/data-ingestion/kafka/confluent/confluent-cloud",
842843
"integrations/data-ingestion/kafka/confluent/custom-connector",
843844
"integrations/data-ingestion/kafka/msk/index",
844845
"integrations/data-ingestion/kafka/kafka-vector",

src/components/RelatedBlogs/styles.module.scss

Lines changed: 21 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,17 @@
1+
@use '../../css/breakpoints' as breakpoints;
2+
13
/* Container for the entire component */
24
.relatedBlogsContainer {
35
display: flex;
46
flex-direction: column;
57
padding-top: 32px;
68
}
79

8-
/* Container for just the cards (excluding header) */
910
.blogCardsContainer {
10-
display: grid;
11-
grid-template-columns: repeat(3, minmax(0, 1fr));
12-
gap: 0.5em;
13-
width: 100%;
14-
height: 100%
11+
display: flex;
12+
flex-direction: column;
13+
gap: 1em;
14+
width: 100%;
1515
}
1616

1717
/* Container for individual card */
@@ -68,10 +68,11 @@
6868

6969
/* Container for the bottom half of the card */
7070
[data-theme='light'] .cardBottom {
71-
background-color: #c0c0c0;
71+
background-color: #f7f7fa;
7272
border-left: 1px solid #c7c7c7;
7373
border-right: 1px solid #c7c7c7;
7474
border-bottom: 1px solid #c7c7c7;
75+
border-top: 1px solid #c7c7c7;
7576
}
7677

7778
/* Container for the bottom half of the card */
@@ -186,4 +187,17 @@
186187
100% {
187188
opacity: 0.6;
188189
}
190+
}
191+
192+
@media (min-width: breakpoints.$laptop-breakpoint) {
193+
194+
/* Container for just the cards (excluding header) */
195+
.blogCardsContainer {
196+
display: grid;
197+
grid-template-columns: repeat(3, minmax(0, 1fr));
198+
gap: 0.5em;
199+
width: 100%;
200+
height: 100%
201+
}
202+
189203
}

0 commit comments

Comments
 (0)