Skip to content

Commit 07b67a3

Browse files
authored
Merge pull request #3849 from ClickHouse/pg/remove-beta
Remove mentions of public beta for PG CDC connector
2 parents 3fcf59d + 20849ee commit 07b67a3

File tree

11 files changed

+13
-22
lines changed

11 files changed

+13
-22
lines changed

docs/cloud/reference/changelog.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -115,8 +115,7 @@ within a secure customer environment.
115115

116116
### Postgres CDC connector for ClickPipes {#postgres-cdc-connector-for-clickpipes}
117117

118-
Postgres CDC connector for ClickPipes is now in public beta. This feature allows
119-
users to seamlessly replicate their Postgres databases to ClickHouse Cloud.
118+
Postgres CDC connector for ClickPipes allows users to seamlessly replicate their Postgres databases to ClickHouse Cloud.
120119

121120
- To get started, refer to the [documentation](https://clickhouse.com/docs/integrations/clickpipes/postgres) for ClickPipes Postgres CDC connector.
122121
- For more information on customer use cases and features, please refer to the [landing page](https://clickhouse.com/cloud/clickpipes/postgres-cdc-connector) and the [launch blog](https://clickhouse.com/blog/postgres-cdc-connector-clickpipes-public-beta).
@@ -249,7 +248,7 @@ Org Admins can now add more email addresses to a specific notification as additi
249248

250249
Bring Your Own Cloud for AWS is now available in Beta. This deployment model allows you to deploy and run ClickHouse Cloud in your own AWS account. We support deployments in 11+ AWS regions, with more coming soon. Please [contact support](https://clickhouse.com/support/program) for access. Note that this deployment is reserved for large-scale deployments.
251250

252-
### Postgres Change-Data-Capture (CDC) Connector in ClickPipes (Public Beta) {#postgres-change-data-capture-cdc-connector-in-clickpipes-public-beta}
251+
### Postgres Change-Data-Capture (CDC) Connector in ClickPipes {#postgres-change-data-capture-cdc-connector-in-clickpipes}
253252

254253
This turnkey integration enables customers to replicate their Postgres databases to ClickHouse Cloud in just a few clicks and leverage ClickHouse for blazing-fast analytics. You can use this connector for both continuous replication and one-time migrations from Postgres.
255254

docs/guides/inserting-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,7 @@ See [HTTP Interface](/interfaces/http) for further details.
148148
For loading data from Postgres, users can use:
149149

150150
- `PeerDB by ClickHouse`, an ETL tool specifically designed for PostgreSQL database replication. This is available in both:
151-
- ClickHouse Cloud - available through our [new connector](/integrations/clickpipes/postgres) (Private Preview) in ClickPipes, our managed ingestion service. Interested users [sign up here](https://clickpipes.peerdb.io/).
151+
- ClickHouse Cloud - available through our [new connector](/integrations/clickpipes/postgres) in ClickPipes, our managed ingestion service.
152152
- Self-managed - via the [open-source project](https://github.com/PeerDB-io/peerdb).
153153
- The [PostgreSQL table engine](/integrations/postgresql#using-the-postgresql-table-engine) to read data directly as shown in previous examples. Typically appropriate if batch replication based on a known watermark, e.g., timestamp, is sufficient or if it's a one-off migration. This approach can scale to 10's millions of rows. Users looking to migrate larger datasets should consider multiple requests, each dealing with a chunk of the data. Staging tables can be used for each chunk prior to its partitions being moved to a final table. This allows failed requests to be retried. For further details on this bulk-loading strategy, see here.
154154
- Data can be exported from PostgreSQL in CSV format. This can then be inserted into ClickHouse from either local files or via object storage using table functions.

docs/integrations/data-ingestion/clickpipes/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ import Image from '@theme/IdealImage';
4545
| DigitalOcean Spaces | <DOsvg class="image" alt="Digital Ocean logo" style={{width: '3rem', height: 'auto'}}/> | Object Storage | Stable | Configure ClickPipes to ingest large volumes of data from object storage.
4646
| Azure Blob Storage | <ABSsvg class="image" alt="Azure Blob Storage logo" style={{width: '3rem', height: 'auto'}}/> | Object Storage | Private Beta | Configure ClickPipes to ingest large volumes of data from object storage.
4747
| Amazon Kinesis | <Amazonkinesis class="image" alt="Amazon Kenesis logo" style={{width: '3rem', height: 'auto'}}/> |Streaming| Stable | Configure ClickPipes and start ingesting streaming data from Amazon Kinesis into ClickHouse cloud. |
48-
| Postgres | <Postgressvg class="image" alt="Postgres logo" style={{width: '3rem', height: 'auto'}}/> |DBMS| Public Beta | Configure ClickPipes and start ingesting data from Postgres into ClickHouse Cloud. |
48+
| Postgres | <Postgressvg class="image" alt="Postgres logo" style={{width: '3rem', height: 'auto'}}/> |DBMS| Stable | Configure ClickPipes and start ingesting data from Postgres into ClickHouse Cloud. |
4949
| MySQL | <Mysqlsvg class="image" alt="MySQL logo" style={{width: '3rem', height: 'auto'}}/> |DBMS| Private Beta | Configure ClickPipes and start ingesting data from MySQL into ClickHouse Cloud. |
5050

5151

docs/integrations/data-ingestion/clickpipes/postgres/deduplication.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Updates and deletes replicated from Postgres to ClickHouse result in duplicated
1414

1515
### PostgreSQL logical decoding {#PostgreSQL-logical-decoding}
1616

17-
ClickPipes uses [Postgres Logical Decoding](https://www.pgedge.com/blog/logical-replication-evolution-in-chronological-order-clustering-solution-built-around-logical-replication) to consume changes as they happen in Postgres. The Logical Decoding process in Postgres enables clients like ClickPipes to receive changes in a human-readable format, i.e., a series of INSERTs, UPDATEs, and DELETEs.
17+
ClickPipes uses [Postgres Logical Decoding](https://www.pgedge.com/blog/logical-replication-evolution-in-chronological-order-clustering-solution-built-around-logical-replication) to consume changes as they happen in Postgres. The Logical Decoding process in Postgres enables clients like ClickPipes to receive changes in a human-readable format, i.e., a series of INSERTs, UPDATEs, and DELETEs.
1818

1919
### ReplacingMergeTree {#replacingmergetree}
2020

docs/integrations/data-ingestion/clickpipes/postgres/index.md

Lines changed: 1 addition & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -18,16 +18,8 @@ import Image from '@theme/IdealImage';
1818

1919
# Ingesting Data from Postgres to ClickHouse (using CDC)
2020

21-
<BetaBadge/>
22-
23-
:::info
24-
Currently, ingesting data from Postgres to ClickHouse Cloud via ClickPipes is in Public Beta.
25-
:::
26-
27-
2821
You can use ClickPipes to ingest data from your source Postgres database into ClickHouse Cloud. The source Postgres database can be hosted on-premises or in the cloud including Amazon RDS, Google Cloud SQL, Azure Database for Postgres, Supabase and others.
2922

30-
3123
## Prerequisites {#prerequisites}
3224

3325
To get started, you first need to make sure that your Postgres database is set up correctly. Depending on your source Postgres instance, you may follow any of the following guides:
@@ -153,6 +145,6 @@ You can configure the Advanced settings if needed. A brief description of each s
153145

154146
## What's next? {#whats-next}
155147

156-
Once you've moved data from Postgres to ClickHouse, the next obvious question is how to query and model your data in ClickHouse to make the most of it. Please refer to the [migration guide](/migrations/postgresql/overview) to a step by step approaches on how to migrate from PostgreSQL to ClickHouse. Alongside the migration guide, make sure to check the pages about [Deduplication strategies (using CDC)](/integrations/clickpipes/postgres/deduplication) and [Ordering Keys](/integrations/clickpipes/postgres/ordering_keys) to understand how to handle duplicates and customize ordering keys when using CDC.
148+
Once you've moved data from Postgres to ClickHouse, the next obvious question is how to query and model your data in ClickHouse to make the most of it. Please refer to the [migration guide](/migrations/postgresql/overview) to a step by step approaches on how to migrate from PostgreSQL to ClickHouse. Alongside the migration guide, make sure to check the pages about [Deduplication strategies (using CDC)](/integrations/clickpipes/postgres/deduplication) and [Ordering Keys](/integrations/clickpipes/postgres/ordering_keys) to understand how to handle duplicates and customize ordering keys when using CDC.
157149

158150
Finally, please refer to the ["ClickPipes for Postgres FAQ"](/integrations/clickpipes/postgres/faq) page for more information about common issues and how to resolve them.

docs/integrations/data-ingestion/dbms/postgresql/connecting-to-postgresql.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ import ExperimentalBadge from '@theme/badges/ExperimentalBadge';
1313

1414
This page covers following options for integrating PostgreSQL with ClickHouse:
1515

16-
- using [ClickPipes](/integrations/clickpipes/postgres), the managed integration service for ClickHouse Cloud powered by PeerDB - now in public beta!
16+
- using [ClickPipes](/integrations/clickpipes/postgres), the managed integration service for ClickHouse Cloud powered by PeerDB.
1717
- using [PeerDB](https://github.com/PeerDB-io/peerdb), an open-source CDC tool specifically designed for PostgreSQL database replication to both self-hosted ClickHouse and ClickHouse Cloud.
1818
- using the `PostgreSQL` table engine, for reading from a PostgreSQL table
1919
- using the experimental `MaterializedPostgreSQL` database engine, for syncing a database in PostgreSQL with a database in ClickHouse

docs/integrations/data-ingestion/dbms/postgresql/inserting-data.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,7 @@ We recommend reading [this guide](/guides/inserting-data) to learn best practice
99

1010
For bulk loading data from PostgreSQL, users can use:
1111

12-
- using [ClickPipes](/integrations/clickpipes/postgres), the managed integration service for ClickHouse Cloud - now in public beta. Please [sign up here](https://clickpipes.peerdb.io/)
12+
- using [ClickPipes](/integrations/clickpipes/postgres), the managed integration service for ClickHouse Cloud.
1313
- `PeerDB by ClickHouse`, an ETL tool specifically designed for PostgreSQL database replication to both self-hosted ClickHouse and ClickHouse Cloud.
14-
- PeerDB is now available natively in ClickHouse Cloud - Blazing-fast Postgres to ClickHouse CDC with our [new ClickPipe connector](/integrations/clickpipes/postgres) - now in public beta. Please [sign up here](https://clickhouse.com/cloud/clickpipes/postgres-cdc-connector)
1514
- The [Postgres Table Function](/sql-reference/table-functions/postgresql) to read data directly. This is typically appropriate for if batch replication based on a known watermark, e.g. a timestamp. is sufficient or if it's a once-off migration. This approach can scale to 10's of millions of rows. Users looking to migrate larger datasets should consider multiple requests, each dealing with a chunk of the data. Staging tables can be used for each chunk prior to its partitions being moved to a final table. This allows failed requests to be retried. For further details on this bulk-loading strategy, see here.
1615
- Data can be exported from Postgres in CSV format. This can then be inserted into ClickHouse from either local files or via object storage using table functions.

i18n/jp/docusaurus-plugin-content-docs/current/cloud/reference/changelog.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,7 +189,7 @@ APIキーの有効期限オプションを制限し、有効期限のないOpenA
189189

190190
AWS向けのBring Your Own Cloudが現在ベータ版で利用可能です。このデプロイメントモデルにより、ClickHouse Cloudを独自のAWSアカウントで展開および実行できます。11以上のAWSリージョンでのデプロイメントをサポートし、今後さらに追加される予定です。アクセスについては、[サポートにお問い合わせください](https://clickhouse.com/support/program)。このデプロイは、大規模なデプロイメントにのみ予約されています。
191191

192-
### Postgres Change-Data-Capture (CDC) Connector in ClickPipes (Public Beta) {#postgres-change-data-capture-cdc-connector-in-clickpipes-public-beta}
192+
### Postgres Change-Data-Capture (CDC) Connector in ClickPipes {#postgres-change-data-capture-cdc-connector-in-clickpipes}
193193

194194
このターンキー統合により、顧客は数回のクリックでPostgresデータベースをClickHouse Cloudにレプリケートし、ClickHouseを利用して瞬時に分析できます。このコネクタを使用して、Postgresからの継続的なレプリケーションと1回限りのマイグレーションの両方を行うことができます。
195195

i18n/zh/docusaurus-plugin-content-docs/current/cloud/reference/changelog.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -115,8 +115,7 @@ within a secure customer environment.
115115

116116
### Postgres CDC connector for ClickPipes {#postgres-cdc-connector-for-clickpipes}
117117

118-
Postgres CDC connector for ClickPipes is now in public beta. This feature allows
119-
users to seamlessly replicate their Postgres databases to ClickHouse Cloud.
118+
Postgres CDC connector for ClickPipes allows users to seamlessly replicate their Postgres databases to ClickHouse Cloud.
120119

121120
- To get started, refer to the [documentation](https://clickhouse.com/docs/integrations/clickpipes/postgres) for ClickPipes Postgres CDC connector.
122121
- For more information on customer use cases and features, please refer to the [landing page](https://clickhouse.com/cloud/clickpipes/postgres-cdc-connector) and the [launch blog](https://clickhouse.com/blog/postgres-cdc-connector-clickpipes-public-beta).
@@ -249,7 +248,7 @@ Org Admins can now add more email addresses to a specific notification as additi
249248

250249
Bring Your Own Cloud for AWS is now available in Beta. This deployment model allows you to deploy and run ClickHouse Cloud in your own AWS account. We support deployments in 11+ AWS regions, with more coming soon. Please [contact support](https://clickhouse.com/support/program) for access. Note that this deployment is reserved for large-scale deployments.
251250

252-
### Postgres Change-Data-Capture (CDC) Connector in ClickPipes (Public Beta) {#postgres-change-data-capture-cdc-connector-in-clickpipes-public-beta}
251+
### Postgres Change-Data-Capture (CDC) Connector in ClickPipes {#postgres-change-data-capture-cdc-connector-in-clickpipes}
253252

254253
This turnkey integration enables customers to replicate their Postgres databases to ClickHouse Cloud in just a few clicks and leverage ClickHouse for blazing-fast analytics. You can use this connector for both continuous replication and one-time migrations from Postgres.
255254

styles/ClickHouse/Headings.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,3 +42,4 @@ exceptions:
4242
- MySQL
4343
- ClickPipe
4444
- ClickPipes
45+
- CDC

0 commit comments

Comments
 (0)