Skip to content

Commit d56eca0

Browse files
committed
update
1 parent f609713 commit d56eca0

File tree

10 files changed

+16
-17
lines changed

10 files changed

+16
-17
lines changed

docs/integrations/data-ingestion/data-ingestion-index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ For more information check out the pages below:
1818
| [Azure Synapse](/integrations/azure-synapse) | A fully managed, cloud-based analytics service provided by Microsoft Azure, combining big data and data warehousing to simplify data integration, transformation, and analytics at scale using SQL, Apache Spark, and data pipelines. |
1919
| [Azure Data Factory](/integrations/azure-data-factory) | A cloud-based data integration service that enables you to create, schedule, and orchestrate data workflows at scale. |
2020
| [Apache Beam](/integrations/apache-beam) | An open-source, unified programming model that enables developers to define and execute both batch and stream (continuous) data processing pipelines. |
21-
| [BladePipe](/integrations/bladepipe) | A real-time end-to-end data integration tool, boosting seamless data flow across platforms. |
21+
| [BladePipe](/integrations/bladepipe) | A real-time end-to-end data integration tool with sub-second latency, boosting seamless data flow across platforms. |
2222
| [dbt](/integrations/dbt) | Enables analytics engineers to transform data in their warehouses by simply writing select statements. |
2323
| [dlt](/integrations/data-ingestion/etl-tools/dlt-and-clickhouse) | An open-source library that you can add to your Python scripts to load data from various and often messy data sources into well-structured, live datasets. |
2424
| [Fivetran](/integrations/fivetran) | An automated data movement platform moving data out of, into and across your cloud data platforms. |

docs/integrations/data-ingestion/etl-tools/bladepipe-and-clickhouse.md

Lines changed: 15 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -24,18 +24,18 @@ import ClickHouseSupportedBadge from '@theme/badges/ClickHouseSupported';
2424
<ClickHouseSupportedBadge/>
2525

2626

27-
<a href="https://www.bladepipe.com/" target="_blank">BladePipe</a> is a real-time end-to-end data integration tool, boosting seamless data flow across platforms.
27+
<a href="https://www.bladepipe.com/" target="_blank">BladePipe</a> is a real-time end-to-end data integration tool with sub-second latency, boosting seamless data flow across platforms.
2828

29-
ClickHouse is one of BladePipe's pre-built connectors, allowing users to integrate data from various sources into ClickHouse. This page will show how to load data into ClickHouse in real time step by step.
29+
ClickHouse is one of BladePipe's pre-built connectors, allowing users to integrate data from various sources into ClickHouse automatically. This page will show how to load data into ClickHouse in real time step by step.
3030

3131

32-
## 1. Run BladePipe {#1-run-bladepipe}
32+
## 1. Download and run BladePipe {#1-run-bladepipe}
3333
1. Log in to <a href="https://www.bladepipe.com/" target="_blank">BladePipe Cloud</a>.
3434

3535
2. Follow the instructions in <a href="https://doc.bladepipe.com/productOP/byoc/installation/install_worker_docker" target="_blank">Install Worker (Docker)</a> or <a href="https://doc.bladepipe.com/productOP/byoc/installation/install_worker_binary" target="_blank">Install Worker (Binary)</a> to download and install a BladePipe Worker.
3636

3737
:::note
38-
Alternatively, you can download and deploy <a href="https://doc.bladepipe.com/productOP/onPremise/installation/install_all_in_one_binary" target="_blank">BladePipe Enterprise</a>
38+
Alternatively, you can download and deploy <a href="https://doc.bladepipe.com/productOP/onPremise/installation/install_all_in_one_binary" target="_blank">BladePipe Enterprise</a>.
3939
:::
4040

4141
## 2. Add ClickHouse as a target {#2-add-clickhouse-as-a-target}
@@ -45,38 +45,38 @@ ClickHouse is one of BladePipe's pre-built connectors, allowing users to integra
4545
2. To use ClickHouse as a target, make sure that the user has SELECT, INSERT and common DDL permissions.
4646
:::
4747

48-
1. In BladePipe, click **DataSource** > **Add DataSource**.
48+
1. In BladePipe, click "DataSource" > "Add DataSource".
4949

50-
2. Select ClickHouse, and fill out the settings by providing your ClickHouse host and port, username and password, and click **Test Connection**.
50+
2. Select `ClickHouse`, and fill out the settings by providing your ClickHouse host and port, username and password, and click "Test Connection".
5151

5252
<Image img={bp_ck_1} size="lg" border alt="Add ClickHouse as a target" />
5353

54-
3. Click **Add DataSource** at the bottom, and a ClickHouse instance is added.
54+
3. Click "Add DataSource" at the bottom, and a ClickHouse instance is added.
5555

5656
## 3. Add MySQL as a source {#3-add-mysql-as-a-source}
57-
In this tutorial, we use a MySQL instance as the source, and expalin the process of loading MySQL data to ClickHouse.
57+
In this tutorial, we use a MySQL instance as the source, and explain the process of loading MySQL data to ClickHouse.
5858

5959
:::note
6060
To use MySQL as a source, make sure that the user has the <a href="https://doc.bladepipe.com/dataMigrationAndSync/datasource_func/MySQL/privs_for_mysql" target="_blank">requried permissions</a>.
6161
:::
6262

63-
1. In BladePipe, click **DataSource** > **Add DataSource**.
63+
1. In BladePipe, click "DataSource" > "Add DataSource".
6464

65-
2. Select MySQL, and fill out the settings by providing your MySQL host and port, username and password, and click **Test Connection**.
65+
2. Select `MySQL`, and fill out the settings by providing your MySQL host and port, username and password, and click "Test Connection".
6666

6767
<Image img={bp_ck_2} size="lg" border alt="Add MySQL as a source" />
6868

69-
3. Click **Add DataSource** at the bottom, and a MySQL instance is added.
69+
3. Click "Add DataSource" at the bottom, and a MySQL instance is added.
7070

7171

7272
## 4. Create a pipeline {#4-create-a-pipeline}
7373

74-
1. In BladePipe, click **DataJob** > **Create DataJob**.
74+
1. In BladePipe, click "DataJob" > "Create DataJob".
7575

76-
2. Select the added MySQL and ClickHouse instances and click **Test Connection** to ensure BladePipe is connected to the instances. Then, select the databases to be moved.
76+
2. Select the added MySQL and ClickHouse instances and click "Test Connection" to ensure BladePipe is connected to the instances. Then, select the databases to be moved.
7777
<Image img={bp_ck_3} size="lg" border alt="Select source and target" />
7878

79-
3. Select **Incremental** for DataJob Type, together with the **Full Data** option.
79+
3. Select "Incremental" for DataJob Type, together with the "Full Data" option.
8080
<Image img={bp_ck_4} size="lg" border alt="Select sync type" />
8181

8282

@@ -93,12 +93,11 @@ To use MySQL as a source, make sure that the user has the <a href="https://doc.b
9393

9494
## 5. Verify the data {#5-verify-the-data}
9595
1. Stop data write in MySQL instance and wait for ClickHouse to merge data.
96-
9796
:::note
9897
Due to the unpredictable timing of ClickHouse's automatic merging, you can manually trigger a merging by running the `OPTIMIZE TABLE xxx FINAL;` command. Note that there is a chance that this manual merging may not always succeed.
9998

10099
Alternatively, you can run the `CREATE VIEW xxx_v AS SELECT * FROM xxx FINAL;` command to create a view and perform queries on the view to ensure the data is fully merged.
101100
:::
102101

103-
2. Create a <a href="https://doc.bladepipe.com/operation/job_manage/create_job/create_period_verification_correction_job" target="_blank">Verification DataJob</a>. Once the Verification DataJob is completed, review the results to confirm that the data in ClickHouse are the same as the data in MySQL.
102+
2. Create a <a href="https://doc.bladepipe.com/operation/job_manage/create_job/create_period_verification_correction_job" target="_blank">Verification DataJob</a>. Once the Verification DataJob is completed, review the results to confirm that the data in ClickHouse is the same as the data in MySQL.
104103
<Image img={bp_ck_9} size="lg" border alt="Verify data" />
-43.7 KB
Loading
-44.4 KB
Loading
33.4 KB
Loading
-16 KB
Loading
14.8 KB
Loading
-9.54 KB
Loading
-73.2 KB
Loading
-69 KB
Loading

0 commit comments

Comments
 (0)