Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .dlc.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
"pattern": "^{"
},
{
"pattern": "^https://repo1.maven.org/maven2/org/apache/flink.*SNAPSHOT.*"
"pattern": "^https://repo1.maven.org/maven2/org/apache/flink.*"
},
{
"pattern": "^https://mvnrepository.com"
Expand Down
6 changes: 3 additions & 3 deletions docs/config.toml
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,9 @@ pygmentsUseClasses = true
ShowOutDatedWarning = false

# This is the version referenced in the docs. Please only use these variables
# to reference a specific Flink version, because this is the only place where
# we change the version for the complete docs when forking of a release branch
# etc.
# to reference a specific Flink CDC version, because this is the only place
# where we change the version for the complete docs when forking of a release
# branch etc.
# The full version string as referenced in Maven (e.g. 1.2.1)
Version = "3.3-SNAPSHOT"

Expand Down
4 changes: 2 additions & 2 deletions docs/content.zh/docs/connectors/flink-sources/db2-cdc.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ using a build automation tool (such as Maven or SBT) and SQL Client with SQL JAR

### SQL Client JAR

Download [flink-sql-connector-db2-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-db2-cdc/3.1.0/flink-sql-connector-db2-cdc-3.1.0.jar) and
Download [flink-sql-connector-db2-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-db2-cdc/{{< param Version >}}/flink-sql-connector-db2-cdc-{{< param Version >}}.jar) and
put it under `<FLINK_HOME>/lib/`.

**Note:** Refer to
Expand Down Expand Up @@ -354,7 +354,7 @@ public class Db2SourceExample {
}
```

The DB2 CDC incremental connector (after 3.1.0) can be used as the following shows:
The DB2 CDC incremental connector (since 3.1.0) can be used as the following shows:
```java
import org.apache.flink.api.common.eventtime.WatermarkStrategy;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ MongoDB CDC 连接器允许从 MongoDB 读取快照数据和增量数据。 本

```下载链接仅适用于稳定版本。```

下载 [flink-sql-connector-mongodb-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mongodb-cdc/3.1.0/flink-sql-connector-mongodb-cdc-3.1.0.jar), 把它放在 `<FLINK_HOME>/lib/`.
下载 [flink-sql-connector-mongodb-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mongodb-cdc/{{< param Version >}}/flink-sql-connector-mongodb-cdc-{{< param Version >}}.jar), 把它放在 `<FLINK_HOME>/lib/`.

**注意:** 参考 [flink-sql-connector-mongodb-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-mongodb-cdc), 当前已发布的版本将在 Maven 中央仓库中提供。

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/flink-sources/mysql-cdc.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ MySQL CDC 连接器允许从 MySQL 数据库读取快照数据和增量数据。

```下载链接仅在已发布版本可用,请在文档网站左下角选择浏览已发布的版本。```

下载 [flink-sql-connector-mysql-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/3.1.0/flink-sql-connector-mysql-cdc-3.1.0.jar) 到 `<FLINK_HOME>/lib/` 目录下。
下载 [flink-sql-connector-mysql-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/{{< param Version >}}/flink-sql-connector-mysql-cdc-{{< param Version >}}.jar) 到 `<FLINK_HOME>/lib/` 目录下。

**注意:** 参考 [flink-sql-connector-mysql-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-mysql-cdc) 当前已发布的所有版本都可以在 Maven 中央仓库获取。

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ OceanBase CDC 源端读取方案:

```下载链接仅在已发布版本可用,请在文档网站左下角选择浏览已发布的版本。```

下载[flink-sql-connector-oceanbase-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oceanbase-cdc/3.1.0/flink-sql-connector-oceanbase-cdc-3.1.0.jar) 到 `<FLINK_HOME>/lib/` 目录下。
下载[flink-sql-connector-oceanbase-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oceanbase-cdc/{{< param Version >}}/flink-sql-connector-oceanbase-cdc-{{< param Version >}}.jar) 到 `<FLINK_HOME>/lib/` 目录下。

**注意:** 参考 [flink-sql-connector-oceanbase-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-oceanbase-cdc) 当前已发布的所有版本都可以在 Maven 中央仓库获取。

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ In order to setup the Oracle CDC connector, the following table provides depende

**Download link is available only for stable releases.**

Download [flink-sql-connector-oracle-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oracle-cdc/3.1.0/flink-sql-connector-oracle-cdc-3.1.0.jar) and put it under `<FLINK_HOME>/lib/`.
Download [flink-sql-connector-oracle-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oracle-cdc/{{< param Version >}}/flink-sql-connector-oracle-cdc-{{< param Version >}}.jar) and put it under `<FLINK_HOME>/lib/`.

**Note:** Refer to [flink-sql-connector-oracle-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-oracle-cdc), more released versions will be available in the Maven central warehouse.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ In order to setup the Postgres CDC connector, the following table provides depen

```Download link is available only for stable releases.```

Download [flink-sql-connector-postgres-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-postgres-cdc/3.1.0/flink-sql-connector-postgres-cdc-3.1.0.jar) and put it under `<FLINK_HOME>/lib/`.
Download [flink-sql-connector-postgres-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-postgres-cdc/{{< param Version >}}/flink-sql-connector-postgres-cdc-{{< param Version >}}.jar) and put it under `<FLINK_HOME>/lib/`.

**Note:** Refer to [flink-sql-connector-postgres-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-postgres-cdc), more released versions will be available in the Maven central warehouse.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ In order to setup the SQLServer CDC connector, the following table provides depe

```Download link is available only for stable releases.```

Download [flink-sql-connector-sqlserver-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-sqlserver-cdc/3.1.0/flink-sql-connector-sqlserver-cdc-3.1.0.jar) and put it under `<FLINK_HOME>/lib/`.
Download [flink-sql-connector-sqlserver-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-sqlserver-cdc/{{< param Version >}}/flink-sql-connector-sqlserver-cdc-{{< param Version >}}.jar) and put it under `<FLINK_HOME>/lib/`.

**Note:** Refer to [flink-sql-connector-sqlserver-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-sqlserver-cdc), more released versions will be available in the Maven central warehouse.

Expand Down
2 changes: 1 addition & 1 deletion docs/content.zh/docs/connectors/flink-sources/tidb-cdc.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ In order to setup the TiDB CDC connector, the following table provides dependenc

```Download link is available only for stable releases.```

Download [flink-sql-connector-tidb-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-tidb-cdc/3.1.0/flink-sql-connector-tidb-cdc-3.1.0.jar) and put it under `<FLINK_HOME>/lib/`.
Download [flink-sql-connector-tidb-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-tidb-cdc/{{< param Version >}}/flink-sql-connector-tidb-cdc-{{< param Version >}}.jar) and put it under `<FLINK_HOME>/lib/`.

**Note:** Refer to [flink-sql-connector-tidb-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-tidb-cdc), more released versions will be available in the Maven central warehouse.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ volumes:

**下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译**

- [flink-sql-connector-mysql-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/3.1.0/flink-sql-connector-mysql-cdc-3.1.0.jar)
- [flink-sql-connector-mysql-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/{{< param Version >}}/flink-sql-connector-mysql-cdc-{{< param Version >}}.jar)
- [flink-shaded-hadoop-2-uber-2.7.5-10.0.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.7.5-10.0/flink-shaded-hadoop-2-uber-2.7.5-10.0.jar)
- [iceberg-flink-1.13-runtime-0.13.0-SNAPSHOT.jar](https://raw.githubusercontent.com/luoyuxia/flink-cdc-tutorial/main/flink-cdc-iceberg-demo/sql-client/lib/iceberg-flink-1.13-runtime-0.13.0-SNAPSHOT.jar)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,8 @@ docker-compose up -d

**下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译**
- [flink-sql-connector-elasticsearch7-3.0.1-1.17.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/3.0.1-1.17/flink-sql-connector-elasticsearch7-3.0.1-1.17.jar)
- [flink-sql-connector-mysql-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/3.1.0/flink-sql-connector-mysql-cdc-3.1.0.jar)
- [flink-sql-connector-postgres-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-postgres-cdc/3.1.0/flink-sql-connector-postgres-cdc-3.1.0.jar)
- [flink-sql-connector-mysql-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/{{< param Version >}}/flink-sql-connector-mysql-cdc-{{< param Version >}}.jar)
- [flink-sql-connector-postgres-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-postgres-cdc/{{< param Version >}}/flink-sql-connector-postgres-cdc-{{< param Version >}}.jar)

### 准备数据
#### 在 MySQL 数据库中准备数据
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ db.customers.insertMany([
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```

- [flink-sql-connector-elasticsearch7-3.0.1-1.17.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/3.0.1-1.17/flink-sql-connector-elasticsearch7-3.0.1-1.17.jar)
- [flink-sql-connector-mongodb-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mongodb-cdc/3.1.0/flink-sql-connector-mongodb-cdc-3.1.0.jar)
- [flink-sql-connector-mongodb-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mongodb-cdc/{{< param Version >}}/flink-sql-connector-mongodb-cdc-{{< param Version >}}.jar)

4. 然后启动 Flink 集群,再启动 SQL CLI.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```

- [flink-sql-connector-elasticsearch7-3.0.1-1.17.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/3.0.1-1.17/flink-sql-connector-elasticsearch7-3.0.1-1.17.jar)
- [flink-sql-connector-oceanbase-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oceanbase-cdc/3.1.0/flink-sql-connector-oceanbase-cdc-3.1.0.jar)
- [flink-sql-connector-oceanbase-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oceanbase-cdc/{{< param Version >}}/flink-sql-connector-oceanbase-cdc-{{< param Version >}}.jar)

### 在 Flink SQL CLI 中使用 Flink DDL 创建表

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ docker-compose down
*下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译*

- [flink-sql-connector-elasticsearch7-3.0.1-1.17.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/3.0.1-1.17/flink-sql-connector-elasticsearch7-3.0.1-1.17.jar)
- [flink-sql-connector-oracle-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oracle-cdc/3.1.0/flink-sql-connector-oracle-cdc-3.1.0.jar)
- [flink-sql-connector-oracle-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-oracle-cdc/{{< param Version >}}/flink-sql-connector-oracle-cdc-{{< param Version >}}.jar)


**在 Oracle 数据库中准备数据**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),
2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.17.0/lib/` 下

```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```
- 用于订阅PolarDB-X Binlog: [flink-sql-connector-mysql-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/3.1.0/flink-sql-connector-mysql-cdc-3.1.0.jar)
- 用于订阅PolarDB-X Binlog: [flink-sql-connector-mysql-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-mysql-cdc/{{< param Version >}}/flink-sql-connector-mysql-cdc-{{< param Version >}}.jar)
- 用于写入Elasticsearch: [flink-sql-connector-elasticsearch7-3.0.1-1.17.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/3.0.1-1.17/flink-sql-connector-elasticsearch7-3.0.1-1.17.jar)
3. 启动flink服务:
```shell
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ docker-compose down
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```

- [flink-sql-connector-elasticsearch7-3.0.1-1.17.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/3.0.1-1.17/flink-sql-connector-elasticsearch7-3.0.1-1.17.jar)
- [flink-sql-connector-sqlserver-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-sqlserver-cdc/3.1.0/flink-sql-connector-sqlserver-cdc-3.1.0.jar)
- [flink-sql-connector-sqlserver-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-sqlserver-cdc/{{< param Version >}}/flink-sql-connector-sqlserver-cdc-{{< param Version >}}.jar)


**在 SqlServer 数据库中准备数据**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ docker-compose down
```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```

- [flink-sql-connector-elasticsearch7-3.0.1-1.17.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/3.0.1-1.17/flink-sql-connector-elasticsearch7-3.0.1-1.17.jar)
- [flink-sql-connector-tidb-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-tidb-cdc/3.1.0/flink-sql-connector-tidb-cdc-3.1.0.jar)
- [flink-sql-connector-tidb-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-tidb-cdc/{{< param Version >}}/flink-sql-connector-tidb-cdc-{{< param Version >}}.jar)


**在 TiDB 数据库中准备数据**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ In order to setup the Vitess CDC connector, the following table provides depende

### SQL Client JAR

Download [flink-sql-connector-vitess-cdc-3.1.0.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-vitess-cdc/3.1.0/flink-sql-connector-vitess-cdc-3.1.0.jar) and put it under `<FLINK_HOME>/lib/`.
Download [flink-sql-connector-vitess-cdc-{{< param Version >}}.jar](https://repo1.maven.org/maven2/org/apache/flink/flink-sql-connector-vitess-cdc/{{< param Version >}}/flink-sql-connector-vitess-cdc-{{< param Version >}}.jar) and put it under `<FLINK_HOME>/lib/`.

**Note:** Refer to
[flink-sql-connector-vitess-cdc](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-vitess-cdc),
Expand Down
26 changes: 13 additions & 13 deletions docs/content.zh/docs/deployment/kubernetes.md
Original file line number Diff line number Diff line change
Expand Up @@ -159,9 +159,9 @@ Job Description: Sync MySQL Database to Doris
假设您的Docker构建目录为`/opt/docker/flink-cdc`,此时该目录下的文件结构如下:
```text
/opt/docker/flink-cdc
├── flink-cdc-3.1.0-bin.tar.gz
├── flink-cdc-pipeline-connector-doris-3.1.0.jar
├── flink-cdc-pipeline-connector-mysql-3.1.0.jar
├── flink-cdc-{{< param Version >}}-bin.tar.gz
├── flink-cdc-pipeline-connector-doris-{{< param Version >}}.jar
├── flink-cdc-pipeline-connector-mysql-{{< param Version >}}.jar
├── mysql-connector-java-8.0.27.jar
└── ...
```
Expand All @@ -170,23 +170,23 @@ Job Description: Sync MySQL Database to Doris
FROM flink:1.18.0-java8
ADD *.jar $FLINK_HOME/lib/
ADD flink-cdc*.tar.gz $FLINK_HOME/
RUN mv $FLINK_HOME/flink-cdc-3.1.0/lib/flink-cdc-dist-3.1.0.jar $FLINK_HOME/lib/
RUN mv $FLINK_HOME/flink-cdc-{{< param Version >}}/lib/flink-cdc-dist-{{< param Version >}}.jar $FLINK_HOME/lib/
```
Docker镜像构建目录最终如下:
```text
/opt/docker/flink-cdc
├── Dockerfile
├── flink-cdc-3.1.0-bin.tar.gz
├── flink-cdc-pipeline-connector-doris-3.1.0.jar
├── flink-cdc-pipeline-connector-mysql-3.1.0.jar
├── flink-cdc-{{< param Version >}}-bin.tar.gz
├── flink-cdc-pipeline-connector-doris-{{< param Version >}}.jar
├── flink-cdc-pipeline-connector-mysql-{{< param Version >}}.jar
├── mysql-connector-java-8.0.27.jar
└── ...
```
3. 构建自定义镜像并推送至仓库
```bash
docker build -t flink-cdc-pipeline:3.1.0 .
docker build -t flink-cdc-pipeline:{{< param Version >}} .

docker push flink-cdc-pipeline:3.1.0
docker push flink-cdc-pipeline:{{< param Version >}}
```

### 创建ConfigMap用于挂载Flink CDC配置文件
Expand Down Expand Up @@ -237,14 +237,14 @@ spec:
state.checkpoints.dir: 'file:///tmp/checkpoints'
state.savepoints.dir: 'file:///tmp/savepoints'
flinkVersion: v1_18
image: 'flink-cdc-pipeline:3.1.0'
image: 'flink-cdc-pipeline:{{< param Version >}}'
imagePullPolicy: Always
job:
args:
- '--use-mini-cluster'
- /opt/flink/flink-cdc-3.1.0/conf/mysql-to-doris.yaml
- /opt/flink/flink-cdc-{{< param Version >}}/conf/mysql-to-doris.yaml
entryClass: org.apache.flink.cdc.cli.CliFrontend
jarURI: 'local:///opt/flink/flink-cdc-3.1.0/lib/flink-cdc-dist-3.1.0.jar'
jarURI: 'local:///opt/flink/flink-cdc-{{< param Version >}}/lib/flink-cdc-dist-{{< param Version >}}.jar'
parallelism: 1
state: running
upgradeMode: savepoint
Expand All @@ -261,7 +261,7 @@ spec:
# don't modify this name
- name: flink-main-container
volumeMounts:
- mountPath: /opt/flink/flink-cdc-3.1.0/conf
- mountPath: /opt/flink/flink-cdc-{{< param Version >}}/conf
name: flink-cdc-pipeline-config
volumes:
- configMap:
Expand Down
Loading
Loading