Skip to content

Commit d8dc341

Browse files
author
Jill Grant
authored
Merge pull request #270478 from sreekzz/patch-20
Updated flink as 1.17
2 parents 627d59f + b565243 commit d8dc341

File tree

5 files changed

+82
-34
lines changed

5 files changed

+82
-34
lines changed

articles/hdinsight-aks/flink/flink-catalog-iceberg-hive.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Table API and SQL - Use Iceberg Catalog type with Hive in Apache Flink®
33
description: Learn how to create Iceberg Catalog in Apache Flink® on HDInsight on AKS
44
ms.service: hdinsight-aks
55
ms.topic: how-to
6-
ms.date: 10/27/2023
6+
ms.date: 3/28/2024
77
---
88

99
# Create Iceberg Catalog in Apache Flink® on HDInsight on AKS
@@ -23,8 +23,10 @@ In this article, we learn how to use Iceberg Table managed in Hive catalog, with
2323
Once you launch the Secure Shell (SSH), let us start downloading the dependencies required to the SSH node, to illustrate the Iceberg table managed in Hive catalog.
2424

2525
```
26-
wget https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-flink-runtime-1.16/1.3.0/iceberg-flink-runtime-1.16-1.3.0.jar -P $FLINK_HOME/lib
26+
wget https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-flink-runtime-1.17/1.4.0/iceberg-flink-runtime-1.17-1.4.0.jar -P $FLINK_HOME/lib
2727
wget https://repo1.maven.org/maven2/org/apache/parquet/parquet-column/1.12.2/parquet-column-1.12.2.jar -P $FLINK_HOME/lib
28+
wget https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-hdfs-client/3.3.4/hadoop-hdfs-client-3.3.4.jar -P $FLINK_HOME
29+
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$FLINK_HOME/hadoop-hdfs-client-3.3.4.jar
2830
```
2931

3032
## Start the Apache Flink SQL Client
@@ -43,7 +45,7 @@ With the following steps, we illustrate how you can create Flink-Iceberg Catalog
4345
'uri'='thrift://hive-metastore:9083',
4446
'clients'='5',
4547
'property-version'='1',
46-
'warehouse'='abfs://container@storage_account.dfs.core.windows.net/ieberg-output');
48+
'warehouse'='abfs://container@storage_account.dfs.core.windows.net/iceberg-output');
4749
```
4850
> [!NOTE]
4951
> - In the above step, the container and storage account *need not be same* as specified during the cluster creation.
@@ -56,8 +58,8 @@ With the following steps, we illustrate how you can create Flink-Iceberg Catalog
5658
#### Add dependencies to server classpath
5759

5860
```sql
59-
ADD JAR '/opt/flink-webssh/lib/iceberg-flink-runtime-1.16-1.3.0.jar';
60-
ADD JAR '/opt/flink-webssh/lib/parquet-column-1.12.2.jar';
61+
ADD JAR '/opt/flink-webssh/lib/iceberg-flink-runtime-1.17-1.4.0.jar';
62+
ADD JAR '/opt/flink-webssh/lib/parquet-column-1.12.2.jar';
6163
```
6264
#### Create Database
6365

articles/hdinsight-aks/flink/monitor-changes-postgres-table-flink.md

Lines changed: 72 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Change Data Capture (CDC) of PostgreSQL table using Apache Flink®
33
description: Learn how to perform CDC on PostgreSQL table using Apache Flink®
44
ms.service: hdinsight-aks
55
ms.topic: how-to
6-
ms.date: 10/27/2023
6+
ms.date: 03/28/2024
77
---
88

99
# Change Data Capture (CDC) of PostgreSQL table using Apache Flink®
@@ -12,7 +12,7 @@ ms.date: 10/27/2023
1212

1313
Change Data Capture (CDC) is a technique you can use to track row-level changes in database tables in response to create, update, and delete operations. In this article, we use [CDC Connectors for Apache Flink®](https://github.com/ververica/flink-cdc-connectors), which offer a set of source connectors for Apache Flink. The connectors integrate [Debezium®](https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/formats/debezium/#debezium-format) as the engine to capture the data changes.
1414

15-
Flink supports to interpret Debezium JSON and Avro messages as INSERT/UPDATE/DELETE messages into Apache Flink SQL system.
15+
Flink supports to interpret Debezium JSON and Avro messages as INSERT/UPDATE/DELETE messages into Apache Flink SQL system.
1616

1717
This support is useful in many cases to:
1818

@@ -91,7 +91,7 @@ Now, let's learn how to monitor changes on PostgreSQL table using Flink-SQL CDC.
9191
<dependency>
9292
<groupId>com.ververica</groupId>
9393
<artifactId>flink-sql-connector-postgres-cdc</artifactId>
94-
<version>2.3.0</version>
94+
<version>2.4.2</version>
9595
</dependency>
9696
</dependencies>
9797
</project>
@@ -113,41 +113,87 @@ Now, let's learn how to monitor changes on PostgreSQL table using Flink-SQL CDC.
113113
114114
```sql
115115
/opt/flink-webssh/bin/sql-client.sh -j
116-
/opt/flink-webssh/target/flink-sql-connector-postgres-cdc-2.3.0.jar -j
116+
/opt/flink-webssh/target/flink-sql-connector-postgres-cdc-2.4.2.jar -j
117117
/opt/flink-webssh/target/slf4j-api-1.7.15.jar -j
118118
/opt/flink-webssh/target/hamcrest-2.1.jar -j
119-
/opt/flink-webssh/target/flink-shaded-guava-30.1.1-jre-16.0.jar -j
119+
/opt/flink-webssh/target/flink-shaded-guava-31.1-jre-17.0.jar-j
120120
/opt/flink-webssh/target/awaitility-4.0.1.jar -j
121121
/opt/flink-webssh/target/jsr308-all-1.1.2.jar
122122
```
123123
These commands start the sql client with the dependencies as,
124124
125-
:::image type="content" source="./media/monitor-changes-postgres-table-flink/start-the-sql-client.png" alt-text="Screenshot showing start-the-sql-client." border="true" lightbox="./media/monitor-changes-postgres-table-flink/start-the-sql-client.png":::
126-
127-
:::image type="content" source="./media/monitor-changes-postgres-table-flink/sql-client-status.png" alt-text="Screenshot showing sql-client-status." border="true" lightbox="./media/monitor-changes-postgres-table-flink/sql-client-status.png":::
128-
125+
```
126+
user@sshnode-0 [ ~ ]$ bin/sql-client.sh -j flink-sql-connector-postgres-cdc-2.4.2.jar -j slf4j-api-1.7.15.jar -j hamcrest-2.1.jar -j flink-shaded-guava-31.1-jre-17.0.jar -j awaitility-4.0.1.jar -j jsr308-all-1.1.2.jar
127+
128+
????????
129+
????????????????
130+
??????? ??????? ?
131+
???? ????????? ?????
132+
??? ??????? ?????
133+
??? ??? ?????
134+
?? ???????????????
135+
?? ? ??? ?????? ?????
136+
????? ???? ????? ?????
137+
??????? ??? ??????? ???
138+
????????? ?? ?? ??????????
139+
???????? ?? ? ?? ???????
140+
???? ??? ? ?? ???????? ?????
141+
???? ? ?? ? ?? ???????? ???? ??
142+
???? ???? ?????????? ??? ?? ????
143+
???? ?? ??? ??????????? ???? ? ? ???
144+
??? ?? ??? ????????? ???? ???
145+
?? ? ??????? ???????? ??? ??
146+
??? ??? ???????????????????? ???? ?
147+
????? ??? ?????? ???????? ???? ??
148+
???????? ??????????????? ??
149+
?? ???? ??????? ??? ?????? ?? ???
150+
??? ??? ??? ??????? ???? ?????????????
151+
??? ????? ???? ?? ?? ???? ???
152+
?? ??? ? ?? ?? ??
153+
?? ?? ?? ?? ????????
154+
?? ????? ?? ??????????? ??
155+
?? ???? ? ??????? ??
156+
??? ????? ?? ???????????
157+
???? ???? ??????? ????????
158+
????? ?? ???? ?????
159+
????????????????????????????????? ?????
160+
161+
______ _ _ _ _____ ____ _ _____ _ _ _ BETA
162+
| ____| (_) | | / ____|/ __ \| | / ____| (_) | |
163+
| |__ | |_ _ __ | | __ | (___ | | | | | | | | |_ ___ _ __ | |_
164+
| __| | | | '_ \| |/ / \___ \| | | | | | | | | |/ _ \ '_ \| __|
165+
| | | | | | | | < ____) | |__| | |____ | |____| | | __/ | | | |_
166+
|_| |_|_|_| |_|_|\_\ |_____/ \___\_\______| \_____|_|_|\___|_| |_|\__|
167+
168+
Welcome! Enter 'HELP;' to list all available commands. 'QUIT;' to exit.
169+
170+
Command history file path: /home/xcao/.flink-sql-history
171+
172+
Flink SQL>
173+
```
129174
130175
- Create a Flink PostgreSQL CDC table using CDC connector
131176
132177
```
133178
CREATE TABLE shipments (
134-
shipment_id INT,
135-
order_id INT,
136-
origin STRING,
137-
destination STRING,
138-
is_arrived BOOLEAN,
139-
PRIMARY KEY (shipment_id) NOT ENFORCED
140-
) WITH (
141-
'connector' = 'postgres-cdc',
142-
'hostname' = 'flinkpostgres.postgres.database.azure.com',
143-
'port' = '5432',
144-
'username' = 'username',
145-
'password' = 'admin',
146-
'database-name' = 'postgres',
147-
'schema-name' = 'public',
148-
'table-name' = 'shipments',
149-
'decoding.plugin.name' = 'pgoutput'
150-
);
179+
shipment_id INT,
180+
order_id INT,
181+
origin STRING,
182+
destination STRING,
183+
is_arrived BOOLEAN,
184+
PRIMARY KEY (shipment_id) NOT ENFORCED
185+
) WITH (
186+
'connector' = 'postgres-cdc',
187+
'hostname' = 'flinkpostgres.postgres.database.azure.com',
188+
'port' = '5432',
189+
'username' = 'username',
190+
....
191+
'database-name' = 'postgres',
192+
'schema-name' = 'public',
193+
'table-name' = 'shipments',
194+
'decoding.plugin.name' = 'pgoutput',
195+
'slot.name' = 'flink'
196+
);
151197
```
152198
## Validation
153199

articles/hdinsight-aks/flink/process-and-consume-data.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Using Apache Kafka® on HDInsight with Apache Flink® on HDInsight on AKS
33
description: Learn how to use Apache Kafka® on HDInsight with Apache Flink® on HDInsight on AKS
44
ms.service: hdinsight-aks
55
ms.topic: how-to
6-
ms.date: 10/27/2023
6+
ms.date: 03/28/2024
77
---
88

99
# Using Apache Kafka® on HDInsight with Apache Flink® on HDInsight on AKS
@@ -12,7 +12,7 @@ ms.date: 10/27/2023
1212

1313
A well known use case for Apache Flink is stream analytics. The popular choice by many users to use the data streams, which are ingested using Apache Kafka. Typical installations of Flink and Kafka start with event streams being pushed to Kafka, which can be consumed by Flink jobs.
1414

15-
This example uses HDInsight on AKS clusters running Flink 1.16.0 to process streaming data consuming and producing Kafka topic.
15+
This example uses HDInsight on AKS clusters running Flink 1.17.0 to process streaming data consuming and producing Kafka topic.
1616

1717
> [!NOTE]
1818
> FlinkKafkaConsumer is deprecated and will be removed with Flink 1.17, please use KafkaSource instead.
@@ -39,7 +39,7 @@ Flink provides an [Apache Kafka Connector](https://nightlies.apache.org/flink/fl
3939
<dependency>
4040
<groupId>org.apache.flink</groupId>
4141
<artifactId>flink-connector-kafka</artifactId>
42-
<version>1.16.0</version>
42+
<version>1.17.0</version>
4343
</dependency>
4444
```
4545

0 commit comments

Comments
 (0)