Skip to content

Commit 4ac24b2

Browse files
authored
Merge pull request #269911 from sreekzz/patch-15
Flink updates Batch 2
2 parents 670a9e1 + a5eb558 commit 4ac24b2

16 files changed

+250
-88
lines changed

articles/hdinsight-aks/flink/change-data-capture-connectors-for-apache-flink.md

Lines changed: 17 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: How to perform Change Data Capture of SQL Server with Apache Flink® Data
33
description: Learn how to perform Change Data Capture of SQL Server with Apache Flink® DataStream API and DataStream Source.
44
ms.service: hdinsight-aks
55
ms.topic: how-to
6-
ms.date: 08/29/2023
6+
ms.date: 03/22/2024
77
---
88

99
# Change Data Capture of SQL Server with Apache Flink® DataStream API and DataStream Source on HDInsight on AKS
@@ -155,10 +155,10 @@ In the below snippet, we use HDInsight Kafka 2.4.1. Based on your usage, update
155155
<properties>
156156
<maven.compiler.source>1.8</maven.compiler.source>
157157
<maven.compiler.target>1.8</maven.compiler.target>
158-
<flink.version>1.16.0</flink.version>
158+
<flink.version>1.17.0</flink.version>
159159
<java.version>1.8</java.version>
160160
<scala.binary.version>2.12</scala.binary.version>
161-
<kafka.version>2.4.1</kafka.version> // Replace with 3.2 if you're using HDInsight Kafka 3.2
161+
<kafka.version>3.2.0</kafka.version> // Replace with 3.2 if you're using HDInsight Kafka 3.2
162162
</properties>
163163
<dependencies>
164164
<dependency>
@@ -284,6 +284,19 @@ public class mssqlSinkToKafka {
284284
}
285285
```
286286

287+
### Submit job to Flink
288+
289+
* On Webssh pod
290+
291+
```
292+
bin/flink run -c contoso.example.mssqlSinkToKafka -j FlinkSQLServerCDCDemo-1.0-SNAPSHOT.jar
293+
Job has been submitted with JobID abccf644ae13a8028d7e232b85bd507f
294+
```
295+
* On Flink UI make the following change.
296+
297+
:::image type="content" source="./media/change-data-capture-connectors-for-apache-flink/flink-ui.png" alt-text="Screenshot showing the Flink UI.":::
298+
299+
287300
### Validation
288301

289302
- Insert four rows into table order on sqlserver, then check on Kafka
@@ -306,7 +319,7 @@ public class mssqlSinkToKafka {
306319

307320
:::image type="content" source="./media/change-data-capture-connectors-for-apache-flink/check-changes-on-kafka-for-id-107.png" alt-text="Screenshot showing changes in Kafka for updated ID 107.":::
308321

309-
- Delete `product_id=107` on sqlserver
322+
- Delete `product_id=107` on
310323

311324
:::image type="content" source="./media/change-data-capture-connectors-for-apache-flink/delete-product-id-107-on-sql-server.png" alt-text="Screenshot showing how to delete product ID 107.":::
312325

articles/hdinsight-aks/flink/datastream-api-mongodb.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
---
22
title: Use DataStream API for MongoDB as a source and sink with Apache Flink®
3-
description: Learn how to use Apache Flink® DataStream API on HDInsight on AKS for MongoDB as a source and sink
3+
description: Learn how to use Apache Flink® DataStream API on HDInsight on AKS for MongoDB as a source and sink.
44
ms.service: hdinsight-aks
55
ms.topic: how-to
6-
ms.date: 10/30/2023
6+
ms.date: 03/22/2024
77
---
88

99
# Use Apache Flink® DataStream API on HDInsight on AKS for MongoDB as a source and sink
@@ -12,24 +12,24 @@ ms.date: 10/30/2023
1212

1313
Apache Flink provides a MongoDB connector for reading and writing data from and to MongoDB collections with at-least-once guarantees.
1414

15-
This example demonstrates on how to use Apache Flink 1.16.0 on HDInsight on AKS along with your existing MongoDB as Sink and Source with Flink DataStream API MongoDB connector.
15+
This example demonstrates on how to use Apache Flink 1.17.0 on HDInsight on AKS along with your existing MongoDB as Sink and Source with Flink DataStream API MongoDB connector.
1616

17-
MongoDB is a non-relational document database that provides support for JSON-like storage that helps store complex structures easily.
17+
MongoDB is a nonrelational document database that provides support for JSON-like storage that helps store complex structures easily.
1818

1919
In this example, you learn how to use MongoDB to source and sink with DataStream API.
2020

2121
## Prerequisites
2222

23-
* [Flink cluster 1.16.0 on HDInsight on AKS](../flink/flink-create-cluster-portal.md)
23+
* [HDInsight on AKS - Flink 1.17.0 Cluster](../flink/flink-create-cluster-portal.md)
2424
* For this demonstration, use a Window VM as maven project develop env in the same VNET as HDInsight on AKS.
2525
* We use the [MongoDB Connector](https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/datastream/mongodb/)
2626
* For this demonstration, use an Ubuntu VM in the same VNET as HDInsight on AKS, install a MongoDB on this VM.
2727

2828
## Installation of MongoDB on Ubuntu VM
2929

30-
[Install MongoDB on Ubuntu](https://www.mongodb.com/docs/manual/tutorial/install-mongodb-on-ubuntu/)
30+
[Install MongoDB on Ubuntu](https://www.mongodb.com/docs/manual/tutorial/install-mongodb-on-ubuntu/).
3131

32-
[MongoDB Shell commands](https://www.mongodb.com/docs/mongodb-shell/run-commands/)
32+
[MongoDB Shell commands](https://www.mongodb.com/docs/mongodb-shell/run-commands/).
3333

3434
**Prepare MongoDB environment**:
3535
```
@@ -97,7 +97,7 @@ net:
9797

9898
## Get started
9999

100-
### Create a maven project on IdeaJ, to prepare the pom.xml for MongoDB Collection
100+
### Create a maven project on IdeaJ to prepare the pom.xml for MongoDB Collection
101101

102102
``` xml
103103
<?xml version="1.0" encoding="UTF-8"?>
@@ -300,7 +300,7 @@ public class Event {
300300
```
301301
### Use MongoDB as a source and sink to ADLS Gen2
302302

303-
Write a program for MongoDB as a source and sink to ADLS Gen2
303+
Write a program for MongoDB as a source and sink to ADLS Gen2.
304304

305305
**MongoDBSourceDemo.java**
306306
``` java
@@ -373,7 +373,7 @@ public class MongoDBSourceDemo {
373373
```
374374
### Package the maven jar, and submit to Apache Flink UI
375375

376-
Package the maven jar, upload it to Storage and then wget it to [Flink CLI](./flink-web-ssh-on-portal-to-flink-sql.md) or directly upload to Flink UI to run.
376+
Package the maven jar, and upload it to Storage and then wget it to [Flink CLI](./flink-web-ssh-on-portal-to-flink-sql.md) or directly upload to Flink UI to run.
377377

378378
:::image type="content" source="./media/datastream-api-mongodb/step-3-1-maven-jar-upload-abfs.png" alt-text="Screenshot displays how to upload package to storage." border="true" lightbox="./media/datastream-api-mongodb/step-3-1-maven-jar-upload-abfs.png":::
379379

articles/hdinsight-aks/flink/fabric-lakehouse-flink-datastream-api.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Microsoft Fabric with Apache Flink® in HDInsight on AKS
33
description: An introduction to lakehouse on Microsoft Fabric with Apache Flink® on HDInsight on AKS
44
ms.service: hdinsight-aks
55
ms.topic: conceptual
6-
ms.date: 08/29/2023
6+
ms.date: 03/23/2024
77
---
88
# Connect to OneLake in Microsoft Fabric with HDInsight on AKS cluster for Apache Flink®
99

@@ -56,7 +56,7 @@ This step illustrates, that we package dependencies needed for onelakeDemo
5656
<properties>
5757
<maven.compiler.source>1.8</maven.compiler.source>
5858
<maven.compiler.target>1.8</maven.compiler.target>
59-
<flink.version>1.16.0</flink.version>
59+
<flink.version>1.17.0</flink.version>
6060
<java.version>1.8</java.version>
6161
<scala.binary.version>2.12</scala.binary.version>
6262
</properties>
@@ -202,13 +202,17 @@ public class onelakeDemo {
202202
}
203203
}
204204
```
205-
### Package the jar and submit to Flink
205+
### Package the jar and upload it into Webssh and submit the job:
206206

207-
Here, we use the packaged jar and submit to Flink cluster in HDInsight on AKS
207+
`bin/flink run -c contoso.example.onelakeDemo -j OneLakeDemo-1.0-SNAPSHOT.jar`
208208

209-
:::image type="content" source="./media/fabric-lakehouse-flink-datastream-api/jar-submit-flink-step-1.png" alt-text="Screenshot showing How to submit packaged jar and submitting to Flink cluster - step 1." border="true" lightbox="./media/fabric-lakehouse-flink-datastream-api/jar-submit-flink-step-1.png":::
209+
:::image type="content" source="./media/fabric-lakehouse-flink-datastream-api/package-the-jar-file.png" alt-text="Screenshot showing how to package the jar file." border="true" lightbox="./media/fabric-lakehouse-flink-datastream-api/package-the-jar-file.png":::
210+
211+
Check Job running on Flink UI:
212+
213+
214+
:::image type="content" source="./media/fabric-lakehouse-flink-datastream-api/check-job-runs-on-flink-ui.png" alt-text="Screenshot showing how to check job runs on Flink UI." border="true" lightbox="./media/fabric-lakehouse-flink-datastream-api/check-job-runs-on-flink-ui.png":::
210215

211-
:::image type="content" source="./media/fabric-lakehouse-flink-datastream-api/jar-submit-flink-step-2.png" alt-text="Screenshot showing How to submit packaged jar and submitting to Flink cluster - step 2." border="true" lightbox="./media/fabric-lakehouse-flink-datastream-api/jar-submit-flink-step-2.png":::
212216

213217
### Results on Microsoft Fabric
214218

69.3 KB
Loading
126 KB
Loading
74.6 KB
Loading
-55.3 KB
Loading
-48.2 KB
Loading
109 KB
Loading
6.36 KB
Loading

0 commit comments

Comments
 (0)