You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/hdinsight/kafka/apache-kafka-streams-api.md
+20-21Lines changed: 20 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,13 +1,13 @@
1
1
---
2
2
title: 'Tutorial: Use the Apache Kafka Streams API - Azure HDInsight '
3
3
description: Tutorial - Learn how to use the Apache Kafka Streams API with Kafka on HDInsight. This API enables you to perform stream processing between topics in Kafka.
4
-
ms.service: hdinsight
5
4
author: hrasheed-msft
6
5
ms.author: hrasheed
7
6
ms.reviewer: jasonh
7
+
ms.service: hdinsight
8
8
ms.custom: hdinsightactive
9
9
ms.topic: tutorial
10
-
ms.date: 06/25/2019
10
+
ms.date: 10/08/2019
11
11
#Customer intent: As a developer, I need to create an application that uses the Kafka streams API with Kafka on HDInsight
12
12
---
13
13
@@ -57,9 +57,9 @@ The important things to understand in the `pom.xml` file are:
57
57
```xml
58
58
<!-- Kafka client for producer/consumer operations -->
59
59
<dependency>
60
-
<groupId>org.apache.kafka</groupId>
61
-
<artifactId>kafka-clients</artifactId>
62
-
<version>${kafka.version}</version>
60
+
<groupId>org.apache.kafka</groupId>
61
+
<artifactId>kafka-clients</artifactId>
62
+
<version>${kafka.version}</version>
63
63
</dependency>
64
64
```
65
65
@@ -68,7 +68,7 @@ The important things to understand in the `pom.xml` file are:
68
68
* Plugins: Maven plugins provide various capabilities. In this project, the following plugins are used:
69
69
70
70
* `maven-compiler-plugin`: Used to set the Java version used by the project to 8. Java 8 is required by HDInsight 3.6.
71
-
* `maven-shade-plugin`: Used to generate an uber jar that contains this application as well as any dependencies. It is also used to set the entry point of the application, so that you can directly run the Jar file without having to specify the main class.
71
+
* `maven-shade-plugin`: Used to generate an uber jar that contains this application as well as any dependencies. It's also used to set the entry point of the application, so that you can directly run the Jar file without having to specify the main class.
72
72
73
73
### Stream.java
74
74
@@ -155,32 +155,31 @@ To build and deploy the project to your Kafka on HDInsight cluster, use the foll
155
155
sudo apt -y install jq
156
156
```
157
157
158
-
3. Set up environment variables. Replace `PASSWORD` and `CLUSTERNAME` with the cluster login password and cluster name respectively, then enter the command:
158
+
3. Set up password variable. Replace `PASSWORD` with the cluster login password, then enter the command:
159
159
160
160
```bash
161
161
export password='PASSWORD'
162
-
export clusterNameA='CLUSTERNAME'
163
162
```
164
163
165
-
4. Extract correctly cased cluster name. The actual casing of the cluster name may be different than you expect, depending on how the cluster was created. This command will obtain the actual casing, store it in a variable, and then display the correctly cased name, and the name you provided earlier. Enter the following command:
166
-
164
+
4. Extract correctly cased cluster name. The actual casing of the cluster name may be different than you expect, depending on how the cluster was created. This command will obtain the actual casing, and then store it in a variable. Enter the following command:
> If you're doing this process from outside the cluster, there is a different procedure for storing the cluster name. Get the cluster name in lower case from the Azure portal. Then, substitute the cluster name for `<clustername>` in the following command and execute it: `export clusterName='<clustername>'`.
171
+
173
172
5. To get the Kafka broker hosts and the Apache Zookeeper hosts, use the following commands. When prompted, enter the password for the cluster login (admin) account. You are prompted for the password twice.
> These commands require Ambari access. If your cluster is behind an NSG, run these commands from a machine that can access Ambari.
182
+
184
183
6. To create the topics used by the streaming operation, use the following commands:
185
184
186
185
> [!NOTE]
@@ -227,7 +226,7 @@ To build and deploy the project to your Kafka on HDInsight cluster, use the foll
227
226
The `--property` parameters tell the console consumer to print the key (word) along with the count (value). This parameter also configures the deserializer to use when reading these values from Kafka.
228
227
229
228
The output is similar to the following text:
230
-
229
+
231
230
dwarfs 13635
232
231
ago 13664
233
232
snow 13636
@@ -269,4 +268,4 @@ To remove the resource group using the Azure portal:
269
268
In this document, you learned how to use the Apache Kafka Streams API with Kafka on HDInsight. Use the following to learn more about working with Kafka.
0 commit comments