You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/hdinsight/kafka/apache-kafka-ssl-encryption-authentication.md
+22-11Lines changed: 22 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -17,8 +17,9 @@ This article shows you how to set up Transport Layer Security (TLS) encryption,
17
17
> [!Important]
18
18
> There are two clients which you can use for Kafka applications: a Java client and a console client. Only the Java client `ProducerConsumer.java` can use TLS for both producing and consuming. The console producer client `console-producer.sh` does not work with TLS.
19
19
20
-
> [!Note]
20
+
> [!Note]
21
21
> HDInsight Kafka console producer with version 1.1 does not support SSL.
22
+
22
23
## Apache Kafka broker setup
23
24
24
25
The Kafka TLS broker setup will use four HDInsight cluster VMs in the following way:
@@ -131,7 +132,7 @@ To complete the configuration modification, do the following steps:
131
132
132
133

133
134
134
-
1. Add new configuration properties to the server.properties file.
135
+
1. For HDI version 3.6, go to Ambari UI and add the following configurations under **Advanced kafka-env** and the **kafka-env template** property.
135
136
136
137
```bash
137
138
# Configure Kafka to advertise IP addresses instead of FQDN
@@ -146,18 +147,17 @@ To complete the configuration modification, do the following steps:
1. Start the admin client with producer and consumer options to verify that both producers and consumers are working on port 9093. Please refer to [Verification](apache-kafka-ssl-encryption-authentication.md#verification) section below for steps needed to verify the setup using console producer/consumer.
215
+
214
216
## Client setup (with authentication)
215
217
216
218
> [!Note]
@@ -273,17 +275,24 @@ The details of each step are given below.
273
275
scp ca-cert sshuser@HeadNode1_Name:~/ssl/ca-cert
274
276
```
275
277
276
-
1. Create client store with signed cert, and import ca cert into the keystore and truststore:
278
+
1. Sign in to the client machine (standby head node) and navigate to ssl directory.
1. Create a file `client-ssl-auth.properties`. It should have the following lines:
295
+
1. Create a file `client-ssl-auth.properties` on client machine (hn1) . It should have the following lines:
287
296
288
297
```bash
289
298
security.protocol=SSL
@@ -296,6 +305,8 @@ The details of each step are given below.
296
305
297
306
## Verification
298
307
308
+
Run these steps on the client machine.
309
+
299
310
> [!Note]
300
311
> If HDInsight 4.0 and Kafka 2.1 is installed, you can use the console producer/consumers to verify your setup. If not, run the Kafka producer on port 9092 and send messages to the topic, and then use the Kafka consumer on port 9093 which uses TLS.
0 commit comments