Skip to content

Commit ef85801

Browse files
authored
Merge pull request #110668 from dagiro/kafka6
kafka6
2 parents 6762972 + 9e83ff8 commit ef85801

File tree

1 file changed

+22
-11
lines changed

1 file changed

+22
-11
lines changed

articles/hdinsight/kafka/apache-kafka-ssl-encryption-authentication.md

Lines changed: 22 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,9 @@ This article shows you how to set up Transport Layer Security (TLS) encryption,
1717
> [!Important]
1818
> There are two clients which you can use for Kafka applications: a Java client and a console client. Only the Java client `ProducerConsumer.java` can use TLS for both producing and consuming. The console producer client `console-producer.sh` does not work with TLS.
1919
20-
> [!Note]
20+
> [!Note]
2121
> HDInsight Kafka console producer with version 1.1 does not support SSL.
22+
2223
## Apache Kafka broker setup
2324

2425
The Kafka TLS broker setup will use four HDInsight cluster VMs in the following way:
@@ -131,7 +132,7 @@ To complete the configuration modification, do the following steps:
131132
132133
![Editing kafka ssl configuration properties in Ambari](./media/apache-kafka-ssl-encryption-authentication/editing-configuration-ambari2.png)
133134
134-
1. Add new configuration properties to the server.properties file.
135+
1. For HDI version 3.6, go to Ambari UI and add the following configurations under **Advanced kafka-env** and the **kafka-env template** property.
135136
136137
```bash
137138
# Configure Kafka to advertise IP addresses instead of FQDN
@@ -146,18 +147,17 @@ To complete the configuration modification, do the following steps:
146147
echo "ssl.truststore.password=MyServerPassword123" >> /usr/hdp/current/kafka-broker/conf/server.properties
147148
```
148149
149-
1. Go to Ambari configuration UI and verify that the new properties show up under **Advanced kafka-env** and the **kafka-env template** property.
150+
1. Here is the screenshot that shows Ambari configuration UI with these changes.
150151
151152
For HDI version 3.6:
152153
153154
![Editing kafka-env template property in Ambari](./media/apache-kafka-ssl-encryption-authentication/editing-configuration-kafka-env.png)
154155
155156
For HDI version 4.0:
156157
157-
![Editing kafka-env template property in Ambari four](./media/apache-kafka-ssl-encryption-authentication/editing-configuration-kafka-env-four.png)
158+
![Editing kafka-env template property in Ambari four](./media/apache-kafka-ssl-encryption-authentication/editing-configuration-kafka-env-four.png)
158159
159160
1. Restart all Kafka brokers.
160-
1. Start the admin client with producer and consumer options to verify that both producers and consumers are working on port 9093.
161161
162162
## Client setup (without authentication)
163163
@@ -203,14 +203,16 @@ These steps are detailed in the following code snippets.
203203
keytool -keystore kafka.client.keystore.jks -alias CARoot -import -file ca-cert -storepass "MyClientPassword123" -keypass "MyClientPassword123" -noprompt
204204
```
205205

206-
1. Create the file `client-ssl-auth.properties`. It should have the following lines:
206+
1. Create the file `client-ssl-auth.properties` on client machine (hn1) . It should have the following lines:
207207

208208
```config
209209
security.protocol=SSL
210210
ssl.truststore.location=/home/sshuser/ssl/kafka.client.truststore.jks
211211
ssl.truststore.password=MyClientPassword123
212212
```
213213

214+
1. Start the admin client with producer and consumer options to verify that both producers and consumers are working on port 9093. Please refer to [Verification](apache-kafka-ssl-encryption-authentication.md#verification) section below for steps needed to verify the setup using console producer/consumer.
215+
214216
## Client setup (with authentication)
215217

216218
> [!Note]
@@ -273,17 +275,24 @@ The details of each step are given below.
273275
scp ca-cert sshuser@HeadNode1_Name:~/ssl/ca-cert
274276
```
275277

276-
1. Create client store with signed cert, and import ca cert into the keystore and truststore:
278+
1. Sign in to the client machine (standby head node) and navigate to ssl directory.
277279

278280
```bash
279-
keytool -keystore kafka.client.keystore.jks -import -file client-cert-signed -storepass MyClientPassword123 -keypass MyClientPassword123 -noprompt
281+
ssh sshuser@HeadNode1_Name
282+
cd ssl
283+
```
284+
285+
1. Create client store with signed cert, and import ca cert into the keystore and truststore on client machine (hn1):
286+
287+
```bash
288+
keytool -keystore kafka.client.truststore.jks -alias CARoot -import -file ca-cert -storepass "MyClientPassword123" -keypass "MyClientPassword123" -noprompt
280289
281-
keytool -keystore kafka.client.keystore.jks -alias CARoot -import -file ca-cert -storepass MyClientPassword123 -keypass MyClientPassword123 -noprompt
290+
keytool -keystore kafka.client.keystore.jks -alias CARoot -import -file ca-cert -storepass "MyClientPassword123" -keypass "MyClientPassword123" -noprompt
282291
283-
keytool -keystore kafka.client.truststore.jks -alias CARoot -import -file ca-cert -storepass MyClientPassword123 -keypass MyClientPassword123 -noprompt
292+
keytool -keystore kafka.client.keystore.jks -import -file client-cert-signed -storepass "MyClientPassword123" -keypass "MyClientPassword123" -noprompt
284293
```
285294

286-
1. Create a file `client-ssl-auth.properties`. It should have the following lines:
295+
1. Create a file `client-ssl-auth.properties` on client machine (hn1) . It should have the following lines:
287296

288297
```bash
289298
security.protocol=SSL
@@ -296,6 +305,8 @@ The details of each step are given below.
296305

297306
## Verification
298307

308+
Run these steps on the client machine.
309+
299310
> [!Note]
300311
> If HDInsight 4.0 and Kafka 2.1 is installed, you can use the console producer/consumers to verify your setup. If not, run the Kafka producer on port 9092 and send messages to the topic, and then use the Kafka consumer on port 9093 which uses TLS.
301312

0 commit comments

Comments
 (0)