Skip to content

Commit 7486eba

Browse files
Fixes #316: Required Kafka ACLs not documented (#447)
1 parent 88dd281 commit 7486eba

File tree

1 file changed

+80
-2
lines changed

1 file changed

+80
-2
lines changed

doc/docs/modules/ROOT/pages/kafka-ssl.adoc

Lines changed: 80 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -195,5 +195,83 @@ kafka.sasl.mechanism=PLAIN
195195

196196
For more information, please consult the official Confluent documentation at the following links:
197197

198-
* https://docs.confluent.io/2.0.0/kafka/sasl.html#authentication-using-sasl
199-
* https://docs.confluent.io/2.0.0/kafka/sasl.html#configuring-kafka-clients
198+
* https://docs.confluent.io/platform/current/kafka/authentication_sasl/index.html
199+
200+
== Authorization with ACL's
201+
202+
To configure use with ACLs, the following configuration properties are required:
203+
204+
[source, properties]
205+
----
206+
kafka.authorizer.class.name=kafka.security.authorizer.AclAuthorizer
207+
kafka.zookeeper.set.acl=true
208+
----
209+
210+
[NOTE]
211+
* `kafka.security.authorizer.AclAuthorizer` (the default Kafka authorizer implementation), was introduced in Apache Kafka 2.4/Confluent Platform 5.4.0. If you are running a previous version, then use SimpleAclAuthorizer (`kafka.security.auth.SimpleAclAuthorizer`). If you are using the Confluent platform, you can use also the LDAP authorizer (please refer to the official Confluent documentation for further details: https://docs.confluent.io/platform/current/security/ldap-authorizer/quickstart.html)
212+
* Please consider that `zookeeper.set.acl` is **false** by default
213+
214+
From the official Kafka documentation you can find that if a resource has no associated ACLs, then no one is allowed to access that resource except super users.
215+
If this is the case in your Kafka cluster, then you have also to add the following:
216+
217+
[source, properties]
218+
----
219+
kafka.allow.everyone.if.no.acl.found=true
220+
----
221+
222+
[NOTE]
223+
Be very careful on using the above property because, as the property name implies, it will allow access to everyone if no acl were found
224+
225+
If super users are specified, then include also:
226+
227+
[source,properties]
228+
----
229+
kafka.super.users=...
230+
----
231+
232+
Moreover, if you change the default user name (principal) mapping rule then you have to add also the following properties:
233+
234+
* If you used SSL encryption, then:
235+
236+
+
237+
[source, properties]
238+
----
239+
kafka.ssl.principal.mapping.rules=...
240+
----
241+
242+
* If you used SASL encryption (probably so, if you have Kerberos environment), then:
243+
244+
+
245+
[source, properties]
246+
----
247+
kafka.sasl.kerberos.principal.to.local.rules=...
248+
----
249+
250+
Furthermore, if you want to ensure that also the brokers communicates with each other using Kerberos,
251+
you have to specify the following property, which is anyway not required for the ACLs purposes:
252+
253+
[source, properties]
254+
----
255+
kafka.security.inter.broker.protocol=SASL_SSL
256+
----
257+
258+
[NOTE]
259+
The last property is `PLAIN` by default
260+
261+
To make the plugin work properly, the following operations must be authorized for Topic and Cluster resource types:
262+
263+
* **Write**, when you want to use the plugin as a Source
264+
* **Read**, when you want to use the plugin as a Sink
265+
* **DescribeConfigs** and **Describe**, because the plugin uses the following 2 Kafka AdminClient API:
266+
** listTopics
267+
** describeCluster
268+
269+
To use streams procedures, the same operations must be authorized (read or write) depending on which of the procedures you wish to use. The permissions required by the procedures and the source/sink operations are the same.
270+
271+
For further details on how to setup and define ACLs on Kafka, please refer to official Confluent Kafka documentation:
272+
273+
* https://docs.confluent.io/platform/current/kafka/authorization.html#kafka-authorization
274+
275+
[NOTE]
276+
This section applies only to the Neo4j Streams plugin and not to the Kafka Connect plugin. This because it's Kafka Connect plugin that takes care about the authorizations.
277+
The only special case for the Kafka Connect plugin is when you use the DLQ. If so, you have to define the **Write** authorization the DLQ producer needs.

0 commit comments

Comments
 (0)