Skip to content

Commit e75bc5c

Browse files
committed
[SRVKE-1278]: Add kafka sink security config docs
1 parent aa9f82b commit e75bc5c

5 files changed

+102
-4
lines changed

modules/serverless-kafka-broker-sasl-default-config.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
[id="serverless-kafka-broker-sasl-default-config_{context}"]
77
= Configuring SASL authentication for Kafka brokers
88

9-
_Simple Authentication and Security Layer_ (SASL) is used by Apache Kafka for authentication. If you use SASL authentication on your cluster, users must provide credentials to Knative for communicating with the Kafka cluster, otherwise events cannot be produced or consumed.
9+
_Simple Authentication and Security Layer_ (SASL) is used by Apache Kafka for authentication. If you use SASL authentication on your cluster, users must provide credentials to Knative for communicating with the Kafka cluster; otherwise events cannot be produced or consumed.
1010

1111
.Prerequisites
1212

modules/serverless-kafka-sasl-channels.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
[id="serverless-kafka-sasl-channels_{context}"]
77
= Configuring SASL authentication for Kafka channels
88

9-
_Simple Authentication and Security Layer_ (SASL) is used by Apache Kafka for authentication. If you use SASL authentication on your cluster, users must provide credentials to Knative for communicating with the Kafka cluster, otherwise events cannot be produced or consumed.
9+
_Simple Authentication and Security Layer_ (SASL) is used by Apache Kafka for authentication. If you use SASL authentication on your cluster, users must provide credentials to Knative for communicating with the Kafka cluster; otherwise events cannot be produced or consumed.
1010

1111
.Prerequisites
1212

modules/serverless-kafka-sasl-source.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
[id="serverless-kafka-sasl-source_{context}"]
77
= Configuring SASL authentication for Kafka sources
88

9-
_Simple Authentication and Security Layer_ (SASL) is used by Apache Kafka for authentication. If you use SASL authentication on your cluster, users must provide credentials to Knative for communicating with the Kafka cluster, otherwise events cannot be produced or consumed.
9+
_Simple Authentication and Security Layer_ (SASL) is used by Apache Kafka for authentication. If you use SASL authentication on your cluster, users must provide credentials to Knative for communicating with the Kafka cluster; otherwise events cannot be produced or consumed.
1010

1111
.Prerequisites
1212

@@ -53,7 +53,7 @@ spec:
5353
secretKeyRef:
5454
name: <kafka_auth_secret>
5555
key: password
56-
saslType:
56+
type:
5757
secretKeyRef:
5858
name: <kafka_auth_secret>
5959
key: saslType
Lines changed: 91 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,91 @@
1+
// Module is included in the following assemblies:
2+
//
3+
// * serverless/admin_guide/serverless-kafka-admin.adoc
4+
5+
:_content-type: PROCEDURE
6+
[id="serverless-kafka-sink-security-config_{context}"]
7+
= Configuring security for Kafka sinks
8+
9+
_Transport Layer Security_ (TLS) is used by Apache Kafka clients and servers to encrypt traffic between Knative and Kafka, as well as for authentication. TLS is the only supported method of traffic encryption for Knative Kafka.
10+
11+
_Simple Authentication and Security Layer_ (SASL) is used by Apache Kafka for authentication. If you use SASL authentication on your cluster, users must provide credentials to Knative for communicating with the Kafka cluster; otherwise events cannot be produced or consumed.
12+
13+
.Prerequisites
14+
15+
* The {ServerlessOperatorName}, Knative Eventing, and the `KnativeKafka` custom resources (CRs) are installed on your {product-title} cluster.
16+
* Kafka sink is enabled in the `KnativeKafka` CR.
17+
* You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in {product-title}.
18+
* You have a Kafka cluster CA certificate stored as a `.pem` file.
19+
* You have a Kafka cluster client certificate and a key stored as `.pem` files.
20+
* You have installed the OpenShift (`oc`) CLI.
21+
* You have chosen the SASL mechanism to use, for example, `PLAIN`, `SCRAM-SHA-256`, or `SCRAM-SHA-512`.
22+
23+
.Procedure
24+
25+
. Create the certificate files as a secret in the same namespace as your `KafkaSink` object:
26+
+
27+
[IMPORTANT]
28+
====
29+
Certificates and keys must be in PEM format.
30+
====
31+
32+
** For authentication using SASL without encryption:
33+
+
34+
[source,terminal]
35+
----
36+
$ oc create secret -n <namespace> generic <secret_name> \
37+
--from-literal=protocol=SASL_PLAINTEXT \
38+
--from-literal=sasl.mechanism=<sasl_mechanism> \
39+
--from-literal=user=<username> \
40+
--from-literal=password=<password>
41+
----
42+
43+
** For authentication using SASL and encryption using TLS:
44+
+
45+
[source,terminal]
46+
----
47+
$ oc create secret -n <namespace> generic <secret_name> \
48+
--from-literal=protocol=SASL_SSL \
49+
--from-literal=sasl.mechanism=<sasl_mechanism> \
50+
--from-file=ca.crt=<my_caroot.pem_file_path> \ <1>
51+
--from-literal=user=<username> \
52+
--from-literal=password=<password>
53+
----
54+
<1> The `ca.crt` can be omitted to use the system's root CA set if you are using a public cloud managed Kafka service, such as Red Hat OpenShift Streams for Apache Kafka.
55+
56+
** For authentication and encryption using TLS:
57+
+
58+
[source,terminal]
59+
----
60+
$ oc create secret -n <namespace> generic <secret_name> \
61+
--from-literal=protocol=SSL \
62+
--from-file=ca.crt=<my_caroot.pem_file_path> \ <1>
63+
--from-file=user.crt=<my_cert.pem_file_path> \
64+
--from-file=user.key=<my_key.pem_file_path>
65+
----
66+
<1> The `ca.crt` can be omitted to use the system's root CA set if you are using a public cloud managed Kafka service, such as Red Hat OpenShift Streams for Apache Kafka.
67+
68+
. Create or modify a `KafkaSink` object and add a reference to your secret in the `auth` spec:
69+
+
70+
[source,yaml]
71+
----
72+
apiVersion: eventing.knative.dev/v1alpha1
73+
kind: KafkaSink
74+
metadata:
75+
name: <sink_name>
76+
namespace: <namespace>
77+
spec:
78+
...
79+
auth:
80+
secret:
81+
ref:
82+
name: <secret_name>
83+
...
84+
----
85+
86+
. Apply the `KafkaSink` object:
87+
+
88+
[source,terminal]
89+
----
90+
$ oc apply -f <filename>
91+
----

serverless/admin_guide/serverless-kafka-admin.adoc

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,12 +42,19 @@ Kafka clusters are generally secured by using the TLS or SASL authentication met
4242
Red Hat recommends that you enable both SASL and TLS together.
4343
====
4444

45+
// kafka broker security config
4546
include::modules/serverless-kafka-broker-tls-default-config.adoc[leveloffset=+2]
4647
include::modules/serverless-kafka-broker-sasl-default-config.adoc[leveloffset=+2]
48+
49+
// kafka channel security config
4750
include::modules/serverless-kafka-tls-channels.adoc[leveloffset=+2]
4851
include::modules/serverless-kafka-sasl-channels.adoc[leveloffset=+2]
4952
include::modules/serverless-kafka-sasl-source.adoc[leveloffset=+2]
5053

54+
// kafka sink security config
55+
include::modules/serverless-kafka-sink-security-config.adoc[leveloffset=+2]
56+
57+
// kafka broker general configmap
5158
include::modules/serverless-kafka-broker-configmap.adoc[leveloffset=+1]
5259

5360
[role="_additional-resources"]

0 commit comments

Comments
 (0)