Skip to content

Commit ab9046d

Browse files
authored
Merge pull request #42770 from abrennan89/SRVKE-1086
SRVKE-1086: Add Kafka sink TP docs
2 parents 4fbcb6e + 0e9b301 commit ab9046d

File tree

5 files changed

+88
-11
lines changed

5 files changed

+88
-11
lines changed

modules/serverless-create-kafka-channel-yaml.adoc

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,14 +7,13 @@
77
[id="serverless-create-kafka-channel-yaml_{context}"]
88
= Creating a Kafka channel by using YAML
99

10-
You can create a Kafka channel by using YAML to create the `KafkaChannel` object.
10+
You can create a Knative Eventing channel that is backed by Kafka topics. To do this, you must create a `KafkaChannel` object. The following procedure explains how you can create a `KafkaChannel` object by using YAML files and the `oc` CLI.
1111

1212
.Prerequisites
1313

1414
* The {ServerlessOperatorName}, Knative Eventing, and the `KnativeKafka` custom resource are installed on your {product-title} cluster.
1515
* You have installed the `oc` CLI.
1616
* You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in {product-title}.
17-
* You have installed the `oc` CLI.
1817
1918
.Procedure
2019

modules/serverless-install-kafka-odc.adoc

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,6 +28,8 @@ spec:
2828
bootstrapServers: <bootstrap_servers> <5>
2929
numPartitions: <num_partitions> <6>
3030
replicationFactor: <replication_factor> <7>
31+
sink:
32+
enabled: true <8>
3133
----
3234
<1> Enables developers to use the `KafkaChannel` channel type in the cluster.
3335
<2> A comma-separated list of bootstrap servers from your AMQ Streams cluster.
@@ -41,6 +43,7 @@ spec:
4143
====
4244
The `replicationFactor` value must be less than or equal to the number of nodes of your Red Hat AMQ Streams cluster.
4345
====
46+
<8> Enables developers to use a Kafka sink in the cluster.
4447

4548
.Prerequisites
4649

@@ -68,7 +71,7 @@ endif::[]
6871
+
6972
[IMPORTANT]
7073
====
71-
To use the Kafka channel, source, or broker on your cluster, you must toggle the *enabled* switch for the options you want to use to *true*. These switches are set to *false* by default. Additionally, to use the Kafka channel or broker, you must specify the bootstrap servers.
74+
To use the Kafka channel, source, broker, or sink on your cluster, you must toggle the *enabled* switch for the options you want to use to *true*. These switches are set to *false* by default. Additionally, to use the Kafka channel, broker or sink, you must specify the bootstrap servers.
7275
====
7376
.. Using the form is recommended for simpler configurations that do not require full control of *KnativeKafka* object creation.
7477
.. Editing the YAML is recommended for more complex configurations that require full control of *KnativeKafka* object creation. You can access the YAML by clicking the *Edit YAML* link in the top right of the *Create Knative Kafka* page.

modules/serverless-kafka-sink.adoc

Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
// Module included in the following assemblies:
2+
//
3+
// * serverless/develop/serverless-kafka-developer.adoc
4+
5+
:_content-type: PROCEDURE
6+
[id="serverless-kafka-sink_{context}"]
7+
= Using a Kafka sink
8+
9+
You can create an event sink called a Kafka sink, that sends events to a Kafka topic. To do this, you must create a `KafkaSink` object. The following procedure explains how you can create a `KafkaSink` object by using YAML files and the `oc` CLI.
10+
11+
.Prerequisites
12+
13+
* The {ServerlessOperatorName}, Knative Eventing, and the `KnativeKafka` custom resource (CR) are installed on your cluster.
14+
* You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in {product-title}.
15+
* You have access to a Red Hat AMQ Streams (Kafka) cluster that produces the Kafka messages you want to import.
16+
* You have installed the `oc` CLI.
17+
18+
.Procedure
19+
20+
. Create a `KafkaSink` object definition as a YAML file:
21+
+
22+
.Kafka sink YAML
23+
[source,yaml]
24+
----
25+
apiVersion: eventing.knative.dev/v1alpha1
26+
kind: KafkaSink
27+
metadata:
28+
name: <sink-name>
29+
namespace: <namespace>
30+
spec:
31+
topic: <topic-name>
32+
bootstrapServers:
33+
- <bootstrap-server>
34+
----
35+
36+
. To create the Kafka sink, apply the `KafkaSink` YAML file:
37+
+
38+
[source,terminal]
39+
----
40+
$ oc apply -f <filename>
41+
----
42+
43+
. Configure an event source so that the sink is specified in its spec:
44+
+
45+
.Example of a Kafka sink connected to an API server source
46+
[source,yaml]
47+
----
48+
apiVersion: sources.knative.dev/v1alpha2
49+
kind: ApiServerSource
50+
metadata:
51+
name: <source-name> <1>
52+
namespace: <namespace> <2>
53+
spec:
54+
serviceAccountName: <service-account-name> <3>
55+
mode: Resource
56+
resources:
57+
- apiVersion: v1
58+
kind: Event
59+
sink:
60+
ref:
61+
apiVersion: eventing.knative.dev/v1alpha1
62+
kind: KafkaSink
63+
name: <sink-name> <4>
64+
----
65+
<1> The name of the event source.
66+
<2> The namespace of the event source.
67+
<3> The service account for the event source.
68+
<4> The Kafka sink name.

serverless/admin_guide/serverless-kafka-admin.adoc

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -26,10 +26,8 @@ The `KnativeKafka` CR provides users with additional options, such as:
2626

2727
* Kafka source
2828
* Kafka channel
29-
* Kafka broker
30-
31-
:FeatureName: Kafka broker
32-
include::snippets/technology-preview.adoc[leveloffset=+1]
29+
* Kafka broker (Technology Preview)
30+
* Kafka sink (Technology Preview)
3331
3432
include::modules/serverless-install-kafka-odc.adoc[leveloffset=+1]
3533

serverless/develop/serverless-kafka-developer.adoc

Lines changed: 13 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,10 +21,8 @@ Knative Kafka provides additional options, such as:
2121

2222
* Kafka source
2323
* Kafka channel
24-
* Kafka broker
25-
26-
:FeatureName: Kafka broker
27-
include::snippets/technology-preview.adoc[leveloffset=+1]
24+
* Kafka broker (Technology Preview)
25+
* Kafka sink (Technology Preview)
2826
2927
include::modules/serverless-kafka-event-delivery.adoc[leveloffset=+1]
3028

@@ -61,6 +59,17 @@ include::modules/serverless-kafka-broker.adoc[leveloffset=+2]
6159
// Kafka channels
6260
include::modules/serverless-create-kafka-channel-yaml.adoc[leveloffset=+1]
6361

62+
[id="serverless-kafka-developer-sink"]
63+
== Kafka sink
64+
65+
:FeatureName: Kafka sink
66+
include::snippets/technology-preview.adoc[leveloffset=+2]
67+
68+
Kafka sinks are a type of xref:../../serverless/develop/serverless-event-sinks.adoc#serverless-event-sinks[event sink] that are available if a cluster administrator has enabled Kafka on your cluster. You can send events directly from an xref:../../serverless/discover/knative-event-sources.adoc#knative-event-sources[event source] to a Kafka topic by using a Kafka sink.
69+
70+
// Kafka sink
71+
include::modules/serverless-kafka-sink.adoc[leveloffset=+2]
72+
6473
[id="additional-resources_serverless-kafka-developer"]
6574
[role="_additional-resources"]
6675
== Additional resources

0 commit comments

Comments
 (0)