Skip to content

Commit ad2fae0

Browse files
committed
Add docs on creating Kafka source using the kn plugin
Add sections: 1. kn plugins general info 2. using kafka source using kn plugin Improvements in the kn kafka plugin docs Attempt to fix build failure Remove problematic xref Attempt to fix a bad xref Multiple improvements Remove one of the two verification options s/test-topic/my-topic and s/test-consumer-group/my-consumer-group in procedure Fix kn plugin name Fix image spec in command invocation Implement peer review feedback Miniscule style fixes
1 parent 1fd3c7b commit ad2fae0

File tree

4 files changed

+126
-0
lines changed

4 files changed

+126
-0
lines changed
Lines changed: 109 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,109 @@
1+
// Module included in the following assemblies:
2+
//
3+
// * serverless/event_sources/serverless-kafka-source.adoc
4+
[id="serverless-kafka-source-kn_{context}"]
5+
= Creating a Kafka event source by using the kn CLI
6+
7+
This section describes how to create a Kafka event source by using the `kn` command.
8+
9+
:FeatureName: Creating a Kafka event source by using the `kn` CLI
10+
include::../modules/technology-preview.adoc[leveloffset=+0]
11+
12+
.Prerequisites
13+
14+
* The {ServerlessOperatorName}, Knative Eventing, Knative Serving, and the `KnativeKafka` custom resource are installed on your cluster.
15+
* You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in {product-title}.
16+
* You have access to a Red Hat AMQ Streams (Kafka) cluster that produces the Kafka messages you want to import.
17+
18+
.Procedure
19+
20+
. To verify that the Kafka event source is working, create a Knative service that dumps incoming events into the service logs:
21+
+
22+
[source, terminal]
23+
----
24+
$ kn service create event-display \
25+
--image quay.io/openshift-knative/knative-eventing-sources-event-display
26+
----
27+
28+
. Create a `KafkaSource` resource:
29+
+
30+
[source,terminal]
31+
----
32+
$ kn source kafka create mykafkasrc \
33+
--servers my-cluster-kafka-bootstrap.kafka.svc:9092 \
34+
--topics my-topic --consumergroup my-consumer-group \
35+
--sink event-display
36+
----
37+
+
38+
The `--servers`, `--topics`, and `--consumergroup` options specify the connection parameters to the Kafka cluster. The `--consumergroup` option is optional.
39+
40+
. Optional: View details about the `KafkaSource` resource you created:
41+
+
42+
[source, terminal]
43+
----
44+
$ kn source kafka describe mykafkasrc
45+
----
46+
+
47+
.Example output
48+
[source, terminal]
49+
----
50+
Name: mykafkasrc
51+
Namespace: kafka
52+
Age: 1h
53+
BootstrapServers: my-cluster-kafka-bootstrap.kafka.svc:9092
54+
Topics: my-topic
55+
ConsumerGroup: my-consumer-group
56+
57+
Sink:
58+
Name: event-display
59+
Namespace: default
60+
Resource: Service (serving.knative.dev/v1)
61+
62+
Conditions:
63+
OK TYPE AGE REASON
64+
++ Ready 1h
65+
++ Deployed 1h
66+
++ SinkProvided 1h
67+
----
68+
69+
.Verification steps
70+
71+
. Trigger the Kafka instance to send a message to the topic:
72+
+
73+
[source,terminal]
74+
----
75+
$ oc -n kafka run kafka-producer \
76+
-ti --image=quay.io/strimzi/kafka:latest-kafka-2.7.0 --rm=true \
77+
--restart=Never -- bin/kafka-console-producer.sh \
78+
--broker-list my-cluster-kafka-bootstrap:9092 --topic my-topic
79+
----
80+
+
81+
Enter the message in the prompt. This command assumes that:
82+
+
83+
* The Kafka cluster is installed in the `kafka` namespace.
84+
* The `KafkaSource` object has been configured to use the `my-topic` topic.
85+
86+
. Verify that the message arrived by viewing the logs:
87+
+
88+
[source,terminal]
89+
----
90+
$ oc logs $(oc get pod -o name | grep event-display) -c user-container
91+
----
92+
+
93+
.Example output
94+
[source,terminal]
95+
----
96+
☁️ cloudevents.Event
97+
Validation: valid
98+
Context Attributes,
99+
specversion: 1.0
100+
type: dev.knative.kafka.event
101+
source: /apis/v1/namespaces/default/kafkasources/mykafkasrc#my-topic
102+
subject: partition:46#0
103+
id: partition:46/offset:0
104+
time: 2021-03-10T11:21:49.4Z
105+
Extensions,
106+
traceparent: 00-161ff3815727d8755848ec01c866d1cd-7ff3916c44334678-00
107+
Data,
108+
Hello!
109+
----
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
// Module is included in the following assemblies:
2+
//
3+
// serverless/installing-kn.adoc
4+
5+
[id="serverless-kn-plugins_{context}"]
6+
= kn plug-ins
7+
8+
The Red Hat distribution of the CLI command `kn` contains extensions of `kn` known as plug-ins. The main use of plug-ins is to extend `kn` by providing additional commands on {ServerlessProductName}, such as for the Kafka stack API. The `kn` plug-ins are used in the same way as the main `kn` functionality.
9+
10+
Currently, Red Hat provides the `kn-source-kafka` plug-in as a Technology Preview feature.

serverless/event_sources/serverless-kafka-source.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ include::modules/technology-preview.adoc[leveloffset=+2]
1111
The Apache Kafka event source brings messages into Knative. It reads events from an Apache Kafka cluster and passes these events to an event sink so that they can be consumed. You can use the `KafkaSource` event source with {ServerlessProductName}.
1212

1313
include::modules/serverless-kafka-source-odc.adoc[leveloffset=+1]
14+
include::modules/serverless-kafka-source-kn.adoc[leveloffset=+1]
1415
include::modules/serverless-kafka-source-yaml.adoc[leveloffset=+1]
1516

1617
[id="serverless-kafka-source-additional-resources"]

serverless/installing-kn.adoc

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,3 +20,9 @@ include::modules/serverless-installing-cli-linux-rpm.adoc[leveloffset=+1]
2020
include::modules/serverless-installing-cli-linux.adoc[leveloffset=+1]
2121
include::modules/serverless-installing-cli-macos.adoc[leveloffset=+1]
2222
include::modules/serverless-installing-cli-windows.adoc[leveloffset=+1]
23+
include::modules/serverless-kn-cli-plugins.adoc[leveloffset=+1]
24+
25+
== Additional resources
26+
27+
* For instructions on using the Kafka source `kn` plug-in, see xref:../serverless/event_sources/serverless-kafka-source.adoc#serverless-kafka-source[Creating a Kafka event source using the `kn` CLI].
28+
// * For information on {ServerlessProductName} Functions, see link:https://openshift-knative.github.io/docs/docs/functions/about-functions.html[About OpenShift Serverless Functions].

0 commit comments

Comments
 (0)