Skip to content

Commit 669d250

Browse files
authored
Merge pull request #45969 from abrennan89/SRVKE-1220
[SRVKE-1220 + SRVKE-1185]: Add docs for using an external topic with Kafka brokers, updating kafka left nav headings
2 parents 5a2f678 + cef5268 commit 669d250

9 files changed

+82
-12
lines changed

_topic_maps/_topic_map.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3459,7 +3459,7 @@ Topics:
34593459
# Triggers
34603460
- Name: Triggers
34613461
File: serverless-triggers
3462-
- Name: Knative Kafka
3462+
- Name: Using Knative Kafka
34633463
File: serverless-kafka-developer
34643464
# Admin guide
34653465
- Name: Administer
@@ -3468,7 +3468,7 @@ Topics:
34683468
- Name: Global configuration
34693469
File: serverless-configuration
34703470
# Eventing
3471-
- Name: Knative Kafka
3471+
- Name: Configuring Knative Kafka
34723472
File: serverless-kafka-admin
34733473
- Name: Serverless components in the Administrator perspective
34743474
File: serverless-admin-perspective

_topic_maps/_topic_map_osd.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -275,14 +275,14 @@ Topics:
275275
File: serverless-using-brokers
276276
- Name: Triggers
277277
File: serverless-triggers
278-
- Name: Knative Kafka
278+
- Name: Using Knative Kafka
279279
File: serverless-kafka-developer
280280
- Name: Administer
281281
Dir: admin_guide
282282
Topics:
283283
- Name: Global configuration
284284
File: serverless-configuration
285-
- Name: Knative Kafka
285+
- Name: Configuring Knative Kafka
286286
File: serverless-kafka-admin
287287
- Name: Serverless components in the Administrator perspective
288288
File: serverless-admin-perspective

modules/serverless-broker-types.adoc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
There are multiple broker implementations available for use with {ServerlessProductName}, each of which have different event delivery guarantees and use different underlying technologies. You can choose the broker implementation when creating a broker by specifying a broker class, otherwise the default broker class is used. The default broker class can be configured by cluster administrators.
1010
// TO DO: Need to add docs about setting default broker class.
1111

12-
[id="serverless-using-brokers-channel-based"]
12+
[id="serverless-broker-types-channel-based"]
1313
== Channel-based broker
1414

1515
The channel-based broker implementation internally uses channels for event delivery. Channel-based brokers provide different event delivery guarantees based on the channel implementation a broker instance uses, for example:
@@ -18,10 +18,10 @@ The channel-based broker implementation internally uses channels for event deliv
1818

1919
* A broker using the `KafkaChannel` implementation provides the event delivery guarantees required for a production environment.
2020

21-
[id="serverless-using-brokers-kafka"]
21+
[id="serverless-broker-types-kafka"]
2222
== Kafka broker
2323

24+
The Kafka broker is a broker implementation that uses Kafka internally to provide at-least once delivery guarantees. It supports multiple Kafka versions, and has a native integration with Kafka for storing and routing events.
25+
2426
:FeatureName: Kafka broker
2527
include::snippets/technology-preview.adoc[leveloffset=+2]
26-
27-
The Kafka broker is a broker implementation that uses Kafka internally to provide at-least once delivery guarantees. It supports multiple Kafka versions, and has a native integration with Kafka for storing and routing events.
Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
// Module included in the following assemblies:
2+
//
3+
// * serverless/develop/serverless-kafka-developer.adoc
4+
// * serverless/develop/serverless-using-brokers.adoc
5+
6+
:_content-type: PROCEDURE
7+
[id="serverless-kafka-broker-with-kafka-topic_{context}"]
8+
= Creating a Kafka broker that uses an externally managed Kafka topic
9+
10+
If you want to use a Kafka broker without allowing it to create its own internal topic, you can use an externally managed Kafka topic instead. To do this, you must create a Kafka `Broker` object that uses the `kafka.eventing.knative.dev/external.topic` annotation.
11+
12+
.Prerequisites
13+
14+
* The {ServerlessOperatorName}, Knative Eventing, and the `KnativeKafka` custom resource are installed on your {product-title} cluster.
15+
16+
* You have access to a Kafka instance such as link:https://access.redhat.com/documentation/en-us/red_hat_amq/7.6/html/amq_streams_on_openshift_overview/kafka-concepts_str#kafka-concepts-key_str[Red Hat AMQ Streams], and have created a Kafka topic.
17+
18+
* You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in {product-title}.
19+
20+
* You have installed the OpenShift (`oc`) CLI.
21+
22+
.Procedure
23+
24+
. Create a Kafka-based broker as a YAML file:
25+
+
26+
[source,yaml]
27+
----
28+
apiVersion: eventing.knative.dev/v1
29+
kind: Broker
30+
metadata:
31+
annotations:
32+
eventing.knative.dev/broker.class: Kafka <1>
33+
kafka.eventing.knative.dev/external.topic: <topic_name> <2>
34+
...
35+
----
36+
<1> The broker class. If not specified, brokers use the default class as configured by cluster administrators. To use the Kafka broker, this value must be `Kafka`.
37+
<2> The name of the Kafka topic that you want to use.
38+
39+
. Apply the Kafka-based broker YAML file:
40+
+
41+
[source,terminal]
42+
----
43+
$ oc apply -f <filename>
44+
----

modules/serverless-kafka-broker.adoc

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,8 +12,10 @@ Creating Knative resources by using YAML files uses a declarative API, which ena
1212
.Prerequisites
1313

1414
* The {ServerlessOperatorName}, Knative Eventing, and the `KnativeKafka` custom resource are installed on your {product-title} cluster.
15+
1516
* You have created a project or have access to a project with the appropriate roles and permissions to create applications and other workloads in {product-title}.
16-
* Install the OpenShift CLI (`oc`).
17+
18+
* You have installed the OpenShift (`oc`) CLI.
1719
1820
.Procedure
1921

serverless/admin_guide/serverless-kafka-admin.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
:_content-type: ASSEMBLY
22
[id="serverless-kafka-admin"]
3-
= Knative Kafka
3+
= Configuring Knative Kafka
44
include::_attributes/common-attributes.adoc[]
55
:context: serverless-kafka-admin
66

serverless/develop/serverless-kafka-developer.adoc

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
:_content-type: ASSEMBLY
22
[id="serverless-kafka-developer"]
3-
= Knative Kafka
3+
= Using Knative Kafka
44
include::_attributes/common-attributes.adoc[]
55
:context: serverless-kafka-developer
66

@@ -45,7 +45,7 @@ include::modules/serverless-kafka-source-yaml.adoc[leveloffset=+2]
4545
[id="serverless-kafka-developer-broker"]
4646
== Kafka broker
4747

48-
If a cluster administrator has configured your {ServerlessProductName} deployment to use Kafka broker as the default broker type, xref:../../serverless/develop/serverless-using-brokers.adoc#serverless-using-brokers-creating-brokers[creating a broker by using the default settings] creates a Kafka-based `Broker` object. If your {ServerlessProductName} deployment is not configured to use Kafka broker as the default broker type, you can still use the following procedure to create a Kafka-based broker.
48+
include::snippets/serverless-kafka-broker-intro.adoc[]
4949

5050
:FeatureName: Kafka broker
5151
include::snippets/technology-preview.adoc[leveloffset=+2]
@@ -56,6 +56,7 @@ The Kafka broker, which is currently in Technology Preview, is not supported on
5656
====
5757

5858
include::modules/serverless-kafka-broker.adoc[leveloffset=+2]
59+
include::modules/serverless-kafka-broker-with-kafka-topic.adoc[leveloffset=+2]
5960

6061
// Kafka channels
6162
include::modules/serverless-create-kafka-channel-yaml.adoc[leveloffset=+1]

serverless/develop/serverless-using-brokers.adoc

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,22 @@ include::modules/serverless-creating-broker-labeling.adoc[leveloffset=+2]
2121
include::modules/serverless-deleting-broker-injection.adoc[leveloffset=+2]
2222
include::modules/serverless-creating-a-broker-odc.adoc[leveloffset=+2]
2323

24+
[id="serverless-using-brokers-kafka"]
25+
== Kafka broker
26+
27+
include::snippets/serverless-kafka-broker-intro.adoc[]
28+
29+
:FeatureName: Kafka broker
30+
include::snippets/technology-preview.adoc[leveloffset=+2]
31+
32+
[IMPORTANT]
33+
====
34+
The Kafka broker, which is currently in Technology Preview, is not supported on FIPS.
35+
====
36+
37+
include::modules/serverless-kafka-broker.adoc[leveloffset=+2]
38+
include::modules/serverless-kafka-broker-with-kafka-topic.adoc[leveloffset=+2]
39+
2440
[id="serverless-using-brokers-managing-brokers"]
2541
== Managing brokers
2642

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
// Text snippet included in the following modules:
2+
//
3+
// * /serverless/develop/serverless-using-brokers.adoc
4+
5+
:_content-type: SNIPPET
6+
7+
If a cluster administrator has configured your {ServerlessProductName} deployment to use Kafka as the default broker type, creating a broker by using the default settings creates a Kafka-based `Broker` object. If your {ServerlessProductName} deployment is not configured to use Kafka broker as the default broker type, you can use one of the following procedures to create a Kafka-based broker.

0 commit comments

Comments
 (0)