Skip to content

Commit ce90592

Browse files
authored
Merge pull request #40560 from abrennan89/kafkareorg
SRVKE-747: Split knative kafka docs into different personas
2 parents bc7816f + 73cdb4b commit ce90592

13 files changed

+136
-119
lines changed

_topic_maps/_topic_map.yml

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -3186,26 +3186,25 @@ Topics:
31863186
File: serverless-apiserversource
31873187
- Name: Using a ping source
31883188
File: serverless-pingsource
3189-
- Name: Using a Kafka source
3190-
File: serverless-kafka-source
31913189
- File: serverless-custom-event-sources
31923190
Name: Custom event sources
31933191
- Name: Creating and deleting channels
31943192
File: serverless-creating-channels
31953193
- Name: Subscriptions
31963194
File: serverless-subs
3195+
- Name: Knative Kafka
3196+
File: serverless-kafka-developer
31973197
# Admin guide
31983198
- Name: Administer
31993199
Dir: admin_guide
32003200
Topics:
32013201
- Name: Configuring OpenShift Serverless
32023202
File: serverless-configuration
3203+
# Eventing
32033204
- Name: Configuring channel defaults
32043205
File: serverless-configuring-channels
3205-
# Ingress options
3206-
- Name: Integrating Service Mesh with OpenShift Serverless
3207-
File: serverless-ossm-setup
3208-
# Eventing
3206+
- Name: Knative Kafka
3207+
File: serverless-kafka-admin
32093208
- Name: Creating Knative Eventing components in the Administrator perspective
32103209
File: serverless-cluster-admin-eventing
32113210
# - Name: Configuring the Knative Eventing custom resource
@@ -3216,6 +3215,9 @@ Topics:
32163215
File: serverless-cluster-admin-serving
32173216
- Name: Configuring the Knative Serving custom resource
32183217
File: knative-serving-CR-config
3218+
# Ingress options
3219+
- Name: Integrating Service Mesh with OpenShift Serverless
3220+
File: serverless-ossm-setup
32193221
# Monitoring
32203222
- Name: Monitoring serverless components
32213223
File: serverless-admin-monitoring
@@ -3255,6 +3257,8 @@ Topics:
32553257
File: serverless-custom-domains
32563258
- Name: Using a custom TLS certificate for domain mapping
32573259
File: serverless-custom-tls-cert-domain-mapping
3260+
- Name: Security configuration for Knative Kafka
3261+
File: serverless-kafka-security
32583262
# Knative Eventing
32593263
- Name: Knative Eventing
32603264
Dir: knative_eventing
@@ -3268,9 +3272,6 @@ Topics:
32683272
# Event delivery
32693273
- Name: Event delivery
32703274
File: serverless-event-delivery
3271-
# Knative Kafka
3272-
- Name: Knative Kafka
3273-
File: serverless-kafka
32743275
# Functions
32753276
- Name: Functions
32763277
Dir: functions

modules/serverless-install-kafka-odc.adoc

Lines changed: 23 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,37 @@
11
// Module is included in the following assemblies:
22
//
3-
// serverless/knative_eventing/serverless-kafka.adoc
3+
// serverless/admin_guide/serverless-kafka-admin.adoc
44

55
[id="serverless-install-kafka-odc_{context}"]
6-
= Installing Knative Kafka components by using the web console
6+
= Installing Knative Kafka
77

8-
Cluster administrators can enable the use of Knative Kafka functionality in an {ServerlessProductName} deployment by instantiating the `KnativeKafka` custom resource definition provided by the *Knative Kafka* {ServerlessOperatorName} API.
8+
The {ServerlessOperatorName} provides the Knative Kafka API that can be used to create a `KnativeKafka` custom resource:
9+
10+
.Example `KnativeKafka` custom resource
11+
[source,yaml]
12+
----
13+
apiVersion: operator.serverless.openshift.io/v1alpha1
14+
kind: KnativeKafka
15+
metadata:
16+
name: knative-kafka
17+
namespace: knative-eventing
18+
spec:
19+
channel:
20+
enabled: true <1>
21+
bootstrapServers: <bootstrap_servers> <2>
22+
source:
23+
enabled: true <3>
24+
----
25+
<1> Enables developers to use the `KafkaChannel` channel type in the cluster.
26+
<2> A comma-separated list of bootstrap servers from your AMQ Streams cluster.
27+
<3> Enables developers to use the `KafkaSource` event source type in the cluster.
928

1029
.Prerequisites
1130

1231
* You have installed {ServerlessProductName}, including Knative Eventing, in your {product-title} cluster.
1332
* You have access to a Red Hat AMQ Streams cluster.
1433
* You have cluster administrator permissions on {product-title}.
15-
* You are logged in to the web console.
34+
* You are logged in to the {product-title} web console.
1635

1736
.Procedure
1837

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
// Module included in the following assemblies:
2+
//
3+
// * serverless/develop/serverless-kafka-developer.adoc
4+
5+
[id="serverless-kafka-delivery-retries_{context}"]
6+
= Event delivery and retries
7+
8+
Using Kafka components in an event-driven architecture provides "at least once" event delivery. This means that operations are retried until a return code value is received. This makes applications more resilient to lost events; however, it might result in duplicate events being sent.
9+
10+
For the Kafka event source, there is a fixed number of retries for event delivery by default. For Kafka channels, retries are only performed if they are configured in the Kafka channel `Delivery` spec.

modules/serverless-kafka-source-kn.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
// Module included in the following assemblies:
2+
//
3+
// * serverless/develop/serverless-kafka-developer.adoc
4+
15
[id="serverless-kafka-source-kn_{context}"]
26
= Creating a Kafka event source by using the Knative CLI
37

modules/serverless-kafka-source-odc.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
// Module included in the following assemblies:
22
//
3-
// * serverless/event_sources/serverless-kafka-source.adoc
3+
// * serverless/develop/serverless-kafka-developer.adoc
44

55
[id="serverless-kafka-source-odc_{context}"]
66
= Creating a Kafka event source by using the web console

modules/serverless-kafka-source-yaml.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
// Module included in the following assemblies:
22
//
3-
// * serverless/event_sources/serverless-kafka-source.adoc
3+
// * serverless/develop/serverless-kafka-developer.adoc
44

55
[id="serverless-kafka-source-yaml_{context}"]
66
= Creating a Kafka event source by using YAML

serverless/admin_guide/serverless-cluster-admin-eventing.adoc

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -29,4 +29,3 @@ include::modules/serverless-creating-subscription-admin-web-console.adoc[levelof
2929
* See xref:../../serverless/develop/serverless-subs.adoc#serverless-subs[Subscriptions].
3030
* See xref:../../serverless/knative_eventing/serverless-triggers.adoc#serverless-triggers[Triggers].
3131
* See xref:../../serverless/discover/serverless-channels.adoc#serverless-channels[Channels].
32-
* See xref:../../serverless/knative_eventing/serverless-kafka.adoc#serverless-kafka[Knative Kafka].
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
include::modules/serverless-document-attributes.adoc[]
2+
[id="serverless-kafka-admin"]
3+
= Knative Kafka
4+
include::modules/common-attributes.adoc[]
5+
:context: serverless-kafka-admin
6+
7+
toc::[]
8+
9+
In addition to the Knative Eventing components that are provided as part of a core {ServerlessProductName} installation, cluster administrators can install the `KnativeKafka` custom resource (CR).
10+
11+
The `KnativeKafka` CR provides users with additional options, such as:
12+
13+
* Kafka event source
14+
* Kafka channel
15+
// * Kafka broker
16+
17+
[NOTE]
18+
====
19+
Knative Kafka is not currently supported for IBM Z and IBM Power Systems.
20+
====
21+
22+
include::modules/serverless-install-kafka-odc.adoc[leveloffset=+1]
23+
24+
[id="additional-resources_serverless-kafka-admin"]
25+
== Additional resources
26+
27+
* See the link:https://access.redhat.com/documentation/en-us/red_hat_amq/7.6/html/amq_streams_on_openshift_overview/kafka-concepts_str#kafka-concepts-key_str[Red Hat AMQ Streams] documentation for more information about Kafka concepts.
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
[id="serverless-kafka-developer"]
2+
= Knative Kafka
3+
include::modules/serverless-document-attributes.adoc[]
4+
include::modules/common-attributes.adoc[]
5+
:context: serverless-kafka-developer
6+
7+
toc::[]
8+
9+
Knative Kafka functionality is available in an {ServerlessProductName} installation xref:../../serverless/admin_guide/serverless-kafka-admin.adoc#serverless-kafka-admin[if a cluster administrator has installed the `KnativeKafka` custom resource].
10+
11+
Knative Kafka provides additional options, such as:
12+
13+
* Kafka event source
14+
* xref:../../serverless/develop/serverless-creating-channels.adoc#serverless-creating-channels[Kafka channel]
15+
// * Kafka broker
16+
17+
[NOTE]
18+
====
19+
Knative Kafka is not currently supported for IBM Z and IBM Power Systems.
20+
====
21+
22+
include::modules/serverless-kafka-event-delivery.adoc[leveloffset=+1]
23+
24+
[id="serverless-kafka-developer-event-source"]
25+
== Using a Kafka event source
26+
27+
You can create a Knative Kafka event source that reads events from an Apache Kafka cluster and passes these events to a sink.
28+
29+
// dev console
30+
include::modules/serverless-kafka-source-odc.adoc[leveloffset=+2]
31+
// kn commands
32+
include::modules/serverless-kafka-source-kn.adoc[leveloffset=+2]
33+
include::modules/specifying-sink-flag-kn.adoc[leveloffset=+3]
34+
// YAML
35+
include::modules/serverless-kafka-source-yaml.adoc[leveloffset=+2]
36+
37+
[id="additional-resources_serverless-kafka-developer"]
38+
== Additional resources
39+
40+
* See the link:https://access.redhat.com/documentation/en-us/red_hat_amq/7.6/html/amq_streams_on_openshift_overview/kafka-concepts_str#kafka-concepts-key_str[Red Hat AMQ Streams] documentation for more information about Kafka concepts.
41+
* See xref:../../serverless/discover/knative-event-sources.adoc#knative-event-sources[Event sources].

serverless/develop/serverless-kafka-source.adoc

Lines changed: 0 additions & 29 deletions
This file was deleted.

0 commit comments

Comments
 (0)