Skip to content

Commit 225e0ec

Browse files
committed
[srvls] Reorg Kafka docs
1 parent db412ff commit 225e0ec

13 files changed

+50
-33
lines changed

_topic_map.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2922,6 +2922,9 @@ Topics:
29222922
# Event delivery
29232923
- Name: Event delivery
29242924
File: serverless-event-delivery
2925+
# Knative Kafka
2926+
- Name: Knative Kafka
2927+
File: serverless-kafka
29252928
# Event sources
29262929
- Name: Event sources
29272930
Dir: event_sources
@@ -2938,9 +2941,6 @@ Topics:
29382941
File: serverless-sinkbinding
29392942
- Name: Using a Kafka source
29402943
File: serverless-kafka-source
2941-
# Knative Kafka
2942-
- Name: Using Apache Kafka with OpenShift Serverless
2943-
File: serverless-kafka
29442944
# Functions - uncomment at tech preview
29452945
# - Name: OpenShift Serverless Functions
29462946
# Dir: functions

modules/serverless-channels-creating-intro.adoc

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,6 @@ The `spec.channelTemplate` properties cannot be changed after creation, because
3030

3131
The channel controller then creates the backing channel instance based on the `spec.channelTemplate` configuration.
3232

33-
When this mechanism is used with the example above, two objects are created: a generic backing channel and an `InMemoryChannel` channel.
34-
If you are using a different default channel implementation, for example, Apache Kafka, a generic backing channel and `KafkaChannel` channel are created.
33+
When this mechanism is used with the preceding example, two objects are created: a generic backing channel and an `InMemoryChannel` channel. If you are using a different default channel implementation, the `InMemoryChannel` is replaced with one that is specific to your implementation. For example, with Knative Kafka, the `KafkaChannel` channel is created.
3534

3635
The backing channel acts as a proxy that copies its subscriptions to the user-created channel object, and sets the user-created channel object status to reflect the status of the backing channel.

modules/serverless-creating-channel-admin-web-console.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,6 @@ If you have cluster administrator permissions, you can create a channel by using
2020
+
2121
[NOTE]
2222
====
23-
Currently only `InMemoryChannel` channel objects are supported by default. Kafka channels are available if you have installed Apache Kafka on {ServerlessProductName}.
23+
Currently only `InMemoryChannel` channel objects are supported by default. Kafka channels are available if you have installed Knative Kafka on {ServerlessProductName}.
2424
====
2525
. Click *Create*.

modules/serverless-event-delivery-component-behaviors.adoc

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,12 @@
77

88
Different Knative Eventing channel types have their own behavior patterns that are followed for event delivery. Developers can set event delivery parameters in the subscription configuration to ensure that any events that fail to be delivered from channels to an event sink are retried. You must also configure a dead letter sink for subscriptions if you want to provide a sink where events that are not eventually delivered can be stored, otherwise undelivered events are dropped.
99

10-
== Event delivery behavior for Apache Kafka channels
10+
[id="serverless-event-delivery-component-behaviors-kafka-channels_{context}"]
11+
== Event delivery behavior for Knative Kafka channels
1112

1213
If an event is successfully delivered to a Kafka channel or broker receiver, the receiver responds with a `202` status code, which means that the event has been safely stored inside a Kafka topic and is not lost. If the receiver responds with any other status code, the event is not safely stored, and steps must be taken by the user to resolve this issue.
1314

15+
[id="serverless-event-delivery-component-behaviors-status-codes_{context}"]
1416
== Delivery failure status codes
1517

1618
The channel or broker receiver can respond with the following status codes if an event fails to be delivered:

modules/serverless-install-kafka-odc.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,9 @@
33
// serverless/serverless-kafka.adoc
44

55
[id="serverless-install-kafka-odc_{context}"]
6-
= Installing Apache Kafka components by using the web console
6+
= Installing Knative Kafka components by using the web console
77

8-
Cluster administrators can enable the use of Apache Kafka functionality in an {ServerlessProductName} deployment by instantiating the `KnativeKafka` custom resource definition provided by the *Knative Kafka* {ServerlessOperatorName} API.
8+
Cluster administrators can enable the use of Knative Kafka functionality in an {ServerlessProductName} deployment by instantiating the `KnativeKafka` custom resource definition provided by the *Knative Kafka* {ServerlessOperatorName} API.
99

1010
.Prerequisites
1111

modules/serverless-rn-1-11-0.adoc

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,17 +10,14 @@
1010
== New features
1111

1212
* Knative Eventing on {ServerlessProductName} is now Generally Available (GA).
13-
* Apache Kafka features such as Kafka channel and Kafka event source are now available as a Technology Preview on {ServerlessProductName}. Kafka integration is delivered through the {ServerlessOperatorName} and does not require a separate community Operator installation. For more information, see the documentation on _Using Apache Kafka with OpenShift Serverless_.
13+
* Knative Kafka features such as Kafka channel and Kafka event source are now available as a Technology Preview on {ServerlessProductName}. Kafka integration is delivered through the {ServerlessOperatorName} and does not require a separate community Operator installation.
1414
* {ServerlessProductName} Functions is now delivered as a Developer Preview through the standard Knative `kn` CLI installation. This feature is not yet supported by Red Hat for production deployments, but can be used for development and testing. For more information about using {ServerlessProductName} Functions through the `kn func` CLI, see the link:https://openshift-knative.github.io/docs/docs/functions/about-functions.html[{ServerlessProductName} Functions Developer Preview documentation].
1515
* {ServerlessProductName} now uses Knative Serving 0.17.3.
1616
* {ServerlessProductName} uses Knative Eventing 0.17.2.
1717
* {ServerlessProductName} now uses Kourier 0.17.0.
1818
* {ServerlessProductName} now uses Knative `kn` CLI 0.17.3.
1919
* {ServerlessProductName} now uses Knative Kafka 0.17.1.
2020

21-
// [id="fixed-issues-1-11-0_{context}"]
22-
// == Fixed issues
23-
2421
[id="known-issues-1-11-0_{context}"]
2522
== Known issues
2623

modules/serverless-rn-1-14-0.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,15 +14,15 @@
1414
* {ServerlessProductName} now uses Kourier 0.20.0.
1515
* {ServerlessProductName} now uses Knative `kn` CLI 0.20.0.
1616
* {ServerlessProductName} now uses Knative Kafka 0.20.0.
17-
* Apache Kafka on {ServerlessProductName} is now Generally Available (GA).
17+
* Knative Kafka on {ServerlessProductName} is now Generally Available (GA).
1818
+
1919
[IMPORTANT]
2020
====
21-
Only the `v1beta1` version of the API for Apache Kafka on {ServerlessProductName} is supported. Do not use the `v1alpha1` version of the API, as this is deprecated.
21+
Only the `v1beta1` version of the API for Knative Kafka on {ServerlessProductName} is supported. Do not use the `v1alpha1` version of the API, as this is deprecated.
2222
====
2323
* The Operator channel for installing and upgrading {ServerlessProductName} has been updated to `stable` for {product-title} 4.6 and newer versions.
2424
* {ServerlessProductName} is now supported on IBM Power Systems, IBM Z, and LinuxONE, except for the following features, which are not yet supported:
25-
** Apache Kafka functionality.
25+
** Knative Kafka functionality.
2626
** {ServerlessProductName} Functions developer preview.
2727
// Not including Camel-K since we don't document or support that yet for serverless anyway.
2828

serverless/admin_guide/serverless-cluster-admin-eventing.adoc

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,5 +29,4 @@ include::modules/serverless-creating-subscription-admin-web-console.adoc[levelof
2929
* See xref:../../serverless/event_workflows/serverless-subs.adoc#serverless-subs[Subscriptions].
3030
* See xref:../../serverless/event_workflows/serverless-triggers.adoc#serverless-triggers[Triggers].
3131
* See xref:../../serverless/event_workflows/serverless-channels.adoc#serverless-channels[Channels].
32-
* For information about using Apache Kafka components, see xref:../../serverless/serverless-kafka.adoc#serverless-kafka[
33-
Using Apache Kafka with {ServerlessProductName}].
32+
* See xref:../../serverless/event_workflows/serverless-kafka.adoc#serverless-kafka[Knative Kafka].

serverless/event_sources/knative-event-sources.adoc

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ Currently, {ServerlessProductName} supports the following event source types:
1313
API server source:: Connects a sink to the Kubernetes API server.
1414
Ping source:: Periodically sends ping events with a constant payload. It can be used as a timer.
1515
Sink binding:: Allows you to connect core Kubernetes resource objects, such as `Deployment`, `Job`, or `StatefulSet` objects, with a sink.
16-
Apache Kafka source:: Connect a Kafka cluster to a sink as an event source.
16+
Knative Kafka source:: Connect a Kafka cluster to a sink as an event source.
1717

1818
You can create and manage Knative event sources using the **Developer** perspective in the {product-title} web console, the `kn` CLI, or by applying YAML files.
1919

@@ -22,7 +22,8 @@ You can create and manage Knative event sources using the **Developer** perspect
2222
* Create a xref:../../serverless/event_sources/serverless-sinkbinding.adoc#serverless-sinkbinding[sink binding].
2323
* Create a xref:../../serverless/event_sources/serverless-kafka-source.adoc#serverless-kafka-source[Kafka source].
2424
25-
[id="knative-event-sources-additional-resources"]
25+
[id="additional-resources_knative-event-sources"]
2626
== Additional resources
27+
2728
* For more information about eventing workflows using {ServerlessProductName}, see xref:../../serverless/architecture/serverless-event-architecture.adoc#serverless-event-architecture[Knative Eventing architecture].
28-
* For more information about using Kafka event sources, see xref:../../serverless/serverless-kafka.adoc#serverless-kafka[Using Apache Kafka with {ServerlessProductName}].
29+
* See xref:../../serverless/event_workflows/serverless-kafka.adoc#serverless-kafka[Knative Kafka].

serverless/event_sources/serverless-kafka-source.adoc

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,20 @@ include::modules/common-attributes.adoc[]
66

77
toc::[]
88

9-
The Apache Kafka event source brings messages into Knative. It reads events from an Apache Kafka cluster and passes these events to an event sink so that they can be consumed. You can use the `KafkaSource` event source with {ServerlessProductName}.
9+
You can create a Knative Kafka event source that reads events from an Apache Kafka cluster and passes these events to a sink.
10+
11+
[id="prerequisites_serverless-kafka-source"]
12+
== Prerequisites
13+
14+
You can use the `KafkaSource` event source with {ServerlessProductName} after you have xref:../../serverless/admin_guide/installing-knative-eventing.adoc#installing-knative-eventing[Knative Eventing] and xref:../../serverless/event_workflows/serverless-kafka.adoc#serverless-kafka[Knative Kafka] installed on your cluster.
1015

1116
include::modules/serverless-kafka-source-odc.adoc[leveloffset=+1]
1217
include::modules/serverless-kafka-source-kn.adoc[leveloffset=+1]
1318
include::modules/serverless-kafka-source-yaml.adoc[leveloffset=+1]
1419

15-
[id="serverless-kafka-source-additional-resources"]
20+
[id="additional-resources_serverless-kafka-source"]
1621
== Additional resources
1722

23+
* See xref:../../serverless/event_sources/knative-event-sources.adoc#knative-event-sources[Getting started with event sources].
24+
* See xref:../../serverless/event_workflows/serverless-kafka.adoc#serverless-kafka[Knative Kafka].
1825
* See the link:https://access.redhat.com/documentation/en-us/red_hat_amq/7.6/html/amq_streams_on_openshift_overview/kafka-concepts_str#kafka-concepts-key_str[Red Hat AMQ Streams] documentation for more information about Kafka concepts.

0 commit comments

Comments
 (0)