Skip to content

Commit 925f25c

Browse files
committed
[RHDEVDOCS-5161] Move Kafka sink docs and create new section
1 parent cb26a3c commit 925f25c

9 files changed

+42
-12
lines changed

_topic_maps/_topic_map.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3960,6 +3960,8 @@ Topics:
39603960
Topics:
39613961
- Name: Event sinks overview
39623962
File: serverless-event-sinks
3963+
- Name: Creating event sinks
3964+
File: serverless-creating-sinks
39633965
- Name: Kafka sink
39643966
File: serverless-kafka-developer-sink
39653967
- Name: Brokers

_topic_maps/_topic_map_osd.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -538,6 +538,8 @@ Topics:
538538
Topics:
539539
- Name: Event sinks overview
540540
File: serverless-event-sinks
541+
- Name: Creating event sinks
542+
File: serverless-creating-sinks
541543
- Name: Kafka sink
542544
File: serverless-kafka-developer-sink
543545
- Name: Brokers

_topic_maps/_topic_map_rosa.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -733,6 +733,8 @@ Topics:
733733
Topics:
734734
- Name: Event sinks overview
735735
File: serverless-event-sinks
736+
- Name: Creating event sinks
737+
File: serverless-creating-sinks
736738
- Name: Kafka sink
737739
File: serverless-kafka-developer-sink
738740
- Name: Brokers

modules/serverless-creating-a-kafka-event-sink.adoc

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,20 @@
11
// Module included in the following assemblies:
22
//
3-
// * serverless/develop/serverless-event-sinks.adoc
3+
// * serverless/eventing/event-sinks/serverless-kafka-developer-sink.adoc
44

55
:_content-type: PROCEDURE
66
[id="serverless-creating-a-kafka-event-sink_{context}"]
7+
= Creating a Kafka sink by using the {product-title} web console
78

8-
= Creating a Kafka event sink
9-
10-
As a developer, you can create an event sink to receive events from a particular source and send them to a Kafka topic.
9+
You can create a Kafka sink that sends events to a Kafka topic by using the *Developer* perspective in the {product-title} web console. By default, a Kafka sink uses the binary content mode, which is more efficient than the structured mode.
1110

1211
.Prerequisites
13-
* You have installed the Red Hat OpenShift Serverless operator, with Knative Serving, Knative Eventing, and Knative Kafka APIs, from the Operator Hub.
1412

13+
* You have installed the {ServerlessOperatorName}, with Knative Serving, Knative Eventing, and Knative Kafka APIs, from the OperatorHub.
1514
* You have created a Kafka topic in your Kafka environment.
1615
1716
.Procedure
17+
1818
. In the *Developer* perspective, navigate to the *+Add* view.
1919
. Click *Event Sink* in the *Eventing catalog*.
2020
. Search for `KafkaSink` in the catalog items and click it.
@@ -28,5 +28,6 @@ image::create-event-sink.png[]
2828
. Click *Create*.
2929

3030
.Verification
31+
3132
. In the *Developer* perspective, navigate to the *Topology* view.
3233
. Click the created event sink to view its details in the right panel.

modules/serverless-kafka-sink.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,9 @@
44

55
:_content-type: PROCEDURE
66
[id="serverless-kafka-sink_{context}"]
7-
= Using a Kafka sink
7+
= Creating a Kafka sink by using YAML
88

9-
You can create an event sink called a Kafka sink that sends events to a Kafka topic. Creating Knative resources by using YAML files uses a declarative API, which enables you to describe applications declaratively and in a reproducible manner. By default, a Kafka sink uses the binary content mode, which is more efficient than the structured mode. To create a Kafka sink by using YAML, you must create a YAML file that defines a `KafkaSink` object, then apply it by using the `oc apply` command.
9+
You can create a Kafka sink that sends events to a Kafka topic. By default, a Kafka sink uses the binary content mode, which is more efficient than the structured mode. To create a Kafka sink by using YAML, you must create a YAML file that defines a `KafkaSink` object, then apply it by using the `oc apply` command.
1010

1111
.Prerequisites
1212

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
:_content-type: ASSEMBLY
2+
include::_attributes/common-attributes.adoc[]
3+
[id="serverless-creating-sinks"]
4+
= Creating event sinks
5+
:context: serverless-creating-sinks
6+
7+
toc::[]
8+
9+
include::snippets/serverless-about-event-sinks.adoc[]
10+
11+
For information about creating resources that can be used as event sinks, see the following documentation:
12+
13+
* xref:../../../serverless/knative-serving/getting-started/serverless-applications.adoc#serverless-applications[Serverless applications]
14+
* xref:../../../serverless/eventing/brokers/serverless-using-brokers.adoc#serverless-using-brokers[Creating brokers]
15+
* xref:../../../serverless/eventing/channels/serverless-creating-channels.adoc#serverless-creating-channels[Creating channels]
16+
* xref:../../../serverless/eventing/event-sinks/serverless-kafka-developer-sink.adoc#serverless-kafka-developer-sink[Kafka sink]

serverless/eventing/event-sinks/serverless-event-sinks.adoc

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,15 +6,12 @@ include::_attributes/common-attributes.adoc[]
66

77
toc::[]
88

9-
When you create an event source, you can specify a sink where events are sent to from the source. A sink is an addressable or a callable resource that can receive incoming events from other resources. Knative services, channels and brokers are all examples of sinks.
9+
include::snippets/serverless-about-event-sinks.adoc[]
1010

1111
Addressable objects receive and acknowledge an event delivered over HTTP to an address defined in their `status.address.url` field. As a special case, the core Kubernetes `Service` object also fulfills the addressable interface.
1212

1313
Callable objects are able to receive an event delivered over HTTP and transform the event, returning `0` or `1` new events in the HTTP response. These returned events may be further processed in the same way that events from an external event source are processed.
1414

15-
//Creating a Kafka event sink
16-
include::modules/serverless-creating-a-kafka-event-sink.adoc[leveloffset=+1]
17-
1815
// Using --sink flag with kn (generic)
1916
include::modules/specifying-sink-flag-kn.adoc[leveloffset=+1]
2017

serverless/eventing/event-sinks/serverless-kafka-developer-sink.adoc

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,11 @@ toc::[]
88

99
Kafka sinks are a type of xref:../../../serverless/eventing/event-sinks/serverless-event-sinks.adoc#serverless-event-sinks[event sink] that are available if a cluster administrator has enabled Kafka on your cluster. You can send events directly from an xref:../../../serverless/eventing/event-sources/knative-event-sources.adoc#knative-event-sources[event source] to a Kafka topic by using a Kafka sink.
1010

11-
// Kafka sink
11+
// Kafka sink via YAML
1212
include::modules/serverless-kafka-sink.adoc[leveloffset=+1]
1313

14+
// Creating a Kafka sink via ODC
15+
include::modules/serverless-creating-a-kafka-event-sink.adoc[leveloffset=+1]
1416

1517
// kafka sink security config
1618
include::modules/serverless-kafka-sink-security-config.adoc[leveloffset=+1]
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
// Text snippet included in the following modules and assemblies:
2+
//
3+
// * /serverless/eventing/event-sinks/serverless-event-sinks
4+
// * /serverless/eventing/event-sinks/serverless-creating-sinks
5+
6+
:_content-type: SNIPPET
7+
8+
When you create an event source, you can specify an event sink where events are sent to from the source. An event sink is an addressable or a callable resource that can receive incoming events from other resources. Knative services, channels, and brokers are all examples of event sinks. There is also a specific Apache Kafka sink type available.

0 commit comments

Comments
 (0)